Search Immortality Topics:

Page 45«..1020..44454647..5060..»


Category Archives: Quantum Computing

Quantum Computing Technologies Market, Share, Application Analysis, Regional Outlook, Competitive Strategies & Forecast up to 2025 – AlgosOnline

Market Study Report, LLC, has added a detailed study on the Quantum Computing Technologies market which provides a brief summary of the growth trends influencing the market. The report also includes significant insights pertaining to the profitability graph, market share, regional proliferation and SWOT analysis of this business vertical. The report further illustrates the status of key players in the competitive setting of the Quantum Computing Technologies market, while expanding on their corporate strategies and product offerings.

The report on Quantum Computing Technologies market presents insights regarding major growth drivers, potential challenges, and key opportunities that shape the industry expansion over analysis period.

Request a sample Report of Quantum Computing Technologies Market at:https://www.marketstudyreport.com/request-a-sample/2673446?utm_source=algosonline.com&utm_medium=AG

According to the study, the industry is predicted to witness a CAGR of XX% over the forecast timeframe (2020-2025) and is anticipated to gain significant returns by the end of study period.

COVID-19 outbreak has caused ups and downs in industries, introducing uncertainties in the business space. Along with the immediate short-term impact of the pandemic, some industries are estimated to face the challenges on a long-term basis.

Most of the businesses across various industries are taking measures to cater the uncertainty and have revisited their budget to draft a roadmap for profit making in the coming years. The report helps companies operating in this particular business vertical to prepare a robust contingency plan taking into consideration all notable aspects.

Key inclusions of the Quantum Computing Technologies market report:

Ask for Discount on Quantum Computing Technologies Market Report at:https://www.marketstudyreport.com/check-for-discount/2673446?utm_source=algosonline.com&utm_medium=AG

Quantum Computing Technologies Market segments covered in the report:

Regional segmentation: North America, Europe, Asia-Pacific, South America, Middle East and Africa

Product types:

Applications spectrum:

Competitive outlook:

For More Details On this Report: https://www.marketstudyreport.com/reports/global-quantum-computing-technologies-market-2020-by-company-regions-type-and-application-forecast-to-2025

Some of the Major Highlights of TOC covers:

Chapter 1: Methodology & Scope

Definition and forecast parameters

Methodology and forecast parameters

Data Sources

Chapter 2: Executive Summary

Business trends

Regional trends

Product trends

End-use trends

Chapter 3: Quantum Computing Technologies Industry Insights

Industry segmentation

Industry landscape

Vendor matrix

Technological and innovation landscape

Chapter 4: Quantum Computing Technologies Market, By Region

Chapter 5: Company Profile

Business Overview

Financial Data

Product Landscape

Strategic Outlook

SWOT Analysis

Related Reports:

2. Global Mechanical Computer Aided Design (MCAD) Market 2021 by Company, Regions, Type and Application, Forecast to 2026Mechanical Computer Aided Design (MCAD) Market Report covers the makers' information, including shipment, value, income, net benefit, talk with record, business appropriation and so forth., this information enables the buyer to think about the contenders better. This report additionally covers every one of the districts and nations of the world, which demonstrates a provincial advancement status, including market size, volume and esteem, and also value information. It additionally covers diverse enterprises customers data, which is critical for the producers.Read More: https://www.marketstudyreport.com/reports/global-mechanical-computer-aided-design-mcad-market-2021-by-company-regions-type-and-application-forecast-to-2026

Contact Us:Corporate Sales,Market Study Report LLCPhone: 1-302-273-0910Toll Free: 1-866-764-2150 Email: [emailprotected]

Originally posted here:
Quantum Computing Technologies Market, Share, Application Analysis, Regional Outlook, Competitive Strategies & Forecast up to 2025 - AlgosOnline

Posted in Quantum Computing | Comments Off on Quantum Computing Technologies Market, Share, Application Analysis, Regional Outlook, Competitive Strategies & Forecast up to 2025 – AlgosOnline

Quantum Computing And Investing – ValueWalk

At a conference on quantum computing and finance on December 10, 2020, William Zeng, head of quantum research at Goldman Sachs, told the audience that quantum computing could have a revolutionary impact on the bank, and on finance more broadly. In a similar vein, Marco Pistoia of JP Morgan stated that new quantum machines will boost profits by speeding up asset pricing models and digging up better-performing portfolios. While there is little dispute that quantum computing has great potential to perform certain mathematical calculations much more quickly, whether it can revolutionize investing by so doing is an altogether different matter.

Get the entire 10-part series on Seth Klarman in PDF. Save it to your desktop, read it on your tablet, or email to your colleagues.

Q3 2020 hedge fund letters, conferences and more

The hope is that the immense power of quantum computers will allow investment managers earn superior investment returns by uncovering patterns in prices and financial data that can be exploited. The dark side is that quantum computers will open the door to finding patterns that either do not actually exist, or if they did exist at one time, no longer do. In more technical terms, quantum computing may allow for a new level of unwarranted data mining and lead to further confusion regarding the role of nonstationarity.

ValueWalk's Raul Panganiban interviews George Mussalli, Chief Investment Officer and Head of Equity Research at PanAgora Asset Management. In this epispode, they discuss quant ESG as well as PanAgoras unique approach to it. The following is a computer generated transcript and may contain some errors. Q3 2020 hedge fund letters, conferences and more Interview . Read More

Any actual sequence of numbers, even one generated by a random process, will have certain statistical quirks. Physicist Richard Feynman used to make this point with reference to the first 767 digits of Pi, replicated below. Allegedly (but unconfirmed) he liked to reel off the first 761 digits, and then say 9-9-9-9-9 and so on.[1] If you only look at the first 767 digits the replication of six straight nines is clearly an anomaly a potential investment opportunity. In fact, there is no discernible pattern in the digits of Pi. Feynman was purposely making fun of data mining by focusing on the first 767 digits.

3 .1 4 1 5 9 2 6 5 3 5 8 9 7 9 3 2 3 8 4 6 2 6 4 3 3 8 3 2 7 9 5 0 2 8 8 4 1 9 7 1 6 9 3 9 9 3 7 5 1 0 5 8 2 0 9 7 4 9 4 4 5 9 2 3 0 7 8 1 6 4 0 6 2 8 6 2 0 8 9 9 8 6 2 8 0 3 4 8 2 5 3 4 2 1 1 7 0 6 7 9 8 2 1 4 8 0 8 6 5 1 3 2 8 2 3 0 6 6 4 7 0 9 3 8 4 4 6 0 9 5 5 0 5 8 2 2 3 1 7 2 5 3 5 9 4 0 8 1 2 8 4 8 1 1 1 7 4 5 0 2 8 4 1 0 2 7 0 1 9 3 8 5 2 1 1 0 5 5 5 9 6 4 4 6 2 2 9 4 8 9 5 4 9 3 0 3 8 1 9 6 4 4 2 8 8 1 0 9 7 5 6 6 5 9 3 3 4 4 6 1 2 8 4 7 5 6 4 8 2 3 3 7 8 6 7 8 3 1 6 5 2 7 1 2 0 1 9 0 9 1 4 5 6 4 8 5 6 6 9 2 3 4 6 0 3 4 8 6 1 0 4 5 4 3 2 6 6 4 8 2 1 3 3 9 3 6 0 7 2 6 0 2 4 9 1 4 1 2 7 3 7 2 4 5 8 7 0 0 6 6 0 6 3 1 5 5 8 8 1 7 4 8 8 1 5 2 0 9 2 0 9 6 2 8 2 9 2 5 4 0 9 1 7 1 5 3 6 4 3 6 7 8 9 2 5 9 0 3 6 0 0 1 1 3 3 0 5 3 0 5 4 8 8 2 0 4 6 6 5 2 1 3 8 4 1 4 6 9 5 1 9 4 1 5 1 1 6 0 9 4 3 3 0 5 7 2 7 0 3 6 5 7 5 9 5 9 1 9 5 3 0 9 2 1 8 6 1 1 7 3 8 1 9 3 2 6 1 1 7 9 3 1 0 5 1 1 8 5 4 8 0 7 4 4 6 2 3 7 9 9 6 2 7 4 9 5 6 7 3 5 1 8 8 5 7 5 2 7 2 4 8 9 1 2 2 7 9 3 8 1 8 3 0 1 1 9 4 9 1 2 9 8 3 3 6 7 3 3 6 2 4 4 0 6 5 6 6 4 3 0 8 6 0 2 1 3 9 4 9 4 6 3 9 5 2 2 4 7 3 7 1 9 0 7 0 2 1 7 9 8 6 0 9 4 3 7 0 2 7 7 0 5 3 9 2 1 7 1 7 6 2 9 3 1 7 6 7 5 2 3 8 4 6 7 4 8 1 8 4 6 7 6 6 9 4 0 5 1 3 2 0 0 0 5 6 8 1 2 7 1 4 5 2 6 3 5 6 0 8 2 7 7 8 5 7 7 1 3 4 2 7 5 7 7 8 9 6 0 9 1 7 3 6 3 7 1 7 8 7 2 1 4 6 8 4 4 0 9 0 1 2 2 4 9 5 3 4 3 0 1 4 6 5 4 9 5 8 5 3 7 1 0 5 0 7 9 2 2 7 9 6 8 9 2 5 8 9 2 3 5 4 2 0 1 9 9 5 6 1 1 2 1 2 9 0 2 1 9 6 0 8 6 4 0 3 4 4 1 8 1 5 9 8 1 3 6 2 9 7 7 4 7 7 1 3 0 9 9 6 0 5 1 8 7 0 7 2 1 1 3 4 9 9 9 9 9 9

When it comes to investing, there is only one sequence of historical returns. With sufficient computing power and with repeated torturing of the data, anomalies are certain to be detected. A good example is factor investing. The publication of a highly influential paper by Professors Eugene Fama and Kenneth French identified three systematic investment factors, which started an industry focused on searching for additional factors. Research by Arnott, Harvey, Kalesnik and Linnainmaa reports that by year-end 2018 an implausibly large 400 significant factors had been discovered. One wonders how many such anomalies quantum computers might find.

Factor investing is just one example among many. Richard Roll, a leading academic financial economist with in-depth knowledge of the anomalies literature has also been an active financial manager. Based on his experience Roll stated that his money management firms attempted to make money from numerous anomalies widely documented in the academic literature but failed to make a nickel.

The simple fact is that if you have machines that can look closely enough at any historical data set, they will find anomalies. For instance, what about the anomalous sequence 0123456789 in the expansion of Pi.? That anomaly can be found beginning at digit 17,387,594,880.

The digits of Pi may be random, but they are stationary. The process that generates the first million digits is the same as the one which generates the million digits beginning at one trillion. The same is not true of investing. Consider, for example, providing a computer the sequence of daily returns on Apple stock from the day the company went public to the present. The computer could sift through the returns looking for patterns, but this is almost certainly a fruitless endeavor. The company that generated those returns is far from stationary. In 1978, Apple was run by two young entrepreneurs and had total revenues of $0.0078 billion. By 2019, the company was run by a large, experienced, management team and had revenues of $274 billion, an increase of about 35,000 times. The statistical process generating those returns is almost certainly nonstationary due to fundamental changes in the company generating them. To a lesser extent, the same is true of nearly every listed company. The market is constantly in flux and the companies are constantly evolving as consumer demands, government regulation, and technology, among other things, continually change. It is hard to imagine that even if there were past patterns in stock prices that were more than data mining, they would persist for long due to nonstationarity.

In the finance arena, computers and artificial intelligence work by using their massive data processing skills to find patterns that humans may miss. But in a nonstationary world the ultimate financial risk is that by the time they are identified those patterns will be gone. As a result, computerized trading comes to resemble a dog chasing its tail. This leads to excessive trading and ever rising costs without delivering superior results on average. Quantum computing risks simply adding fuel. Of course, there are individual cases where specific quant funds make highly impressive returns, but that too could be an example of data mining. Given the large number of firms in the money management business, the probability that a few do extraordinarily well is essentially one.

These criticisms are not meant to imply that quantum computing has no role to play in finance. For instance, it has great potential to improve the simulation analyses involved in assessing risk. The point here is that it will not be a holy grail for improving investment performance.

Despite the drawbacks associated with data mining and nonstationarity, there is one area in which the potential for quantum computing is particularly bright marketing quantitative investment strategies. Selling quantitative investment has always been an art. It involves convincing people that the investment manager knows something that will make them money, but which is too complicated to explain to them and, in some cases, too complicated for the manager to understand. Quantum computing takes that sales pitch to a whole new level because virtually no one will be able to understand how the machine decided that a particular investment strategy is attractive.

This skeptics take is that quantum computing will have little impact on what is ultimately the source of successful investing allocating capital to companies that have particularly bright prospects for developing profitable business in a highly uncertain and non-stationary world. Perhaps at some future date a computer will development the business judgment to determine whether Teslas business prospects justify its current stock price. Until then being able to comb through historical data in search of obscure patterns at ever increasing rates is more likely to produce profits through the generation of management fees rather than the enhancement of investor returns.

[1] The Feynman story has been repeated so often that the sequence of 9s starting at digit 762 is now referred to as the Feynman point in the expansion of Pi.

Go here to read the rest:
Quantum Computing And Investing - ValueWalk

Posted in Quantum Computing | Comments Off on Quantum Computing And Investing – ValueWalk

Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too – HPCwire

Here on the cusp of the new year, the catchphrase 2020 hindsight has a distinctly different feel. Good riddance, yes. But also proof of sciences power to mobilize and do good when called upon. Theres gratitude by those who came through less scathed, and, maybe more willingness to assist those who didnt.

Despite the unrelenting pandemic, high performance computing (HPC) proved itself an able member of the worldwide community of pandemic fighters. We should celebrate that, perhaps quietly since the work isnt done. HPC made a significant difference in speeding up and enabling vastly distributed research and funneling the results to those who could turn them into patient care, epidemiology guidance, and now vaccines. Remarkable really. Necessary, of course, but actually got done too. (Forget the quarreling; thats who we are.)

Across the Tabor family of publications, weve run more than 200 pandemic-related articles. I counted nearly 70 significant pieces in HPCwire. The early standing up of Fugaku at RIKEN, now comfortably astride the Top500 for a second time and by a significant margin, to participate in COVID-19 research is a good metaphor for HPCs mobilization. Many people and organizations contributed to the HPC v. pandemic effort and that continues.

Before spotlighting a few pandemic-related HPC activities and digging into a few other topics, lets do a speed-drive through the 2020 HPC/AI technology landscape.

Consolidation continued among chip players (Nvidia/Arm, AMD/Xilinx) while the AI chip newcomers (Cerebras, Habana (now Intel), SambaNova, Graphcore et. al.) were winning deals. Nvidias new A100 GPU is amazing and virtually everyone else is taking potshots for just that reason. Suddenly RISC-V looks very promising. Systems makers weathered 2020s storm with varying success while IBM seems to be winding down its HPC focus; it also plans to split/spin off its managed infrastructure services. Firing up Fugaku (notably a non-accelerated system) quickly was remarkable. The planned Frontier (ORNL) supercomputer now has the pole position in the U.S. exascale race ahead of the delayed Aurora (ANL).

The worldwide quantum computing frenzy is in full froth as the U.S. looks for constructive ways to spend its roughly $1.25 billion (U.S. Quantum Initiative) and, impressively, China just issued a demonstration of quantum supremacy. Theres a quiet revolution going on in storage and memory (just ask VAST Data). Nvidia/Mellanox introduced its line of 400 Gbs network devices while Ethernet launched its 800 Gbs spec. HPC-in-the-cloud is now a thing not a soon-to-be thing. AI is no longer an oddity but quickly infusing throughout HPC (That happened fast).

Last but not least, hyperscalers demonstrably rule the IT roost. Chipmakers used to, consistently punching above their weight (sales volume). Not so much now:

Ok then. Apologies for the many important topics omitted (e.g. exascale and leadership systems, neuromorphic tech, software tools (can oneAPI flourish?), newer fabrics, optical interconnect, etc.).

Lets start.

I want to highlight two HPC pandemic-related efforts, one current and one early on, and also single out the efforts of Oliver Peckham, HPCwires editor who leads our pandemic coverage which began in earnest with articles on March 6 (Summit Joins the Fight Against the Coronavirus) and March 13 (Global Supercomputing Is Mobilizing Against COVID-19). Actually, the very first piece Tech Conferences Are Being Canceled Due to Coronavirus, March 3 was more about interrupted technology events and we picked it up from our sister pub, Datanami which ran it on March 2. Weve since become a virtualized event world.

Heres an excerpt from the first Summit piece about modeling COVID-19s notorious spike:

Micholas Smith, a postdoctoral researcher at the University of Tennessee/ORNL Center for Molecular Biophysics (UT/ORNL CMB), used early studies and sequencing of the virus to build a virtual model of the spike protein.[A]fter being granted time on Summit through a discretionary allocation, Smith and his colleagues performed a series of molecular dynamics simulations on the protein, cycling through 8,000 compounds within a few days and analyzing how they bound to the spike protein, if at all.

Using Summit, we ranked these compounds based on a set of criteria related to how likely they were to bind to the S-protein spike, Smith said in aninterviewwith ORNL. In total, the team identified 77 candidate small-molecule compounds (such as medications) that they considered worthy of further experimentation, helping to narrow the field for medical researchers.

It took us a day or two whereas it would have taken months on a normal computer, said Jeremy Smith, director of UT/ORNL CMB and principal researcher for the study. Our results dont mean that we have found a cure or treatment for the Wuhan coronavirus. We are very hopeful, though, that our computational findings will both inform future studies and provide a framework that experimentalists will use to further investigate these compounds. Only then will we know whether any of them exhibit the characteristics needed to mitigate this virus.

The flood (and diversity) of efforts that followed was startling. Olivers advice on what to highlight catches the flavor of the challenge: You could go with something like the Fugaku vs. COVID-19 piece or the grocery store piece, maybe contrast them a bit, earliest vs. current simulations of viral particle spreador something like the LANL retrospective piece vs. the piece I just wrote up on their vaccine modeling. Think that might work for a how far weve come angle, either way.

Theres too much to cover.

Last week we ran Olivers article on LANL efforts to optimize vaccine distribution (At Los Alamos National Lab, Supercomputers Are Optimizing Vaccine Distribution). Heres a brief excerpt:

The new vaccines from Pfizer and Moderna have been deemed highly effective by the FDA; unfortunately, doses are likely to be limited for some time. As a result, many state governments are struggling to weigh difficult choices should the most exposed, like frontline workers, be vaccinated first? Or perhaps the most vulnerable, like the elderly and immunocompromised? And after them, whos next?

LANL was no stranger to this kind of analysis: earlier in the year, the lab had used supercomputer-powered tools like EpiCast to simulate virtual cities populated by individuals with demographic characteristics to model how COVID-19 would spread under different conditions. The first thing we looked at was whether it made a difference to prioritize certain populations such as healthcare workers or to just distribute the vaccine randomly,saidSara Del Valle, the LANL computational epidemiologist who is leading the labs COVID-19 modeling efforts. We learned that prioritizing healthcare workers first was more effective in reducing the number of COVID cases and deaths.

You get the idea. The well of HPC efforts to tackle and stymie COVID-19 is extremely deep. Turning unproven mRNA technology into a vaccine in record time was awe-inspiring and required many disciplines. For those unfamiliar with mRNA mechanism heres a brief CDC explanation as it relates to the new vaccines. Below are links to a few HPCwirearticles on the worldwide effort to bring HPC computational power to bear. (The last is a link to the HPCwire COVID-19 Archive which has links to all our major pandemic coverage):

COVID COVERAGE LINKS

Global Supercomputing Is Mobilizing Against COVID-19 (March 12, 2020)

Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations (November 19, 2020)

Supercomputer Research Leads to Human Trial of Potential COVID-19 Therapeutic Raloxifene (October 29, 2020)

AMDs Massive COVID-19 HPC Fund Adds 18 Institutions, 5 Petaflops of Power (September 14, 2020)

Supercomputer-Powered Research Uncovers Signs of Bradykinin Storm That May Explain COVID-19 Symptoms (July 28, 2020)

Researchers Use Frontera to Investigate COVID-19s Insidious Sugar Coating (June 16, 2020)

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects (May 28, 2020)

At SC20, an Expert Panel Braces for the Next Pandemic (December, 17, 2020)

Whats New in Computing vs. COVID-19: Cerebras, Nvidia, OpenMP & More (May 18, 2020)

Billion Molecules Against COVID-19 Challenge to Launch with Massive Supercomputing Support (April 22, 2020)

Pandemic Wipes Out 2020 HPC Market Growth, Flat to 12% Drop Expected (March 31, 2020)

[emailprotected]Turns Its Massive Crowdsourced Computer Network Against COVID-19 (March 16, 2020)

2020 HPCwire Awards Honor a Year of Remarkable COVID-19 Research (December, 23, 2020)

HPCWIRE COVID-19 COVERAGE ARCHIVE

Making sense of the processor world is challenging. Microprocessors are still the workhorses in mainstream computing with Intel retaining its giant market share despite AMDs encroachment. That said, the rise of heterogeneous computing and blended AI/HPC requirements has shifted focus to accelerators. Nvidias A100 GPU (54 billion transistors on 826mm2of silicon, worlds largest seven-nanometer chip) was launched this spring. Then at SC20 Nvidia announced an enhanced version of the A100, doubling its memory to 80GB; it now delivers 2TB/s of bandwidth. The A100 is an impressive piece of work.

The A100s most significant advantage, says Rick Stevens, associate lab director, Argonne National Laboratory, is its multi-instance GPU capability.

For many people the problem is achieving high occupancy, that is, being able to fill the GPU up because that depends on how much work you have to do. [By] introducing this MIG, this multi instance stuff that they have, theyre able to virtualize it. Most of the real-world performance wins are actually kind of throughput wins by using the virtualization. What weve seen isour big performance improvement is not that individual programs run much faster its that we can run up to seven parallel things on each GPU. When you add up the aggregate performance, you get these factors of three to five improvement over the V100, said Stevens.

Meanwhile, Intels XE GPU line is slowly trickling to market, mostly in card form. At SC20 Intel announced plans to make its high performance discrete GPUs available to early access developers. Notably, the new chips have been deployed at ANL and will serve as a transitional development vehicle for the future (2022) Aurora supercomputer, subbing in for the delayed IntelXE-HPC (Ponte Vecchio) GPUs that are the computational backbone of the system.

AMD, also at SC20, launched its latest GPU the MI100. AMD says it delivers 11.5 teraflops peak double-precision (FP64), 46.1 teraflops peak single-precision matrix (FP32), 23.1 teraflops peak single-precision (FP32), 184.6 teraflops peak half-precision (FP16) floating-point performance, and 92.3 peak teraflops of bfloat16 performance. HPCwire reported, AMDs MI100GPU presents a competitive alternative to Nvidias A100 GPU, rated at 9.7 teraflops of peak theoretical performance. However, the A100 is returning even higher performance than that on its FP64 Linpack runs. It will be interesting to see the specs of the GPU AMD eventually fields for use in its exascale system wins.

The stakes are high in what could become a GPU war. Today, Nvidia is the market leader in HPC.

Turning back to CPUs, which many in HPC/AI have begun to regard as the lesser of CPU/GPU pairings. Perhaps that will change with the spectacular showing of Fujitsus A64FX at the heart of Fugaku. Nvidias proposed acquisition of Arm, not a done deal yet (regulatory concerns), would likely inject fresh energy in what was already a surging Arm push into the datacenter. Of course, Nvidia has jumped into the systems business with its DGX line and presumably wants a home-grown CPU. The big mover of the last couple of years, AMDs Epyc microprocessor line, continues its steady incursion into Intel x86 territory.

Theres not been much discussion around Power10 beyond IBMs summer announcement that Power10 would offer a ~3x performance gain and ~2.6x core efficiency gain over Power9. The new executive director of OpenPOWER Foundation, James Kulina, says attracting more chipmakers to build Power devices is a top goal. Well see. RISC-V is definitely drawing interest but exactly how it fits into the processor puzzle is unclear. Esperanto unveiled a RISC-V based chip aimed at machine learning with 1,100 low-power cores based on the open-source RISC-V. Esperanto reported a goal of 4,000 cores on a single device. Europe is betting on RISC-V. However, at least near-term, RISC-V variants are seen as specialized chips.

The CPU waters are murkier than ever.

Sort of off in a land of their own are AI chip/system players. Their proliferation continues with the early movers winning important deployments. Some observers think 2021 will start sifting winners from the losers. Lets not forget that last year Intel stopped development of its newly-acquired Nervana line in favor of its even more newly-acquired Habana products. Its a high-risk, high-reward arena still.

PROCESSOR COVERAGE LINKS

Intel Xe-HP GPU Deployed for Aurora Exascale Development

Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?

LLNL, ANL and GSK Provide Early Glimpse into Cerebras AI System Performance

David Patterson Kicks Off AI Hardware Summit Championing Domain Specific Chips

Graphcores IPU Tackles Particle Physics, Showcasing Its Potential for Early Adopters

Intel Debuts Cooper Lake Xeons for 4- and 8-Socket Platforms

Intel Launches Stratix 10 NX FPGAs Targeting AI Workloads

Nvidias Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

AMD Launches Three New High-Frequency Epyc SKUs Aimed at Commercial HPC

IBM Debuts Power10; Touts New Memory Scheme, Security, and Inferencing

AMDs Road Ahead: 5nm Epyc, CPU-GPU Coupling, 20% CAGR

AI Newcomer SambaNova GAs Product Lineup and Offers New Service

Japans AIST Benchmarks Intel Optane; Cites Benefit for HPC and AI

Storage and memory dont get the attention they deserve. 3D XPoint memory (Intel and Micron), declining flash costs, and innovative software are transforming this technology segment. Hard disk drives and tape arent going away, but traditional storage management approaches such as tiering based on media type (speed/capacity/cost) are under attack. Newcomers WekaIO, VAST Data, and MemVerge are all-in on solid state, and a few leading-edge adopters (NERSC/Perlmutter) are taking the plunge. Data-intensive computing driven by the data flood and AI compute requirements (gotta keep those GPUs busy!) are big drivers.

Our storage systems typically see over an exabyte of I/O annually. Balancing this I/O intensive workload with the economics of storage means that at NERSC, we live and breathe tiering. And this is a snapshot of the storage hierarchy we have on the floor today at NERSC. Although it makes for a pretty picture, we dont have storage tiering because we want to, and in fact, Id go so far as to say its the opposite of what we and our users really want. Moving data between tiers has nothing to do with scientific discovery, said NERSC storage architect Glenn Lockwood during an SC20 panel.

To put some numbers behind this, last year we did a study that found that between 15% and 30% of that exabyte of I/O is not coming from our users jobs, but instead coming from data movement between storage tiers. That is to say that 15% to 30% of the I/O at NERSC is a complete waste of time in terms of advancing science. But even before that study, we knew that both the changing landscape of storage technology and the emerging large-scale data analysis and AI workloads arriving at NERSC required us to completely rethink our approach to tiered storage, said Lockwood.

Not surprisingly Intel and Micron (Optane/3D XPoint) are trying to accelerate the evolution. Micron released what it calls a heterogeneous-memory storage engine (HSE) designed for solid-state drives, memory-based storage and, ultimately, applications requiring persistent memory. Legacy storage engines born in the era of hard disk drives have historically failed to architecturally provide for the increased performance and reduced latency of next-generation nonvolatile media, said the company. Again, well see.

Software defined storage leveraging newer media has all the momentum at the moment with all of the established players IBM, DDN, Panasas, etc., mixing those capabilities into their product sets. WekaIO and Intel have battled it out for the top IO500 spot the last couple of years and Intels DAOS (distributed asynchronous object store) is slated for use in Aurora.

The concept of asynchronous IO is very interesting, noted Ari Berman, CEO, BioTeam research consultancy. Its essentially a queue mechanism at the system write level so system waits in the processors dont have to happen while a confirmed write back comes from the disks. So asynchronous IO allows jobs can keep running while youre waiting on storage to happen, to a limit of course. That would really improve the data input-output pipelines in those systems. Its a very interesting idea. I like asynchronous data writes and asynchronous storage access. I can see there very easily being corruption that creeps into those types of things and data without very careful sequencing. It will be interesting to watch. If it works it will be a big innovation.

Change is afoot and the storage technology community is adapting. Memory technology is also advancing.

Micron introduced a 176-layer 3D NAND flash memory at SC230 that it says increases read and write densities by more than 35 percent.JEDEC published the DDR5 SDRAM spec, the next-generation standard for random access memory (RAM) in the summer. Compared to DDR4, the DDR5 spec will deliver twice the performance and improved power efficiency, addressing ever-growing demand from datacenter and cloud environments, as well as artificial intelligence and HPC applications. At launch, DDR5 modules will reach 4.8 Gbps, providing a 50 percent improvement versus the previous generation. Density goes up four-fold with maximum density increasing from 16 Gigabits per die to 64 Gigabits per die in the new spec. JEDEC representatives indicated there will be 8 Gb and 16 Gb DDR5 products at launch.

There are always the wildcards. IBMs memristive technology is moving closer to practical use. One outlier is DNA-based storage. Dave Turek, longtime IBMer, joined DNA storage start-up Catalog this year and, says Catalog is working on proof of concepts with government agencies and a number of Fortune 500 companies. Some of these are whos-who HPC players, but some are non-HPC players many names you would recognizeWere at what I would say is the beginning of the commercial beginning. Again, well see.

STORAGE & MEMORY LINKS

SC20 Panel OK, You Hate Storage Tiering. Whats Next Then?

Intels Optane/DAOS Solution Tops Latest IO500

Startup MemVerge on Memory-centric Mission

HPC Strategist Dave Turek Joins DNA Storage (and Computing) Company Catalog

DDN-Tintri Showcases Technology Integration with Two New Products

Intel Refreshes Optane Persistent Memory, Adds New NAND SSDs

Micron Boosts Flash Density with 176-Layer 3D NAND

DDR5 Memory Spec Doubles Data Rate, Quadruples Density

IBM Touts STT MRAM Technology at IDEM 2020

The Distributed File Systems and Object Storage Landscape: Whos Leading?

Its tempting to omit quantum computing this year. Too much happened to summarize easily and the overall feel is of steady carry-on progress from 2019. There was, perhaps, a stronger pivot at least by press release count towards seeking early applications for near-term noisy intermediate scale quantum (NISQ) computers. Ion trap qubit technology got another important player in Honeywell which formally rolled out its effort and first system. Intel also stepped out from the shadows a bit in terms of showcasing its efforts. D-Wave launched a giant 5000-qubit machine (Advantage), again using a quantum annealing approach thats different from universal gate-based quantum system. IBM announced a stretch goal of achieving one million qubits!

Calling quantum computing a market is probably premature but monies are being spent. The Quantum Economic Development Consortium (QED-C) and Hyperion Research issued a forecast that projects the global quantum computing (QC) market worth an estimated $320 million in 2020 to grow 27% CAGR between 2020 and 2024. That would reach approximately $830 million by 2024. Chump change? Perhaps but real activity.

IBMs proposed Quantum Volume metric has drawn support as a broad benchmark of quantum computer performance. Honeywell promoted the 128QV score of its launch system. In December IBM reported it too had achieved a 128QV. The first QV reported by IBM was 16 in 2019 at the APS March meeting. Just what a QV of 128 means in determining practical usefulness is unclear but it is steady progress and even Intel agrees that QV is as good as any measure at the moment. DoE is also working on benchmarks, focusing a bit more on performance on given workloads.

[One] major component of benchmarking is asking what kind of resources does it take to run this or that interesting problem. Again, these are problems of interest to DoE, so basic science problems in chemistry and nuclear physics and things like that. What well do is take applications in chemistry and nuclear physics and convert them into what we consider a benchmark. We consider it a benchmark when we can distill a metric from it. So the metric could be the accuracy, the quality of the solution, or the resources required to get a given level of quality, said Raphael Pooser, PI for DoEs Quantum Testbed Pathfinder project at ORNL, during an HPCwire interview.

Next year seems likely to bring more benchmarking activity around system quality, qubit technology, and performance on specific problem sets. Several qubit technologies still vie for sway superconducting, trapped ion, optical, quantum dots, cold atoms, et al. The need to operate at near-zero (K) temps complicates everything. Google claimed achieving Quantum Supremacy last year. This year a group of China researchers also did so. The groups used different qubit technologies (superconducting v. optical) and Chinas effort tried to skirt criticisms that were lobbed at Googles effort. Frankly, both efforts were impressive. Russia reported early last year it would invest $790 million in quantum with achieving quantum supremacy as one goal.

Whats happening now is a kind of pell-mell rush among a larger and increasingly diverse quantum ecosystem (hardware, software, consultants, governments, academia). Fault tolerant quantum computing still seems distant but clever algorithms and error mitigation strategies to make productive use of NISQ systems, likely on narrow applications, look more and more promising.

Here are a few snapshots:

The persistent question is when will all of these efforts pay off and will they be as game-changing as many believe. With new money flowing into quantum, one has the sense there will be few abrupt changes in the next couple years barring untoward economic turns.

QUANTUM COVERAGE LINKS

IBMs Quantum Race to One Million Qubits

Googles Quantum Chemistry Simulation Suggests Promising Path Forward

Intel Connects the (Quantum) Dots in Accelerating Quantum Computing Effort

D-Wave Delivers 5000-qubit System; Targets Quantum Advantage

Honeywell Debuts Quantum System, Subscription Business Model, and Glimpse of Roadmap

Global QC Market Projected to Grow to More Than $800 million by 2024

ORNLs Raphael Pooser on DoEs Quantum Testbed Project

RigettiComputing Wins $8.6M DARPA Grant to Demonstrate Practical Quantum Computing

Braket: Amazons Cloud-First Quantum Environment Is Generally Available

IBM-led Webinar Tackles Quantum Developer Community Needs

Microsofts Azure Quantum Platform Now Offers Toshibas Simulated Bifurcation Machine

As always theres personnel shuffling. Lately hyperscalers have been taking HPC folks. Two long-time Intel executives, Debra Goldfarb and Bill Magro, recently left for the cloud Goldfarb to AWS as director for HPC products and strategy, and Magro to Google as CTO for HPC. Going in the other direction, John Martinis left Googles quantum development team and recently joined Australian start-up Silicon Quantum Computing. Ginny Rometty, of course, stepped down as CEO and chairman at IBM. IBMs long-time HPC exec Dave Turek left to take position with DNA storage start-up, Catalog, and last January, IBMer Brad McCredie joined AMD as corporate VP, GPU platforms.

View post:
Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too - HPCwire

Posted in Quantum Computing | Comments Off on Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too – HPCwire

Collaboration is the Future – Mediate.com

Lawyers love conflict. They thrive on it. If anyone can coexist with conflict, its a lawyer.

At least thats how most people think of lawyers. In reality, the opposite is more often true. The only people who love conflict might be candidates for the therapists couch. Most of us, especially lawyers, are averse to it.

The lawyer turned clinical psychologist, Larry Richard, has given personality assessments to over 5,000 lawyers over 20 years. As a tribe lawyers are disproportionately low in the personality traits of resilience and sociability. Resilience is the mark of emotional intelligence that allows one to accept failure, rejection and loss. Were not so good at that it turns out.

That may be, but what does that have to do with the economics of a successful legal practice or law department? It might surprise a few of us who subscribe to the zealous advocacy theory of legal practice that collaboration is more economically sustainable than exclusive competition.

Hold this thought in mind: in 2017 $10 billion in legal services revenue went from the BigLaw vault into the pockets of alternative legal service providers that are not law firms.

Why? Our conflict aversion is our greatest enemy in the Exponential Age of digital data, artificial intelligence and blockchain technologies. Doing better, faster and cheaper is the mantra of the collaborative economy. The legal business model that has worked extremely well in the competitive economy is on the verge of collapse, though that claim may seem a bit grandioseeven for a lawyer. But lets examine the evidence.

Unresolved Conflict in Workplaces is Expensive

Howatt HR Consulting provides a conflict cost calculator to gauge the cost of unresolved conflict in law firms and legal departments. I recently ran my calculator from the perspective of the most conflict-rich workplace I remember being a part of. It only cost $100,000 per year in lost productivity, absenteeism, health-care claims, turnover and other profit-destroying contributors. That is simply the impact of one person in that workplace! Howatt points out that the Canadian economy suffers a loss of over $16 billion each year due to unresolved conflict in the nations workplaces.

Its customarily calculated that the cost of an employees turnoverthrough termination or voluntary departure, then replacementcosts 120 percent of that employees annual compensation. For a $55,000-a-year paralegal, the cost of losing him or her is $66,000. Lost productivity, training and bringing a replacement to the same level of performance as a predecessor is not cheap.

At the British Legal Technology Forum 2018, Kevin Gold, a Mishcon de Reya managing partner, stated in a plenary session that the firm had calculated the costs of bringing a new young attorney to the point of return on investment; it was 250,000, or roughly $340,000.

I have listened as partners proudly describe the economic brilliance of their firms leverage model in terms such as, We have one associate make partner for every eight associates we hire. Theyre expendable. If they cant figure out how to succeed in our business model, we dont need them. There are more waiting for the empty chair. But losing seven associates to every one who makes partner is a very expensive proposition. Most associates who arent going to make partner are gone, voluntarily or otherwise, before they achieve third-year status.

According to Gold, the young lawyers at Mishcon de Reya become revenue-neutral somewhere close to their third year. Under the business model in my partner-friends firm, the firm loses about $2.5 million for every successful associate. Adjust the variables however you wish and the loss of treating associate attorneys as fungible is economically foolhardy, if not disastrous.

Similarly, the numerous accounts and studies of lateral attorney hires reflect how rarely the transition is economically beneficial for the firm. The laterally hired partner usually makes out like a bandit, but the firm often breaks even at best. More often the transaction is a loss leader. It may be worth the headlines, but the price borne by the bottom line can be less than rosy.

Of course, the law is one of the only professions that prohibits noncompete agreements with lawyers. A high-value executive can be bound by non-competes, but not lawyers. As a former firm executive committee member, we often said that a law firm is the only business that allows its inventory to walk out the door each night. If the lawyer doesnt return the next day, neither do their clients in most cases. When negotiating with a lateral attorney, the deal is usually cut on the basis of the attorneys portable business.

Whats the cause of all this lost revenue and profit? Unresolved conflict is usually the culprit. Perhaps its the associate who isnt popular enough with the firms power brokers and influencers to be worth the effort to resource, train, develop and treat as the resource Mishcon de Reya recognizes him or her to be. Or partners at odds with each over origination credits in the last compensation wars are more likely to engage in passive-aggressive behavior than have a conversation intended to reach agreement over a proper allocation of credit.

Admit it, you know its true. After 40 years of legal practice, Ive witnessed more unresolved conflict in law firms and legal departments than in prisons. Prisoners just take it outside. Lawyers demonstrate what we call Nashville Nice around these parts. You learn how to smile to their faces and then stab them in the back with a politically correct criticism in the Nashville fashion: Oh, shes a nice person, and I would never say anything bad about her, bless her heart. Thats conflict aversion.

Frankly, its more than an economic problem. Its a societal, emotional and health problem. Lawyer addiction, suicide and relational dysfunction exceed the general norm by a large margin. That, too, is an economic scourge.

The statistics cannot be questioned. Gender diversity in law school is far superior to that in law firms, legal departments, firm management committees, partnerships and the executive suite. Racial diversity doesnt even begin to reflect the population. The steady reduction in diversity as the organizational level of power and status increases is an indictment on our entire profession. What are the economic costs? The answer is simply unimaginableand totally unacceptable.

Thriving in the Collaborative Economy

We all remember the 1L experience when the most intimidating professor in our assigned classes made the recurrent sobering remark: Look to your right, look to the left . . . . Thus began our steady march into the competitive mindset of thinking like a lawyer. Unfortunately, for those of us wired that way, this culture of competition fed all our worst instincts. For others it was soul destroying. Richard, the lawyer turned clinical therapist, indicates thats the reason he became a psychologist.

While the law has perfected radical competitiveness, the rest of the business world is becoming radically collaborative. This transformative transition is due to the inevitability of digital power and pace. For a full exploration of the exponential nature of the Digital Age and its impact on commerce and culture, read The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies, by Erik Brynjolfsson and Andrew McAfee. The authors brilliantly compare the attributes of the first half of the Machine Agefrom the steam engine up to 2006to the second half. The first was competitive leading to scarcity. The second, also known as the Exponential Age, is collaborative leading to abundance.

A recent visit to Silicon Valley revealed how cooperative business has become. I spoke with a software engineer working for Dell who supervises a software development team. Nothing abnormal about that. However, he manages a team whose members change every day on projects that change every day. A Dell engineer manages a team that one day might consist of developers from Microsoft, SAP, Google, Apple and others. They are working on open-source software that builds open-source softwarefor the benefit of all.

Some say attorneys could never do that. It would be unethical, wouldnt it? Ask Pfizer and the small number of law firms that won the privilege of doing Pfizers legal work. A few years ago the pharmaceutical company required its successful law firm bidders to share work product, lessons learned and mistakes made with the other Pfizer core counsel after each matter. Thats distinctly unconventionaland the hallmark of successful business models in the Exponential Age.

Many other professions have already arrived in the cooperative age of business. Preparing for a recent training program for the Vanderbilt Medical School Leadership College, I discovered Quantum Leadership: Building Better Partnerships for Sustainable Health, by Tim Porter-OGrady and Kathy Malloch. Remove the word health and replace it with law and the parallels are unmistakable. The tools of technology, artificial intelligence, blockchain, the internet of things and cryptocurrency are, or will be, changing everything. Even quantum computing has arrived, making traditional computing look like the tortoise versus the harethat is, quantum computers can calculate 100,000 times faster. As a result the old keep-it-so-no-one-else-can-get-it mindset is evaporating. Do you want to work on IBMs quantum computer, operating at 20 qubits and soon to be 50 qubits? Its free and open source. Go right ahead.

When did all this happen, you ask. Seemingly overnight, and without warning. Thats exponential. As a result no disciplinary expertise is sufficient in itself. Cognitive diversity is the fuel of innovation. Seeing a problem from the same perspective leads to the same old solutions. Seeing the same problem from multiple perspectives (gender, racial, religious, sexual orientation, disability and national origin) brings creativity to the table, and competition is inimical to its success.

What quantum leadership requires is a new form of leadership: one thats radically collaborative. The old commercial model is hierarchical, structured and highly command-and-control oriented. The new model is flat, team-based and relational.

The new commercial model is focused on accountability rather than responsibility and output rather than effort. My life as a lawyer was spent selling effort, not output. Time has been the coin of the realm in the law since 1956, when the ABA informed lawyers that time is your most valuable asset. Man, did we buy that, and so did our clientsuntil they tired of it. Now they want value, not effort.

The difference between the old commercial order and the new is stunning. Working in teams is not taught in law school. I have been teaching Legal Project Management at Vanderbilt Law School for six years. Law students routinely report that this class is the first time they have been asked to work in a team in law school unless they are joint J.D./M.B.A. candidates. Business students dont understand why law school doesnt value teamwork. Therein lies one of our greatest problems: our clients are team-based, and we dont know how to do that.

Replacing Hypercompetition with Collaboration

Lets return to the question of the missing $10 billion. How could BigLaw lose that much value in a year? Lets examine the data.

The data isnt secret. Its been building over 10 years. Its more than an aberration; its a statistical trend. The data is submitted voluntarily by the nations largest law firmsnamely, the Am Law 300on a monthly basis and reported in the Thomson Reuters Peer Monitor Index reports. Although anonymized, the data collected over the last 10 years is stunning. Law firms are losing market share steadily, relentlessly and without response.

Spend time with the data reported in the Georgetown Law Centers and Thomson Reuters Legal Executive Institutes annual Report on the State of the Legal Market. Ten years of BigLaw self-reporting reveals the following: all the data reflecting financial progress in time billed and billings realized, collected and banked in firm law treasuries is in long-term decline. There are two rising trends: rates and costs. This dangerous economic state is obvious to everyone. Nothing is being done except by a few high-flying firms that have figured out the antidote to demise.

Check out Table 15 in the Georgetown/Thomson Reuters report. The missing $10 billion went to nonlawyers and nonlaw firms such as PwC, Deloitte, Axiom, lexunited, Pangea3, LegalZoom and a growing host of alternative legal service providers doing law better, faster and cheaperand sometimes without a law license. Thats what the market wants.

The report pulls no punches this year. It states: Stop doubling down on your failing strategy! Citing the Harvard Business Review analysis by the same title, the report warns BigLaw leaders that their conflict aversion could make these hallmark firms irrelevant.

How so? Harvard and Georgetown Law cite the power of our mind-blindness in the face of economic peril. Its all about heuristics, the state of mind that partially determines how we react to stress and threat. Our worldview is only valuable in the context of how it was formed. Another way of saying it is, You cant tell a room full of millionaires their business model is broken. They cant hear it. This is not a function of intelligence but of experience. We cant know what we dont know.

Specifically, the mental heuristics that take over our cognitive capacity in times of economic peril can be summarized with startling reality in the following ways:

When combined, these mental heuristics, which reflect simply how the human brain works, can be a toxic brew of mind-blindness, obscuring paths to rescue and ways out of a dilemma of our own making.

Whats a body to do? We must overcome our conflict aversion and welcome a path to open, respectful and strategic conflict competence rather than our preferred resort to passive-aggressive behavior.

The Harvard Business Review article suggests rules to follow to achieve conflict competence:

Embracing the Cooperative Economy

Although unfamiliar to those of us steeped in a competitive model of economic success, the world has moved on and is continuing to stake out new opportunities for economic success through previously unheard-of degrees of cooperative effort.

Start small and learn as you go. Discover the power and the scope of building bridges rather than silos. As our digital world continues to explode in data and the power to process it, learn to learn from other disciplines. Make friends with a data scientist, a software engineer or a legal project manager. Learn to see from their perspectives.

And, most importantly, jump in, the waters fine.

Follow this link:
Collaboration is the Future - Mediate.com

Posted in Quantum Computing | Comments Off on Collaboration is the Future – Mediate.com

Tech trends in 2021: How artificial intelligence and technology will reshape businesses – The Financial Express

What better time than now to unveil what to look out for in the world of AI and technology in 2021.

By Prithwis De

The year 2020 will be marked as an unprecedented year in history due to the adverse impact of coronavirus worldwide. This pandemic has started bringing extraordinary changes in some key areas. The trends of faster drug development, effective remote care, efficient supply chain, etc, will continue into 2021. Drone technology is already playing a vital role in delivering food and other essentials alongside relief activities.

With Covid-19 came a new concept of the Internet of Behaviour within organisations to track human behaviour in the work environment and trace any slack in maintaining guidelines. Now on, organisations are set to capture and combine behaviour-related data from different sources and use it. We can assertively say it will affect the way organisations interact with people, going forward. Students are experiencing distance learning, taking examinations under remotely-monitored and proctored surveillance systems through identity verification and authentication in real time.

All these will have a high impact on technology, which will shape our outlook in the future. Businesses around the globe are taking the giant leap to become tech-savvy with quantum computing, artificial intelligence (AI), cybersecurity, etc. AI and cloud computing are alluring us all towards an environment of efficiency, security, optimisation and confidence. What better time than now to unveil what to look out for in the world of AI and technology in 2021.

What 2020 has paved the way for is quantum computing. Now, be prepared to adapt to a hybrid computing approach (conventional cum quantum computing) to problem-solving. This paradigm shift in computing will result in the emergence of implausible ways to solve existing business problems and ideate new opportunities. Its effects will be visible on our ability to perform better in diverse areasfinancial forecasting, weather predictions, drug and vaccine development, blood-protein analysis, supply chain planning and optimisation, etc. Quantum Computing as a Service (QCaaS) will be a natural choice for organisations to plug into the experiments as we advance. Forward-thinking businesses are excited to take the quantum leap, but the transition is still in a nascent stage. This new year will be a crucial stepping stone towards the future of things to change in the following years.

Cloud providers such as Amazon (AWS), Microsoft (Azure) and Google will continue to hog the limelight as the AI tool providers for most companies leaning towards real-time experiments in their business processes in the months to follow. Efficiency, security and customisation are the advantages for which serverless and hybrid cloud computing are gaining firm ground with big enterprises. It will continue to do so in 2021.

Going forward, the aim is to make the black box of AI transparent with explainable AI. The lack of clarity hampers our ability to trust AI yet. Automated machine learning (AutoML), another crucial area, is likely to be very popular in the near future. One more trend that caught on like wildfire in 2020 is Machine Learning Operations (MLOps). It provides organisations visibility of their models and has become an efficient tool to steer clear of duplicated efforts in AI. Most of the companies have been graduating from AI experimentations and pilot projects to implementation. This endeavour is bound to grow further and enable AI experts to have more control over their work from end-to-end now onwards.

Cybersecurity will gain prime importance in 2021 and beyond as there is no doubt that hacking and cybercrime prevention are priorities for all businesses with sensitive data becoming easily accessible with advanced phishing tools. Advanced prediction algorithms, along with AI, will play a decisive role in the future to prevent such breaches in data security.

AI and the Internet of Things along with edge computing, which is data processing nearer the source closer to the device at the edge of the network, will usher in a new era for actionable insights from the vast amount of data. The in-memory-accelerated-real-time AI will be needed, particularly when 5G has started creating new opportunities for disruption.

In 2020, there was a dip in overall funding as the pandemic had badly impacted the investment sector due to a reduction in activity. Some of the technology start-ups are still unable to cope up with the challenges created due to Covid-19 and the consequent worsening economic conditions. According to NASSCOM, around 40% of Indian start-ups were forced to stop their operations. In 2021, mergers and acquisitions of start-ups are expected. The larger companies are likely to target smaller companies, specialised mainly in niche and innovative areas such as drug development, cybersecurity, AI chips, cloud computing, MLOps, etc.

The businesses in 2021 and beyond will develop into efficient workplaces for everybody who believes in the power of technology. It is important to bear in mind that all trends are not necessarily independent of each other, but rather form the support base of the other as well as work in tandem with human intervention. So, are the hybrid trends and solutions here to stay for the next few years for the smooth running of various organisations? Only time will tell. But the need for AI and newer technology adoption and modernisation increases manifold.

The author is an analytics and AI professional, based in London, working in a big IT company. Views are personal

Get live Stock Prices from BSE, NSE, US Market and latest NAV, portfolio of Mutual Funds, Check out latest IPO News, Best Performing IPOs, calculate your tax by Income Tax Calculator, know markets Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.

Financial Express is now on Telegram. Click here to join our channel and stay updated with the latest Biz news and updates.

Go here to read the rest:
Tech trends in 2021: How artificial intelligence and technology will reshape businesses - The Financial Express

Posted in Quantum Computing | Comments Off on Tech trends in 2021: How artificial intelligence and technology will reshape businesses – The Financial Express

Tech trends to watch in 2021 – India Today

The year 2020 has been one of the most unpredictable years and in parallel, we have seen the transition of technology in various sectors that has really helped humanity predict & prepare for any catastrophic condition. Considering the Covid-19 pandemic as one of the situations, many Scientists, Engineers & other techies have realized that a lot of development is still required to make life easier with accessible technology. Therefore, we bring to you some of the top tech trends to watch in 2021:

In the last decade, we have seen that there is no limit for technology & with the rise of digitalization in India, there will be a need for Quantum computing in order to protect Banking systems & IT security from cybercrime. With database processing as a critical strength of quantum computing, technologies such as artificial intelligence will be one application that will get significant benefit from the superior processing of Quantum computers.

Therefore, it can be seen that there will be massive competition among the big IT giants to provide services in cybersecurity, drug development, climatic condition prediction, etc., with the help of quantum computing.

In IoT applications, there were two challenges: the range and battery life. These two challenges are now overcome with the help of NB IoT. Considering the fact that approximately 21 billion devices will be connected by 2025, there will be a huge competition between Telecoms like Jio, Airtel, Vodafone, and others to provide cost-effective & efficient solution to their consumers in SaaS (Software as a Service) and PaaS (Platform as a Service) model. Moreover, India is working actively on NB IoT. In a first, BSNL with Skylo has launched the world's first satellite-based on NB-IoT to streamline various sectors, including fishers, farmers, construction, mining and logistics enterprises.

Tech trends to watch in 2021 | Representational image

IPA is the advanced version of RPA i.e., Robotics Process Automation. It is actually a combination of RPA & Machine Learning. Due to the outbreak of COVID-19, most of the IT industry has given intimation that there is a possibility to announce permanent work from home & some of the companies have already declared the same, including TCS, Deloitte, Twitter, etc. It is imperative for any industry to check the activeness, productivity & relative output from the workforce during this scenario.

Therefore, IPA techniques will be expected to increase process efficiency, better customer experience, optimize workforce productivity, and generate a relatively surge in revenue generation. In 2019-2020 we saw how chatbot helped the firms automate customer interaction & thereby reducing the operational cost. Similarly, various IPA techniques will help firms of any kind to construct any raw data into a structured one. In consequence, IPA techniques will be going to reducing human error & enhancing customer satisfaction.

Artificial Intelligence will expand its footprints in various sectors, including military, defence, agriculture, automotive, education, medical, construction, etc. power & scope of AI is unimaginable; it's endless. According to Fox News, the artificial intelligence algorithm, developed by heron systems, swept a human F-16 pilot in a simulated dogfight 5-0 on August 2020. Additionally, with the launch of GPT-3, an autoregressive language model that uses deep learning to produce human-like text developed by the OpenAI lab team.

This model expects to generate excellent quality text, making it difficult to distinguish whether the text is generated by humans or machines. In the agriculture sector, too, there will be some expectation to increase crop productivity with the help of AI techniques & thereby increasing farmers' income.

With the announcement of NEP 2020 by the Ministry of Education, there will be a change in all the institutions' learning patterns. We can see the rise in technologies such as Artificial Intelligence, Machine Learning, Big Data, blockchain, etc. Hence, the education ministry will put strenuous effort into upgrading India's education quality to make a skilled workforce.

Most awaited 5G or the 5th generation cellular network technology services is expected to launch in 2021 as telecom giants including Bharti Airtel, Jio, Vodafone Idea is ramping up to move early trails to commercialization with their respective partners.

Meanwhile, Reliance CMD Mr. Mukesh Ambani has already declared that Jio is ready with the infrastructure & Jio will pioneer the 5G revolution in India in the second half of 2021. This way, we can realize that technologies have made our lives easier and better in many misfortune situations. Hence, it will be our primary need in the future to let humans and machines work together to protect humans.

-Article by Abhishek Gupta, CEO & Co-founder, Hex N Bit

READ | Education sector's 2021 outlook and trends to keep in mind

READ | 5 edtech startups to watch out for in 2021

READ | What skills are Indians learning for 2021

See the rest here:
Tech trends to watch in 2021 - India Today

Posted in Quantum Computing | Comments Off on Tech trends to watch in 2021 – India Today