Search Immortality Topics:

Page 28«..1020..27282930..4050..»


Category Archives: Quantum Computing

The Spooky Quantum Phenomenon You’ve Never Heard Of – Quanta Magazine

Perhaps the most famously weird feature of quantum mechanics is nonlocality: Measure one particle in an entangled pair whose partner is miles away, and the measurement seems to rip through the intervening space to instantaneously affect its partner. This spooky action at a distance (as Albert Einstein called it) has been the main focus of tests of quantum theory.

Nonlocality is spectacular. I mean, its like magic, said Adn Cabello, a physicist at the University of Seville in Spain.

But Cabello and others are interested in investigating a lesser-known but equally magical aspect of quantum mechanics: contextuality. Contextuality says that properties of particles, such as their position or polarization, exist only within the context of a measurement. Instead of thinking of particles properties as having fixed values, consider them more like words in language, whose meanings can change depending on the context: Timeflies likean arrow. Fruitflies likebananas.

Although contextuality has lived in nonlocalitys shadow for over 50 years, quantum physicists now consider it more of a hallmark feature of quantum systems than nonlocality is. A single particle, for instance, is a quantum system in which you cannot even think about nonlocality, since the particle is only in one location, said Brbara Amaral, a physicist at the University of So Paulo in Brazil. So [contextuality] is more general in some sense, and I think this is important to really understand the power of quantum systems and to go deeper into why quantum theory is the way it is.

Researchers have also found tantalizing links between contextuality and problems that quantum computers can efficiently solve that ordinary computers cannot; investigating these links could help guide researchers in developing new quantum computing approaches and algorithms.

And with renewed theoretical interest comes a renewed experimental effort to prove that our world is indeed contextual. In February, Cabello, in collaboration with Kihwan Kim at Tsinghua University in Beijing, China, published a paper in which they claimed to have performed the first loophole-free experimental test of contextuality.

The Northern Irish physicist John Stewart Bell is widely credited with showing that quantum systems can be nonlocal. By comparing the outcomes of measurements of two entangled particles, he showed with his eponymous theorem of 1965 that the high degree of correlations between the particles cant possibly be explained in terms of local hidden variables defining each ones separate properties. The information contained in the entangled pair must be shared nonlocally between the particles.

Bell also proved a similar theorem about contextuality. He and, separately, Simon Kochen and Ernst Specker showed that it is impossible for a quantum system to have hidden variables that define the values of all their properties in all possible contexts.

In Kochen and Speckers version of the proof, they considered a single particle with a quantum property called spin, which has both a magnitude and a direction. Measuring the spins magnitude along any direction always results in one of two outcomes: 1 or 0. The researchers then asked: Is it possible that the particle secretly knows what the result of every possible measurement will be before it is measured? In other words, could they assign a fixed value a hidden variable to all outcomes of all possible measurements at once?

Quantum theory says that the magnitudes of the spins along three perpendicular directions must obey the 101 rule: The outcomes of two of the measurements must be 1 and the other must be 0. Kochen and Specker used this rule to arrive at a contradiction. First, they assumed that each particle had a fixed, intrinsic value for each direction of spin. They then conducted a hypothetical spin measurement along some unique direction, assigning either 0 or 1 to the outcome. They then repeatedly rotated the direction of their hypothetical measurement and measured again, each time either freely assigning a value to the outcome or deducing what the value must be in order to satisfy the 101 rule together with directions they had previously considered.

They continued until, in the 117th direction, the contradiction cropped up. While they had previously assigned a value of 0 to the spin along this direction, the 101 rule was now dictating that the spin must be 1. The outcome of a measurement could not possibly return both 0 and 1. So the physicists concluded that there is no way a particle can have fixed hidden variables that remain the same regardless of context.

While the proof indicated that quantum theory demands contextuality, there was no way to actually demonstrate this through 117 simultaneous measurements of a single particle. Physicists have since devised more practical, experimentally implementable versions of the original Bell-Kochen-Specker theorem involving multiple entangled particles, where a particular measurement on one particle defines a context for the others.

In 2009, contextuality, a seemingly esoteric aspect of the underlying fabric of reality, got a direct application: One of the simplified versions of the original Bell-Kochen-Specker theorem was shown to be equivalent to a basic quantum computation.

The proof, named Mermins star after its originator, David Mermin, considered various combinations of contextual measurements that could be made on three entangled quantum bits, or qubits. The logic of how earlier measurements shape the outcomes of later measurements has become the basis for an approach called measurement-based quantum computing. The discovery suggested that contextuality might be key to why quantum computers can solve certain problems faster than classical computers an advantage that researchers have struggled mightily to understand.

Robert Raussendorf, a physicist at the University of British Columbia and a pioneer of measurement-based quantum computing, showed that contextuality is necessary for a quantum computer to beat a classical computer at some tasks, but he doesnt think its the whole story. Whether contextuality powers quantum computers is probably not exactly the right question to ask, he said. But we need to get there question by question. So we ask a question that we understand how to ask; we get an answer. We ask the next question.

Some researchers have suggested loopholes around Bell, Kochen and Speckers conclusion that the world is contextual. They argue that context-independent hidden variables havent been conclusively ruled out.

In February, Cabello and Kim announced that they had closed every plausible loophole by performing a loophole free Bell-Kochen-Specker experiment.

The experiment entailed measuring the spins of two entangled trapped ions in various directions, where the choice of measurement on one ion defined the context for the other ion. The physicists showed that, although making a measurement on one ion does not physically affect the other, it changes the context and hence the outcome of the second ions measurement.

Skeptics would ask: How can you be certain that the context created by the first measurement is what changed the second measurement outcome, rather than other conditions that might vary from experiment to experiment? Cabello and Kim closed this sharpness loophole by performing thousands of sets of measurements and showing that the outcomes dont change if the context doesnt. After ruling out this and other loopholes, they concluded that the only reasonable explanation for their results is contextuality.

Cabello and others think that these experiments could be used in the future to test the level of contextuality and hence, the power of quantum computing devices.

If you want to really understand how the world is working, said Cabello, you really need to go into the detail of quantum contextuality.

See the original post here:
The Spooky Quantum Phenomenon You've Never Heard Of - Quanta Magazine

Posted in Quantum Computing | Comments Off on The Spooky Quantum Phenomenon You’ve Never Heard Of – Quanta Magazine

Global Quantum Computing Market is estimated to be US$ 4531.04 billion by 2030 with a CAGR of 28.2% during the forecast period – By PMI -…

Covina, June 22, 2022 (GLOBE NEWSWIRE) -- The discovery of potential COVID-19 therapeutics has a bright future due toquantum computing. New approaches to drug discovery are being investigated with funding from the Penn State Institute for Computational and Data Sciences, coordinated through the Penn State Huck Institutes of the Life Sciences. For businesses in the quantum computing market, these tendencies are turning into lucrative opportunities during forecast period. Research initiatives that are assisting in the screening of billions of chemical compounds to uncover suitable medication candidates have been made possible by the convergence of machine learning and quantum physics. Stakeholders in the quantum computing business are expanding the availability of supercomputers and growing R&D in artificial intelligence to support these studies (AI). The energy and electricity sector offers lucrative potential for businesses in the quantum computing market. As regard to whole assets, work overs, and infrastructure, this technology is assisting players in the energy and power sector in making crucial investment decisions. Budgetary considerations, resource constraints, and contractual commitments may all be factors in these issues that quantum computing can help to resolve.

Region Analysis:

North America is predicted to hold a large market share for quantum computing due to its early adoption of cutting-edge technology. Additionally, the existence of a competitive market and end-user acceptance of cutting-edge technology may promote market growth. Sales are anticipated to increase throughout Europe as a result of the rise of multiple startups, favourable legislative conditions, and the growing use of cloud technology. In addition, it is anticipated that leading companies' company expansion will accelerate market growth. The market is anticipated to grow in Asia Pacific as a result of the growing need for quantum computing solutions for simulation, optimization, and machine learning.

Key Highlights:

Before purchasing this report, request a sample or make an inquiry by clicking the following link:

https://www.prophecymarketinsights.com/market_insight/Insight/request-sample/571

Key Market Insights from the report:

Global Quantum Computing Market size accounted for US$ 387.3 billion in 2020 and is estimated to be US$ 4531.04 billion by 2030 and is anticipated to register a CAGR of 28.2%.The Global Quantum Computing Market is segmented based on component, application, end-user industry and region.

Competitive Landscape & their strategies of Quantum Computing Market:

Key players in the global quantum computing market include Wave Systems Corp, 1QB Information Technologies Inc, QC Ware, Corp, Google Inc, QxBranch LLC, Microsoft Corporation, International Business Machines Corporation, Huawei Technologies Co., Ltd, ID Quantique SA, and Atos SE.

Scope of the Report:

Global Quantum Computing Market, By Component, 2019 2029, (US$ Mn)

To know more:Click here

Some Important Points Answered in this Market Report Are Given Below:

Browse Related Reports:

1.Photonic Integrated Circuit Market, By Integration (Monolithic Integration, Hybrid Integration, and Module Integration), By Raw Material (Gallium Arsenide, Indium Phosphide, Silica On Silicon, Silicon On Insulator, and Lithium Niobate), By Application (Optical Fiber Communication, Optical Fiber Sensors, Biomedical, and Quantum Computing), and By Region (North America, Europe, Asia-Pacific, Latin America, and Middle East & Africa) - Trends, Analysis, and Forecast till 2029

2.Edge Computing Market, By Component (Hardware, Services, Platform, and Solutions), By Application (Location Services, Analytics, Data Caching, Smart Cities, Environmental Monitoring, Optimized Local Content, Augmented Reality, Optimized Local Content, and Others), By End-User (Telecommunication & IT, Healthcare, Government & Public, Retail, Media & Entertainment, Transportation, Energy & Utilities, and Manufacturing), and By Region (North America, Europe, Asia Pacific, Latin America, and Middle East & Africa) - Trends, Analysis, and Forecast till 2029

3.Global 5G Technology Infrastructure Market, By Communication Infrastructure (Small Cell, Macro Cell, Radio Access Network, and Distributed Antenna System), By Network Technology (Software Defined Networking & Network Function Virtualization, Mobile Edge Computing, and Fog Computing), By Application (Automotive, Energy & Utilities, Healthcare, Retail, and Others), and By Region (North America, Europe, Asia Pacific, Latin America, and Middle East & Africa) - Trends, Analysis and Forecast till 2029

See the article here:
Global Quantum Computing Market is estimated to be US$ 4531.04 billion by 2030 with a CAGR of 28.2% during the forecast period - By PMI -...

Posted in Quantum Computing | Comments Off on Global Quantum Computing Market is estimated to be US$ 4531.04 billion by 2030 with a CAGR of 28.2% during the forecast period – By PMI -…

Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST

An enigma machine on display outside the Alan Turing Institute entrance inside the British Library, London.

Credit: Shutterstock/William Barton

Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.

The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turings brief life.

Alan Turing

Credit: National Portrait Gallery, London

In Turings time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a cars most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.

Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.

His hypothetical device is now known as a Turing machine. The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to todays hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.

Credit: NIST

The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current state of the machine. For example, if the machine needs to decide whether the tape contains the text string TC it can scan the tape in the forward direction while switching among the states previous letter was T and previous letter was not C. If while in state previous letter was T it reads a C, it goes to a state found it and halts. If it encounters the blank symbol at the end of the input, it goes to the state did not find it and halts. Nowadays we would recognize the set of instructions as the machines program.

It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.

To understand what can be computed it is helpful to identify what cannot be computed. Ina previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: My program has been running for a long time; is it stuck? This is called the Halting Problem, and students often wondered why we simply couldnt detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement this machine halts is true or false.

Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.

Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called chemical castration. He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and pardoned him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.

Turings computability work provided the foundation for modern complexity theory. This theory tries to answer the question Among those problems that can be solved by a computer, which ones can be solved efficiently? Here, efficiently means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.

For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.

Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the worlds experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turings machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.

But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of todays computing technology and even gave us hints on how to solve looming technological problems.

See the original post here:
Alan Turing's Everlasting Contributions to Computing, AI and Cryptography - NIST

Posted in Quantum Computing | Comments Off on Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST

Quantum computing: D-Wave shows off prototype of its next quantum annealing computer – ZDNet

Image: Wacomka/Shutterstock

Quantum-computing outfit D-Wave has announced commercial access to an "experimental prototype" of its Advantage2 quantum annealing computer.

D-Wave is beating its own path to qubit processors with its quantum annealing approach. According to D-Wave, the Advantage2 prototype available today features over 500 qubits. It's a preview of a much larger Advantage2 it hopes to be available by 2024 with 7,000 qubits.

Access to the Advantage2 prototype is restricted to customers who have a D-Wave's Leap cloud service subscription, but developers interested in trying D-Wave's quantum cloud can sign up to get "one minute of free use of the actual quantum processing units (QPUs) and quantum hybrid solvers" that run on its earlier Advantage QPU.

The Advantage2 prototype is built with D-Wave's Zephyr connection technology that it claims offers higher connectivity between qubits than its predecessor topology called Pegasus, which is used in its Advantage QPU.

D-Wave says the Zephyr design enables shorter chains in its Advantage2 quantum chips, which can make them friendlier for calculations that require extra precision.

SEE:What is quantum computing? Everything you need to know about the strange world of quantum computers

"The Advantage2 prototype is designed to share what we're learning and gain feedback from the community as we continue to build towards the full Advantage2 system," says Emile Hoskinson, director of quantum annealing products at D-Wave.

"With Advantage2, we're pushing that envelope again demonstrating that connectivity and reduction in noise can be a delivery vehicle for even greater performance once the full system is available. The Advantage2 prototype is an opportunity for us to share our excitement and give a sneak peek into the future for customers bringing quantum into their applications."

While quantum computing is still experimental, senior execs are priming up for it as a business disruptor by 2030, according to a survey by consultancy EY. The firm found found that 81% of senior UK executives expect quantum computing to play a significant role in their industry by 2030.

Fellow consultancy McKinsey this month noted funding for quantum technology startups doubled in the past two years, from $700 million in 2020 to $1.4 billion in 2021. McKinsey sees quantum computing shaking up pharmaceuticals, chemicals, automotive, and finance industries, enabling players to "capture nearly $700 billion in value as early as 2035" through improved simulation and better machine learning. It expects revenues from quantum computing to exceed $90 billion by 2040.

D-Wave's investors include PSP Investments, Goldman Sachs, BDC Capital, NEC Corp, Aegis Group Partners, and the CIA's VC firm, In-Q-Tel.

See the rest here:
Quantum computing: D-Wave shows off prototype of its next quantum annealing computer - ZDNet

Posted in Quantum Computing | Comments Off on Quantum computing: D-Wave shows off prototype of its next quantum annealing computer – ZDNet

Quantum computing: Definition, facts & uses | Live Science

Quantum computing is a new generation of technology that involves a type of computer 158 million times faster than the most sophisticated supercomputer we have in the world today. It is a device so powerful that it could do in four minutes what it would take a traditional supercomputer 10,000 years to accomplish.

For decades, our computers have all been built around the same design. Whether it is the huge machines at NASA, or your laptop at home, they are all essentially just glorified calculators, but crucially they can only do one thing at a time.

The key to the way all computers work is that they process and store information made of binary digits called bits. These bits only have two possible values, a one or a zero. It is these numbers that create binary code, which a computer needs to read in order to carry out a specific task, according to the book Fundamentals of Computers (opens in new tab).

Quantum theory is a branch of physics which deals in the tiny world of atoms and the smaller (subatomic) particles inside them, according to the journal Documenta Mathematica (opens in new tab). When you delve into this minuscule world, the laws of physics are very different to what we see around us. For instance, quantum particles can exist in multiple states at the same time. This is known as superposition.

Instead of bits, quantum computers use something called quantum bits, 'qubits' for short. While a traditional bit can only be a one or a zero, a qubit can be a one, a zero or it can be both at the same time, according to a paper published from IEEE International Conference on Big Data (opens in new tab).

This means that a quantum computer does not have to wait for one process to end before it can begin another, it can do them at the same time.

Imagine you had lots of doors which were all locked except for one, and you needed to find out which one was open. A traditional computer would keep trying each door, one after the other, until it found the one which was unlocked. It might take five minutes, it might take a million years, depending on how many doors there were. But a quantum computer could try all the doors at once. This is what makes them so much faster.

As well as superposition, quantum particles also exhibit another strange behaviour called entanglement which also makes this tech so potentially ground-breaking. When two quantum particles are entangled, they form a connection to each other no matter how far apart they are. When you alter one, the other responds the same way even if they're thousands of miles apart. Einstein called this particle property "spooky action at a distance", according to the journal Nature (opens in new tab).

As well as speed, another advantage quantum computers have over traditional computers is size. According to Moore's Law, computing power doubles roughly every two years, according to the journal IEEE Annals of the History of Computing (opens in new tab). But in order to enable this, engineers have to fit more and more transistors onto a circuit board. A transistor is like a microscopic light switch which can be either off or on. This is how a computer processes a zero or a one that you find in binary code.

To solve more complex problems, you need more of those transistors. But no matter how small you make them there's only so many you can fit onto a circuit board. So what does that mean? It means sooner or later, traditional computers are going to be as smart as we can possibly make them, according to the Young Scientists Journal (opens in new tab). That is where quantum machines can change things.

The quest to build quantum computers has turned into something of a global race, with some of the biggest companies and indeed governments on the planet vying to push the technology ever further, prompting a rise in interest in quantum computing stocks on the money markets.

One example is the device created by D-Wave. It has built the Advantage system which it says is the first and only quantum computer designed for business use, according to a press release (opens in new tab) from the company.

D-wave said it has been designed with a new processor architecture with over 5,000 qubits and 15-way qubit connectivity, which it said enables companies to solve their largest and most complex business problems.

The firm claims the machine is the first and only quantum computer that enables customers to develop and run real-world, in-production quantum applications at scale in the cloud. The firm said the Advantage is 30 times faster and delivers equal or better solutions 94% of the time compared to its previous generation system.

But despite the huge, theoretical computational power of quantum computers, there is no need to consign your old laptop to the wheelie bin just yet. Conventional computers will still have a role to play in any new era, and are far more suited to everyday tasks such as spreadsheets, emailing and word processing, according to Quantum Computing Inc. (QCI) (opens in new tab).

Where quantum computing could really bring about radical change though is in predictive analytics. Because a quantum computer can make analyses and predictions at breakneck speeds, it would be able to predict weather patterns and perform traffic modelling, things where there are millions if not billions of variables that are constantly changing.

Standard computers can do what they are told well enough if they are fed the right computer programme by a human. But when it comes to predicting things, they are not so smart. This is why the weather forecast is not always accurate. There are too many variables, too many things changing too quickly for any conventional computer to keep up.

Because of their limitations, there are some computations which an ordinary computer may never be able to solve, or it might take literally a billion years. Not much good if you need a quick prediction or piece of analysis.

But a quantum computer is so fast, almost infinitely so, that it could respond to changing information quickly and examine a limitless number of outcomes and permutations simultaneously, according to research by Rigetti Computing (opens in new tab).

Quantum computers are also relatively small because they do not rely on transistors like traditional machines. They also consume comparatively less power, meaning they could in theory be better for the environment.

You can read about how to get started in quantum computing in this article by Nature (opens in new tab). To learn more about the future of quantum computing, you can watch this TED Talk (opens in new tab) by PhD student Jason Ball.

Read the original here:
Quantum computing: Definition, facts & uses | Live Science

Posted in Quantum Computing | Comments Off on Quantum computing: Definition, facts & uses | Live Science

McKinsey thinks quantum computing could create $80b in revenue … eventually – The Register

In the hype-tastic world of quantum computing, consulting giant McKinsey & Company claims that the still-nascent field has the potential to create $80 billion in new revenue for businesses across industries.

It's a claim McKinsey has repeated nearly two dozen times on Twitter since March to promote its growing collection of research diving into various aspects of quantum computing, from startup and government funding to use cases and its potential impact on a range of industries.

The consulting giant believes this $80 billion figure represents the "value at stake" for quantum computing players but not the actual value that use cases could create [PDF]. This includes companies working in all aspects of quantum computing, from component makers to service providers.

Despite wildly optimistic numbers, McKinsey does ground the report in a few practical realities. For instance, in a Wednesday report, the firm says the hardware for quantum systems "remains too immature to enable a significant number of use cases," which, in turn, limits the "opportunities for fledgling software players." The authors add that this is likely one of the reasons why the rate of new quantum startups entering the market has begun to slow.

Even the top of McKinsey's page for quantum computing admits that capable systems won't be ready until 2030, which is in line with what various industry players, including Intel, are expecting. Like fusion, it's always a decade or so away.

McKinsey, like all companies navigating if quantum computing has any real-world value, is trying to walk a fine line, exploring the possibilities of quantum computing while showing the ways the tech is still disconnected from ordinary enterprise reality.

"While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage. Indeed, experts are still debating the most foundational topics for the field," McKinsey wrote in a December 2021 article about how use cases "are getting real."

One could argue the report is something of a metaphor for the quantum industry in 2022. Wildl optimism about future ecosystem profitability without really understanding what the tech will mean and to whom--and at what scale.

Go here to see the original:
McKinsey thinks quantum computing could create $80b in revenue ... eventually - The Register

Posted in Quantum Computing | Comments Off on McKinsey thinks quantum computing could create $80b in revenue … eventually – The Register