Search Immortality Topics:

Page 29«..1020..28293031..4050..»


Category Archives: Quantum Computing

Businesses brace for quantum computing disruption by end of decade – The Register

While business leaders expect quantum computing to play a significant role in industry by 2030, some experts don't believe the tech is going to be ready for production deployment in the near future.

The findings, from a survey titled "2022 Quantum Readiness" commissioned by consultancy EY, refer to UK businesses, although it is likely that the conclusions are equally applicable to global organizations.

According to EY, 81 percent of senior UK executives expect quantum computing to have a significant impact in their industry within seven and a half years, with almost half (48 percent) believing that quantum technology will begin to transform industries as soon as 2025.

As for the naysayers who say quantum tech won't be ready for live deployment any time soon, the industry also suffers from a hype problem, with capabilities being exaggerated and even some accusations flying around of alleged falsification, as with the example of quantum startup IonQ that was recently accused by Scorpion Capital of misleading investors about the effectiveness of its quantum hardware.

Joseph Reger, Fujitsu Fellow, CTO of Central and Eastern Europe and Member of Quantum Computing Council of World Economic Forum, told The Register he is getting some "heat" for saying quantum is not nearly a thing yet.

"There are impressive advantages that pre-quantum or quantum-inspired technologies provide. They are less sexy, but very powerful."

He added: "Some companies are exaggerating the time scales. If quantum computing gets overhyped, we are likely to face the first quantum winter."

Fujitsu is itself developing quantum systems, and announced earlier this year that it was working to integrate quantum computing with traditional HPC technology. The company also unveiled a high performance quantum simulator based on its PRIMEHPC FX 700 systems that it said will serve as an important bridge towards the development of quantum computing applications in future.

Meanwhile, EY claims that respondents were "almost unanimous" in their belief that quantum computing will create a moderate or high level of disruption for their own organization, industry sector, and the broader economy in the next five years.

Despite this, the survey finds that strategic planning for quantum computing is still at an embryonic stage for most organizations, with only 33 percent involved in strategic planning for how quantum will affect them and only a quarter have appointed specialist leaders or set up pilot teams.

The survey conducted in February-March 2022 covered 501 UK-based executives, all with senior roles in their organisations, who had to demonstrate at least a moderate (but preferably a high) level of understanding of quantum computing. EY said they originally approached 1,516 executives, but only 501 met this requirement, which in and of itself tells a tale.

EY's Quantum Computing Leader, Piers Clinton-Tarestad, said the survey reveals a disconnect between the pace at which some industry leaders expect quantum to start affecting business and their preparedness for those impacts.

"Maximizing the potential of quantum technologies will require early planning to build responsive and adaptable organisational capabilities," he said, adding that this is a challenge because the progress of quantum has accelerated, but it is "not following a steady trajectory."

For example, companies with quantum processors have increased the power of their hardware dramatically over the past several years, from just a handful of qubits to over a hundred in the case of IBM, which expects to deliver a 4,158-qubit system by 2025. Yet despite these advances, quantum computers remain a curiosity, with most operational systems deployed in research laboratories or made available via a cloud service for developers to experiment with.

Clinton-Tarestad said "quantum readiness" is "not so much a gap to be assessed as a road to be walked," with the next steps in the process being regularly revisited as the landscape evolves. He warned businesses that expect to see disruption in their industry within the next three or five years need to act now.

According to EY's report, executives in consumer and retail markets are those most likely to believe that quantum will play a significant role by 2025, with just over half of technology, media and telecommunications (TMT) executives expecting an impact within the same time frame. Most respondents among health and life sciences companies think this is more likely to happen later, between 2026 and 2035.

Most organizations surveyed expect to start their quantum preparations within the next two years, with 72 percent aiming to start by 2024.

However, only a quarter of organizations have got as far as recruiting people with the necessary skills to lead quantum computing efforts, although 68 percent said they are aiming to set up pilot teams to explore the potential of quantum for their business by 2024.

Fear of falling behind because rival companies are working to develop their own quantum capabilities is driving some respondents to start quantum projects, while the applications of quantum computing anticipated by industry leaders would advance operations involving AI and machine learning, especially among financial services, automotive and manufacturing companies. TMT respondents cited potential applications in cryptography and encryption as being the most likely use of quantum computing.

While the EY report warns about companies potentially losing out to rivals on the benefits of quantum computing, there are also dangers that organizations should be preparing for now, as Intel warned about during its Intel Vision conference last month.

One of these is that quantum computers could be used to break current cryptographic algorithms, meaning that the confidentiality of both personal and enterprise data could be at risk. This is not a far-off threat, but something that organizations need to consider right now, according to Sridhar Iyengar, VP of Intel Labs and Director of Security and Privacy Research.

"Adversaries could be harvesting encrypted data right now, so that they can decrypt it later when quantum computers are available. This could be sensitive data, such as your social security number or health records, which are required to be protected for a long period of time," Iyengar told us.

Organizations may want to address threats like this by taking steps such as evaluating post-quantum cryptography algorithms and increasing the key sizes for current crypto algorithms like AES.

Or they may simply decide to adopt a wait and see attitude. EY will no doubt be on hand to sell consultancy services to help clarify their thinking.

Read the original post:
Businesses brace for quantum computing disruption by end of decade - The Register

Posted in Quantum Computing | Comments Off on Businesses brace for quantum computing disruption by end of decade – The Register

Quantum computing can solve EVs safety woes – Times of India

Recent incidents of electric vehicle (EV) catching fire has shocked the Indian ecosystem and hindered the broad adoption of these vehicles. Before March of this year, there has been a substantial rise in the demand for electric vehicles and rapid advances in innovation and technology. Improvements in the battery technology, through increased efficiency and range, have made the EVs more accessible to the mass public, as the sector is currently dominated by two-wheelers and three-wheelers in India. According to Mordor Intelligence, Indias electric vehicle market was valued at $1.4 trillion in 2021, and it is expected to reach $15.4 trillion by 2027, recording a CAGR of 47.09% over the forecast period (2022-2027). Since March, the challenge in EV has shifted from affordability, charging, and range anxiety to safety. Safety has been of prime importance and an EV catching fire has led to dire consequences and even fatal.

The question is, why is this happening?

A report by the Defence Research and Development Organisations (DRDO) Centre for Fire Explosive and Environment Safety points it to the EV batteries. The issues highlighted includes poor quality cells, lack of fuse, issues with thermal management, and battery management system (BMS).

The highlighted issues cause the batteries to experience Thermal Runaway problem, leading to the fires. This phenomenon occurs when an increase in temperature changes the conditions in a manner that causes further increase in temperature, often leading to a destructive result. The issue highlighted by the DRDO report are all potential causes of thermal runaway. Lets explain why.

Local atmospheric temperature directly affects the operating temperature of battery. For efficient performance, batterys operating temperature should be around 20-35 C. To keep the battery at this temperature, EVs need battery thermal management system (BTMS). Now, with rising temperatures in our cities, the BTMS are being challenged and possibly due to the poor thermal management system of EV batteries, thermal runaway is being caused.

Another cause for the thermal runaway, is possibly due to the rapid battery charging. With the evolution of battery technology, charging technology is also advancing. While the fast charging can greatly improve the convenience of EVs, it increases the risks related to batteries. Fast charging an EV can overheat the battery system, enough to melt the electrical wires and cause short circuits, leading to explosive consequences, as already seen by several charging-related incidents.

While hot weather conditions and inadequate thermal management systems of the battery can negatively impact performance and shorten life, they alone cannot cause thermal runaway. As mentioned by DRDO report, inefficient, or even absence of, fuse as a fail-safe mechanism is a missing component causing thermal runaway.

The causes of thermal runaway highlighted above could be due to either inefficient design or not enough testing by EV manufacturers. But the manufacturers cannot spend more time on increased testing due to time-to-market constraints.

Whats the solution?

As stated, design and testing phase are very important phases of any product manufacturing. Since the era of industry 4.0, all design and testing have moved digitally and carried out on large-scale powerful computers through what is called Engineering Simulations (referred to as Simulations hereafter). Simulations can be of various types some of which are thermal (studying the effect of heat and temperature on object), structural (studying effect of objects strength, stress, and failure), fluid (studying effect of flow in and around an object), and electrochemical (studying effect of chemistry on electricity). Thermal runaway is a complex engineering problem, entailing all the types of simulations mentioned above. With the right simulation tools, simulations allow to mimic every possible physical condition, rising temperature, fast charging, or fuse placement and find areas of problem. After identifying, it can also aid in testing different solutions and hence avoid thermal runaway all together.

The question then becomes why are we seeing the news at all?

Biggest issue EV manufactures have with performing numerous simulations is the duration of time. To run a series of simulations, it can take months to obtain results with minimal flaws and defects (high accuracy simulations). Manufacturers cannot afford this as it greatly hampers the time to market. Thus, companies opt for simulations that can provide solutions but with several minor flaws and defects (low accuracy simulations) to them, leading to large mishaps like EV explosions, system failures, and affecting human lives. In addition, if the companies do find some time to perform these simulations with minimum flaws and defects (high accuracy simulations), the cost that manufacturers incur is very high due to the need for supercomputers whether on-premises (setup and maintenance cost) or on cloud (due high duration time of the computing).

So the real issue is the computing technology bottleneck. This is where the next-generation computing technology of Quantum computers can step in and revolutionize the industries like EV and Battery Design. This new technology is much more powerful, enabling exponential abilities to these industries.

Prospect of Quantum-powered simulations

The power Quantum computers is showcased by its ability to perform the same simulations in much less time compared to classical supercomputers. Hence, this technology can significantly help EV manufacturers in their time to market.

Moreover, the ability to obtain high accuracy from simulations is vital in using them in the product development process. Since high accuracy simulations took lot of time before, making them prohibitive, quantum-powered simulations can now enable the manufacturers to perform accurate simulations at reasonable time, in hours instead of months. Added accuracy will not only help companies create more efficient designs and improve the reliability of their vehicles, but also help in saving something invaluable, i.e., Lives. In addition, the speedup from Quantum computations enables lower computing usages, decreasing the overall cost and making it affordable for EV manufacturers.

Whats next?

In the computing sphere, Quantum Computing is the revolutionizing system, changing our understanding of computations and shows tremendous potential as shown by various use cases. While the prospect of Quantum-powered simulations offers the advantage of Better, Faster, and Cheaper, the development is very challenging as the Quantum computers work in entirely different ways.

Good news is that companies are already developing & building Quantum-powered simulation software, which can solve problems of thermal runaway and optimization of BTMS. Quantum Computing is here and now!

Views expressed above are the author's own.

END OF ARTICLE

The rest is here:
Quantum computing can solve EVs safety woes - Times of India

Posted in Quantum Computing | Comments Off on Quantum computing can solve EVs safety woes – Times of India

Quantum Computing – Intel

Quantum computing employs the properties of quantum physics like superposition and entanglement to perform computation. Traditional transistors use binary encoding of data represented electrically as on or off states. Quantum bits or qubits can simultaneously operate in multiple states enabling unprecedented levels of parallelism and computing efficiency.

Todays quantum systems only include tens or hundreds of entangled qubits, limiting them from solving real-world problems. To achievequantum practicality, commercial quantum systems need to scale to over a million qubits and overcome daunting challenges like qubit fragility and software programmability. Intel Labs is working to overcome these challenges with the help of industry and academic partners and has made significant progress.

First, Intel is leveraging its expertise in high-volume transistor manufacturing to develophot silicon spin-qubits, much smaller computing devices that operate at higher temperatures. Second, theHorse Ridge IIcryogenic quantum control chip provides tighter integration. And third, thecryoproberenables high-volume testing that is helping to accelerate commercialization.

Even though we may be years away from large-scale implementation, quantum computing promises to enable breakthroughs in materials, chemicals and drug design, financial and climate modeling, and cryptography.

Continued here:
Quantum Computing - Intel

Posted in Quantum Computing | Comments Off on Quantum Computing – Intel

The big money is here: The arms race to quantum computing – Haaretz

Theres a major controversy raging in the field of quantum computing. One side consists of experts and researchers who are skeptical of quantum computers ability to be beneficial in the foreseeable future, simply because the physical and technological challenges are too great. On the other side, if you ask the entrepreneurs and investors at firms banking on quantum computing, that hasnt been the issue for quite some time. From their standpoint, its only a matter of time and concerted effort until the major breakthrough and the real revolution in the field is achieved. And theyre prepared to gamble a lot of money on that.

For decades, most of the quantum research and development has been carried out by academic institutions and government research institutes, but in recent years, steps to make the transition from the academic lab to the industrial sector have increased. Researchers and scientists have been creating or joining companies developing quantum computing technology, and startups in the field have been cropping up at a dizzying pace. In 2021, $3.2 billion was invested in quantum firms around the world, according to The Quantum Insider compared to $900 million in 2020.

And in the first quarter of this year, about $700 million was invested a sum similar to the investments in the field between 2015 and 2019 combined. In addition to the surge in startup activity in the field, tech giants such as IBM, Amazon, Google and Microsoft have been investing major resources in the field and have been recruiting experts as well.

The quantum computing field was academic for a long time, and everything changed the moment that big money reached industry, said Ayal Itzkovitz, managing partner at the Pitango First fund, which has invested in several quantum companies in recent years. Everything is moving forward more quickly. If three years ago, we didnt know if it was altogether possible to build such a computer, now we already know that there will be quantum computers that will be able to do something different from classic computers.

Quantum computers, which are based on the principles of quantum theory, are aimed at providing vastly greater computing power than regular computers, with the capability to carry out a huge number of computations simultaneously. Theoretically it should take them seconds, minutes or hours to do what it would take todays regular supercomputers thousands of years to perform.

Quantum computers are based not on bits, but on qubits produced by a quantum processing unit, which is not limited to the binary of 0 or 1 but is a combination of the two. The idea is that a workable quantum computer, if and when there is such a thing, wont be suitable for use for any task but instead for a set of specific problems that require simultaneous computing, such as simulations, for example. It would be relevant for fields such as chemistry, pharmaceuticals, finance, energy and encoding among others.

It's still all theoretical, and there has yet to be a working quantum computer produced that is capable of performing a task more effectively than a regular computer but that doesnt bother those engaged in the arms race to develop a breakthrough quantum processor.

A million-qubit computer

IBM, which is one of the pioneers in the industry, recently unveiled a particularly large 127-qubit computer, and its promising to produce a 1,000-qubit one within the next few years. In 2019, Google claimed quantum supremacy with a computer that managed in 3.5 minutes to perform a task that would have taken a regular computer 10,000 years to carry out. And in May of last year, it unveiled a new quantum center in Santa Barbara, California and it intends to build a million-qubit computer by 2029 at an investment of billions of dollars.

Amazon has gotten into the field, recruiting researchers and recently launching a new quantum center at the California Institute of Technology, and Intel and Microsoft have also gotten into the game. In addition to their own internal development efforts, Amazon, Microsoft and Google have been offering researchers access to active quantum computers via their cloud computing services.

At the same time, there are several firms in the market that specialize in quantum computing that have already raised considerable sums or have even gone public. One of the most prominent of them is the American company IonQ (which in the past attracted investments from Google, Amazon and Samsung) and which last year went public via a SPAC merger. Another such company is the Silicon Valley firm Rigetti Computing, which also went public via a SPAC merger. Then theres Quantinuum, which was the product of a merger between Honeywell Quantum Solutions and Cambridge Quantum.

All thats in addition to a growing startup ecosystem of smaller companies such as Atom Computing and QuEra, which have raised initial funding to develop their own versions of a quantum processor.

In Israel in recent months, the countrys first two startups trying to create a quantum processor have been established. Theyre still in their stealth stage. One is Rehovot-based Quantum Source, which has raised $15 million to develop photonic quantum computing solutions. Its technology is based on research at the Weizmann Institute of Science, and its headed by leading people in the Israeli processor chip sector. The second is Quantum Art, whose executives came from the Israeli defense sector. Its technology is also based on work at the Weizmann Institute.

There are also other early-stage enterprises that are seeking to develop a quantum processor, including one created by former Intel employees and another by former defense company people. Then there is LightSolver, which is seeking to develop a laser technology computer, which is not quantum technology, but it seeks to provide similar performance.

Going for broke

But all of these are at their early stages from a technological standpoint, and the prominent companies overseas have or are building active but small quantum computers usually of dozens of qubits that are only for R&D use to demonstrate their capabilities but without actual practical application. Thats out of a sense that developing an effective quantum computer that has a real advantage requires millions of qubits. Thats a major disparity that will be difficult to bridge from a technological standpoint.

The problem is that sometimes investing in the here-and-now comes at the expense of investments in the future. The quantum companies are still relatively small and have limited staff. If they have an active computer, they also need to maintain it and support its users in the community and among researchers. That requires major efforts and a lot of money, which might be at the expense of next-generation research and it is already delaying the work of a large number of quantum computer manufacturers who are seeing how smaller startups focusing only on next-generation development are getting ahead of them.

As a result, there are also companies with an entirely different approach, which seeks to skip over the current generation of quantum computers and go for broke to build an effective computer with millions of qubits capable of error detection and correction even if it takes many years.

In 2016, it was on that basis that the Palo Alto, California firm PsiQuantum was founded. Last year the company raised $450 million (in part from Microsoft and BlackRock) based on a company valuation of $3 billion, becoming one of the hot and promising names in the field.

Itzkovitz, from the Pitango fund, was one of its early investors. They said they wouldnt make a small computer with a few qubits because it would delay them but would instead go straight for the real goal, he explained.

PsiQuantum is gambling on a fundamentally different paradigm: Most of the companies building an active computer, including the tech giants, have chosen technology based on specifical material matters (for example superconductors or trapped ions). In contrast, PsiQuantum is building a photonic quantum computer, based on light and optics an approach that until recently was considered physically impossible.

Itzkovitz said that he has encountered a large number of startups that are building quantum processors despite the technological risk and the huge difficulty involved. In the past two weeks, I have spoken with 12 or 13 companies making qubits from England, Holland, Finland, the United States and Canada as if this were the most popular thing there was now in the high-tech industry around the world, he said.

As a result, there are also venture capital funds in Israel and overseas that in the past had not entered the field but that are now looking for such companies to invest in over concern not to be left out of the race, as well as a desire to be exposed to the quantum field.

Its the Holy Grail

Similar to the regular computing industry, in quantum computing, its also not enough to build a processor. A quantum processor is a highly complex system that requires a collection of additional hardware components, as well as software and supporting algorithms, of course all of which are designed to permit its core to function efficiently and to take advantage of the ability and potential of qubits in the real world. Therefore, at the same time that quantum processor manufacturers have been at work, in recent years there has been a growing industry of startups seeking to provide them and clients with layers of hardware and software in the tower that stands on the shoulders of the quantum computers processor.

A good example of that is the Israeli firm Quantum Machines, which was established in 2018 and has so far raised $75 million. It has developed a monitoring and control system for quantum computers consisting of hardware and software. According to the company, the system constitutes the brain of the quantum processor and enables it to perform computing activity well and to fulfill its potential. There are also other companies in the market supplying such components and other components including even the refrigerators necessary to build the computers.

Some companies develop software and algorithms in the hope that they will be needed to effectively operate the computers. One of them is Qedma Quantum Computing from Israel, which has developed what it describes as an operating system for quantum computers that is designed to reduce errors and increase quantum computers reliability.

Our goal is to provide hardware manufacturers with the tools that will enable them to do something efficient with the quantum computers and to help create a world in which quantum algorithmic advantages can actually be realized, said Asif Sinay, the companys founder-partner and CEO. Its the Holy Grail of all of the quantum companies in the world.

The big challenge facing these companies is proving that their technology is genuine and that it provides real value to companies developing quantum processors. Thats of course in addition to providing a solution that is sufficiently unique that the tech giants wont be able to develop it on their own.

The big companies dont throw money around just like that, Sinay said. They want to create cooperation with companies that help them reach their goal and to improve the quality of the quantum computer. Unlike the cyber field, for example, you cant come and scare a customer into buying your product. Here youre sitting with people at your level, really smart [people] who understand that you need to give them value that assists in the companys performance and to take the computer to a higher level.

Two concurrent arms races

What the companies mentioned so far have in common is that they are building technology designed to create an efficient quantum computer, whether its a processor or the technology surrounding it. At the same time, another type of companies is gaining steam those that develop the tools to develop quantum software that in the future will make it possible for developers and firms to build applications for the quantum computer.

Classiq is an Israeli company that has developed tools that make it easier for programmers to write software for quantum computers. It raised $33 million at the beginning of the year and has raised $48 million all told. A competitor in Singapore, Horizon Quantum Computing, which just days ago announced that it raised $12 million, is offering a similar solution.

Another prominent player is the U.S. firm Zapata, in which Israels Pitago fund has also invested, and which is engaged in services involved in building quantum applications for corporations.

There are two concurrent arms races happening now, says Nir Minerbi, co founder and CEO of Classiq. One is to build the worlds first fully functional quantum computer. And many startups and tech giants are working on that and that market is now peaking. The second race is the one for creating applications and software that runs on quantum and can serve these firms. This is a field that is now only making its first steps - and its hard to know when it will reach its goal.

See the article here:
The big money is here: The arms race to quantum computing - Haaretz

Posted in Quantum Computing | Comments Off on The big money is here: The arms race to quantum computing – Haaretz

PsiQuantum’s Path to 1 Million Qubits by the Middle of the Decade – HPCwire

PsiQuantum, founded in 2016 by four researchers with roots at Bristol University, Stanford University, and York University, is one of a few quantum computing startups thats kept a moderately low PR profile. (Thats if you disregard the roughly $700 million in funding it has attracted.) The main reason is PsiQuantum has eschewed the clamorous public chase for NISQ (near-term intermediate scale quantum) computers and set out to develop a million-qubit system the company says will deliver big gains on big problems as soon as it arrives.

When will that be?

PsiQuantum says it will have all the manufacturing processes in place by the middle of the decade and its working closely with GlobalFoundries (GF) to turn its vision into reality. The generous size of its funding suggests many think it will succeed. PsiQuantum is betting on a photonics-based approach called fusion-based quantum computing (paper) that relies mostly on well-understood optical technology but requires extremely precise manufacturing tolerances to scale up. It also relies on managing individual photons, something that has proven difficult for others.

Heres the companys basic contention:

Success in quantum computing will require large, fault-tolerant systems and the current preoccupation with NISQ computers is an interesting but ultimately mistaken path. The most effective and fastest route to practical quantum computing will require leveraging (and innovating) existing semiconductor manufacturing processes and networking thousands of quantum chips together to reach the million-qubit system threshold thats widely regarded as necessary to run game-changing applications in chemistry, banking, and other sectors.

Its not that incrementalism is bad. In fact, its necessary. But its not well served when focused on delivering NISQ systems argues Peter Shadbolt, one of PsiQuantum founders and the current chief scientific officer.

Conventional supercomputers are already really good. Youve got to do some kind of step change, you cant increment your way [forward], and especially you cant increment with five qubits, 10 qubits, 20 qubits, 50 qubits to a million. That is not a good strategy. But its also not true to say that were planning to leap from zero to a million, said Shadbolt. We have a whole chain of incrementally larger and larger systems that were building along the way. Those allow us to validate the control electronics, the systems integration, the cryogenics, the networking, etc. But were not spending time and energy, trying to dress those up as something that theyre not. Were not having to take those things and try to desperately extract computational value from something that doesnt have any computational value. Were able to use those intermediate systems for our own learnings and for our own development.

Thats a much different approach from the majority of quantum computing hopefuls. Shadbolt suggests the broad message about the need to push beyond NISQ dogma is starting to take hold.

There is a change that is happening now, which is that people are starting to program for error-corrected quantum computers, as opposed to programming for NISQ computers. Thats a welcome change and thats happening across the whole space. If youre programming for NISQ computers, you very rapidly get deeply entangled if youll forgive the pun with the hardware. You start looking under the hood, and you start trying to find shortcuts to deal with the fact that you have so few gates at your disposal. So, programming NISQ computers is a fascinating, intellectually stimulating activity, Ive done it myself, but it rapidly becomes sort of siloed and you have to pick a winner, said Shadbolt.

With fault tolerance, once you start to accept that youre going to need error correction, then you can start programming in a fault-tolerant gate set which is hardware agnostic, and its much more straightforward to deal with. There are also some surprising characteristics, which mean that the optimizations that you make to algorithms in a fault-tolerant regime are in many cases, the diametric opposite of the optimizations that you would make in the NISQ regime. It really takes a different approach but its very welcome that the whole industry is moving in that direction and spending less time on these kinds of myopic, narrow efforts, he said.

That sounds a bit harsh. PsiQuantum is no doubt benefitting from the manifold efforts by the young quantum computing ecosystem to tout advances and build traction by promoting NISQ use cases. Theres an old business axiom that says a little hype is often a necessary lubricant to accelerate development of young industries; quantum computing certainly has its share. A bigger question is will PsiQuantum beat rivals to the end-game? IBM has laid out a detailed roadmap and said 2023 is when it will start delivering quantum advantage, using a 1000-qubit system, with plans for eventual million-qubit systems. Intel has trumpeted its CMOS strength to scale up manufacturing its quantum dot qubits. D-Wave has been selling its quantum annealing systems to commercial and government customers for years.

Its really not yet clear which of the qubit technologies semiconductor-based superconducting, trapped ions, neutral atoms, photonics, or something else will prevail and for which applications. Whats not ambiguous is PsiQuantums Go Big or Go Home strategy. Its photonics approach, argues the company, has distinct advantages in manufacturability and scalability, operating environment (less frigid), ease of networking, and error correction. Shadbolt recently talked with HPCwire about the companys approach, technology and progress.

What is fusion-based quantum computing?

Broadly, PsiQuantum uses a form of linear optical quantum computing in which individual photons are used as qubits. Over the past year and a half, the previously stealthy PsiQuantum has issued several papers describing the approach while keeping many details close to the vest (papers listed at end of article). The computation flow is to generate single photons and entangle them. PsiQuantum uses dual rail entangling/encoding for photons. The entangled photons are the qubits and are grouped into what PsiQuantum calls resource states, a group of qubits if you will. Fusion measurements (more below) act as gates. Shadbolt says the operations can be mapped to a standard gate-set to achieve universal, error-corrected, quantum computing.

On-chip components carry out the process. It all sounds quite exotic, in part because it differs from more-widely used matter-based qubit technologies. The figure below taken from a PsiQuantum paper Fusion-based quantum computation issued about a year ago roughly describes the process.

Digging into the details is best served by reading the papers and the company has archived videos exploring its approach on its website. The video below is a good brief summation by Mercedes Gimeno-Segovia, vice president of quantum architecture at PsiQuantum.

Shadbolt also briefly described fusion-based quantum computation (FBQC).

Once youve got single photons, you need to build what we refer to as seed states. Those are pretty small entangled states and can be constructed again using linear optics. So, you take some single photons and send them into an interferometer and together with single photon detection, you can probabilistically generate small entangled states. You can then multiplex those again and basically the task is to get as fast as possible to a large enough, complex enough, appropriately structured, resource state which is ready to then be acted upon by a fusion network. Thats it. You want to kill the photon as fast as possible. You dont want photons living for a long time if you can avoid it. Thats pretty much it, said Shadbolt.

The fusion operators are the smallest simplest piece of the machine. The multiplex, single-photon sources are the biggest, most expensive piece. Everything in the middle is kind of the secret sauce of our architecture, some of that weve put out in that paper and you can see kind of how that works, he said. (At the risk of overkill, another brief description of the system from PsiQuantum is presented at the end of the article.)

One important FBQC advantage, says PsiQuantum, is that the shallow depth of optical circuits make error correction easier. The small entangled states fueling the computation are referred to as resource states. Importantly, their size is independent of code distance used or the computation being performed. This allows them to be generated by a constant number of operations. Since the resource states will be immediately measured after they are created, the total depth of operations is also constant. As a result, errors in the resource states are bounded, which is important for fault-tolerance.

Some of the differences between the PsiQuantums FBQC design and the more familiar MBQC (measurement-based quantum computing) paradigm are shown below.

Another advantage is the operating environment.

Nothing about photons themselves requires cryogenic operation. You can do very high fidelity manipulation and generation of qubits at room temperature, and in fact, you can even detect single photons at room temperature just fine. The efficiency of room temperature single photon detectors, is not good enough for fault tolerance. These room temperature detectors are based on pretty complex semiconductor devices, avalanche photodiodes, and theres no physical reason why you couldnt push those to the necessary efficiency, but it looks really difficult [and] people have been trying for a very long time, said Shadbolt

We use a superconducting single-photon detector, which can achieve the necessary efficiencies without a ton of development. Its worth noting those detectors run in the ballpark of 4 Kelvin. So liquid helium temperature, which is still very cold, but its nowhere near as cold as milli-Kelvin temperatures required for superconducting qubits or some of the competing technologies, said Shadbolt.

This has important implications for control circuit placement as well as for reduced power needed to generate the 4-degree Kelvin environment.

Theres a lot to absorb here and its best done directly from the papers. PsiQuantum, like many other quantum start-ups, was founded by researchers who were already digging into the quantum computing space and theyve shown that PsiQuantums FBQC flavor of linear optical quantum computing will work. While at Bristol, Shadbolt was involved in the first demonstration of running a Variational Quantum Eigensolver (VQE) on a photonic chip.

The biggest challenges for PsiQuantum, he suggests, are developing manufacturing techniques and system architecture around well-known optical technology. The company argues having a Tier-1 fab partner such as GlobalFoundries is decisive.

You can go into infinite detail on the architecture and how all the bits and pieces go together. But the point of optical quantum computing is that the network of components is pretty complicated all sorts of modules and structures and multiplexing strategies, and resource state generation schemes and interferometers, and so on but theyre all just made out of beam splitters, and switches, and single photon sources and detectors. Its kind of like in a conventional CPU, you can go in with a microscope and examine the structure of the cache and the ALU and whatever, but underneath its all just transistors. Its the same kind of story here. The limiting factor in our development is the semiconductor process enablement. The thesis has always been that if you tried to build a quantum computer anywhere other than a high-volume semiconductor manufacturing line, your quantum computer isnt going to work, he said.

Any quantum computer needs millions of qubits. Millions of qubits dont fit on a single chip. So youre talking about heaps of chips, probably billions of components realistically, and they all need to work and they all need to work better than the state of the art. That brings us to the progress, which is, again, rearranging those various components into ever more efficient and complex networks in pretty close analogy with CPU architecture. Its a very key part of our IP, but its not rate limiting and its not terribly expensive to change the network of components on the chip once weve got the manufacturing process. Were continuously moving the needle on that architecture development and weve improved these architectures in terms of their tolerance to loss by more than 150x, [actually] well beyond that. Weve reduced the size of the machine, purely through architectural improvements by many, many orders of magnitude.

The big, expensive, slow pieces of the development are in being able to build high quality components at GlobalFoundries in New York. What weve already done there is to put single photon sources and superconducting nanowire, single photon detectors into that manufacturing process engine. We can build wafers, 300-millimeter wafers, with tens of thousands of components on the wafer, including a full silicon photonics PDK (process design kit), and also a very high performing single photon detector. Thats real progress that brings us closer to being able to build a quantum computer, because that lets us build millions to billions of components.

Shadbolt says real systems will quickly follow development of the manufacturing process. PsiQuantum, like everyone in the quantum computing community, is collaborating closely with potential users. Roughly a week ago, it issued a joint paper with Mercedes-Benz discussing quantum computer simulation of Li-ion chemistry. If the PsiQuantum-GlobalFoundries process is ready around 2025, can a million-qubit system (100 logical qubits) be far behind?

Shadbolt would only say that things will happen quickly once the process has been fully developed. He noted there are three ways to make money with a quantum computer: sell machines, sell time, and sell solutions that come from that machine. I think we were exploring all of the above, he said.

Our customers, which is a growing list at this point pharmaceutical companies, car companies, materials companies, big banks are coming to us to understand what a quantum computer can do for them. To understand that, what we are doing, principally, is fault-tolerant resource counting, said Shadbolt. So that means were taking the algorithm or taking the problem the customer has, working with their technical teams to look under the hood, and understand the technical requirements of solving that problem. We are turning that into the quantum algorithms and sub routines that are appropriate. Were compiling that for the fault-tolerant gate set that will run on top of that fusion network, which by the way is a completely vanilla textbook fault-tolerant gate set.

Stay tuned.

PsiQuantum Papers

Fusion-based quantum computation, https://arxiv.org/abs/2101.09310

Creation of Entangled Photonic States Using Linear Optics, https://arxiv.org/abs/2106.13825

Interleaving: Modular architectures for fault-tolerant photonic quantum computing, https://arxiv.org/abs/2103.08612

Description of PsiQuantums Fusion-Based System from the Interleaving Paper

Useful fault-tolerant quantum computers require very large numbers of physical qubits. Quantum computers are often designed as arrays of static qubits executing gates and measurements. Photonic qubits require a different approach. In photonic fusion-based quantum computing (FBQC), the main hardware components are resource-state generators (RSGs) and fusion devices connected via waveguides and switches. RSGs produce small entangled states of a few photonic qubits, whereas fusion devices perform entangling measurements between different resource states, thereby executing computations. In addition, low-loss photonic delays such as optical fiber can be used as fixed-time quantum memories simultaneously storing thousands of photonic qubits.

Here, we present a modular architecture for FBQC in which these components are combined to form interleaving modules consisting of one RSG with its associated fusion devices and a few fiber delays. Exploiting the multiplicative power of delays, each module can add thousands of physical qubits to the computational Hilbert space. Networks of modules are universal fault-tolerant quantum computers, which we demonstrate using surface codes and lattice surgery as a guiding example. Our numerical analysis shows that in a network of modules containing 1-km-long fiber delays, each RSG can generate four logical distance-35 surface-code qubits while tolerating photon loss rates above 2% in addition to the fiber-delay loss. We illustrate how the combination of interleaving with further uses of non-local fiber connections can reduce the cost of logical operations and facilitate the implementation of unconventional geometries such as periodic boundaries or stellated surface codes. Interleaving applies beyond purely optical architectures, and can also turn many small disconnected matter-qubit devices with transduction to photons into a large-scale quantum computer.

Slides/Figures from various PsiQuantum papers and public presentations

Read more here:
PsiQuantum's Path to 1 Million Qubits by the Middle of the Decade - HPCwire

Posted in Quantum Computing | Comments Off on PsiQuantum’s Path to 1 Million Qubits by the Middle of the Decade – HPCwire

Aalto University Wins a 2.5 Million ($2.66M USD) Grant to Develop a New Type of Superconducting Qubit – Quantum Computing Report

Aalto University Wins a 2.5 Million ($2.66M USD) Grant to Develop a New Type of Superconducting Qubit

The award was made by the European Research Council for a project named ConceptQ. It will cover a five year period to research a new superconducting quantum device concept utilizing increased anharmonicity, simple structure, and insensitivity to charge and flux noise. One problem with superconducting qubits is that they can sometimes end up in states other than |0> or |1>, These states are sometimes called Qutrits which could potentially be in a superpositions of three different states denoted as |0>, |1>, and |2>. In current quantum processors, the |2> state is not desired and could cause a loss of qubit fidelity. Aaltos new qubit design is meant to reduce or eliminate the occurrence of the |2> state which would remove a source of errors and help to increase the accuracy of the calculation. Another aspect of the project will be to develop low temperature cryoCMOS electronics that can be used to control qubits inside a dilution refrigerator. More information about this grant and the ConceptQ project is available in a news release posted available on the Aalto University website here.

April 26, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

See the rest here:
Aalto University Wins a 2.5 Million ($2.66M USD) Grant to Develop a New Type of Superconducting Qubit - Quantum Computing Report

Posted in Quantum Computing | Comments Off on Aalto University Wins a 2.5 Million ($2.66M USD) Grant to Develop a New Type of Superconducting Qubit – Quantum Computing Report