The Future Of Nano Technology
- Alan Watts
- Anti-Aging Medicine
- David Sinclair
- Gene Medicine
- Gene therapy
- Genetic Medicine
- Genetic Therapy
- Global News Feed
- Hormone Replacement Therapy
- Human Genetic Engineering
- Human Reproduction
- Integrative Medicine
- Life Skills
- Longevity Medicine
- Machine Learning
- Medical School
- Nano Medicine
- Parkinson's disease
- Quantum Computing
- Regenerative Medicine
- Stem Cell Therapy
- Stem Cells
- 4 Bulletproof Ways to Improve Your Immune System Naturally
- 7 ways to keep mosquitoes away from your baby
- How To Live With Less Sugar In Your Diet – Longevity LIVE – Longevity LIVE
- How to live longer: Bacteria in the gut could boost longevity study – Toys Matrix
- World Lipstick Day: A Beauty Guide To The Perfect Pout – Longevity LIVE – Longevity LIVE
- erythropoietic protoporphyria death from mt 7117 trial
- how long numbness after getting teeth pulled
- dr joseph angel de soto
- can filling numbness last a long time
- TIME MAGAZINE ARTICLE : WHAT DID JESUS SAY ABOUT HEAVEN AND HELL
- Medical genetics wikipedia
- numb chin after tooth exytaction
- how long can numbness last after extraction
- how long does it take for numbness to go away after wisdom tooth extraction
- innie vs outie vulva
|Search Immortality Topics:|
Category Archives: Quantum Computing
View of Cyborg hand holding Quantum computing concept with qubit and devices 3d rendering
QC Investment Today
Quantum Computing (QC) proof of concept (POC) projects are growing in Q4 2021 with commercialization pilots by 2025 and broader adoption before 2030.Accelerated digital transformation and digital reshaping from the pandemic is driving investments and early IPOs (ex. Q1 announcement by IonQ). In my daily engagements pro bono with global communities (across governments, industry, computing and research organizations, NGOs, UN agencies, innovation hubs, think tanks) of more than 60K CEOs, 30K investors , 10K innovation leaders, Im finding nearly 50% are planning pilots for QC in five years. Theres an understanding that the exponential lead provided by a breakthrough in QC warrants the early investment and learnings now since practical adoption will take years.
As a measure of progress and to stimulate collaboration/sharing in QC, the non-profit IEEE held their first Quantum Week in October 2020 and is holding their second conference IEEE Quantum Week 2021 October 18-22 2021. Ill provide a follow-up article after the conference.
Quantum Physics produces Quantum Effects from Quantum Mechanics providing Quantum Information Science that includes quantum computing, quantum communications, quantum sensing, quantum measurement, quantum safe cryptography and more. I often use QC as the general term for simplicity in this article to point to Quantum Effects-related to Quantum Information Science. Quantum Information Science is the better umbrella term.
Learn From the QC Top Leaders
In this article, I will highlight QC 2021 best insights from my chats with QC top leaders in 2021. The pro bono full video interviews can be found with the non-profits such as IEEE TEMS and ACM (see interviews series Stephen Ibaraki). IEEE is the largest non-profit electrical engineering organization and responsible for many of the global standards in use today in technology.
The QC interviewees include:
Michele Mosca: Co-founder, Institute for Quantum Computing, University of Waterloo; Founder of Quantum-Safe Canada and Quantum Industry Canada; Co-founder and CEO of the quantum-safe cybersecurity company, evolutionQ.
William Hurley, who goes by the name whurley: Innovator; Serial Entrepreneur; Founder & CEO Strangeworks, about Quantum Computing.
Scott Aaronson: David J. Bruton Centennial Professor of Computer Science at the University of Texas at Austin; recipient of ACM Prize in Computing; about theoretical computer science and quantum computing. The ACM prize is the second highest award from the ACM, which is the largest non-profit computing science organization.
Stefan Woerner: IBM Quantum Applications Research & Software Lead. Stefan is considered one of the top researchers in QC applications.
QC Top Leaders Best Pointers
Michele Mosca details quantum history and being at the founding of world leading physics and quantum research groups at University of Waterloo. We discuss the future of quantum, the probabilities of success timelines, and providing quantum risk assessment. In addition, Michele and his students have founded companies in this area thus the entrepreneurship journey is shared.
We discuss categories of quantum:
Quantum computing (QC), the focus on my January Forbes article where Google in 2019 and China in 2020 provided examples of Quantum Supremacy where problems are solved in seconds that would take thousands or billions of years on classical digital computers.
Quantum safe cryptography and designs to be safe from quantum enabled attacks. NIST (National Institute of Standards and Technology) working on QC standards. Encryption being vulnerable to quantum computing capabilities including where data can be stored and decrypted later by quantum computers.
Quantum communications where China is leading and also the UN agency ITU has programs such as Quantum Information Technology for Networks.
Quantum sensing providing ultrasensitive capabilities to detect underwater deposits and seismic events plus much more.
Willan Hurley whurley shares his experiences as a serial entrepreneur including having several startups exit within the same year. whurley then shares turning his attention to QC by authoring the book, Quantum Computing for Babies, and launching his startup Strangeworks. Strangeworks provides a platform with developer tools and systems management. In our chat, whurley states, I think if you look at IBM public roadmap, if you look at IBM Q, and Rigetti, and all of the companies and what they're doing Microsoft, Google; Google, even then announce it, they think they'll have their machine in 2029...and I think that they will actually do it before. So I predict Google will have a machine online, closer to the 2025, 2026 range...There's over 500 startups involving quantum right now today. When I started three years ago, they were like 12...And you're going to see a big inflection point driven by the government investment worldwide ... whurley talks about billions invested in France, Germany, China, USA ...you've got Norway, Finland, Russia, you've got everybody in this game now.
Scott Aaronson received the 2020 ACM Prize in Computing in April 2021 for his contributions to QC. In our chat, we talk about his work and his views on QC today and into the future. Its good to view our chat - as noted in the ACM prize citation, Aaronson helped develop the concept of quantum supremacy, which denotes the milestone that is achieved when a quantum device can solve a problem that no classical computer can solve in a reasonable amount of time. Aaronson established many of the theoretical foundations of quantum supremacy experiments. Such experiments allow scientists to give convincing evidence that quantum computers provide exponential speedups without having to first build a full fault-tolerant quantum computer. The ACM citation provides notable contributions with: Boson Sampling, Fundamental Limits of Quantum Computers, Classical Complexity Theory, his respected book on QC Quantum Computing Since Democritus and Scotts work Making Quantum Computing Accessible (ex. his popular blog.Shtetl Optimized).
Here are excerpts from my extensive chat with Stefan Woerner. The interview has been edited for clarity and brevity and I used AI to provide the transcript (which has limits). I recommend going directly to the video interview for our nuanced discussion.
I ask how Stefan got into quantum computing.
And then I started to look into how can we apply this to problems I looked into before, for example, in optimization or in finance, and it turned out that, that there are many things that can be done...quantum computing gave me a new toolbox to look at the problems that I studied already for quite a while and it opened up completely new directions. It also came with quite new challenges. But but I think it's extremely exciting for me. Now having this additional tools, additional possibilities to try to solve relevant problems and eventually have an impact with optimization or with Monte Carlo simulation and things like that.
That's fascinating, your grandfather's sort of stimulating this interest in mathematics and sciences in general as well...And then in your early work, using mathematics, did you use supercomputers at that time in your optimization problems?
We did some optimization on the cloud. And we used some cloud solvers. But these were not supercomputing. So our approach was more to try to find good formulations that are accessible by the solvers. We had, for example, writing our own simulations for supply chains that could be leveraged in an optimization setting.
Quantum computing is still a mystery to a lot of people and especially to developers so there's more and more tools coming out. You have the IBM challenge to try to make it easier for the broader community to start experimenting with quantum computing. But before we delve into the tools you have and how you make it accessible for proof of concepts. Let's go back to basics, what is quantum computing?
So quantum computing is a completely new computational paradigm where you leverage the laws of quantum mechanics. And that means if we now really go to the basics, classically, you have a bit, that's either zero or one. In quantum computing, you have the quantum bit qubit, which can be a superpositions of zero or one. And that sometimes this is explained like it's 50%, zero or 50%, one. But that's not 100% true; it's really like a superposition, it's this state in between, so you can think of it as a continuous variable, in a way a continuous value. If you have two qubits they can also be entangled. And in a way, this means that the state of two qubits can be correlated. So if you can, you can construct states that are perfectly correlated, where the state of the one qubit perfectly determines the state of the other qubit. So if the one [qubit] is zero, you know, the other one is also zero. And the other way around, if the one [qubit] is one, the other is also one. So this correlation of two particles, which are two qubits, this is something that's purely quantum mechanical. This doesn't exist in classical computing and classical electronics. And if you scale this, this means that the state space of a system of qubits scales exponentially. So that the state space to describe the system really scales extremely fast to something that's way beyond you can handle classically, that alone would not be enough. There's one more feature, let's call them interference. And you know that from sound or from water, you can have constructive and destructive interference where waves are adding up or they're cancelling out. And this is something that we leverage in quantum computing, as well. So you can have this high dimensional states, and then you can let them interfere. And that's what actually then amplifies probabilities of good solutions. Now, this also tells you one important thing, a way a quantum computer is working. And the way you program a quantum computer is completely different to how you would do this classically. Because you need to translate your problem now into something that's leveraging this interference in a way.
There's this idea early on in quantum computing, where they're measuring the capabilities by the number of relatively stable, qubits, or logical qubits. And then IBM came up with this idea of quantum volume. They're saying maybe qubits is not a great way of representing the capability of a quantum computing. Can you explain IBM's concept of a quantum volume?
Qubits that we built today, let's refer to them as physical qubits. They are noisy so they after a while they lose the state, the operations that we can can use to control their state or to modify the state are not perfect. So there's an error. And that means it's so difficult to really operate with these qubits until you really have to imagine here this is really trying to harness nature as its extreme. It's, in our case, superconducting qubits. So they are in a very cold environment and shielded from external disturbances and so on. These physical qubits, they're kind of fragile and you can have lots of qubits, but if they have very high error rates, you won't be able to use all of them. Because once you operated on all of them and entangled them, and so on, you introduce so much noise that you're not getting out anything meaningful anymore. So you really need to take into account the number of qubits, that is an important factor. But as you said, not the only one. But also the errors indicating the decoherence time. So how long the qubit keeps it state, and things like that. And now the quantum volume is a single number that's determined by some benchmarking circuit. So you run some operations on your quantum hardware, where you kind of know the result, or you can evaluate the result. And you can then say, whether this is above a certain threshold or not. And then if you can run this on a certain certain number of qubits and with a certain number of operations, and this determines the quantum volume. And so the quantum volume in a way determines how many qubits you can use with a certain number of operations, meaning that the number of operations that you run sequentially is about the number of qubits. But this is kind of benchmark. So the single number benchmark that puts on it, that takes into account the number of qubits and noise and all these factors that actually impact the power of a quantum computer. Now, this is for these physical qubits, then now looking forward, once we reach a certain size and a certain quality, then we can leverage error correction. And we can get to fault tolerant quantum computing. In here, we take many physical qubits, and we encode them as one logical qubit. So there's like an abstract and logical error correction layer on top of that. And this overhead is relatively large, so it's estimated that you need a few 100 to 1000 physical qubits to get to one logical qubit. And then this logical qubit has a significantly suppressed error. And then you can start to work with that, in this clean theoretic computational paradigm where you ignore more or less the noise from the hardware.
IBM, announcing their 1 million qubit roadmap by 2030. What does that roadmap mean? I know you've got some interim results that you're targeting: 2023, 2025, etc., 2030. What are the implications of this roadmap?
So I think the next couple of years will be would be very, very exciting for different reasons. So the roadmap that we announced that says that, until 2023, we reach a quantum chip with more than 1000 qubits and also give some specifications on the error rates that these qubits should have because as I mentioned before, qubits, just the number of qubits doesn't mean too much. So the quality needs to be improved as well, to really get a more powerful quantum computer and so we get over 100 qubits. So currently we have 65 [qubits]. Last year we released 65 that can be accessed to the cloud this year, we plan to get to 127 I think, next year 433. And then after 2023 over 1000. And, in the roadmap and also the technical details, like what leads to this improvement, what are the changes that help us to grow these chips. Now getting to 1000 qubits is kind of an inflection point. And this is because, as I mentioned before, this is about the number that you need to to build a logical qubit. So that's where you can really start to study, fault tolerant quantum computing, maybe at a small scale. But, that will be then the first time this really can be investigated in depth. And then the next step to scale to the millions; also then to not have like more and more qubits on a single chip, but also go for example, you could imagine that you combine multiple chips with 1000 qubits. And that way, get a larger quantum computer. Well, that's from the technological development, this is extremely fascinating. And I think also, this path to the 1000 qubits will be extremely interesting for applications and algorithms, researchers like myself, because right now when we run algorithms on real hardware, and also when we simulate them classically, which is very, very expensive computationally, because it scales exponentially in a number of qubits...and once we scale to 60, 100, 400, 1000 cubits, this is really where we can see that the asymptotic behavior of these heuristics, so this is really where we can start to make forecasts about how they will perform for interesting problems. And I think that this will result in us getting a way better understanding of what we can do with near term quantum computers for optimization for machine learning, for things like that.
Different companies and research groups come out with different claims; there's a group out of China recently came up with a claim that they've achieved some kind of quantum supremacy that it would take a supercomputer, over 2 billion years to do this kind of quantum problem of Gaussian boson sampling. Google, made some buzz, in 2019, where they released the Sycamore system, and they indicated, quantum supremacy on this quantum problem. It's not a practical problem, but really just to illustrate that it can do something that maybe supercomputers can't do. And yet IBM looked at that and said, maybe that's not as big of a breakthrough as you're indicating, because really, we can get that done on a supercomputer just by improving our algorithms to maybe a few days. So maybe it's not quantum supremacy is. So what is supremacy, what is quantum advantage? There's these words being thrown out, and what is it real?
So we don't use quantum quantum supremacy for multiple reasons. One, is we don't believe that quantum computers will become superior to classical computers at any point. And so a quantum computer cannot speed up everything. A quantum computer can be used as an accelerator for some tasks. So I think it will always be a combination of classical and quantum computers that work in harmony to solve some problems. So it's not like you won't write your emails with a quantum computer [you will NOT be using quantum computers to write emails], you might solve some computationally heavy quantum chemical simulations or control optimization problems with a quantum computer. And now, what do we mean by quantum advantage? That's if you can do something with the help of a quantum computer that has some practical value. So I think what you mentioned are very nice experimental demonstrations. And important steps on the development of quantum computers. What we're looking for is really a practical value that has been achieved with a quantum computer. And I think that still is a bit out in the in the future.
You're an expert in quantum computing, but there's different kinds of quantum computing. And what I mean by that: trapped ion concept, topological quantum computer that Microsoft has been chasing for some time, very low temperature spin, photonic, can you get into a summary of the different categories and why IBM has chosen your particular way of doing quantum computing?
Trapped ions, spins, photonics, and also in superconducting they're different designs, we will look into superconducting qubits, because we think that's particularly in the near term, the most promising to scale...superconducting qubits are operating in very low temperature...about 50 milli Kelvin, which is, I think, 100 or 1000 times colder than outer space. So this really like just above the absolute zero temperature. And that, that sounds very challenging. But this, dilution refrigerators that get down to these temperatures, this is something that is actually quite reliable and well understood technology. So that first sounds like a big problem. But I think that's something that has been quite well understood. And if you have that solved, or if you have the environment where you can operate them, then you can process these chips, you can come up with different designs, the superconducting qubits, for example, at a larger scale than the spins. So I think, to get these near term systems, that there might be an advantage in processing, in fabricating them. And we came up with a design that is also accessible to error correction. So here's, that's an example where the theoretical research and error correction and the people who design the devices are really like collaborated very nicely because the design of the chip has been chosen such that it's good to manufacture it, and which has then let the error correction team to come up with new error correcting codes that can be run, eventually on this hardware. So these are all pieces that fit together that make us believe that we can scale this to 1000 cubits and then if we, for example, can connect larger chips also to the to the millions.
I've been in computing for such a long time. And I remember in the early days, we would flick toggle switches, and program literally in binary code; we moved to assembler then we went to higher level languages. We got to a stage where you had abstraction of the hardware through an operating system; you can write to a more generic kind of code using a much higher level language and that made it much easier. So what is the work being done in that area in quantum computing, to abstract the hardware underneath from an operating standpoint; using toolkits?
So, we just released the development roadmap earlier this year, which, addresses this to some extent, like how the stack will grow, how levels of abstraction will be included, whether this is for, like, pre defined quantum circuits, that you don't have to build the circuit yourself. But that, you know, there's hardware, a library of pre compiled for the hardware, pre compiled circuits, and it's like an optimized instruction set. And, things like that up to actual application services. And now, in terms of the actual languages, I think we are in a very interesting situation, which is a little bit different to what you explained before, because on the one side, we are at the stage of defining the new assembler standard, which is a quantum assembly language. But at the same time, we do have the classical languages, right, we do have a Python, for example, that we can embed all of this in. So we have a render situation that we can leverage the classical existing high level languages. And in this embed these new functionalities, we can write functions, classical functions, that compile or assemble or optimize some of these quantum stuff. And that, that allows us, for example, to build work on application modules. So you, I think, you mentioned qiskit before, qiskit is our open source Python framework, to program quantum computers to define quantum circuits to simulate quantum circuits and to also send them over the cloud to the real hardware. And within qiskit, we are building application modules. And here we're looking into the moment in four different application areas, there's optimization, there's natural sciences, there's machine learning, and finance. And the optimization module has been released the middle of last year. And what this does is it, it allows you to use a classical high level language to specify your optimization problem. Because that's something that has been solved, right, this is nothing quantum computing specific. Like classical optimization, subject matter experts knows how to define a optimization problem using different languages as for example, an IBM language, to model your problem. And now what the qiskit optimization module allows us to take this classical problem, and it automatically translates it into different different representations that are then accessible to different quantum optimization algorithms. So we on the one side, we still work on the assembly level. But on the other side, we have the classical language that does all the translations for us from a high level problem down to an actual circuit. And these, these optimization modules are built in such a way that it's very easy to get started. So you can, if you are like a subject matter expert in one of these domains, you can just download these modules, they are open source, and you can get the tutorials that actually allow you to use the quantum algorithm as a black box. So the entry barrier to run your first quantum optimization program on some illustrative example, is very low, forever. This whole thing is also built in such a modular and flexible way that you can use it as a black box, but you can open the black box, you can look at every level, you can tear it apart, you can replace different pieces by your own implementation, and see whether they improve, whether they change, how do they compare. So it's built in a way that is easy to get started. But that also really, really supports cutting edge research in these areas.
But ultimately, if you want to have, mass proliferation, or usage, you will have to work at this much higher abstraction level. So it's easier for people to get involved. And I guess that's the reason behind the IBM challenge, right, to get people involved. And I read last year your two biggest communities who tried it, were in the data science area, and then financial services, but you also have people like high school students trying and completing the program. Can you talk about this challenge and what you're trying to do? And, and typically, what it involves maybe it's three or four stages of things that you put people through, and you actually get quite a few actually going through the entire program. So can you give us an example what that is like?
This challenge was a collection of problems / tasks that people could apply and try to solve. And this included problems using qiskit, to solve an optimization problem. We have different difficulty...many people really reached the highest score... If I remember correctly, some people even reached the score where there was a little bit higher than anticipated. And that the challenge was one thing, but there was also a kid summer school, ...global summer school with, if I remember correctly, around four or 5000 participants globally. So we provide these educational offers, because it's really important, as you say, for people to be able to get into how this works, but what's different, to grow also the workforce in this in this area, because there will be an increasing demand. And I think, because it is so different, because it is still new, we just figured out the tip of the iceberg of what to use a quantum computer for or how to use a quantum computer to solve problems. So I think it will be extremely important to educate more people around quantum computing, and you see universities picking that up and coming up with new quantum computing curricula, and so on. And so this is important also to really leverage the full potential of this technology.
Microsoft had a blog post where they indicated that it's really not suitable right now for for problems, which have a lot of data requirements, either data and in getting data in and getting data out, it's really more for certain kind of computational problems. And where you're really taking advantage of the unique capabilities within quantum computing. And you've indicated that as well, it's not a standalone; my iPhone isn't going to have a quantum computer in it; it's going to be in combination, or in hybrid form in some way. And you're seeing that with D-Wave, which has a piece of this quantum capability with their quantum annealing, but they have these hybrid systems. That leads to this question, what kind of industries are really suitable for quantum computing? What kind of problems are really suitable for quantum computing? What are the different categories where this whole quantum phenomena is being exploited right now? Or you think we'll have some major kind of advantages going into the future?
Let me get back to the first point you mentioned about data. Because I think that's important. I indicated at the beginning that quantum computers can solve some problems better and that's really important, not all problems. And big data problems are like, if the problem is not that the tasks you want to execute is computationally very complex, but that you want to run it on like a tremendously large data set. Then this is very likely not a quantum computing use case, because loading this large data to a quantum computer just has complexity of the size of the data. But then many of these large are many of these big data algorithms. Classical algorithms also have that complexity, like if you have a big data set and your complexity is quadratic in the data size, and this probably won't work. And that means that loading a big data set into a quantum computer. Well, and here we're talking most likely about a fault tolerant quantum computer will have the same complexity as doing a solving the problem you're interested in classically. So for some problems is just a fundamental limitation; the good example is Grover's search, which is sometimes illustrated as searching unstructured database. But the first thing you have to do is you have to load this database into a quantum computer. And when you load this, and you have to take every element, then you just stop when you found what you're looking for. So we don't load the full database to the quantum computer, but that you would have to search it. So these things can happen. I think particularly in quantum machine learning algorithms, often this fact is is not considered. And still there are some interesting theoretical results. But if you want to look into this, from an application point of view, you really need to analyze it end to end from loading the data to extracting the result. And only then you can make a statement about a potential practical quantum advantage. Now on your question in the industries, so we actually working with quite a lot of companies; I think in the IBM quantum network, we have over 130 members by now. And there's of course, the financial service sector we're working with; with JP Morgan Chase, Goldman Sachs, on things like options, pricing, the derivative pricing and credit risk analysis or risk analysis in general, also optimization, portfolio optimization, things like that. So, I think the financial service sector is an industry that has a lot of of interest in quantum computing, because it's a very compute intensive industry. And, for example, many, many things are done by a Monte Carlo simulation, where we might have some potential speed ups with quantum computing. But there's also a lot of optimization and also machine learning; if you think about credit card fraud, this is something that still causes a lot of costs for the credit card industry and if they could reduce the false positives, and they would significantly reduce costs and improve proof reputation, because no customer likes accidentally blocking of their credit card. So, this is one sector, then since quantum computing might speed up optimization problems, there eventually might be a use cases around logistics supply chain, all these things...I mean, the original idea for quantum computing as Feynman formulated, this was for simulating quantum systems...quantum chemistry, quantum physics, material science, ... and eventually use cases...life science, industry and chemical industry, this these are certainly use cases that might really have a large potential...We have a lot of activities around quantum chemistry. And how to eventually scale this to get to design new materials or to understand to how chemical reactions work to build new catalysts that allow to run some chemical reactions at ambient conditions where today we require lots of energy and so on.
Stephen Ibarakiand Stefan Woener
I ask for further POCs in the near term and Stefan provides added examples. Stefan also looks longer term. ...opens up completely new ways of doing business of doing, for example, financial product, if you have like real time risk tracking, which can also maybe even prevent different things because you can react way faster. So it can lead to a way more informed decision making in multiple businesses... I think quantum quantum computing has also the potential to solve some of the really big problems that society may face in the coming years, whether this is fertilizers for food, and so on, which can use a lot of energy these days. And so this is something where it might help and there are a couple of examples where / when nature does something extremely efficient, and humans have no clue how to reproduce that. And I think with quantum computing, once we really figured out how to build this hardware, and then also, there's a lot of open questions on the algorithms. This might give us a completely new lens to look at nature, to look at how things actually work. So I would imagine that this helps us also to really push the fundamental understanding of how the world actually works, eventually.
We explore areas: quantum cryptography, quantum encryption and decryption and Shor's algorithm, a quantum accelerator, quantum sensing, quantum communications, quantum gravimeters, 20 million qubits where Shor's algorithm becomes a real factor, and in breaking RSA encryption, quantum key distribution.
We get into a discussion about quantum inspired applications (apply the principles to solve real problems today, even though the quantum hardware isn't quite there yet. And when it's ready, it scales.) Stefan provides his insights including improvements to classical software, It's a nice term for classical algorithms. I think, in principle, it's very cool if quantum algorithm research can also inspire finding new classical algorithms. I think this can happen either by kind of de-quantizing, some quantum algorithms, as we have seen, in the last years that there is a, like a quantum algorithm that promises a certain advantage. And then people have found how to kind of mimic some of the of the core parts of this algorithm using some classical sampling techniques. And they could show similar performance. I mean, this is always a little bit disappointing if you try to show a quantum advantage with this algorithm. And then classical algorithms can beat that. But I think it's a pretty cool development. But it stays a classical algorithm...that is not to forget that it is just a classical algorithm. It doesn't give you any advantage from coming from quantum; it's a classical algorithm that has been designed by using some ideas that are coming from quantum computing, but it's based on classical computers. So it will not give you a quantum advantage because it's classical.
We get into philosophical discussions about new kinds of computing and on quantum effects including on consciousness.
What is quantum computing? Everything you need to know about the strange world of quantum computers – ZDNet
While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information.
Quantum computing exploits the puzzling behavior that scientists have been observing for decades in nature's smallest particles think atoms, photons or electrons. At this scale, the classical laws of physics ceases to apply, and instead we shift to quantum rules.
While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information. Successfully bringing those particles under control in a quantum computer could trigger an explosion of compute power that would phenomenally advance innovation in many fields that require complex calculations, like drug discovery, climate modelling, financial optimization or logistics.
As Bob Sutor, chief quantum exponent at IBM, puts it: "Quantum computing is our way of emulating nature to solve extraordinarily difficult problems and make them tractable," he tells ZDNet.
Quantum computers come in various shapes and forms, but they are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.
The nature of those quantum particles, as well as the method employed to control them, varies from one quantum computing approach to another. Some methods require the processor to be cooled down to freezing temperatures, others to play with quantum particles using lasers but share the goal of finding out how to best exploit the value of quantum physics.
The systems we have been using since the 1940s in various shapes and forms laptops, smartphones, cloud servers, supercomputers are known as classical computers. Those are based on bits, a unit of information that powers every computation that happens in the device.
In a classical computer, each bit can take on either a value of one or zero to represent and transmit the information that is used to carry out computations. Using bits, developers can write programs, which are sets of instructions that are read and executed by the computer.
Classical computers have been indispensable tools in the past few decades, but the inflexibility of bits is limiting. As an analogy, if tasked with looking for a needle in a haystack, a classical computer would have to be programmed to look through every single piece of hay straw until it reached the needle.
There are still many large problems, therefore, that classical devices can't solve. "There are calculations that could be done on a classical system, but they might take millions of years or use more computer memory that exists in total on Earth," says Sutor. "These problems are intractable today."
At the heart of any quantum computer are qubits, also known as quantum bits, and which can loosely be compared to the bits that process information in classical computers.
Qubits, however, have very different properties to bits, because they are made of the quantum particles found in nature those same particles that have been obsessing scientists for many years.
One of the properties of quantum particles that is most useful for quantum computing is known as superposition, which allows quantum particles to exist in several states at the same time. The best way to imagine superposition is to compare it to tossing a coin: instead of being heads or tails, quantum particles are the coin while it is still spinning.
By controlling quantum particles, researchers can load them with data to create qubits and thanks to superposition, a single qubit doesn't have to be either a one or a zero, but can be both at the same time. In other words, while a classical bit can only be heads or tails, a qubit can be, at once, heads and tails.
This means that, when asked to solve a problem, a quantum computer can use qubits to run several calculations at once to find an answer, exploring many different avenues in parallel.
So in the needle-in-a-haystack scenario about, unlike a classical machine, a quantum computer could in principle browse through all hay straws at the same time, finding the needle in a matter of seconds rather than looking for years even centuries before it found what it was searching for.
What's more: qubits can be physically linked together thanks to another quantum property called entanglement, meaning that with every qubit that is added to a system, the device's capabilities increase exponentially where adding more bits only generates linear improvement.
Every time we use another qubit in a quantum computer, we double the amount of information and processing ability available for solving problems. So by the time we get to 275 qubits, we can compute with more pieces of information than there are atoms in the observable universe. And the compression of computing time that this could generate could have big implications in many use cases.
Quantum computers are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.
"There are a number of cases where time is money. Being able to do things more quickly will have a material impact in business," Scott Buchholz, managing director at Deloitte Consulting, tells ZDNet.
The gains in time that researchers are anticipating as a result of quantum computing are not of the order of hours or even days. We're rather talking about potentially being capable of calculating, in just a few minutes, the answer to problems that today's most powerful supercomputers couldn't resolve in thousands of years, ranging from modelling hurricanes all the way to cracking the cryptography keys protecting the most sensitive government secrets.
And businesses have a lot to gain, too. According to recent research by Boston Consulting Group (BCG),the advances that quantum computing will enable could create value of up to $850 billion in the next 15 to 30 years, $5 to $10 billion of which will be generated in the next five years if key vendors deliver on the technology as they have promised.
Programmers write problems in the form of algorithms for classical computers to resolve and similarly, quantum computers will carry out calculations based on quantum algorithms. Researchers have already identified that some quantum algorithms would be particularly suited to the enhanced capabilities of quantum computers.
For example, quantum systems could tackle optimization algorithms, which help identify the best solution among many feasible options, and could be applied in a wide range of scenarios ranging from supply chain administration to traffic management. ExxonMobil and IBM, for instance, are working together to find quantum algorithmsthat could one day manage the 50,000 merchant ships crossing the oceans each day to deliver goods, to reduce the distance and time traveled by fleets.
Quantum simulation algorithms are also expected to deliver unprecedented results, as qubits enable researchers to handle the simulation and prediction of complex interactions between molecules in larger systems, which could lead to faster breakthroughs in fields like materials science and drug discovery.
With quantum computers capable of handling and processing much larger datasets,AI and machine-learning applications are set to benefit hugely, with faster training times and more capable algorithms. And researchers have also demonstrated that quantum algorithmshave the potential to crack traditional cryptography keys, which for now are too mathematically difficult for classical computers to break.
To create qubits, which are the building blocks of quantum computers, scientists have to find and manipulate the smallest particles of nature tiny parts of the universe that can be found thanks to different mediums. This is why there are currently many types of quantum processors being developed by a range of companies.
One of the most advanced approaches consists of using superconducting qubits, which are made of electrons, and come in the form of the familiar chandelier-like quantum computers. Both IBM and Google have developed superconducting processors.
Another approach that is gaining momentum is trapped ions, which Honeywell and IonQ are leading the way on, and in which qubits are housed in arrays of ions that are trapped in electric fields and then controlled with lasers.
Major companies like Xanadu and PsiQuantum, for their part, are investing in yet another method that relies on quantum particles of light, called photons, to encode data and create qubits. Qubits can also be created out of silicon spin qubits which Intel is focusing on but also cold atoms or even diamonds.
Quantum annealing, an approach that was chosen by D-Wave, is a different category of computing altogether. It doesn't rely on the same paradigm as other quantum processors, known as the gate model. Quantum annealing processors are much easier to control and operate, which is why D-Wave has already developed devices that can manipulate thousands of qubits, where virtually every other quantum hardware company is working with about 100 qubits or less. On the other hand, the annealing approach is only suitable for a specific set of optimization problems, which limits its capabilities.
Both IBM and Google have developed superconducting processors.
Right now, with a mere 100 qubits being the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.
"While there is a tremendous amount of promise and excitement about what quantum computers can do one day, I think what they can do today is relatively underwhelming," says Buchholz.
Increasing the qubit count in gate-model processors, however, is incredibly challenging. This is because keeping the particles that make up qubits in their quantum state is difficult a little bit like trying to keep a coin spinning without falling on one side or the other, except much harder.
Keeping qubits spinning requires isolating them from any environmental disturbance that might cause them to lose their quantum state. Google and IBM, for example, do this by placing their superconducting processors in temperatures that are colder than outer space, which in turn require sophisticated cryogenic technologies that are currently near-impossible to scale up.
In addition, the instability of qubits means that they are unreliable, and still likely to cause computation errors. This hasgiven rise to a branch of quantum computing dedicated to developing error-correction methods.
Although research is advancing at pace, therefore, quantum computers are for now stuck in what is known as the NISQ era: noisy, intermediate-scale quantum computing but the end-goal is to build a fault-tolerant, universal quantum computer.
As Buchholz explains, it is hard to tell when this is likely to happen. "I would guess we are a handful of years from production use cases, but the real challenge is that this is a little like trying to predict research breakthroughs," he says. "It's hard to put a timeline on genius."
In 2019, Googleclaimed that its 54-qubit superconducting processor called Sycamore had achieved quantum supremacy the point at which a quantum computer can solve a computational task that is impossible to run on a classical device in any realistic amount of time.
Google said that Sycamore has calculated, in only 200 seconds, the answer to a problem that would have taken the world's biggest supercomputers 10,000 years to complete.
More recently,researchers from the University of Science and Technology of China claimed a similar breakthrough, saying that their quantum processor had taken 200 seconds to achieve a task that would have taken 600 million years to complete with classical devices.
This is far from saying that either of those quantum computers are now capable of outstripping any classical computer at any task. In both cases, the devices were programmed to run very specific problems, with little usefulness aside from proving that they could compute the task significantly faster than classical systems.
Without a higher qubit count and better error correction, proving quantum supremacy for useful problems is still some way off.
Organizations that are investing in quantum resources see this as the preparation stage: their scientists are doing the groundwork to be ready for the day that a universal and fault-tolerant quantum computer is ready.
In practice, this means that they are trying to discover the quantum algorithms that are most likely to show an advantage over classical algorithms once they can be run on large-scale quantum systems. To do so, researchers typically try to prove that quantum algorithms perform comparably to classical ones on very small use cases, and theorize that as quantum hardware improves, and the size of the problem can be grown, the quantum approach will inevitably show some significant speed-ups.
For example, scientists at Japanese steel manufacturer Nippon Steelrecently came up with a quantum optimization algorithm that could compete against its classical counterpartfor a small problem that was run on a 10-qubit quantum computer. In principle, this means that the same algorithm equipped with thousands or millions of error-corrected qubits could eventually optimize the company's entire supply chain, complete with the management of dozens of raw materials, processes and tight deadlines, generating huge cost savings.
The work that quantum scientists are carrying out for businesses is, therefore, highly experimental, and so far there are fewer than 100 quantum algorithms that have been shown to compete against their classical equivalents which only points to how emergent the field still is.
With most use cases requiring a fully error-corrected quantum computer, just who will deliver one first is the question on everyone's lips in the quantum industry, and it is impossible to know the exact answer.
All quantum hardware companies are keen to stress that their approach will be the first one to crack the quantum revolution, making it even harder to discern noise from reality. "The challenge at the moment is that it's like looking at a group of toddlers in a playground and trying to figure out which one of them is going to win the Nobel Prize," says Buchholz.
"I have seen the smartest people in the field say they're not really sure which one of these is the right answer. There are more than half a dozen different competing technologies and it's still not clear which one will wind up being the best, or if there will be a best one," he continues.
In general, experts agree that the technology will not reach its full potential until after 2030. The next five years, however, may start bringing some early use cases as error correction improves and qubit counts start reaching numbers that allow for small problems to be programmed.
IBM is one of the rare companies thathas committed to a specific quantum roadmap, which defines the ultimate objective of realizing a million-qubit quantum computer. In the nearer term, Big Blue anticipates that it will release a 1,121-qubit system in 2023, which might mark the start of the first experimentations with real-world use cases.
In general, experts agree that quantum computers will not reach their full potential until after 2030.
Developing quantum hardware is a huge part of the challenge, and arguably the most significant bottleneck in the ecosystem. But even a universal fault-tolerant quantum computer would be of little use without the matching quantum software.
"Of course, none of these online facilities are much use without knowing how to 'speak' quantum," Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, tells ZDNet.
Creating quantum algorithms is not as easy as taking a classical algorithm and adapting it to the quantum world. Quantum computing, rather, requires a brand-new programming paradigm that can only be run on a brand-new software stack.
Of course, some hardware providers also develop software tools, the most established of which is IBM's open-source quantum software development kit Qiskit. But on top of that, the quantum ecosystem is expanding to include companies dedicated exclusively to creating quantum software. Familiar names include Zapata, QC Ware or 1QBit, which all specialize in providing businesses with the tools to understand the language of quantum.
And increasingly, promising partnerships are forming to bring together different parts of the ecosystem. For example, therecent alliance between Honeywell, which is building trapped ions quantum computers, and quantum software company Cambridge Quantum Computing (CQC), has got analysts predicting that a new player could be taking a lead in the quantum race.
The complexity of building a quantum computer think ultra-high vacuum chambers, cryogenic control systems and other exotic quantum instruments means that the vast majority of quantum systems are currently firmly sitting in lab environments, rather than being sent out to customers' data centers.
To let users access the devices to start running their experiments, therefore, quantum companies have launched commercial quantum computing cloud services, making the technology accessible to a wider range of customers.
The four largest providers of public cloud computing services currently offer access to quantum computers on their platform. IBM and Google have both put their own quantum processors on the cloud, whileMicrosoft's Azure QuantumandAWS's Braketservice let customers access computers from third-party quantum hardware providers.
The jury remains out on which technology will win the race, if any at all, but one thing is for certain: the quantum computing industry is developing fast, and investors are generously funding the ecosystem. Equity investments in quantum computing nearly tripled in 2020, and according to BCG, they are set to rise even more in 2021 to reach $800 million.
Government investment is even more significant: the US has unlocked $1.2 billion for quantum information science over the next five years, while the EU announced a 1 billion ($1.20 billion) quantum flagship. The UKalso recently reached the 1 billion ($1.37 billion) budget milestonefor quantum technologies, and while official numbers are not known in China,the government has made no secret of its desire to aggressively compete in the quantum race.
This has caused the quantum ecosystem to flourish over the past years, with new startups increasing from a handful in 2013 to nearly 200 in 2020. The appeal of quantum computing is also increasing among potential customers: according to analysis firm Gartner,while only 1% of companies were budgeting for quantum in 2018, 20% are expected to do so by 2023.
Although not all businesses need to be preparing themselves to keep up with quantum-ready competitors, there are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.
Goldman Sachs and JP Morgan are two examples of financial behemoths investing in quantum computing. That's because in banking,quantum optimization algorithms could give a boost to portfolio optimization, by better picking which stocks to buy and sell for maximum return.
In pharmaceuticals, where the drug discovery process is on average a $2 billion, 10-year-long deal that largely relies on trial and error, quantum simulation algorithms are also expected to make waves. This is also the case in materials science: companies like OTI Lumionics, for example,are exploring the use of quantum computers to design more efficient OLED displays.
Leading automotive companies including Volkswagen and BMW are also keeping a close eye on the technology, which could impact the sector in various ways, ranging from designing more efficient batteries to optimizing the supply chain, through to better management of traffic and mobility. Volkswagen, for example,pioneered the use of a quantum algorithm that optimized bus routes in real time by dodging traffic bottlenecks.
As the technology matures, however, it is unlikely that quantum computing will be limited to a select few. Rather, analysts anticipate that virtually all industries have the potential to benefit from the computational speedup that qubits will unlock.
There are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.
Quantum computers are expected to be phenomenal at solving a certain class of problems, but that doesn't mean that they will be a better tool than classical computers for every single application. Particularly, quantum systems aren't a good fit for fundamental computations like arithmetic, or for executing commands.
"Quantum computers are great constraint optimizers, but that's not what you need to run Microsoft Excel or Office," says Buchholz. "That's what classical technology is for: for doing lots of maths, calculations and sequential operations."
In other words, there will always be a place for the way that we compute today. It is unlikely, for example, that you will be streaming a Netflix series on a quantum computer anytime soon. Rather, the two technologies will be used in conjunction, with quantum computers being called for only where they can dramatically accelerate a specific calculation.
Buchholz predicts that, as classical and quantum computing start working alongside each other, access will look like a configuration option. Data scientists currently have a choice of using CPUs or GPUs when running their workloads, and it might be that quantum processing units (QPUs) join the list at some point. It will be up to researchers to decide which configuration to choose, based on the nature of their computation.
Although the precise way that users will access quantum computing in the future remains to be defined, one thing is certain: they are unlikely to be required to understand the fundamental laws of quantum computing in order to use the technology.
"People get confused because the way we lead into quantum computing is by talking about technical details," says Buchholz. "But you don't need to understand how your cellphone works to use it.
"People sometimes forget that when you log into a server somewhere, you have no idea what physical location the server is in or even if it exists physically at all anymore. The important question really becomes what it is going to look like to access it."
And as fascinating as qubits, superposition, entanglement and other quantum phenomena might be, for most of us this will come as welcome news.
AI, quantum computing and other technologies poised to transform healthcare – Healthcare Finance News
Photo: Al David Sacks/Getty Images
The COVID-19 pandemic has created numerous challenges in healthcare, but challenges can sometimes breed innovation. Technological innovation in particular is poised to change the way care is delivered, driving efficiency in the process. Efficiency will be key as hospitals and health systems look to recover from the initial, devastating wave of the pandemic.
Ryan Hodgin, chief technology officer for IBM Global Healthcare, and Kate Huey, partner at IBM Healthcare, will speak about some of these technological innovations in their digital HIMSS21 session, "Innovation Driven Resiliency: Redefining What's Possible."
The technology in question can encompass telehealth, artificial intelligence, automation, blockchain, chatbots, apps and other elements that have become mainstays of healthcare during the course of the pandemic.
In a way, science fiction is becoming science fact: Technologies that were once in the experimental phase are now coming to life and driving innovation, particularly quantum computing. The power of quantum computing has the potential to transform healthcare just by sheer force of its impressive computational power.
One of the big factors accelerating technological innovation is the healthcare workforce, which has been placed under enormous stress over the past 18 months, with many doctors and clinicians reporting burnout or feelings of being overwhelmed. These technologies promise to reduce the burden being felt by providers.
Importantly, they also promise to more actively engage healthcare consumers, who increasingly expect healthcare to be as user-friendly and experience driven as their favorite apps or online shopping portals.
Hodgin and Huey will speak more on the topic when their session debuts on Tuesday, August 10, from 11:45 a.m. - 12:15 p.m.
See the original post here:
AI, quantum computing and other technologies poised to transform healthcare - Healthcare Finance News
A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City.
IBM has unveiled a brand-new quantum computer in Japan, thousands of miles away from the company's quantum computation center in Poughkeepsie, New York, in another step towards bringing quantum technologies out of Big Blue's labs and directly to partners around the world.
A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City, for Japanese researchers to run their quantum experiments in fields ranging from chemistry to finance.
Most customers to date can only access IBM's System One over the cloud, by connecting to the company's quantum computation center in Poughkeepsie.
Recently, the company unveiled the very first quantum computer that was physically built outside of the computation center's data centers,when the Fraunhofer Institute in Germany acquired a System One. The system that has now been deployed to Japan is therefore IBM's second quantum computer that is located outside of the US.
The announcement comes as part of a long-standing relationship with Japanese organizations. In 2019, IBM and the University of Tokyo inaugurated the Japan-IBM Quantum Partnership, a national agreement inviting universities and businesses across the country to engage in quantum research. It was agreed then that a Quantum System One would eventually be installed at an IBM facility in Japan.
Building on the partnership, Big Blue and the University of Tokyolaunched the Quantum Innovation Initiative Consortium last yearto further bring together organizations working in the field of quantum. With this, the Japanese government has made it clear that it is keen to be at the forefront of the promising developments that quantum technologies are expected to bring about.
Leveraging some physical properties that are specific to quantum mechanics, quantum computers could one day be capable of carrying out calculations that are impossible to run on the devices that are used today, known as a classical computers.
In some industries, this could have big implications; and as part of the consortium, together with IBM researchers, some Japanese companies have already identified promising use cases. Mitsubishi Chemical's research team, for example, has developed quantum algorithms capable of understanding the complex behavior of industrial chemical compounds with the goal of improving OLED displays.
A recent research paper published by the scientistshighlighted the potential of quantum computers when it comes to predicting the properties of OLED materials, which could eventually lead to more efficient displays requiring low-power consumption.
Similarly, researchers from Mizuho Financial Group and Mitsubishi Financial Group have been developing quantum algorithms that could speedup financial operations like Monte Carlo simulations, which could allow for optimized portfolio management thanks to better risk analysis and option pricing.
With access to IBM's Quantum System One, research in those fields is now expected to accelerate. But other industry leaders exploring quantum technologies as part of the partnership extend from Sony to Toyota, through Hitachi, Toshiba or JSR.
Quantum computing is still in its very early stages, and it is not yet possible to use quantum computers to perform computations that are of any value to a business. Rather, scientists are currently carrying out proofs-of-concept, by attempting to identify promising applications and testing them at a very small scale, to be prepared for the moment that the hardware is fully ready.
This is still some way off. Building and controlling the components of quantum computers is a huge challenge, which has so far been limited to the confines of specialist laboratories such as IBM's Poughkeepsie computation center.
It is significant, therefore, that IBM's Quantum System One is now mature enough to be deployed outside of the company's lab.
"Thousands of meticulously engineered components have to work together flawlessly in extreme temperatures within astonishing tolerances," said IBM in a blog post.
Back in the US, too, quantum customers are showing interest in building quantum hardware in their own facilities. The Cleveland Clinic, for example,recently invested $500 million for Big Blue to build quantum hardware on-premises.
Professor Michael Biercuk is CEO of quantum tech startup Q-CTRL.
Researchers at the University of Sydney and quantum control startup Q-CTRL have announced a way to identify sources of error in quantum computers through machine learning, providing hardware developers the ability to pinpoint performance degradation with unprecedented accuracy and accelerate paths to useful quantum computers.
A joint scientific paper detailing the research, titled Quantum Oscillator Noise Spectroscopy via Displaced Cat States, has been published inPhysical Review Letters, the worlds premier physical science research journal and flagship publication of the American Physical Society (APS Physics).
Focused on reducing errors caused by environmental noise - the Achilles heel of quantum computing - the University of Sydney team developed a technique to detect the tiniest deviations from the precise conditions needed to execute quantum algorithms using trapped ion and superconducting quantum computing hardware. These are the core technologies used by world-leading industrial quantum computing efforts at IBM, Google, Honeywell, IonQ, and others.
The University team is based at the Quantum Control Laboratory led by Professor Michael Biercukin the Sydney Nanoscience Hub.
Topinpoint the source of the measured deviations, Q-CTRL scientists developed a new way to process the measurement results using custom machine-learning algorithms. In combination with Q-CTRLs existing quantum control techniques, the researchers were also able to minimise the impact of background interference in the process. This allowed easy discrimination between real noise sources that could be fixed and phantom artefacts of the measurements themselves.
Combining cutting-edge experimental techniques with machine learning has demonstrated huge advantages in the development of quantum computers, said Dr Cornelius Hempel of ETH Zurich who conducted the research while at the University of Sydney. The Q-CTRL team was able to rapidly develop a professionally engineered machine learning solution that allowed us to make sense of our data and provide a new way to see the problems in the hardware and address them.
Q-CTRL CEO Professor Biercuk said: The ability to identify and suppress sources of performance degradation in quantum hardware is critical to both basic research and industrial efforts building quantum sensors and quantum computers.
Quantum control, augmented by machine learning, has shown a pathway to make these systems practically useful and dramatically accelerate R&D timelines, he said.
The published results in a prestigious, peer-reviewed journal validate the benefit of ongoing cooperation between foundational scientific research in a university laboratory and deep-tech startups. Were thrilled to be pushing the field forward through our collaboration.
Q-CTRL was spun-out of the University of Sydney by Professor Michael Biercuk from the School of Physics. The startup builds quantum control infrastructure software for quantum technology end-users and R&D professionals across all applications.
Q-CTRL has assembled the worlds foremost team of expert quantum-control engineers, providing solutions to many of the most advanced quantum computing and sensing teams globally. Q-CTRL is funded by SquarePeg Capital, Sierra Ventures, Sequoia Capital China, Data Collective, Horizons Ventures, Main Sequence Ventures and In-Q-Tel. Q-CTRL has international headquarters in Sydney, Los Angeles, and Berlin.
Google’s ‘time crystals’ could be the greatest scientific achievement of our lifetimes – The Next Web
Eureka! A research team featuring dozens of scientists working in partnership with Googles quantum computing labs may have created the worlds first time crystal inside a quantum computer.
This is the kind of news that makes me want to jump up and do a happy dance.
These scientists may have produced an entirely new phase of matter. Im going to do my best to explain what that means and why I personally believe this is the most important scientificbreakthrough in our lifetimes.
However, for the sake of clarity, theres two points I need to make first:
In colloquial terms, its a big screw you to Sir Isaac Newton.
Time crystals are a new phase of matter. For the sake of simplicity, lets imagine a cube of ice.
When you put a cube of ice in glass of water, youre introducing two separate entities (the ice cube and the liquid water) to each other at two different temperatures.
Everyone knows that the water will get colder (thats why we put the ice in there) and, over time, the ice will get warmer and turn into water. Eventually youll just have a glass of room-temperature water.
We call this process thermal equilibrium.
Most people are familiar with Newtons first law of motion, its the one that says an object at rest tends to stay at rest and an object inmotion tends to stay in motion.
An important side-effect of this law of physics is that it means a perpetual motion machine is classically impossible.
According to classical physics, the universe is always moving towards entropy. In other words: if we isolate an ice cube and a room-temperature glass of water from all other external forces, the water will always melt the ice cube.
The entropy (the movement towards change) of any system will always remain the same if there are no processes, and it will always increase if there are processes.
Since our universe has stars exploding, black holes sucking, and people lighting things on fire chemical processes entropy is always increasing.
Except when it comes to time crystals. Time crystals dont give a damn what Newton or anyone else thinks. Theyre lawbreakers and heart takers. They can, theoretically, maintain entropy even when theyre used in a process.
Think about a crystal youre familiar with, such as a snowflake. Snowflakes arent just beautiful because each one is unique, theyre also fascinating formations that nearly break the laws of physics themselves.
Crystalline structures form in the physical world because, for whatever fundamental scientific reason, the atoms within them want to exist in certain exact points.
Want is a really weird word to use when were talking about atoms Im certainly not implying theyre sentient but its hard to describe the tendency toward crystalline structures in abstracts such as why.
A time crystal is a new phase of matter that, simplified, would be like having a snowflake that constantly cycled back and forth between two different configurations. Its a seven-pointed lattice one moment and a ten-pointed lattice the next, or whatever.
Whats amazing about time crystals is that when they cycle back and forth between two different configurations, they dont lose or use any energy.
Time crystals can survive energy processes without falling victim to entropy. The reason theyre called time crystals is because they can have their cake and eat it too.
They can be in a state of having eaten the whole cake, and then cycle right back to a state of still having the cake and they can, theoretically, do this forever and ever.
Most importantly, they can do this inside of an isolated system. That means they can consume the cake and then magically make it reappear over and over again forever, without using any fuel or energy.
Literally everyone should care. As I wrote back in 2018, time crystals could be the miracle quantum computing needs.
Nearly every far-future tech humans can imagine, from teleportation to warp drives and from artificial food synthesizers to perpetual motion reactors capable of powering the world without burning fuels or harnessing energy, will require quantum computing systems.
Quantum computers can solve really hard problems. Unfortunately, theyre brittle. Its hard to build them, hard to maintain them, hard to get them to do anything, and even harder to interpret the results they give. This is because of something called decoherence, which works a lot like entropy.
Computer bits in the quantum world, qubits, share a funky feature of quantum mechanics that makes them act differently when observed than when theyre left alone. That sort of makes any direct measurements of qubit states (reading the computers output) difficult.
But time crystals want to be coherent. So putting them inside a quantum computer, and using them to conduct computer processes could potentially serve an incredibly important function: ensuringquantum coherence.
[Greetings Humanoids! Did you know we have a newsletter all about AI and quantum computing? You can subscribe to itright here]
No. No, no, no, no no. Dont get me wrong. This is baby steps. This is infancy research. This is Antony van Leeuwenhoek becoming the first person to use a microscope to look at a drop of water under magnification.
What Googles done, potentially, is prove that humans can manufacture time crystals. In the words of the researchers themselves:
These results establish a scalable approach to study non-equilibrium phases of matter on current quantum processors.
Basically they believe theyve proven the concept, so now its time to see what can be done with it.
Time crystals have always been theoretical. And by always, I mean: since 2012 when they were first hypothesized.
If Googles actually created time-crystals, it could accelerate the timeline for quantum computing breakthroughs from maybe never to maybe within a few decades.
At the far-fetched, super-optimistic end of things we could see the creation of a working warp drive in our lifetimes. Imagine taking a trip to Mars or the edge of our solar system, and being back home on Earth in time to catch the evening news.
And, even on the conservative end with more realistic expectations, its not hard to imagine quantum computing-based chemical and drug discovery leading to universally-effective cancer treatments.
This could be the big eureka weve all been waiting for. I cant wait to see what happens in peer-review.
If you want to know more, you can read Googles paper here. And if youre looking for a technical deep-dive into the scientific specifics of what the researchers accomplished in the lab, this piece on Quanta Magazine byNatalie Wolchover is the bees knees.