Search Immortality Topics:

Page 121«..1020..120121122123


Category Archives: Quantum Computing

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing – VentureBeat

Quantum is having a moment. In October, Google claimed to have achieved a quantum supremacy milestone. In November, Microsoft announced Azure Quantum, a cloud service that lets you tap into quantum hardware providers Honeywell, IonQ, or QCI. Last week, AWS announced Amazon Braket, a cloud service that lets you tap into quantum hardware providers D-Wave, IonQ, and Rigetti. At the Q2B 2019 quantum computing conference this week, I got a pulse for how the nascent industry is feeling.

Binary digits (bits) are the basic units of information in classical computing, while quantum bits (qubits) make up quantum computing. Bits are always in a state of 0 or 1, while qubits can be in a state of 0, 1, or a superposition of the two. Quantum computing leverages qubits to perform computations that would be much more difficult for a classical computer. Potential applications are so vast and wide (from basic optimization problems to machine learning to all sorts of modeling) that interested industries span finance, chemistry, aerospace, cryptography, and more. But its still so early that the industry is nowhere close to reaching consensus on what the transistor for qubits should look like.

Currently, your cloud quantum computing options are limited to single hardware providers, such as those from D-Wave and IBM. Amazon and Microsoft want to change that.

Enterprises and researchers interested in testing and experimenting with quantum are excited because they will be able to use different quantum processors via the same service, at least in theory. Theyre uneasy, however, because the quantum processors are so fundamentally different that its not clear how easy it will be to switch between them. D-Wave uses quantum annealing, Honeywell and IonQ use ion trap devices, and Rigetti and QCI use superconducting chips. Even the technologies that are the same have completely different architectures.

Entrepreneurs and enthusiasts are hopeful that Amazon and Microsoft will make it easier to interface with the various quantum hardware technologies. Theyre uneasy, however, because Amazon and Microsoft have not shared pricing and technical details. Plus, some of the quantum providers offer their own cloud services, so it will be difficult to suss out when it makes more sense to work with them directly.

The hardware providers themselves are excited because they get exposure to massive customer bases. Amazon and Microsoft are the worlds biggest and second biggest cloud providers, respectively. Theyre uneasy, however, because the tech giants are really just middlemen, which of course poses its own problems of costs and reliance.

At least right now, it looks like this will be the new normal. Even hardware providers that havent announced they are partnering with Amazon and/or Microsoft, like Xanadu, are in talks to do just that.

Overall at the event, excitement trumped uneasiness. If youre participating in a domain as nascent as quantum, you must be optimistic. The news this quarter all happened very quickly, but there is still a long road ahead. After all, these cloud services have only been announced. They still have to become available, gain exposure, pick up traction, become practical, prove useful, and so on.

The devil is in the details. How much are these cloud services for quantum going to cost? Amazon and Microsoft havent said. When exactly will they be available in preview or in beta? Amazon and Microsoft havent said. How will switching between different quantum processors work in practice? Amazon and Microsoft havent said.

One thing is clear. Everyone at the event was talking about the impact of the two biggest cloud providers offering quantum hardware from different companies. The clear winners? Amazon and Microsoft.

ProBeat is a column in which Emil rants about whatever crosses him that week.

Read the rest here:

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing - VentureBeat

Posted in Quantum Computing | Comments Off on ProBeat: AWS and Azure are generating uneasy excitement in quantum computing – VentureBeat

Quantum Computers Are the Ultimate Paper Tiger – The National Interest Online

Google announced this fall to much fanfare that it had demonstrated quantum supremacy that is, it performed a specific quantum computation far faster than the best classical computers could achieve. IBM promptly critiqued the claim, saying that its own classical supercomputer could perform the computation at nearly the same speed with far greater fidelity and, therefore, the Google announcement should be taken with a large dose of skepticism.

This wasnt the first time someone cast doubt on quantum computing. Last year, Michel Dyakonov, a theoretical physicist at the University of Montpellier in France, offered a slew of technical reasons why practical quantum supercomputers will never be built in an article in IEEE Spectrum, the flagship journal of electrical and computer engineering.

So how can you make sense of what is going on?

As someone who has worked on quantum computing for many years, I believe that due to the inevitability of random errors in the hardware, useful quantum computers are unlikely to ever be built.

Whats a quantum computer?

To understand why, you need to understand how quantum computers work since theyre fundamentally different from classical computers.

A classical computer uses 0s and 1s to store data. These numbers could be voltages on different points in a circuit. But a quantum computer works on quantum bits, also known as qubits. You can picture them as waves that are associated with amplitude and phase.

Qubits have special properties: They can exist in superposition, where they are both 0 and 1 at the same time, and they may be entangled so they share physical properties even though they may be separated by large distances. Its a behavior that does not exist in the world of classical physics. The superposition vanishes when the experimenter interacts with the quantum state.

Due to superposition, a quantum computer with 100 qubits can represent 2100 solutions simultaneously. For certain problems, this exponential parallelism can be harnessed to create a tremendous speed advantage. Some code-breaking problems could be solved exponentially faster on a quantum machine, for example.

There is another, narrower approach to quantum computing called quantum annealing, where qubits are used to speed up optimization problems. D-Wave Systems, based in Canada, has built optimization systems that use qubits for this purpose, but critics also claim that these systems are no better than classical computers.

Regardless, companies and countries are investing massive amounts of money in quantum computing. China has developed a new quantum research facility worth US$10 billion, while the European Union has developed a 1 billion ($1.1 billion) quantum master plan. The United States National Quantum Initiative Act provides $1.2 billion to promote quantum information science over a five-year period.

Breaking encryption algorithms is a powerful motivating factor for many countries if they could do it successfully, it would give them an enormous intelligence advantage. But these investments are also promoting fundamental research in physics.

Many companies are pushing to build quantum computers, including Intel and Microsoft in addition to Google and IBM. These companies are trying to build hardware that replicates the circuit model of classical computers. However, current experimental systems have less than 100 qubits. To achieve useful computational performance, you probably need machines with hundreds of thousands of qubits.

Noise and error correction

The mathematics that underpin quantum algorithms is well established, but there are daunting engineering challenges that remain.

For computers to function properly, they must correct all small random errors. In a quantum computer, such errors arise from the non-ideal circuit elements and the interaction of the qubits with the environment around them. For these reasons the qubits can lose coherency in a fraction of a second and, therefore, the computation must be completed in even less time. If random errors which are inevitable in any physical system are not corrected, the computers results will be worthless.

In classical computers, small noise is corrected by taking advantage of a concept known as thresholding. It works like the rounding of numbers. Thus, in the transmission of integers where it is known that the error is less than 0.5, if what is received is 3.45, the received value can be corrected to 3.

Further errors can be corrected by introducing redundancy. Thus if 0 and 1 are transmitted as 000 and 111, then at most one bit-error during transmission can be corrected easily: A received 001 would be a interpreted as 0, and a received 101 would be interpreted as 1.

Quantum error correction codes are a generalization of the classical ones, but there are crucial differences. For one, the unknown qubits cannot be copied to incorporate redundancy as an error correction technique. Furthermore, errors present within the incoming data before the error-correction coding is introduced cannot be corrected.

Quantum cryptography

While the problem of noise is a serious challenge in the implementation of quantum computers, it isnt so in quantum cryptography, where people are dealing with single qubits, for single qubits can remain isolated from the environment for significant amount of time. Using quantum cryptography, two users can exchange the very large numbers known as keys, which secure data, without anyone able to break the key exchange system. Such key exchange could help secure communications between satellites and naval ships. But the actual encryption algorithm used after the key is exchanged remains classical, and therefore the encryption is theoretically no stronger than classical methods.

Quantum cryptography is being commercially used in a limited sense for high-value banking transactions. But because the two parties must be authenticated using classical protocols, and since a chain is only as strong as its weakest link, its not that different from existing systems. Banks are still using a classical-based authentication process, which itself could be used to exchange keys without loss of overall security.

Quantum cryptography technology must shift its focus to quantum transmission of information if its going to become significantly more secure than existing cryptography techniques.

Commercial-scale quantum computing challenges

While quantum cryptography holds some promise if the problems of quantum transmission can be solved, I doubt the same holds true for generalized quantum computing. Error-correction, which is fundamental to a multi-purpose computer, is such a significant challenge in quantum computers that I dont believe theyll ever be built at a commercial scale.

[ Youre smart and curious about the world. So are The Conversations authors and editors. You can get our highlights each weekend. ]

Subhash Kak, Regents Professor of Electrical and Computer Engineering, Oklahoma State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image: Reuters

See more here:

Quantum Computers Are the Ultimate Paper Tiger - The National Interest Online

Posted in Quantum Computing | Comments Off on Quantum Computers Are the Ultimate Paper Tiger – The National Interest Online

Quantum expert Robert Sutor explains the basics of Quantum Computing – Packt Hub

What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?

For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.

If we could do this at scale, we might be able to compute the molecules we want and need.

These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.

Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?

This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.

Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.

This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is

295, 000, 000, 000, 000, 000, 000 molecules.

A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.

These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.

To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.

Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?

Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:

10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.

And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.

How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.

We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.

However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.

In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.

I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.

Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.

When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.

If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.

If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.

If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?

I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.

There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.

To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.

A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.

A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time

This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.

Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.

Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:

As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.

In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.

To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.

Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.

Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing

Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions

Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases

Read the rest here:

Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub

Posted in Quantum Computing | Comments Off on Quantum expert Robert Sutor explains the basics of Quantum Computing – Packt Hub

Will quantum computing overwhelm existing security tech in the near future? – Help Net Security

More than half (54%) of cybersecurity professionals have expressed concerns that quantum computing will outpace the development of other security tech, according to a research from Neustar.

Keeping a watchful eye on developments, 74% of organizations admitted to paying close attention to the technologys evolution, with 21% already experimenting with their own quantum computing strategies.

A further 35% of experts claimed to be in the process of developing a quantum strategy, while just 16% said they were not yet thinking about it. This shift in focus comes as the vast majority (73%) of cyber security professionals expect advances in quantum computing to overcome legacy technologies, such as encryption, within the next five years.

Almost all respondents (93%) believe the next-generation computers will overwhelm existing security technology, with just 7% under the impression that true quantum supremacy will never happen.

Despite expressing concerns that other technologies will be overshadowed, 87% of CISOs, CSOs, CTOs and security directors are excited about the potential positive impact of quantum computing. The remaining 13% were more cautious and under the impression that the technology would create more harm than good.

At the moment, we rely on encryption, which is possible to crack in theory, but impossible to crack in practice, precisely because it would take so long to do so, over timescales of trillions or even quadrillions of years, said Rodney Joffe, Chairman of NISC and Security CTO at Neustar.

Without the protective shield of encryption, a quantum computer in the hands of a malicious actor could launch a cyberattack unlike anything weve ever seen.

For both todays major attacks, and also the small-scale, targeted threats that we are seeing more frequently, it is vital that IT professionals begin responding to quantum immediately.

The security community has already launched a research effort into quantum-proof cryptography, but information professionals at every organization holding sensitive data should have quantum on their radar.

Quantum computings ability to solve our great scientific and technological challenges will also be its ability to disrupt everything we know about computer security. Ultimately, IT experts of every stripe will need to work to rebuild the algorithms, strategies, and systems that form our approach to cybersecurity, added Joffe.

The report also highlighted a steep two-year increase on the International Cyber Benchmarks Index. Calculated based on changes in the cybersecurity landscape including the impact of cyberattacks and changing level of threat November 2019 saw the highest score yet at 28.2. In November 2017, the benchmark sat at just 10.1, demonstrating an 18-point increase over the last couple of years.

During September October 2019, security professionals ranked system compromise as the greatest threat to their organizations (22%), with DDoS attacks and ransomware following very closely behind (21%).

Go here to read the rest:

Will quantum computing overwhelm existing security tech in the near future? - Help Net Security

Posted in Quantum Computing | Comments Off on Will quantum computing overwhelm existing security tech in the near future? – Help Net Security

How quantum computing is set to impact the finance industry – IT Brief New Zealand

Attempting to explain quantum computing with the comparison between quantum and classical computing is like comparing the world wide web to a typewriter, theres simply next to no comparison.

Thats not to say the typewriter doesnt have its own essential and commercially unique uses. Its just not the same.

However, explaining the enormous impact quantum computing could have if successfully rolled-out and becomes globally accessible is a bit easier.

Archer Materials Limited (ASX:AXE) CEO Dr Mohammad Choucair outlined the impact quantum computing could have on the finance industry.

In an address to shareholders and academics, Dr Choucair outlined that the global financial assets market is estimated to be worth trillions, and Im sure it comes as no surprise that any capability to optimise ones investment portfolio or capitalise on market volatility would be of great value to banks, governments and everyone in the audience.

Traders currently use algorithms to understand and, to a degree, predict the value movement in these markets. An accessible and operating quantum chip would provide immeasurable improvements to these algorithms, along with the machine learning that underpins them.

Archer is a materials technology-focused company that integrates the materials pulled from the ground with the converging materials-based technologies that have the capability to impact global industries including:

It could have an enormous impact on computing and the electric vehicles industries.

The potential for global consumer and business accessibility to quantum computing is the key differentiator between Archer Materials Ltd. and some of the other players in the market.

The companys 12CQ qubit, invented by Dr Choucair, is potentially capable of storing quantum information at room temperature.

As a result of this, the 12CQ chip could be thrown onto the motherboard of the everyday laptop, or tablet if youre tech-savvy, and operate in coexistence with a classical CPU.

This doesnt mean the everyday user can now go and live out a real-world, real-time simulation of The Matrix.

But it does mean that the laptop you have in your new, European leather tote could potentially perform extremely complex calculations to protect digital financial and communication transactions.

To head the progress of the 12CQ Project, Archer hired Dr Martin Fuechsle, a quantum physicist, who is by no means new to the high-performing Australian quantum tech industry.

In fact, Dr Fuechsle invented the worlds first single-atom transistor and offers over 10 years experience in the design, fabrication and integration of quantum devices.

Archer has moved quickly over the last 12 months and landed some significant 12CQ milestones, including the first-stage assembly of the nanoscale qubit processor chip.

Along with the accurate positioning of the qubit componentry with nanoscale precision.

Both of these being key success factors to the commercial and technological readiness of the room-temperature chip.

Most recently, Archer announced the successful and scalable assembly of qubit array components of the 12CQ room-temperature qubit processor. Commenting on the success, Dr Choucair announced: This excellent achievement advances our chip technology development towards a minimum viable product and strengthens our commercial readiness by providing credibility to the claim of 12 CQ chips being potentially scalable.

To build an array of a few qubits in less than a year means we are well and truly on track in our development roadmap taking us into 2020.

The Archer team has commercial agreements in place with the University of Sydney, to access the facilities they need to build chip prototypes at the Research and Prototype Foundry within the world-class, $150 million purpose-built Sydney Nanoscience Hub facility.

Continue reading here:

How quantum computing is set to impact the finance industry - IT Brief New Zealand

Posted in Quantum Computing | Comments Off on How quantum computing is set to impact the finance industry – IT Brief New Zealand

Op Ed: Quantum Computing, Crypto Agility and Future Readiness – Bitcoin Magazine

Over the past few decades, we have seen almost unimaginable progress in computation speed and power. A watch today is a more powerful computer than the first Macintosh that my parents bought me in 1984 (I was very lucky). The weakest and lightest laptop today is more powerful than the computers that I programmed on during my undergraduate studies in university. Do you remember the days of computers with 64 kilobytes of RAM? Now we count in gigabytes and, soon, terabytes.

Yes, I know that Im old (but at least Im not reminiscing about punch cards and vacuum tubes), but thats not really the point. The point is to understand where all of these extremely fast advancements in computing power came from.

The answer is a combination of Moores law (stating that the number of transistors on a chip doubles every two years, although this has now slowed down), together with many architectural improvements and optimizations by chip manufacturers. Despite this, the basic way that our most powerful computers work today is the same as in the 1970s and 1980s. Thus, although improvements are fast and impressive, they are all in the same playing field.

Quantum computing is a completely different ball game. Quantum computers work in a radically different way and could solve problems that classical computers wont be able to solve for hundreds of years, even if Moores law continues. Stated differently, quantum computers dont follow the same rules of classical computing and are in a league of their own. This does not mean that quantum computers can solve all computationally hard problems. However, there are problems for which quantum computers are able to achieve extraordinary speedups.

Some of these problems are closely related to much of modern cryptography, and include the number factorization problem that lies at the core of the RSA cryptosystem, and the discrete log problem that lies at the core of Diffie-Hellman, ECDSA, EdDSA and other cryptosystems (as used in cryptocurrencies and blockchains).

The big question that still has not been answered, despite what you may have read, is whether or not such quantum computers will ever be built. I want to stress that this is still an if and not a when. The fact that small quantum computers have been built does not mean that quantum computers at the scale and accuracy needed to break cryptography will ever be built. The problems that need to be overcome are considerable. I am not saying that I dont think they will succeed; Im just saying that its not a certainty.

The next big question is: When will such a computer that is powerful enough to break RSA or ECDSA be built? Or maybe more relevant when do we have to start worrying about this possibility? I personally believe that this is many years away (I will say at least a decade, but I think it will be more like two decades at least).

Recently, Googles scientists hailed what they believe is the first demonstration of quantum supremacy. This was widely understood to mean that quantum computers are now already faster than classical ones. And if this is the case, then modern cryptography may be broken very soon, in contrast to the time span that I predicted above.

However, this claim by Googles scientists needs to be understood in context. Quantum supremacy is a technical term used by the academic community to mean when a quantum computer can do just one thing faster than a classical computer. However, this is really not what we think about when we hear supremacy, nor is it really relevant to cryptography and other application domains. In particular, what we are really interested in knowing is when quantum computers will be able to solve hard, important problems faster than classical computers, and when quantum computers will be able to break cryptography.

Whether or not quantum supremacy was even demonstrated is not absolutely clear (see IBMs response). However, in any case, this quantum computation has no effect whatsoever on cryptography, blockchains and cryptocurrencies.

So, what does this mean concretely for us as a community? First, we should rest assured that the cryptographic world is getting ready for any eventuality. In particular, we already have good candidates for post-quantum secure public-key encryption and digital signature schemes, and NIST is working on standardization now. As such, we will not be surprised and unprepared if post-quantum computers that threaten our cryptographic infrastructure become close to reality.

This does not, however, mean that our actual products and software in use are ready for the post-quantum era, and this is often a really hard problem. The solution to this problem is called crypto agility, and it relates to the ease (or lack thereof) with which cryptosystems can be replaced in existing deployed systems.

There are two main aspects to crypto agility. The first is how easily it is possible to change code so that one cryptosystem is replaced with another. The more the specific structure of the cryptosystem is relied upon in the code, the harder it will be to replace. The second is how to make this change while preserving backward compatibility and without introducing new vulnerabilities that can happen when new and old versions operate concurrently.

These are (security) software engineering considerations, and there is no general right answer. However, asking your software team what the cost would be to swap out their crypto is a really important first step.

The good thing about becoming more crypto-agile is that, even if the threat of quantum computing to cryptography never eventuates, it is still a good investment. Cryptosystems, key sizes, modes of operation and more change over time. This is a fact of life and will not change. Being more crypto-agile will enable you to respond faster to such changes and to be ahead of the market when new cryptography is introduced (whether it be for classic security systems or for cryptocurrencies and blockchains). That is always a good thing!

This is an op ed contribution by Professor Yehuda Lindell. Views expressed are his own and do not necessarily reflect those of Bitcoin Magazine or BTC Inc.

Read the original here:

Op Ed: Quantum Computing, Crypto Agility and Future Readiness - Bitcoin Magazine

Posted in Quantum Computing | Comments Off on Op Ed: Quantum Computing, Crypto Agility and Future Readiness – Bitcoin Magazine