Search Immortality Topics:

Page 11234..1020..»


Category Archives: Quantum Computing

House Introduces the Advancing Quantum Computing Act – Lexology

On May 19, 2020, Representative Morgan Griffith (R-VA-9) introduced the Advancing Quantum Computing Act (AQCA), which would require the Secretary of Commerce to conduct a study on quantum computing. We cant depend on other countries . . . to guarantee American economic leadership, shield our stockpile of critical supplies, or secure the benefits of technological progress to our people, Representative Griffith explained. It is up to us to do that.

Quantum computers use the science underlying quantum mechanics to store data and perform computations. The properties of quantum mechanics are expected to enable such computers to outperform traditional computers on a multitude of metrics. As such, there are many promising applications, from simulating the behavior of matter to accelerating the development of artificial intelligence. Several companies have started exploring the use of quantum computing to develop new drugs, improve the performance of batteries, and optimize transit routing to minimize congestion.

In addition to the National Quantum Initiative Act passed in 2018, the introduction of AQCA represents another importantalbeit preliminarystep for Congress in helping to shape the growth and development of quantum computing in the United States. It signals Congresss continuing interest in developing a national strategy for the technology.

Overall, the AQCA would require the Secretary of Commerce to conduct the following four categories of studies related to the impact of quantum computing:

Original post:
House Introduces the Advancing Quantum Computing Act - Lexology

Posted in Quantum Computing | Comments Off on House Introduces the Advancing Quantum Computing Act – Lexology

The Role of Quantum Computing in Online Education – MarketScale

On this episode of the MarketScale Online Learning Minute, host Brian Runo dives into how quantum computing, the next revolutionary leap forward in computing, could apply to online education.

In particular, it can be used to epitomize the connectivism theory and provide personalized learning for each individual, as its not restricted by the capacity of an individual instructor.

In this way, each learner can be empowered to learn at their own pace and be presented with materials more tailored to them in real-time.

In fact, quantum computing is so revolutionary that the education world likely cant even currently dream up the innovations it will enable.

For the latest news, videos, and podcasts in theEducation Technology Industry, be sure to subscribe to our industry publication.

Follow us on social media for the latest updates in B2B!Twitter @MarketScaleFacebook facebook.com/marketscaleLinkedIn linkedin.com/company/marketscale

Link:
The Role of Quantum Computing in Online Education - MarketScale

Posted in Quantum Computing | Comments Off on The Role of Quantum Computing in Online Education – MarketScale

The University of New Mexico Becomes IBM Q Hub’s First University Member – HPCwire

May 28, 2020 Under the direction of Michael Devetsikiotis, chair of the Department of Electrical and Computer Engineering (ECE), The University of New Mexico recently joined the IBM Q Hubat North Carolina State University as its first university member.

The NC State IBM Q Hub is a cloud-based quantum computing hub, one of six worldwide and the first in North America to be part of the globalIBM Q Network. This global network links national laboratories, tech startups, Fortune 500 companies, and research universities, providing access to IBMs largest quantum computing systems.

Mainstream computer processors inside our laptops, desktops, and smartphones manipulatebits, information that can only exist as either a 1 or a 0. In other words, the computers we are used to function through programming, which dictates a series of commands with choices restricted to yes/no or if this, then that.Quantum computers, on the other hand, process quantum bits or qubits, that are not restricted to a binary choice. Quantum computers can choose if this, then that or both through complex physics concepts such as quantum entanglement. This allows quantum computers to process information more quickly, and in unique ways compared to conventional computers.

Access to systems such as IBMs newly announced53 qubit processor (as well as several 20 qubit machines) is just one of the many benefits to UNMs participation in the IBM Q Hub when it comes to data analysis and algorithm development for quantum hardware. Quantum knowledge will only grow with time, and the IBM Q Hub will provide unique training and research opportunities for UNM faculty and student researchers for years to come.

How did this partnership come to be? Two years ago, a sort of call to arms was sent out among UNM quantum experts, saying now was the time for big ideas because federal support for quantum research was gaining traction. Devetsikiotis vision was to create a quantum ecosystem, one that could unite the foundational quantum research in physics atUNMsCenter for Quantum Information and Control(CQuIC) with new quantum computing and engineering initiatives for solving big real-world mathematical problems.

At first, I thought [quantum] was something for physicists, explains Devetsikiotis. But I realized its a great opportunity for the ECE department to develop real engineering solutions to these real-world problems.

CQuIC is the foundation of UNMs long-standing involvement in quantum research, resulting in participation in theNational Quantum Initiative(NQI) passed by Congress in 2018 to support multidisciplinary research and training in quantum information science. UNM has been a pioneer in quantum information science since the field emerged 25 years ago, as CQuIC Director Ivan Deutsch knows first-hand.

This is a very vibrant time in our field, moving from physics to broader activities, says Deutsch, and [Devetsikiotis] has seen this as a real growth area, connecting engineering with the existing strengths we have in the CQuIC.

With strategic support from the Office of the Vice President for Research, Devetsikiotis secured National Science Foundation funding to support a Quantum Computing & Information Science (QCIS) faculty fellow. The faculty member will join the Department of Electrical and Computer Engineering with the goal to unite well-established quantum research in physics with new quantum education and research initiatives in engineering. This includes membership in CQuIC and implementation of the IBM Q Hub program, as well as a partnership with Los Alamos National Lab for a Quantum Computing Summer School to develop new curricula, educational materials, and mentorship of next-generation quantum computing and information scientists.As part of the Q Hub at NC State, UNM gains access to IBMs largest quantum computing systems for commercial use cases and fundamental research. It also allows for the restructuring of existing quantum courses to be more hands-on and interdisciplinary than they have in the past, as well as the creation of new courses, a new masters degree program in QCIS, and a new university-wide Ph.D. concentration in QCIS that can be added to several departments including ECE, Computer Science, Physics and Astronomy, and Chemistry.

Theres been a lot of challenges, Devetsikiotis says, but there has also been a lot of good timing, and thankfully The University has provided support for us. UNM has solidified our seat at the quantum table and can now bring in the industrial side.

For additional graphics and full announcement, https://news.unm.edu/news/the-university-of-new-mexico-becomes-ibm-q-hubs-first-university-member

Source: Natalie Rogers, University of New Mexico

View original post here:
The University of New Mexico Becomes IBM Q Hub's First University Member - HPCwire

Posted in Quantum Computing | Comments Off on The University of New Mexico Becomes IBM Q Hub’s First University Member – HPCwire

What’s New in HPC Research: Astronomy, Weather, Security & More – HPCwire

In this bimonthly feature,HPCwirehighlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here.

Developing the HPC system for the ASKAP telescope

The Australian Square Kilometre Array Pathfinder (ASKAP) telescope (itself a pilot project for the record-setting Square Kilometre Array planned for construction in the coming years) will enable highly sensitive radio astronomy that produces a tremendous amount of data. In this paper, researchers from the Commonwealth Scientific and Industrial Research Organisation (CSIRO) highlight how they are preparing a dedicated HPC platform, called ASKAPsoft, to handle the expected 5 PB/year of data produced by ASKAP.

Authors: Juan C. Guzman, Eric Bastholm, Wasim raja, Matthew Whiting, Daniel Mitchell, Stephen Ord and Max Voronkov.

Creating an open infrastructure for sharing and reusing HPC knowledge

In an expert field like HPC, institutional memory and information-sharing is crucial for maintaining and building on expertise but institutions often lack cohesive infrastructures to perpetuate that knowledge. These authors, a team from North Carolina State University and Lawrence Livermore National Laboratory, introduce OpenK, an open, ontology-based infrastructure aimed at facilitating the accumulation, sharing and reuse of HPC knowledge.

Authors: Yue Zhao, Xipeng Shen and Chunhua Liao.

Using high-performance data analysis to facilitate HPC-powered astrophysics

High-performance data analysis (HPDA) is an emerging tool for scientific disciplines like bioscience, climate science and security and now, its being used to prepare astrophysics research for exascale. In this paper, written by a team from the Astronomical Observatory of Trieste, Italy, the authors discuss the ExaNeSt and EuroExa projects, which built a prototype of a low-power exascale facility for HPDA and astrophysics.

Authors: Giuliano Taffoni, David Goz, Luca Tornatore, Marco Frailis, Gianmarco Maggio and Fabio Pasian.

Using power analysis to identify HPC activity

Monitoring users on large computing platforms such as [HPC] and cloud computing systems, these authors a duo from Lawrence Berkeley National Laboratory write, is non-trivial. Users can (and have) abused access to HPC systems, they say, but process viewers and other monitoring tools can impose substantial overhead. To that end, they introduce a technique for identifying running programs with 97% accuracy using just the systems power consumption.

Authors: Bogdan Copos and Sein Peisert.

Building resilience and fault tolerance in HPC for numerical weather and climate prediction

In numerical weather and climate prediction (NWP), accuracy depends strongly on available computing power but the increasing number of cores in top systems is leading to a higher frequency of hardware and software failures for NWP simulations. This report (from researchers at eight different institutions) examines approaches for fault tolerance in numerical algorithms and system resilience in parallel simulations for those NWP tools.

Authors: Tommaso Benacchio, Luca Bonaventura, Mirco Altenbernd, Chris D. Cantwell, Peter D. Dben, Mike Gillard, Luc Giraud, Dominik Gddeke, Erwan Raffin, Keita Teranishi and Nils Wedi.

Pioneering the exascale era with astronomy

Another team this time, from SURF, a collaborative organization for Dutch research also investigated the intersection of astronomy and the exascale era. This paper, written by three researchers from SURF, highlights a new, OpenStack-based cloud infrastructure layer and Spider, a new addition to SURFs high-throughput data processing platform. The authors explore how these additions help to prepare the astronomical research community for the exascale era, in particular with regard to data-intensive experiments like the Square Kilometre Array.

Authors: J. B. R. Oonk, C. Schrijvers and Y. van den Berg.

Enabling EASEY deployment of containerized applications for future HPC systems

As the exascale era approaches, HPC systems are growing in complexity, improving performance but making the systems less accessible for new users. These authors a duo from the Ludwig Maximilian University of Munich propose a support framework for these future HPC architectures called EASEY (for Enable exAScale for EverYone) that can automatically deploy optimized container computations with negligible overhead[.]

Authors: Maximilian Hb and Dieter Kranzlmller.

Do you know about research that should be included in next months list? If so, send us an email at[emailprotected]. We look forward to hearing from you.

Original post:
What's New in HPC Research: Astronomy, Weather, Security & More - HPCwire

Posted in Quantum Computing | Comments Off on What’s New in HPC Research: Astronomy, Weather, Security & More – HPCwire

What Is the Many-Worlds Theory of Quantum Mechanics? – The Wire

Photo: Kelly Sikkema/Unsplash.

Quantum physics is strange. At least, it is strange to us, because the rules of the quantum world, which govern the way the world works at the level of atoms and subatomic particles (the behaviour of light and matter, as the renowned physicist Richard Feynman put it), are not the rules that we are familiar with the rules of what we call common sense.

The quantum rules, which were mostly established by the end of the 1920s, seem to be telling us that a cat can be both alive and dead at the same time, while a particle can be in two places at once. But to the great distress of many physicists, let alone ordinary mortals, nobody (then or since) has been able to come up with a common-sense explanation of what is going on. More thoughtful physicists have sought solace in other ways, to be sure, namely coming up with a variety of more or less desperate remedies to explain what is going on in the quantum world.

These remedies, the quanta of solace, are called interpretations. At the level of the equations, none of these interpretations is better than any other, although the interpreters and their followers will each tell you that their own favored interpretation is the one true faith, and all those who follow other faiths are heretics. On the other hand, none of the interpretations is worse than any of the others, mathematically speaking. Most probably, this means that we are missing something. One day, a glorious new description of the world may be discovered that makes all the same predictions as present-day quantum theory, but also makes sense. Well, at least we can hope.

Meanwhile, I thought I might provide an agnostic overview of one of the more colorful of the hypotheses, the many-worlds, or multiple universes, theory. For overviews of the other five leading interpretations, I point you to my book, Six Impossible Things. I think youll find that all of them are crazy, compared with common sense, and some are more crazy than others. But in this world, crazy does not necessarily mean wrong, and being more crazy does not necessarily mean more wrong.

If you have heard of the Many Worlds Interpretation (MWI), the chances are you think that it was invented by the American Hugh Everett in the mid-1950s. In a way thats true. He did come up with the idea all by himself. But he was unaware that essentially the same idea had occurred to Erwin Schrdinger half a decade earlier. Everetts version is more mathematical, Schrdingers more philosophical, but the essential point is that both of them were motivated by a wish to get rid of the idea of the collapse of the wave function, and both of them succeeded.

Also read: If You Thought Quantum Mechanics Was Weird, Wait Till You Hear About Entangled Time

As Schrdinger used to point out to anyone who would listen, there is nothing in the equations (including his famous wave equation) about collapse. That was something that Bohr bolted on to the theory to explain why we only see one outcome of an experiment a dead cat or a live cat not a mixture, a superposition of states. But because we only detect one outcome one solution to the wave function that need not mean that the alternative solutions do not exist. In a paper he published in 1952, Schrdinger pointed out the ridiculousness of expecting a quantum superposition to collapse just because we look at it. It was, he wrote, patently absurd that the wave function should be controlled in two entirely different ways, at times by the wave equation, but occasionally by direct interference of the observer, not controlled by the wave equation.

Although Schrdinger himself did not apply his idea to the famous cat, it neatly resolves that puzzle. Updating his terminology, there are two parallel universes, or worlds, in one of which the cat lives, and in one of which it dies. When the box is opened in one universe, a dead cat is revealed. In the other universe, there is a live cat. But there always were two worlds that had been identical to one another until the moment when the diabolical device determined the fate of the cat(s). There is no collapse of the wave function. Schrdinger anticipated the reaction of his colleagues in a talk he gave in Dublin, where he was then based, in 1952. After stressing that when his eponymous equation seems to describe different possibilities (they are not alternatives but all really happen simultaneously), he said:

Nearly every result [the quantum theorist] pronounces is about the probability of this or that or that happening with usually a great many alternatives. The idea that they may not be alternatives but all really happen simultaneously seems lunatic to him, just impossible. He thinks that if the laws of nature took this form for, let me say, a quarter of an hour, we should find our surroundings rapidly turning into a quagmire, or sort of a featureless jelly or plasma, all contours becoming blurred, we ourselves probably becoming jelly fish. It is strange that he should believe this. For I understand he grants that unobserved nature does behave this waynamely according to the wave equation. The aforesaid alternatives come into play only when we make an observation which need, of course, not be a scientific observation. Still it would seem that, according to the quantum theorist, nature is prevented from rapid jellification only by our perceiving or observing it it is a strange decision.

In fact, nobody responded to Schrdingers idea. It was ignored and forgotten, regarded as impossible. So Everett developed his own version of the MWI entirely independently, only for it to be almost as completely ignored. But it was Everett who introduced the idea of the Universe splitting into different versions of itself when faced with quantum choices, muddying the waters for decades.

It was Hugh Everett who introduced the idea of the Universe splitting into different versions of itself when faced with quantum choices, muddying the waters for decades.

Everett came up with the idea in 1955, when he was a PhD student at Princeton. In the original version of his idea, developed in a draft of his thesis, which was not published at the time, he compared the situation with an amoeba that splits into two daughter cells. If amoebas had brains, each daughter would remember an identical history up until the point of splitting, then have its own personal memories. In the familiar cat analogy, we have one universe, and one cat, before the diabolical device is triggered, then two universes, each with its own cat, and so on. Everetts PhD supervisor, John Wheeler, encouraged him to develop a mathematical description of his idea for his thesis, and for a paper published in the Reviews of Modern Physics in 1957, but along the way, the amoeba analogy was dropped and did not appear in print until later. But Everett did point out that since no observer would ever be aware of the existence of the other worlds, to claim that they cannot be there because we cannot see them is no more valid than claiming that the Earth cannot be orbiting around the Sun because we cannot feel the movement.

Also read: What Is Quantum Biology?

Everett himself never promoted the idea of the MWI. Even before he completed his PhD, he had accepted the offer of a job at the Pentagon working in the Weapons Systems Evaluation Group on the application of mathematical techniques (the innocently titled game theory) to secret Cold War problems (some of his work was so secret that it is still classified) and essentially disappeared from the academic radar. It wasnt until the late 1960s that the idea gained some momentum when it was taken up and enthusiastically promoted by Bryce DeWitt, of the University of North Carolina, who wrote: every quantum transition taking place in every star, in every galaxy, in every remote corner of the universe is splitting our local world on Earth into myriad copies of itself. This became too much for Wheeler, who backtracked from his original endorsement of the MWI, and in the 1970s, said: I have reluctantly had to give up my support of that point of view in the end because I am afraid it carries too great a load of metaphysical baggage. Ironically, just at that moment, the idea was being revived and transformed through applications in cosmology and quantum computing.

Every quantum transition taking place in every star, in every galaxy, in every remote corner of the universe is splitting our local world on Earth into myriad copies of itself.

The power of the interpretation began to be appreciated even by people reluctant to endorse it fully. John Bell noted that persons of course multiply with the world, and those in any particular branch would experience only what happens in that branch, and grudgingly admitted that there might be something in it:

The many worlds interpretation seems to me an extravagant, and above all an extravagantly vague, hypothesis. I could almost dismiss it as silly. And yet It may have something distinctive to say in connection with the Einstein Podolsky Rosen puzzle, and it would be worthwhile, I think, to formulate some precise version of it to see if this is really so. And the existence of all possible worlds may make us more comfortable about the existence of our own world which seems to be in some ways a highly improbable one.

The precise version of the MWI came from David Deutsch, in Oxford, and in effect put Schrdingers version of the idea on a secure footing, although when he formulated his interpretation, Deutsch was unaware of Schrdingers version. Deutsch worked with DeWitt in the 1970s, and in 1977, he met Everett at a conference organized by DeWitt the only time Everett ever presented his ideas to a large audience. Convinced that the MWI was the right way to understand the quantum world, Deutsch became a pioneer in the field of quantum computing, not through any interest in computers as such, but because of his belief that the existence of a working quantum computer would prove the reality of the MWI.

This is where we get back to a version of Schrdingers idea. In the Everett version of the cat puzzle, there is a single cat up to the point where the device is triggered. Then the entire Universe splits in two. Similarly, as DeWitt pointed out, an electron in a distant galaxy confronted with a choice of two (or more) quantum paths causes the entire Universe, including ourselves, to split. In the DeutschSchrdinger version, there is an infinite variety of universes (a Multiverse) corresponding to all possible solutions to the quantum wave function. As far as the cat experiment is concerned, there are many identical universes in which identical experimenters construct identical diabolical devices. These universes are identical up to the point where the device is triggered. Then, in some universes the cat dies, in some it lives, and the subsequent histories are correspondingly different. But the parallel worlds can never communicate with one another. Or can they?

Deutsch argues that when two or more previously identical universes are forced by quantum processes to become distinct, as in the experiment with two holes, there is a temporary interference between the universes, which becomes suppressed as they evolve. It is this interaction that causes the observed results of those experiments. His dream is to see the construction of an intelligent quantum machine a computer that would monitor some quantum phenomenon involving interference going on within its brain. Using a rather subtle argument, Deutsch claims that an intelligent quantum computer would be able to remember the experience of temporarily existing in parallel realities. This is far from being a practical experiment. But Deutsch also has a much simpler proof of the existence of the Multiverse.

What makes a quantum computer qualitatively different from a conventional computer is that the switches inside it exist in a superposition of states. A conventional computer is built up from a collection of switches (units in electrical circuits) that can be either on or off, corresponding to the digits 1 or 0. This makes it possible to carry out calculations by manipulating strings of numbers in binary code. Each switch is known as a bit, and the more bits there are, the more powerful the computer is. Eight bits make a byte, and computer memory today is measured in terms of billions of bytes gigabytes, or Gb. Strictly speaking, since we are dealing in binary, a gigabyte is 230 bytes, but that is usually taken as read. Each switch in a quantum computer, however, is an entity that can be in a superposition of states. These are usually atoms, but you can think of them as being electrons that are either spin up or spin down. The difference is that in the superposition, they are both spin up and spin down at the same time 0 and 1. Each switch is called a qbit, pronounced cubit.

Using a rather subtle argument, Deutsch claims that an intelligent quantum computer would be able to remember the experience of temporarily existing in parallel realities.

Because of this quantum property, each qbit is equivalent to two bits. This doesnt look impressive at first sight, but it is. If you have three qbits, for example, they can be arranged in eight ways: 000, 001, 010, 011, 100, 101, 110, 111. The superposition embraces all these possibilities. So three qbits are not equivalent to six bits (2 x 3), but to eight bits (2 raised to the power of 3). The equivalent number of bits is always 2 raised to the power of the number of qbits. Just 10 qbits would be equivalent to 210 bits, actually 1,024, but usually referred to as a kilobit. Exponentials like this rapidly run away with themselves. A computer with just 300 qbits would be equivalent to a conventional computer with more bits than there are atoms in the observable Universe. How could such a computer carry out calculations? The question is more pressing since simple quantum computers, incorporating a few qbits, have already been constructed and shown to work as expected. They really are more powerful than conventional computers with the same number of bits.

Deutschs answer is that the calculation is carried out simultaneously on identical computers in each of the parallel universes corresponding to the superpositions. For a three-qbit computer, that means eight superpositions of computer scientists working on the same problem using identical computers to get an answer. It is no surprise that they should collaborate in this way, since the experimenters are identical, with identical reasons for tackling the same problem. That isnt too difficult to visualize. But when we build a 300-qbit machinewhich will surely happenwe will, if Deutsch is right, be involving a collaboration between more universes than there are atoms in our visible Universe. It is a matter of choice whether you think that is too great a load of metaphysical baggage. But if you do, you will need some other way to explain why quantum computers work.

Also read: The Science and Chaos of Complex Systems

Most quantum computer scientists prefer not to think about these implications. But there is one group of scientists who are used to thinking of even more than six impossible things before breakfast the cosmologists. Some of them have espoused the Many Worlds Interpretation as the best way to explain the existence of the Universe itself.

Their jumping-off point is the fact, noted by Schrdinger, that there is nothing in the equations referring to a collapse of the wave function. And they do mean thewave function; just one, which describes the entire world as a superposition of states a Multiverse made up of a superposition of universes.

Some cosmologists have espoused the Many Worlds Interpretation as the best way to explain the existence of the Universe itself.

The first version of Everetts PhD thesis (later modified and shortened on the advice of Wheeler) was actually titled The Theory of the Universal Wave Function. And by universal he meant literally that, saying:

Since the universal validity of the state function description is asserted, one can regard the state functions themselves as the fundamental entities, and one can even consider the state function of the whole universe. In this sense this theory can be called the theory of the universal wave function, since all of physics is presumed to follow from this function alone.

where for the present purpose state function is another name for wave function. All of physics means everything, including us the observers in physics jargon. Cosmologists are excited by this, not because they are included in the wave function, but because this idea of a single, uncollapsed wave function is the only way in which the entire Universe can be described in quantum mechanical terms while still being compatible with the general theory of relativity. In the short version of his thesis published in 1957, Everett concluded that his formulation of quantum mechanics may therefore prove a fruitful framework for the quantization of general relativity. Although that dream has not yet been fulfilled, it has encouraged a great deal of work by cosmologists since the mid-1980s, when they latched on to the idea. But it does bring with it a lot of baggage.

The universal wave function describes the position of every particle in the Universe at a particular moment in time. But it also describes every possible location of those particles at that instant. And it also describes every possible location of every particle at any other instant of time, although the number of possibilities is restricted by the quantum graininess of space and time. Out of this myriad of possible universes, there will be many versions in which stable stars and planets, and people to live on those planets, cannot exist. But there will be at least some universes resembling our own, more or less accurately, in the way often portrayed in science fiction stories. Or, indeed, in other fiction. Deutsch has pointed out that according to the MWI, any world described in a work of fiction, provided it obeys the laws of physics, really does exist somewhere in the Multiverse. There really is, for example, a Wuthering Heights world (but not a Harry Potter world).

That isnt the end of it. The single wave function describes all possible universes at all possible times. But it doesnt say anything about changing from one state to another. Time does not flow. Sticking close to home, Everetts parameter, called a state vector, includes a description of a world in which we exist, and all the records of that worlds history, from our memories, to fossils, to light reaching us from distant galaxies, exist. There will also be another universe exactly the same except that the time step has been advanced by, say, one second (or one hour, or one year).

But there is no suggestion that any universe moves along from one time step to another. There will be a me in this second universe, described by the universal wave function, who has all the memories I have at the first instant, plus those corresponding to a further second (or hour, or year, or whatever). But it is impossible to say that these versions of me are the same person. Different time states can be ordered in terms of the events they describe, defining the difference between past and future, but they do not change from one state to another. All the states just exist. Time, in the way we are used to thinking of it, does not flow in Everetts MWI.

John Gribbin is a Visiting Fellow in Astronomy at the University of Sussex, UK and the author of In Search of Schrdingers Cat, The Universe: A Biography and Six Impossible Thingsfrom which this article is excerpted.

Thisarticlehas been republished fromThe MIT Press Reader.

More here:
What Is the Many-Worlds Theory of Quantum Mechanics? - The Wire

Posted in Quantum Computing | Comments Off on What Is the Many-Worlds Theory of Quantum Mechanics? – The Wire

WISeKey is Adapting its R&D and Extended Patents Portfolio to the Post-COVID 19 Economy with Specific Focus on Post-Quantum Cryptography -…

WISeKey is Adapting its R&D and Extended Patents Portfolio to the Post-COVID 19 Economy with Specific Focus on Post-Quantum Cryptography

With more than 25% of its 2019 annual turnover invested in R&D, WISeKey is a significant and recognized contributor to digital trust in an interconnected world. The Companys recent publication and a conference presentation about post-quantum cryptography illustrates once again that innovation is at the heart of the Company.

WISeKey is involved in this NIST PQC (Post-Quantum Cryptography) program with the only objective of providing future-proof digital security solutions based on existing and new hardware architectures

Geneva, Switzerland May 28, 2020: WISeKey International Holding Ltd. (WISeKey) (SIX: WIHN, NASDAQ: WKEY), a leading global cybersecurity and IoT company, published today a technical article (https://www.wisekey.com/articles-white-papers/) discussing how to guarantee digital security and protect against hackers who will take advantage of the power of quantum information science. This research was presented (video here: https://www.wisekey.com/videos/) during the remote International Workshop on Code-Based Cryptography (CBCrypto 2020 Zagreb, Croatia May 9-10 2020).

IoT products are a major component of the 4th industrial revolution which brings together advances in computational power, semiconductors, blockchain, wireless communication, AI and data to build a vast technology infrastructure that works nearly autonomously.

According to a recent report published by Fortune Business Insights and titled Internet of Things (IoT) Market Size, Share and Industry Analysis By Platform (Device Management, Application Management, Network Management), By Software & Services (Software Solution, Services), By End-Use Industry (BFSI, Retail, Governments, Healthcare, Others) And Regional Forecast, 2019 2026., the IoT market was valued at USD 190.0 billion in 2018. It is projected to reach USD 1,102.6 billion by 2026, with a CAGR of 24.7% in the forecast period. Huge advances in manufacturing have allowed even small manufacturers to produce relatively sophisticated IoT products. This brings to the surface issues related to patents governing IoT products and communication standards governing devices.

Studies about quantum computing, namely how to use quantum mechanical phenomena to perform computation, were initiated in the early 1980s. The perspectives are endless and the future computers will get an incredible computing power when using this technology. When used by hackers, these computers will become a risk to cybersecurity: all the cryptographic algorithms used today to secure our digital world are exposed. Therefore, the US National Institute of Standards and Technology (NIST) launched in 2016 a wide campaign to find new resistant algorithms.

WISeKeys R&D department is very much involved in this NIST PQC (Post-Quantum Cryptography) program with the only objective to provide the market with future-proof digital security solutions based on existing and new hardware architectures. The new article reports one of the Companys current contributions to this safer cyber future. ROLLO-I, a NIST shortlisted algorithm, was implemented on some of WISeKeys secure chips (MS600x secure microcontrollers, VaultIC secure elements, ) with countermeasures to make them robust against attacks.

Although nobody exactly knows when quantum computers are going to be massively available, this is certainly going to happen. WISeKey is significantly investing to develop new technologies and win this race.

With a rich portfolio of more than 100 fundamental individual patents and 20 pending ones in various domains including the design of secure chips, Near Field Communication (NFC), the development of security firmware and backend software, the secure management of data, the improvement of security protocols between connected objects and advanced cryptography, to mention a few, WISeKey has become a key technology provider in the cybersecurity arena, says Carlos Moreira, Founder and CEO of WISeKey. This precious asset makes WISeKey the right Digital Trust Partner to deploy the current and future Internet of Everything.

Want to know more about WISeKeys Intellectual Properties? Please visit our website: https://www.wisekey.com/patents/.

About WISeKey

WISeKey (NASDAQ: WKEY; SIX Swiss Exchange: WIHN) is a leading global cybersecurity company currently deploying large scale digital identity ecosystems for people and objects using Blockchain, AI and IoT respecting the Human as the Fulcrum of the Internet. WISeKey microprocessors secure the pervasive computing shaping todays Internet of Everything. WISeKey IoT has an install base of over 1.5 billion microchips in virtually all IoT sectors (connected cars, smart cities, drones, agricultural sensors, anti-counterfeiting, smart lighting, servers, computers, mobile phones, crypto tokens etc.). WISeKey is uniquely positioned to be at the edge of IoT as our semiconductors produce a huge amount of Big Data that, when analyzed with Artificial Intelligence (AI), can help industrial applications to predict the failure of their equipment before it happens.

Our technology is Trusted by the OISTE/WISeKeys Swiss based cryptographic Root of Trust (RoT) provides secure authentication and identification, in both physical and virtual environments, for the Internet of Things, Blockchain and Artificial Intelligence. The WISeKey RoT serves as a common trust anchor to ensure the integrity of online transactions among objects and between objects and people. For more information, visitwww.wisekey.com.

Press and investor contacts:

Disclaimer:This communication expressly or implicitly contains certain forward-looking statements concerning WISeKey International Holding Ltd and its business. Such statements involve certain known and unknown risks, uncertainties and other factors, which could cause the actual results, financial condition, performance or achievements of WISeKey International Holding Ltd to be materially different from any future results, performance or achievements expressed or implied by such forward-looking statements. WISeKey International Holding Ltd is providing this communication as of this date and does not undertake to update any forward-looking statements contained herein as a result of new information, future events or otherwise.This press release does not constitute an offer to sell, or a solicitation of an offer to buy, any securities, and it does not constitute an offering prospectus within the meaning of article 652a or article 1156 of the Swiss Code of Obligations or a listing prospectus within the meaning of the listing rules of the SIX Swiss Exchange. Investors must rely on their own evaluation of WISeKey and its securities, including the merits and risks involved. Nothing contained herein is, or shall be relied on as, a promise or representation as to the future performance of WISeKey.

Originally posted here:
WISeKey is Adapting its R&D and Extended Patents Portfolio to the Post-COVID 19 Economy with Specific Focus on Post-Quantum Cryptography -...

Posted in Quantum Computing | Comments Off on WISeKey is Adapting its R&D and Extended Patents Portfolio to the Post-COVID 19 Economy with Specific Focus on Post-Quantum Cryptography -…