Search Immortality Topics:

Page 110«..1020..109110111112..120..»


Category Archives: Quantum Computing

New Quantum Switch Turns Metals Into Insulators by Altering the Quantum Nature of the Material – SciTechDaily

This is an artists impression of the dissolving of the electronic traffic jam. The red atoms are different in their quantum nature and allow transport of electrons in their surroundings. Credit: SBQMI

Most modern electronic devices rely on tiny, finely-tuned electrical currents to process and store information. These currents dictate how fast our computers run, how regularly our pacemakers tick and how securely our money is stored in the bank.

In a study published in Nature Physics on January 27, 2020, researchers at the University of British Columbia have demonstrated an entirely new way to precisely control such electrical currents by leveraging the interaction between an electrons spin (which is the quantum magnetic field it inherently carries) and its orbital rotation around the nucleus.

We have found a new way to switch the electrical conduction in materials from on to off, said lead author Berend Zwartsenberg, a Ph.D. student at UBCs Stewart Blusson Quantum Matter Institute (SBQMI). Not only does this exciting result extend our understanding of how electrical conduction works, it will help us further explore known properties such as conductivity, magnetism, and superconductivity, and discover new ones that could be important for quantum computing, data storage, and energy applications.

Broadly, all materials can be categorized as metals or insulators, depending on the ability of electrons to move through the material and conduct electricity.

However, not all insulators are created equally. In simple materials, the difference between metallic and insulating behavior stems from the number of electrons present: an odd number for metals, and an even number for insulators. In more complex materials, like so-called Mott insulators, the electrons interact with each other in different ways, with a delicate balance determining their electrical conduction.

Measurement of a material where modification of the spin-orbit coupling has been used to make it electronically conductive. The dark colors represent electrons that are free to move through the material, and are an indicator of the conductive behavior. Credit: Berend Zwartsenberg/SBQMI

In a Mott insulator, electrostatic repulsion prevents the electrons from getting too close to one another, which creates a traffic jam and limits the free flow of electrons. Until now, there were two known ways to free up the traffic jam: by reducing the strength of the repulsive interaction between electrons, or by changing the number of electrons.

The SBQMI team explored a third possibility: was there a way to alter the very quantum nature of the material to enable a metal-insulator transition to occur?

Using a technique called angle-resolved photoemission spectroscopy, the team examined the Mott insulator Sr2IrO4, monitoring the number of electrons, their electrostatic repulsion, and finally the interaction between the electron spin and its orbital rotation.

We found that coupling the spin to the orbital angular momentum slows the electrons down to such an extent that they become sensitive to one anothers presence, solidifying the traffic jam. said Zwartsenberg. Reducing spin-orbit coupling in turn eases the traffic jam and we were able to demonstrate a transition from an insulator to a metal for the first time using this strategy.

This is a really exciting result at the fundamental physics level, and expands the potential of modern electronics, said co-author Andrea Damascelli, principal investigator and scientific director of SBQMI. If we can develop a microscopic understanding of these phases of quantum matter and their emergent electronic phenomena, we can exploit them by engineering quantum materials atom-by-atom for new electronic, magnetic and sensing applications.

Reference: Spin-orbit-controlled metalinsulator transition in Sr2IrO4 by B. Zwartsenberg, R. P. Day, E. Razzoli, M. Michiardi, N. Xu, M. Shi, J. D. Denlinger, G. Cao, S. Calder, K. Ueda, J. Bertinshaw, H. Takagi, B. J. Kim, I. S. Elfimov and A. Damascelli, 27 January 2020, Nature Physics.DOI: 10.1038/s41567-019-0750-y

Read the rest here:
New Quantum Switch Turns Metals Into Insulators by Altering the Quantum Nature of the Material - SciTechDaily

Posted in Quantum Computing | Comments Off on New Quantum Switch Turns Metals Into Insulators by Altering the Quantum Nature of the Material – SciTechDaily

Could Photonic Chips Outpace the Fastest Supercomputers? – Singularity Hub

Theres been a lot of talk about quantum computers being able to solve far more complex problems than conventional supercomputers. The authors of a new paper say theyre on the path to showing an optical computer can do so, too.

The idea of using light to carry out computing has a long pedigree, and it has gained traction in recent years with the advent of silicon photonics, which makes it possible to build optical circuits using the same underlying technology used for electronics. The technology shows particular promise for accelerating deep learning, and is being actively pursued by Intel and a number of startups.

Now Chinese researchers have put a photonic chip to work tackling a fiendishly complex computer science challenge called the subset sum problem. It has some potential applications in cryptography and resource allocation, but primarily its used as a benchmark to test the limits of computing.

Essentially the task is to work out whether any subset of a given selection of numbers adds up to a chosen target number. The task is NP-complete, which means the time required to solve it scales rapidly as you use a bigger selection of numbers, making it fundamentally tricky to calculate large instances of the challenge in a reasonable time using normal computing approaches.

However, optical computers work very differently from standard ones, and the device built by the researchers was able to solve the problem in a way that suggests future versions could outpace even the fastest supercomputers. They even say it could be a step on the way to photonic supremacy, mimicking the term quantum supremacy used to denote the point at which quantum computers outperform classical ones.

The chip the researchers designed is quite different from a conventional processor, though, and did not rely on silicon photonics. While most chips can be reprogrammed, the ones built by the researchers can only solve a particular instance of the subset problem. A laser was used to etch the task into a special glass by creating a network of wave-guides that channel photons through the processor as well as a series of junctions that get the light beams to split, pass each other, or converge.

They used a laser and series of lenses and mirrors to shoot a beam of light into one end of the processor, and a light detector then picked up the output as it came out the other side. The network of channels is designed to split the light into many different beams that explore all possible combinations of numbers simultaneously in parallel.

The team created two chips designed to solve the problem for sets of three and four numbers, and they showed it could do both easily and efficiently. Problems that small arent especially tough; you could probably do them on the back of an envelope, and conventional chips can work them out in fractions of a nanosecond.

However, the researchers say their approach could fairly simply be scaled up to much bigger instances of the problemand thats where things get interesting. For their approach, the time it takes to compute is simply a function of the speed of light and the longest path in the network. The former doesnt change and the latter goes up fairly gradually with bigger problems, and so their calculations show computing time shouldnt shift much even scaling up to far bigger problems.

Conventional chips have to do a brute-force search of every possible combination of numbers, which expands rapidly as the problem gets bigger. The groups calculations suggest that their chip would surpass a state-of-the-art Intel i7 CPU at a problem size of just six, which they think they should be able to demonstrate in their next experiment. Their estimates also predict their approach would overtake the worlds most powerful supercomputer, Summit, at a problem size of just 28.

Obviously, the proof is in the pudding, and until theyve built much larger chips its hard to predict if there might be unforeseen roadblocks. The fact that each chip is bespoke for a particular problem would seem to make it impractical for most applications.

While there is some prospect of mapping real-world problems onto subset problems that could be solved in this way, its likely any practical application would use an alternative chip design. Butthe researchers say its a great demonstration of the potential for photonic approaches to vastly outstrip conventional computers at some problems.

Image Credit: Image by Thomas-Suisse from Pixabay

Originally posted here:
Could Photonic Chips Outpace the Fastest Supercomputers? - Singularity Hub

Posted in Quantum Computing | Comments Off on Could Photonic Chips Outpace the Fastest Supercomputers? – Singularity Hub

What Is Quantum Computing and How Does it Work? – Built In

Accustomed to imagining worst-case scenarios, many cryptography experts are more concerned than usual these days: one of the most widely used schemes for safely transmitting data is poised to become obsolete once quantum computing reaches a sufficiently advanced state.

The cryptosystem known as RSA provides the safety structure for a host of privacy and communication protocols, from email to internet retail transactions. Current standards rely on the fact that no one has the computing power to test every possible way to de-scramble your data once encrypted, but a mature quantum computer could try every option within a matter of hours.

It should be stressed that quantum computers havent yet hit that level of maturity and wont for some time but when a large, stable device is built (or if its built, asan increasingly diminishing minority argue), its unprecedented ability to factor large numbers would essentially leave the RSA cryptosystem in tatters. Thankfully, the technology is still a ways away and the experts are on it.

Dont panic. Thats what Mike Brown, CTO and co-founder of quantum-focused cryptography company ISARA Corporation, advises anxious prospective clients. The threat is far from imminent. What we hear from the academic community and from companies like IBM and Microsoft is that a 2026-to-2030 timeframe is what we typically use from a planning perspective in terms of getting systems ready, he said.

Cryptographers from ISARA are among several contingents currently taking part in the Post-Quantum Cryptography Standardization project, a contest of quantum-resistant encryption schemes. The aim is to standardize algorithms that can resist attacks levied by large-scale quantum computers. The competition was launched in 2016 by the National Institute of Standards and Technology (NIST), a federal agency that helps establish tech and science guidelines, and is now gearing up for its third round.

Indeed, the level of complexity and stability required of a quantum computer to launch the much-discussed RSA attack is very extreme, according to John Donohue, scientific outreach manager at the University of Waterloos Institute for Quantum Computing. Even granting that timelines in quantum computing particularly in terms of scalability are points of contention, the community is pretty comfortable saying thats not something thats going to happen in the next five to 10 years, he said.

When Google announced that it had achieved quantum supremacy or that it used a quantum computer to run, in minutes, an operation that would take thousands of years to complete on a classical supercomputer that machine operated on 54 qubits, the computational bedrocks of quantum computing. While IBMs Q 53 system operates at a similar level, many current prototypes operate on as few as 20 or even five qubits.

But how many qubits would be needed to crack RSA? Probably on the scale of millions of error-tolerant qubits, Donohue told Built In.

Scott Aaronson, a computer scientist at the University of Texas at Austin, underscored the same last year in his popular blog after presidential candidate Andrew Yang tweeted that no code is uncrackable in the wake of Googles proof-of-concept milestone.

Thats the good news. The bad news is that, while cryptography experts gain more time to keep our data secure from quantum computers, the technologys numerous potential upsides ranging from drug discovery to materials science to financial modeling is also largely forestalled. And that question of error tolerance continues to stand as quantum computings central, Herculean challenge. But before we wrestle with that, lets get a better elemental sense of the technology.

Quantum computers process information in a fundamentally different way than classical computers. Traditional computers operate on binary bits information processed in the form of ones or zeroes. But quantum computers transmit information via quantum bits, or qubits, which can exist either as one or zero or both simultaneously. Thats a simplification, and well explore some nuances below, but that capacity known as superposition lies at the heart of quantums potential for exponentially greater computational power.

Such fundamental complexity both cries out for and resists succinct laymanization. When the New York Times asked 10 experts to explain quantum computing in the length of a tweet, some responses raised more questions than they answered:

Microsoft researcher David Reilly:

A quantum machine is a kind of analog calculator that computes by encoding information in the ephemeral waves that comprise light and matter at the nanoscale.

D-Wave Systems executive vice president Alan Baratz:

If were honest, everything we currently know about quantum mechanics cant fully describe how a quantum computer works.

Quantum computing also cries out for a digestible metaphor. Quantum physicist Shohini Ghose, of Wilfrid Laurier University, has likened the difference between quantum and classical computing to light bulbs and candles: The light bulb isnt just a better candle; its something completely different.

Rebecca Krauthamer, CEO of quantum computing consultancy Quantum Thought, compares quantum computing to a crossroads that allows a traveler to take both paths. If youre trying to solve a maze, youd come to your first gate, and you can go either right or left, she said. We have to choose one, but a quantum computer doesnt have to choose one. It can go right and left at the same time.

It can, in a sense, look at these different options simultaneously and then instantly find the most optimal path, she said. That's really powerful.

The most commonly used example of quantum superposition is Schrdingers cat:

Despite its ubiquity, many in the QC field arent so taken with Schrodingers cat. The more interesting fact about superposition rather than the two-things-at-once point of focus is the ability to look at quantum states in multiple ways, and ask it different questions, said Donohue. That is, rather than having to perform tasks sequentially, like a traditional computer, quantum computers can run vast numbers of parallel computations.

Part of Donohues professional charge is clarifying quantums nuances, so its worth quoting him here at length:

In superposition I can have state A and state B. I can ask my quantum state, are you A or B? And it will tell me, I'm a or I'm B. But I might have a superposition of A + B in which case, when I ask it, Are you A or B? Itll tell me A or B randomly.

But the key of superposition is that I can also ask the question, Are you in the superposition state of A + B? And then in that case, they'll tell me, Yes, I am the superposition state A + B.

But theres always going to be an opposite superposition. So if its A + B, the opposite superposition is A - B.

Thats about as simplified as we can get before trotting out equations. But the top-line takeaway is that that superposition is what lets a quantum computer try all paths at once.

Thats not to say that such unprecedented computational heft will displace or render moot classical computers. One thing that we can really agree on in the community is that it wont solve every type of problem that we run into, said Krauthamer.

But quantum computing is particularly well suited for certain kinds of challenges. Those include probability problems, optimization (what is, say, the best possible travel route?) and the incredible challenge of molecular simulation for use cases like drug development and materials discovery.

The cocktail of hype and complexity has a way of fuzzing outsiders conception of quantum computing which makes this point worth underlining: quantum computers exist, and they are being used right now.

They are not, however, presently solving climate change, turbocharging financial forecasting probabilities or performing other similarly lofty tasks that get bandied about in reference to quantum computings potential. QC may have commercial applications related to those challenges, which well explore further below, but thats well down the road.

Today, were still in whats known as the NISQ era Noisy, Intermediate-Scale Quantum. In a nutshell, quantum noise makes such computers incredibly difficult to stabilize. As such, NISQ computers cant be trusted to make decisions of major commercial consequence, which means theyre currently used primarily for research and education.

The technology just isnt quite there yet to provide a computational advantage over what could be done with other methods of computation at the moment, said Dohonue. Most [commercial] interest is from a long-term perspective. [Companies] are getting used to the technology so that when it does catch up and that timeline is a subject of fierce debate theyre ready for it.

Also, its fun to sit next to the cool kids. Lets be frank. Its good PR for them, too, said Donohue.

But NISQ computers R&D practicality is demonstrable, if decidedly small-scale. Donohue cites the molecular modeling of lithium hydrogen. Thats a small enough molecule that it can also be simulated using a supercomputer, but the quantum simulation provides an important opportunity to check our answers after a classical-computer simulation. NISQs have also delivered some results for problems in high-energy particle physics, Donohue noted.

One breakthrough came in 2017, when researchers at IBM modeled beryllium hydride, the largest molecule simulated on a quantum computer to date. Another key step arrived in 2019, when IonQ researchers used quantum computing to go bigger still, by simulating a water molecule.

These are generally still small problems that can be checked using classical simulation methods. But its building toward things that will be difficult to check without actually building a large particle physics experiment, which can get very expensive, Donohue said.

And curious minds can get their hands dirty right now. Users can operate small-scale quantum processors via the cloud through IBMs online Q Experience and its open-source software Quiskit. Late last year, Microsoft and Amazon both announced similar platforms, dubbed Azure Quantum and Braket. Thats one of the cool things about quantum computing today, said Krauthamer. We can all get on and play with it.

RelatedQuantum Computing and the Gaming Industry

Quantum computing may still be in its fussy, uncooperative stage, but that hasnt stopped commercial interests from diving in.

IBM announced at the recent Consumer Electronics Show that its so-called Q Network had expanded to more than 100 companies and organizations. Partners now range from Delta Air Lines to Anthem health to Daimler AG, which owns Mercedes-Benz.

Some of those partnerships hinge on quantum computings aforementioned promise in terms of molecular simulation. Daimler, for instance, is hoping the technology will one day yield a way to produce better batteries for electric vehicles.

Elsewhere, partnerships between quantum computing startups and leading companies in the pharmaceutical industry like those established between 1QBit and Biogen, and ProteinQure and AstraZeneca point to quantum molecular modelings drug-discovery promise, distant though it remains. (Today, drug development is done through expensive, relatively low-yield trial-and-error.)

Researchers would need millions of qubits to compute the chemical properties of a novel substance, noted theoretical physicist Sabine Hossenfelder in the Guardian last year. But the conceptual underpinning, at least, is there. A quantum computer knows quantum mechanics already, so I can essentially program in how another quantum system would work and use that to echo the other one, explained Donohue.

Theres also hope that large-scale quantum computers will help accelerate AI, and vice versa although experts disagree on this point. The reason theres controversy is, things have to be redesigned in a quantum world, said Krauthamer, who considers herself an AI-quantum optimist. We cant just translate algorithms from regular computers to quantum computers because the rules are completely different, at the most elemental level.

Some believe quantum computers can help combat climate change by improving carbon capture. Jeremy OBrien, CEO of Palo Alto-based PsiQuantum, wrote last year that quantum simulation of larger molecules if achieved could help build a catalyst for scrubbing carbon dioxide directly from the atmosphere.

Long-term applications tend to dominate headlines, but they also lead us back to quantum computings defining hurdle and the reason coverage remains littered with terms like potential and promise: error correction.

Qubits, it turns out, are higher maintenance than even the most meltdown-prone rock star. Any number of simple actions or variables can send error-prone qubits falling into decoherence, or the loss of a quantum state (mainly that all-important superposition). Things that can cause a quantum computer to crash include measuring qubits and running operations in other words: using it. Even small vibrations and temperature shifts will cause qubits to decohere, too.

Thats why quantum computers are kept isolated, and the ones that run on superconducting circuits the most prominent method, favored by Google and IBM have to be kept at near-absolute zero (a cool -460 degrees Fahrenheit).

Thechallenge is two-fold, according to Jonathan Carter, a scientist at Berkeley Quantum. First, individual physical qubits need to have better fidelity. That would conceivably happen either through better engineering, discovering optimal circuit layout, and finding the optimal combination of components. Second, we have to arrange them to form logical qubits.

Estimates range from hundreds to thousands to tens of thousands of physical qubits required to form one fault-tolerant qubit. I think its safe to say that none of the technology we have at the moment could scale out to those levels, Carter said.

From there, researchers would also have to build ever-more complex systems to handle the increase in qubit fidelity and numbers. So how long will it take until hardware-makers actually achieve the necessary error correction to make quantum computers commercially viable?

Some of these other barriers make it hard to say yes to a five- or 10-year timeline, Carter said.

Donohue invokes and rejects the same figure. Even the optimist wouldnt say its going to happen in the next five to 10 years, he said. At the same time, some small optimization problems, specifically in terms of random number generation could happen very soon.

Weve already seen some useful things in that regard, he said.

For people like Michael Biercuk, founder of quantum-engineering software company Q-CTRL, the only technical commercial milestone that matters now is quantum advantage or, as he uses the term, when a quantum computer provides some time or cost advantage over a classical computer. Count him among the optimists: he foresees a five-to-eight year time scale to achieve such a goal.

Another open question: Which method of quantum computing will become standard? While superconducting has borne the most fruit so far, researchers are exploring alternative methods that involve trapped ions, quantum annealing or so-called topological qubits. In Donohues view, its not necessarily a question of which technology is better so much as one of finding the best approach for different applications. For instance, superconducting chips naturally dovetail with the magnetic field technology that underpins neuroimaging.

The challenges that quantum computing faces, however, arent strictly hardware-related. The magic of quantum computing resides in algorithmic advances, not speed, Greg Kuperberg, a mathematician at the University of California at Davis, is quick to underscore.

If you come up with a new algorithm, for a question that it fits, things can be exponentially faster, he said, using exponential literally, not metaphorically. (There are currently 63 algorithms listed and 420 papers cited at Quantum Algorithm Zoo, an online catalog of quantum algorithms compiled by Microsoft quantum researcher Scott Jordan.)

Another roadblock, according to Krauthamer, is general lack of expertise. Theres just not enough people working at the software level or at the algorithmic level in the field, she said. Tech entrepreneur Jack Hidaritys team set out to count the number of people working in quantum computing and found only about 800 to 850 people, according to Krauthamer. Thats a bigger problem to focus on, even more than the hardware, she said. Because the people will bring that innovation.

While the community underscores the importance of outreach, the term quantum supremacy has itself come under fire. In our view, supremacy has overtones of violence, neocolonialism and racism through its association with white supremacy, 13 researchers wrote in Nature late last year. The letter has kickstarted an ongoing conversation among researchers and academics.

But the fields attempt to attract and expand also comes at a time of uncertainty in terms of broader information-sharing.

Quantum computing research is sometimes framed in the same adversarial terms as conversations about trade and other emerging tech that is, U.S. versus China. An oft-cited statistic from patent analytics consultancy Patinformatics states that, in 2018, China filed 492 patents related to quantum technology, compared to just 248 in the United States. That same year, the think tank Center for a New American Security published a paper that warned, China is positioning itself as a powerhouse in quantum science. By the end of 2018, the U.S. passed and signed into law the National Quantum Initiative Act. Many in the field believe legislators were compelled due to Chinas perceived growing advantage.

The initiative has spurred domestic research the Department of Energy recently announced up to $625 million in funding to establish up to five quantum information research centers but the geopolitical tensions give some in the quantum computing community pause, namely for fear of collaboration-chilling regulation. As quantum technology has become prominent in the media, among other places, there has been a desire suddenly among governments to clamp down, said Biercuk, who has warned of poorly crafted and nationalistic export controls in the past.

What they dont understand often is that quantum technology and quantum information in particular really are deep research activities where open transfer of scientific knowledge is essential, he added.

The National Science Foundation one of the government departments given additional funding and directives under the act generally has a positive track record in terms of avoiding draconian security controls, Kuperberg said. Even still, the antagonistic framing tends to obscure the on-the-ground facts. The truth behind the scenes is that, yes, China would like to be doing good research and quantum computing, but a lot of what theyre doing is just scrambling for any kind of output, he said.

Indeed, the majority of the aforementioned Chinese patents are quantum tech, but not quantum computing tech which is where the real promise lies.

The Department of Energy has an internal list of sensitive technologies that it could potentially restrict DOE researchers from sharing with counterparts in China, Russia, Iran and North Korea. It has not yet implemented that curtailment, however, DOE Office of Science director Chris Fall told the House committee on science, space and technology and clarified to Science, in January.

Along with such multi-agency-focused government spending, theres been a tsunami of venture capital directed toward commercial quantum-computing interests in recent years. A Nature analysis found that, in 2017 and 2018, private funding in the industry hit at least $450 million.

Still, funding concerns linger in some corners. Even as Googles quantum supremacy proof of concept has helped heighten excitement among enterprise investors, Biercuk has also flagged the beginnings of a contraction in investment in the sector.

Even as exceptional cases dominate headlines he points to PsiQuantums recent $230 million venture windfall there are lesser-reported signs of struggle. I know of probably four or five smaller shops that started and closed within about 24 months; others were absorbed by larger organizations because they struggled to raise, he said.

At the same time, signs of at least moderate investor agitation and internal turmoil have emerged. The Wall Street Journal reported in January that much-buzzed quantum computing startup Rigetti Computing saw its CTO and COO, among other staff, depart amid concerns that the companys tech wouldnt be commercially viable in a reasonable time frame.

Investor expectations had become inflated in some instances, according to experts. Some very good teams have faced more investor skepticism than I think has been justified This is not six months to mobile application development, Biercuk said.

In Kuperbergs view, part of the problem is that venture capital and quantum computing operate on completely different timelines. Putting venture capital into this in the hope that some profitable thing would arise quickly, that doesnt seem very natural to me in the first place, he said, adding the caveat that he considers the majority of QC money prestige investment rather than strictly ROI-focused.

But some startups themselves may have had some hand in driving financiers over-optimism. I wont name names, but there definitely were some people giving investors outsize expectations, especially when people started coming up with some pieces of hardware, saying that advantages were right around the corner, said Donohe. That very much rubbed the academic community the wrong way.

Scott Aaronson recently called out two prominent startups for what he described as a sort of calculated equivocation. He wrote of a pattern in which a party will speak of a quantum algorithms promise, without asking whether there are any indications that your approach will ever be able to exploit interference of amplitudes to outperform the best classical algorithm.

And, mea culpa, some blame for the hype surely lies with tech media. Trying to crack an area for a lay audience means you inevitably sacrifice some scientific precision, said Biercuk. (Thanks for understanding.)

Its all led to a willingness to serve up a glass of cold water now and again. As Juani Bermejo-Vega, a physicist and researcher at University of Granada in Spain, recently told Wired, the machine on which Google ran its milestone proof of concept is mostly still a useless quantum computer for practical purposes.

Bermejo-Vegas quote came in a story about the emergence of a Twitter account called Quantum Bullshit Detector, which decrees, @artdecider-like, a bullshit or not bullshit quote tweet of various quantum claims. The fact that leading quantum researchers are among the accounts 9,000-plus base of followers would seem to indicate that some weariness exists among the ranks.

But even with the various challenges, cautious optimism seems to characterize much of the industry. For good and ill, Im vocal about maintaining scientific and technical integrity while also being a true optimist about the field and sharing the excitement that I have and to excite others about whats coming, Biercuk said.

This year could prove to be formative in the quest to use quantum computers to solve real-world problems, said Krauthamer. Whenever I talk to people about quantum computing, without fail, they come away really excited. Even the biggest skeptics who say, Oh no, theyre not real. Its not going to happen for a long time.

Related20 Quantum Computing Companies to Know

Link:
What Is Quantum Computing and How Does it Work? - Built In

Posted in Quantum Computing | Comments Off on What Is Quantum Computing and How Does it Work? – Built In

Budget 2020: Govt bets on AI, data analytics and quantum computing – Livemint

Finance Minister Nirmala Sitharaman on Saturday announced an outlay of 8,000 crore over the next five years on national mission on quantum technologies, while emphasising on the importance of leveraging artificial intelligence, data analytics, and internet of things for digital governance.

Data centre parks will be set up in India with the help of private sector, she said. The budget also allocated 6,000 crore for Bharat Net and said 1 lakh gram panchayats will get fibre to home connections under Bharat Net scheme in one year.

Policy on private sector building data centre parks is an exciting opportunity for fintech companies. This is also in line with the governments policy on retaining critical data within the country," said Sanjay Khan, partner, Khaitan & Co.

Maninder Bharadwaj, partner, Deloitte India said the emphasis of government on data and digitisation is clearly highlighted in this budget. Building of data centers, collection of nutritional information from 10 crore households and focus on fiber optic networks are initiatives that will propel India towards a digital journey," he added.

Artificial intelligence and machine learning featured extensively in the ministers speech with proposal to use it in various existing and future projects such as the proposed national policy for statistics and the Ayushman Bharat scheme.

While the government had previously set up a national portal for AI research and development, in the latest announcement, the government has continued to offer its support for tech advancements. We appreciate the governments emphasis on promoting cutting-edge technologies in India," Atul Rai, co-founder & CEO of Staqu said in a statement.

Governments across the world have been laying emphasis on use of AI for digital governance. As per reports, US government intends to spend almost $1 billion in AI-related research and development in 2020.

See more here:
Budget 2020: Govt bets on AI, data analytics and quantum computing - Livemint

Posted in Quantum Computing | Comments Off on Budget 2020: Govt bets on AI, data analytics and quantum computing – Livemint

This Week’s Awesome Tech Stories From Around the Web (Through February 1) – Singularity Hub

COMPUTING

Alphabet Has a Second Secretive Quantum Computing TeamTom Simonite | Wired[Alphabets moonshot incubator Xs]small group of quantum researchers is not building its own quantum computing hardware. The groups leader is more interested in creating new algorithms and applications to run on quantum computers, and building software libraries that allow conventional coders to use the exotic machines.

Japan Is Building a Giant Gundam Robot That Can WalkEvan Ackerman | IEEE SpectrumGundam Factory Yokohama, which is a Gundam Factory in Yokohama, is constructing an 18-meter-tall, 25-ton Gundam robot. The plan is for the robot to be fully actuated using a combination of electric and hydraulic actuators, achieving Gundam-like movement with its 24 degrees of freedom.

How to Turn Garbage Into GrapheneCourtney Linder | Popular MechanicsThe new process, which is called flash graphene production, yields bulk quantities of graphene flakes. Not only does this technique produce far more graphene than traditional methods, but its also way cheaper and greener, upcycling food waste, plastic, and even coal into a valuable carbon allotrope used in various branches of material science.

Mammoth Biosciences Aims to Be Illumina for the Gene Editing GenerationJonathan Schieber | TechCrunchYou will need a full toolbox of CRISPR proteins, says [Trevor Martin, Mammoth Biosciences co-founder and chief executive]. That will allow you to interact with biology in the same way that we interact with software and computers. From first principles, companies will programmatically modify biology to cure a disease or decrease risk for a disease.'

Will You Still Need a College Education in 2040?Anisa Purbasari Horton | Fast Company[Six experts] shared the consensus that change is the only certainty. Workers, employers, and education providers alike need to be agile, flexible, and prepared to adapt as technology continues to disrupt industries and change what jobs will and will not be available. Heres what else they had to say.

This Is the Highest-Resolution Photo of the Sun Ever TakenNeel V. Patel | MIT Technology ReviewThe new image demonstrates the telescopes potential power. It shows off a surface thats divided up into discrete, Texas-size cells, like cracked sections in the desert soil. You can see plasma oozing off the surface, rising high into the solar atmosphere before sinking back into darker lanes. [Note: The referenced photo appears in this articles banner image.]

A History ofStar Treks Uneasy Relationship With AndroidsJames Whitbrook | io9Sci-fi has been fascinated with sentient synthetic life sinceits earliest days, butStar Trek, in particular, has had quite the tumultuous history with its own consideration of androids and their place in its far future. From classic interpretations of sinister bots to one of the franchises most beloved characters, heres everything you need to know aboutStar Treksandroids.

Technology Is AnthropologyJon Evans | TechCrunchIts hard enough getting an accurate answer of how a person would use a new technology when thats the only variable. When they live in a constantly shifting and evolving world of other new technologies, when the ones which take root and spread have a positive-feedback-loop effect on the culture and mindset toward new technologies, and when every one of your first 20 interactions with new tech changes your feelings about itits basically impossible.

Image Credit: NSO/AURA/NSF

Read the rest here:
This Week's Awesome Tech Stories From Around the Web (Through February 1) - Singularity Hub

Posted in Quantum Computing | Comments Off on This Week’s Awesome Tech Stories From Around the Web (Through February 1) – Singularity Hub

The End Of The Digital Revolution Is Coming: Here’s What’s Next – Innovation Excellence

by Tom Koulopoulos

The next era of computing will stretch our minds into a spooky new world that were just starting to understand.

In 1946 the Electronic Numerical Integrator and Computer, or the ENIAC, was introduced. The worlds first commercial computer was intended to be used by the military to project the trajectory of missiles, doing in a few seconds what it would otherwise take a human mathematician about three days. Its 20,000 vacuum tubes (the glowing glass light bulb-like predecessors to the transistor) connected by 500,000 hand soldered wires were a marvel of human ingenuity and technology.

Imagine if it were possible to go back to the developers and users of that early marvel and make the case that in 70 years there would be ten billion computers worldwide and half of the worlds population would be walking around with computers 100,000,000 times as powerful as the ENIAC in their pants pockets.

Youd have been considered a lunatic!

I want you to keep that in mind as you resist the temptation to do the same to me because of what Im about to share.

Quantum Supremacy

Digital computers will soon reach the limits of demanding technologies such as AI. Consider just the impact of these two projection: by 2025 driverless cars alone may produce as much data as exists in the entire world today; fully digitizing every cell in the human body would exceed ten times all of the data stored globally today. In these and many more cases we need to find ways to deal with unprecedented amounts of data and complexity. Enter quantum computing.

Youve likely heard of quantum computing. Amazingly, its a concept as old as digital computers. However, you may have discounted it as a far off future thats about as relevant to your life as flying cars. Well, it may be time to reconsider. Quantum computing is progressing at a rate that is surprising even those who are building it.

Understanding what quantum computers are and how they work challenges much of what we know of not just computing, but the basics of how the physical world appears to operate. Quantum mechanics, the basis for quantum computing, describes the odd and non-intuitive way the universe operates at a sub-atomic level. Its part science, part theory, and part philosophy.

Classical digital computers use what are called bits, something most all of us are familiar with. A bit can be a one or a zero. Quantum computers use what are called qubits (quantum bits). A quibit can also be a one or a zero but it can also be an infinite number of possibilities in between the two. The thing about qubits is that while a digital bit is always either on (1) or off (0), a qubit is always in whats called a superposition state, neither on nor off.

Although its a rough analogy, think of a qubit as a spinning coin thats just been flipped in the dark. While its spinning is it heads or tails? Its at the same time both and neither until it stops spinning and we then shine a light on it. However, a binary bit is like a coin that has a switch to make it glow in the dark. If I asked you Is it glowing? there would only be two answers, yes or no, and those would not change as it spins.

Thats what a qubit is like when compared to a classical digital bit. A quibit does not have a state until you effectively shine a light on it, while a binary bit maintains its state until that state is manually or mechanically changed.

Dont get too hung up on that analogy because as you get deeper into the quantum world trying to use what we know of the physical world is always a very rough and ultimately flawed way to describe the way things operate at the quantum level of matter.

However, the difficulty in understanding how quantum computers works hasnt stopped their progress. Google engineers recently talked about how the quantum computers they are building are progressing so fast that that they may achieve the elusive goal of whats called quantum supremacy (the point at which quantum computers can exceed the ability of classical binary computer) within months. While that may be a bit of stretch, even conservative projections put us on a 5-year timeline for quantum supremacy.

Quantum vs Classical Computing

Quantum computers, which are built using these qubits, will not replace all classical digital computers, but they will become an indispensable part of how we use computers to model the world and to integrate artificial intelligence into our lives.

Quantum computing will be one of the most radical shifts in the history of science, likely outpacing any advances weve seen to date with prior technological revolutions, such as the advent of semiconductors. They will enable us to take on problems that would take even the most powerful classical supercomputers millions or even billions of years to solve. Thats not just because quantum computers are faster but because they can approach problem solving with massive parallelism using the qualities of how quantum particles behave.

The irony is that the same thing that makes quantum computers so difficult to understand, their harnessing of natures smallest particles, also gives them the ability to precisely simulate the biological world at its most detailed. This means that we can model everything from chemical reactions, to biology, to pharmaceuticals, to the inner workings of the universe, to the spread of pandemics, in ways that were simply impossible with classical computers.

A Higher Power

The reason for the all of the hype behind the rate at which quantum computers are evolving has to do with whats called doubly exponential growth.

The exponential growth that most of us are familiar with, and which is being talked about lately, refers to the classical doubling phenomenon. For example, Moores law, which projects the doubling in the density of transistors on a silicon chip every 18 months. Its hard to wrap our linear brains around exponential growth, but its nearly impossible to wrap them around doubly exponential growth.

Doubly exponential growth simply has no analog in the physical world. Doubly exponential growth means that you are raising a number to a power and then raising that to another power. It looks like this 510^10.

What this means is that while a binary computer can store 256 states with 8 bits (28), a quantum computer with eight qubits (recall that a qubit is the conceptual equivalent of a digital bit in a classical computer) can store 1077 bits of data! Thats a number with 77 zeros, or, to put it into perspective, scientists estimate that there are 1078 atoms in the entire visible universe.

Even Einstein had difficulty with entanglement calling it, spooky action at a distance.

By the way, just to further illustrate the point, if you add one more qubit the number of bits (or more precisely, states) that can be stored just jumped to 10154 (one more bit in a classical computer would only raise the capacity to 1078).

Heres whats really mind blowing about quantum computing (as if what we just described isnt already mind-blowing enough.) A single caffeine molecule is made up of 24 atoms and it can have 1048 quantum states (there are only 1050 atoms that make up the Earth). Modeling caffeine precisely is simply not possible with classical computers. Using the worlds fastest super computer it would take 100,000,000,000,000 times the age of the universe to process the 1048 calculations that represent all of the possible states of a caffeine molecule!

So, the obvious question is, How could any computer, quantum or otherwise, take on something of that magnitude? Well, how does nature do it? That cup of coffee youre drinking has trillions of caffeine molecules and nature is doing just fine handling all of the quantum states they are in. Since nature is a quantum machine what better way to model it than a quantum computer?

Spooky Action

The other aspect of quantum computing that challenges our understanding of how the quantum world works is whats called entanglement. Entanglement describes a phenomenon in which two quantum particles are connected in such a way that no matter how great the distance between them they will both have the same state when they are measured.

At first blush that doesnt seem to be all that novel. After all, if I were to paint two balls red and then separate them by the distance of the universe, both would still be red. However, the state of a quantum object is always in whats called a superposition, meaning that it has no inherent state. Think of our coin flip example from earlier where the coin is in a superposition state until it stops spinning.

If instead of a color its two states were up or down it would always be in both states while also in neither state, that is until an observation or measurement forces it to pick a state. Again, think back to the spinning coin.

Now imagine two coins entangled and flipped simultaneously at different ends of the universe. Once you stop the spin of one coin and reveal that its heads the other coin would instantly stop spinning and also be heads.

If this makes your head hurt, youre in good company. Even Einstein had difficulty with entanglement calling it, spooky action at a distance. His concern was that the two objects couldnt communicate at a speed faster than the speed of light. Whats especially spooky about this phenomenon is that the two objects arent communicating at all in any classical sense of the term communication.

Entanglement creates the potential for all sorts of advances in computing, from how we create 100 percent secure communications against cyberthreats, to the ultimate possibility of teleportation.

Room For Possibility

So, should you run out a buy a quantum computer? Well, its not that easy. Qubits need to be super cooled and are exceptionally finicky particles that require an enormous room-sized apparatus and overhead. Not unlike the ENIAC once did.

You can however use a quantum computer for free or lease its use for more sophisticated applications For example, IBMs Q, is available both as an open source learning environment for anyone as well as a powerful tool for fintech users. However, Ill warn you that even if youre accustomed to programming computers, it will still feel as though youre teaching yourself to think in an entirely foreign language.

The truth is that we might as well be surrounded by 20,000 glowing vacuum tubes and 500,000 hand soldered wires. We can barely imagine what the impact of quantum computing will be in ten to twenty years. No more so than the early users of the ENIAC could have predicted the mind-boggling ways in which we use digital computers today.

Listen in to my two podcasts with scientists from IBM, MIT, and Harvard to find out more about quantum computing. Quantum Computing Part I, Quantum Computing Part II

This article was originally published on Inc.

Image credit: Pixabay

Choose how you want the latest innovation content delivered to you:

Tom Koulopoulos is the author of 10 books and founder of the Delphi Group, a 25-year-old Boston-based think tank and a past Inc. 500 company that focuses on innovation and the future of business. He tweets from @tkspeaks.

Link:
The End Of The Digital Revolution Is Coming: Here's What's Next - Innovation Excellence

Posted in Quantum Computing | Comments Off on The End Of The Digital Revolution Is Coming: Here’s What’s Next – Innovation Excellence