Search Immortality Topics:

Page 94«..1020..93949596..100110..»


Category Archives: Quantum Computing

Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.

There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?

At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.

To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.

Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.

Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.

This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.

Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.

The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.

Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.

Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.

What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.

Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.

Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.

However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.

What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).

Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.

A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.

comments

View post:
Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine

Posted in Quantum Computing | Comments Off on Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight

Wiring the Quantum Computer of the Future: a Novel Simple Build with Existing Technology

The basic units of a quantum computer can be rearranged in 2D to solve typical design and operation challenges

Efficient quantum computing is expected to enable advancements that are impossible with classical computers. Scientists from Japan and Sydney have collaborated and proposed a novel two-dimensional design that can be constructed using existing integrated circuit technology. This design solves typical problems facing the current three-dimensional packaging for scaled-up quantum computers, bringing the future one step closer.

Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry,and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason. Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.

But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the quantum bits or qubits. These are typically atoms, ions, photons, subatomic particles such as electrons,or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices. When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect,making their construction a significant engineering challenge.

A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and University of Technology, Sydney, led by Prof Jaw-Shen Tsai, proposes a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design, they say. This study has been published in the New Journal of Physics.

The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a bi-linear array. This put all qubits on the edge and simplified the arrangement of the required wiring system.The system is also completely in 2D. In this new architecture, some of the inter-qubit wiringeach qubit is also connected to all adjacent qubits in an arraydoes overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.

The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. Results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.

The scientists experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other,thereby reducing the crosstalk and consequently increasing the efficiency of the system.

At a time when large labs worldwide are attempting to find ways to buildlarge-scale fault-tolerant quantum computers, the findingsof this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. The quantum computer is an information device expected to far exceed the capabilities of modern computers, Prof Tsai states.The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, We are planning to construct a small-scale circuit to further examine and explore the possibility.

###

ReferenceTitle of original paper: Pseudo-2D superconducting quantum computing circuit for the surface code: the proposal and preliminary tests

Journal:New Journal of Physics

DOI:10.1088/1367-2630/ab7d7d

Tokyo University of Science (TUS) is a well-known and respected university, and the largest science-specialized private research university in Japan, with four campuses in central Tokyo and its suburbs and in Hokkaido. Established in 1881, the university has continually contributed to Japans development in science through inculcating the love for science in researchers, technicians, and educators.

With a mission of Creating science and technology for the harmonious development of nature, human beings, and society, TUS has undertaken a wide range of research from basic to applied science. TUS has embraced a multidisciplinary approach to research and undertaken intensive study in some of todays most vital fields. TUS is a meritocracy where the best in science is recognized and nurtured. It is the only private university in Japan that has produced a Nobel Prize winner and the only private university in Asia to produce Nobel Prize winners within the natural sciences field.

Website:https://www.tus.ac.jp/en/mediarelations/

Dr Jaw-Shen Tsai is currently a Professor at the Tokyo University of Science, Japan. He began research in Physics in 1975 and continues to hold interest in areas such as superconductivity, the Josephson effect, quantum physics, coherence, qubits, and artificial atoms. He has 160+ research publications to his credit and serves as the lead author in this paper. He has also won several awards, including Japans Medal of Honor, the Purple Ribbon Award.

Professor Jaw-Shen Tsai

Department of Physics

Tokyo University of Science

Tsutomu Shimizu

Public Relations Divisions

Tokyo University of Science

Email: mediaoffice@admin.tus.ac.jp

Website: https://www.tus.ac.jp/en/mediarelations/

Share This ArticleDo the sharing thingy

Read the original post:
Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology - Analytics Insight

Posted in Quantum Computing | Comments Off on Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight

Eleven Princeton faculty elected to American Academy of Arts and Sciences – Princeton University

Princeton faculty members Rubn Gallo, M. Zahid Hasan, Amaney Jamal, Ruby Lee, Margaret Martonosi, Tom Muir, Eve Ostriker, Alexander Smits, Leeat Yariv and Muhammad Qasim Zaman have been named members of the American Academy of Arts and Sciences. Visiting faculty member Alondra Nelson also was elected to the academy.

They are among 276 scholars, scientists, artists and leaders in the public, nonprofit and private sectors elected this year in recognition of their contributions to their respective fields.

Gallo is the Walter S. Carpenter, Jr., Professor in Language, Literature, and Civilization of Spain and a professor of Spanish and Portuguese. He joined the Princeton faculty in 2002. His most recent book is Conversacin en Princeton(2017)with Mario Vargas Llosa, who was teaching at Princeton when he received the Nobel Prize in Literature in 2010.

Gallos other books include Prousts LatinAmericans(2014);Freuds Mexico: Into the Wilds of Psychoanalysis(2010); Mexican Modernity: the Avant-Garde and the Technological Revolution(2005); New Tendencies in Mexican Art(2004); andThe Mexico City Reader(2004). He is currently working on Cuba: A New Era, a book about the changes in Cuban culture after the diplomatic thaw with the United States.

Gallo received the Gradiva award for the best book on a psychoanalytic theme and the Modern Language Associations Katherine Singer Kovacs Prize for the best book on a Latin American topic. He is a member of the board of the Sigmund Freud Museum in Vienna, where he also serves as research director.

Photo by

Nick Barberio, Office of Communications

Hasan is the Eugene Higgins Professor of Physics. He studiesfundamental quantum effects in exotic superconductors, topological insulators and quantum magnetsto make new discoveries about the nature of matter, work that may have future applications in areas such asquantum computing. He joined the faculty in 2002and has since led his research team to publish many influential findings.

Last year, Hasans lab led research that discovered that certain classes of crystals with an asymmetry like biological handedness, known as chiral crystals, may harbor electrons that behave in unexpected ways. In 2015, he led a research team that first observed Weyl fermions, which, if applied to next-generation electronics, could allow for a nearly free and efficient flow of electricity in electronics, and thus greater power, especially for computers.

In 2013, Hasan was named a fellow of the American Physical Society for the experimental discovery of three-dimensional topological insulators a new kind of quantum matter. In 2009, he received a Sloan Research Fellowship for groundbreaking research.

Photo by Tori Repp/Fotobuddy

Jamal is the Edwards S. Sanford Professor of Politics and director of the Mamdouha S. Bobst Center for Peace and Justice. She has taught at Princeton since 2003. Her current research focuses on the drivers of political behavior in the Arab world, Muslim immigration to the U.S. and Europe, and the effect of inequality and poverty on political outcomes.

Jamal also directs the Workshop on Arab Political Development and the Bobst-AUB Collaborative Initiative. She is also principal investigator for the Arab Barometer project, which measures public opinion in the Arab world. She is the former President of the Association of Middle East Womens Studies.

Her books include Barriers to Democracy (2007), which won the 2008 APSA Best Book Award in comparative democratization, and Of Empires and Citizens, which was published by Princeton University Press (2012). She is co-editor of Race and Arab Americans Before and After 9/11: From Invisible Citizens to Visible Subjects (2007) and Citizenship and Crisis: Arab Detroit after 9/11 (2009).

Photo by Tori Repp/Fotobuddy

Lee is the Forrest G. Hamrick Professor in Engineering and professor of electrical engineering. She is an associated faculty member in computer science. Lee joined the Princeton faculty in 1998.Her work at Princeton explores how the security and performance of computing systems can be significantly and simultaneously improved by hardware architecture. Her designs of secure processor architectures have strongly influenced industry security offerings and also inspired new generations of academic researchers in hardware security, side-channel attacks and defenses, secure processors and caches, and enhanced cloud computing and smartphone security.

Her research lies at the intersection of computer architecture, cybersecurity and, more recently, the branch of artificial intelligence known as deep learning.

Lee spent 17 years designing computers at Hewlett-Packard, and was a chief architect there before coming to Princeton. Among many achievements, Lee is known in the computer industry for her design of the HP Precision Architecture (HPPA or PA-RISC) that powered HPs commercial and technical computer product families for several decades, and was widely regarded as introducing key forward-looking features. In the '90s she spearheaded the development of microprocessor instructions for accelerating multimedia, which enabled video and audio streaming, leading to ubiquitous digital media.Lee is a fellow into the Association for Computing Machinery and the Institute of Electrical and Electronics Engineers.

Margaret Martonosi, the Hugh Trumbull Adams 35 Professor of Computer Science, specializes in computer architecture and mobile computing with an emphasis on power efficiency. She was one of the architects of the Wattch power modeling infrastructure, a tool that was among the first to allow computer scientists to incorporate power consumption into early-stage computer systems design. Her work helped demonstrate that power needs can help dictate the design of computing systems. More recently, Martonosis work has also focused on architecture and compiler issues in quantum computing.

She currently serves as head of the National Science Foundations Directorate for Computer and Information Science and Engineering, one of seven top-level divisions within the NSF. From 2017 until February 2020, she directed Princetons Keller Center for Innovation in Engineering Education, a center focused on enabling students across the University to realize their aspirations for addressing societal problems. She is an inventor who holds seven U.S. patents and has co-authored two technical reference books on power-aware computer architecture. In 2018, she was one of 13 co-authors of a National Academies consensus study report on progress and challenges in quantum computing.

Martonosi is a fellow of the Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers IEEE). Among other honors, she has received a Jefferson Science Fellowship, the IEEE Technical Achievement Award, and the ACM SIGARCH Alan D. Berenbaum Distinguished Service Award. She joined the Princeton faculty in 1994.

Muir is the Van Zandt Williams, Jr. Class of 65 Professor of Chemistry and chair of the chemistry department. He joined Princeton in 2011 and is also an associated faculty member in molecular biology.

He leads research in investigating the physiochemical basis of protein function in complex systems of biomedical interest. By combining tools of organic chemistry, biochemistry, biophysics and cell biology, his lab has developed a suite of new technologies that provide fundamental insight into how proteins work. The chemistry-driven approaches pioneered by Muirs lab are now widely used by chemical biologists around the world.

Muir has published over 150 scientific articles and has won a number of honors for his research.He received a MERIT Award from the National Institutes of Health and is a fellow of American Association for the Advancement of Science and the Royal Society of Edinburgh.

Nelson is the Harold F. Linder Chair in the School of Social Science at the Institute for Advanced Study and a visiting lecturer with the rank of professor in sociology at Princeton. She is president of the Social Science Research Council and is one of the country's foremost thinkers in the fields of science, technology, social inequalityand race. Her groundbreaking books include "The Social Life of DNA: Race, Reparations, and Reconciliation after the Genome" (2016) and "Body and Soul: The Black Panther Party and the Fight Against Medical Discrimination" (2011).Her other books include"Genetics and the Unsettled Past: The Collision of DNA, Race, and History" (with Keith Wailoo of Princeton and Catherine Lee) and"Technicolor: Race, Technology, and Everyday Life" (with Thuy Linh Tu). In 2002 she edited "Afrofuturism," a special issue of Social Text.

Nelson's writings and commentary also have reached the broader public through a variety of outlets. She has contributed to national policy discussions on inequality and the implications of new technology on society.

She is an elected fellow of the American Academy of Political and Social Science, the Hastings Centerand the Sociological Research Association. She serves on several advisory boards, including the Andrew. W. Mellon Foundation and the American Association for the Advancement of Science.

Ostriker, professor of astrophysical sciences, studies the universe. Her research is in the area of theoretical and computational astrophysics, and the tools she uses are powerful supercomputers and algorithms capable of simulating the birth, life, death and reincarnation of stars in their galactic homes. Ostriker and her fellow researchers build computer models using fundamental physical laws ones that govern gravity, fluid dynamics and electromagnetic radiation to follow the evolution of conditions found in deep space.

Ostriker, who came to Princeton in 2012, and her team have explored the formation of superbubbles, giant fronts of hot gas that billow out from a cluster of supernova explosions. More recently, she and her colleagues turned their focus toward interstellar clouds.

The research team uses computing resources through the Princeton Institute for Computational Science and Engineering and its TIGER and Perseus research computing clusters, as well as supercomputers administered through NASA. In 2017, Ostriker received a Simons Investigator Award.

Photo by

Nick Donnoli, Office of Communications

Smits is the Eugene Higgins Professor of Mechanical and Aerospace Engineering, Emeritus. His research spans the field of fluid mechanics, including fundamental turbulence, supersonic and hypersonic flows, bio-inspired flows, sports aerodynamics, and novel energy-harvesting concepts.

He joined the Princeton faculty in 1981 and transferred to emeritus status in 2018. Smits served as chair of the Department of Mechanical and Aerospace Engineering for 13 years and was director of the Gas Dynamics Laboratory on the Forrestal Campus for 33 years. During that time, he received several teaching awards, including the Presidents Award for Distinguished Teaching.

Smits has written more than 240 articles and three books, and edited seven volumes. He was awarded seven patents and helped found three companies. He is a member of the National Academy of Engineering and a fellow of the American Physical Society, the American Institute of Aeronautics and Astronautics, the American Society of Mechanical Engineers, the American Association for the Advancement of Science, and the Australasian Fluid Mechanics Society.

Yariv is the Uwe Reinhardt Professor of Economics. An expert in applied theory and experimental economics, her research interests concentrate on game theory, political economy, psychology and economics. She joined the faculty in 2018. Yariv also is director of the Princeton Experimental Laboratory for the Social Sciences.

She is a member of several professional organizations and is lead editor of American Economic Journal: Microeconomics, a research associate with the Political Economy Program of the National Bureau of Economic Research, and a research fellow with the Industrial Organization Programme of the Centre for Economic Policy Research.

She is also a fellow of the Econometric Society and the Society for the Advancement of Economic Theory, and has received numerous grants for researchand awards for her many publications.

Zaman, who joined the Princeton faculty in 2006, is the Robert H. Niehaus 77 Professor of Near Eastern Studies and Religion and chair of the Department of Near Eastern Studies.

He has written on the relationship between religious and political institutions in medieval and modern Islam, on social and legal thought in the modern Muslim world, on institutions and traditions of learning in Islam, and on the flow of ideas between South Asia and the Arab Middle East. He is the author of Religion and Politics under the Early Abbasids (1997), The Ulama in Contemporary Islam: Custodians of Change (2002), Ashraf Ali Thanawi: Islam in Modern South Asia (2008), Modern Islamic Thought in a Radical Age: Religious Authority and Internal Criticism (2012), and Islam in Pakistan: A History (2018). With Robert W. Hefner, he is also the co-editor of Schooling Islam: The Culture and Politics of Modern Muslim Education (2007); with Roxanne L. Euben, of Princeton Readings in Islamist Thought (2009); and, as associate editor, with Gerhard Bowering et al., of the Princeton Encyclopedia of Islamic Political Thought (2013). Among his current projects is a book on South Asia and the wider Muslim world in the 18th and 19th centuries.

In 2017, Zaman received Princetons Graduate Mentoring Award. In 2009, he received a Guggenheim Fellowship.

The mission of the academy: Founded in 1780, the American Academy of Arts and Sciences honors excellence and convenes leaders from every field of human endeavor to examine new ideas, address issues of importance to the nation and the world, and work together to cultivate every art and science which may tend to advance the interest, honor, dignity, and happiness of a free, independent, and virtuous people.

View post:
Eleven Princeton faculty elected to American Academy of Arts and Sciences - Princeton University

Posted in Quantum Computing | Comments Off on Eleven Princeton faculty elected to American Academy of Arts and Sciences – Princeton University

RMACC’s 10th High Performance Computing Symposium to Be Held Free Online – HPCwire

BOULDER, Colo., April 22, 2020 The Rocky Mountain Advanced Computing Consortium (RMACC) will hold its 10thannual High Performance Computing Symposium as a multi-track on-line version on May 20-21.Registration for the event will be free to all who would like to attend.

The on-line Symposium will include presentations by two keynote speakers and a full slate of tutorial sessions.Another longtime Symposium tradition a poster competition for students to showcase their own research also will be continued. Competition winners will receive an all-expenses paid trip to SC20 in Atlanta.

Major sponsor support is being provided by Intel, Dell and HPE with additional support from ARM, IBM, Lenovo and Silicon Mechanics.

Links to the Symposium registration, its schedule, and how to enter the poster competition can be found atwww.rmacc.org/hpcsymposium.

The Keynote speakers areDr.Nick Bronn, a Research Staff Member in IBMs Experimental Quantum Computing group, andDr. Jason Dexter, a working group coordinator for the groundbreaking black hole imaging studies published by Event Horizon Telescope.

Dr. Bronn serves at IBMs TJ Watson Research Center in Yorktown Heights, NY.He has been responsible for qubit (quantum bits) device design, packaging, and cryogenic measurement, working towards scaling up larger numbers of qubits on a device and integration with novel implementations of microwave and cryogenic hardware.He will speak on the topic,Benchmarking and Enabling Noisy Near-term Quantum Hardware.

Dr.Dexter is a member of the astrophysical and planetary sciences faculty at the University of Colorado Boulder.He will speak on the role of high performance computing in understanding what we see in the first image of a black hole.Dr. Dexter is a member of both the Event Horizon Telescope and VLTI/GRAVITY collaborations, which can now image black holes.

Their appearances along with the many tutorial sessions continue the RMACCs annual tradition of showcasing cutting-edge HPC achievements in both education and industry.

The largest consortium of its kind, the RMACC is a collaboration among 30 academic and government research institutions in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming. The consortiums mission is to facilitate widespread effective use of high performance computing throughout the 9-state intermountain region.

More about the RMACC and its mission can be found at the website:www.rmacc.org.

About RMACC

Primarily a volunteer organization, the RMACC is collaboration among 30 academic and research institutions located in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming.The RMACCs mission is to facilitate widespread effective use of high performance computing throughout this 9-state intermountain region.

Source: RMACC

Here is the original post:
RMACC's 10th High Performance Computing Symposium to Be Held Free Online - HPCwire

Posted in Quantum Computing | Comments Off on RMACC’s 10th High Performance Computing Symposium to Be Held Free Online – HPCwire

Advanced Encryption Standard (AES): What It Is and How It Works – Hashed Out by The SSL Store – Hashed Out by The SSL Store

Understanding advanced encryption standard on basic level doesnt require a higher degree in computer science or Matrix-level consciousness lets break AES encryption down into laymans terms

Hey, all. We know of security of information to be a hot topic since, well, forever. We entrust our personal and sensitive information to lots of major entities and still have problems with data breaches, data leaks, etc. Some of this happens because of security protocols in networking, or bad practices of authentication management but, really, there are many ways that data breaches can occur. However, the actual process of decrypting a ciphertext without a key is far more difficult. For that, we can thank the encrypting algorithms like the popular advanced encryption standard and the secure keys that scramble our data into indecipherable gibberish.

Lets look into how AES works and different applications for it. Well be getting a little into some Matrix-based math so, grab your red pills and see how far this rabbit hole goes.

Lets hash it out.

You may have heard of advanced encryption standard, or AES for short but may not know the answer to the question what is AES? Here are four things you need to know about AES:

The National Institute of Standards and Technology (NIST) established AES as an encryption standard nearly 20 years ago to replace the aging data encryption standard (DES). After all, AES encryption keys can go up to 256 bits, whereas DES stopped at just 56 bits. NIST could have chosen a cipher that offered greater security, but the tradeoff would have required greater overhead that wouldnt be practical. So, they went with one that had great all-around performance and security.

AESs results are so successful that many entities and agencies have approved it and utilize it for encrypting sensitive information. The National Security Agency (NSA), as well as other governmental bodies, utilize AES encryption and keys to protect classified or other sensitive information. Furthermore, AES is often included in commercial based products, including but limited to:

Although it wouldnt literally take forever, it would take far longer than any of our lifetimes to crack an AES 256-bit encryption key using modern computing technology. This is from a brute force standpoint, as in trying every combination until we hear the click/unlocking sound. Certain protections are put in place to prevent stuff from like this happening quickly, such as a limit on password attempts before a lockdown, which may or may not include a time lapse, to occur before trying again. When we are dealing with computation in milliseconds, waiting 20 minutes to try another five times would seriously add to the time taken to crack a key.

Just how long would it take? We are venturing into a thousand monkeys working on a thousand typewriters to write A Tale of Two Cities territory. The possible combinations for AES 256-bit encryption is 2256. Even if a computer can do multiple quadrillions of instructions per second, then we are still in that eagles-wings-eroding-Mount-Everest time frame.

Needless to say, its waaaaaaaaaaaaaaaaaaay (theres not enough memory on our computers to support the number of a letters that I want to convey) longer than our current universe has been in existence. And thats just for a 16-byte block of data. So, as you can see, brute forcing AES even if it is 128 bits AES is futile.

That would likely change, though, once quantum computing becomes a little more mainstream, available, and effective. Quantum computing is expected to break AES encryption and require other methods to protect our data but thats still a ways down the road.

Manage Digital Certificates like a Boss

14 Certificate Management Best Practices to keep your organization running, secure and fully-compliant.

To better understand what AES is, you need to understand how it works. But in order to see how the advanced encryption standard actually works, however, we first need to look at how this is set up and the rules concerning the process based on the users selection of encryption strength. Typically, when we discuss using higher bit levels of security, were looking at things that are more secure and more difficult to break or hack. While the data blocks are broken up into 128 bits, the key size have a few varying lengths: 128 bits, 196 bits, and 256 bits. What does this mean? Lets back it up for a second here.

We know that encryption typically deals in the scrambling of information into something unreadable and an associated key to decrypt the scramble. AES scramble procedures use four scrambling operations in rounds, meaning that it will perform the operations, and then repeat the process based off of the previous rounds results X number of times. Simplistically, if we put in X and get out Y, that would be one round. We would then put Y through the paces and get out Z for round 2. Rinse and repeat until we have completed the specified number of rounds.

The AES key size, specified above, will determine the number of rounds that the procedure will execute. For example:

As mentioned, each round has four operations.

So, youve arrived this far. Now, you may be asking: why, oh why, didnt I take the blue pill?

Before we get to the operational parts of advanced encryption standard, lets look at how the data is structured. What we mean is that the data that the operations are performed upon is not left-to-right sequential as we normally think of it. Its stacked in a 44 matrix of 128 bits (16 bytes) per block in an array thats known as a state. A state looks something like this:

So, if your message was blue pill or red, it would look something like this:

So, just to be clear, this is just a 16-byte block so, this means that every group of 16 bytes in a file are arranged in such a fashion. At this point, the systematic scramble begins through the application of each AES encryption operation.

As mentioned earlier, once we have our data arrangement, there are certain linked operations that will perform the scramble on each state. The purpose here is to convert the plaintext data into ciphertext through the use of a secret key.

The four types of AES operations as follows (note: well get into the order of the operations in the next section):

As mentioned earlier, the key size determines the number of rounds of scrambling that will be performed. AES encryption uses the Rjindael Key Schedule, which derives the subkeys from the main key to perform the Key Expansion.

The AddRoundKey operation takes the current state of the data and executes the XOR Boolean operation against the current round subkey. XOR means Exclusively Or, which will yield a result of true if the inputs differ (e.g. one input must be 1 and the other input must be 0 to be true). There will be a unique subkey per round, plus one more (which will run at the end).

The SubBytes operation, which stands for substitute bytes, will take the 16-byte block and run it through an S-Box (substitution box) to produce an alternate value. Simply put, the operation will take a value and then replace it by spitting out another value.

The actual S-Box operation is a complicated process, but just know that its nearly impossible to decipher with conventional computing. Coupled with the rest of AES operations, it will do its job to effectively scramble and obfuscate the source data. The S in the white box in the image above represents the complex lookup table for the S-Box.

The ShiftRows operation is a little more straightforward and is easier to understand. Based off the arrangement of the data, the idea of ShiftRows is to move the positions of the data in their respective rows with wrapping. Remember, the data is arranged in a stacked arrangement and not left to right like most of us are used to reading. The image provided helps to visualize this operation.

The first row goes unchanged. The second row shifts the bytes to the left by one position with row wrap around. The third row shifts the bytes one position beyond that, moving the byte to the left by a total of two positions with row wrap around. Likewise, this means that the fourth row shifts the bytes to the left by a total of three positions with row wrap around.

The MixColumns operation, in a nutshell, is a linear transformation of the columns of the dataset. It uses matrix multiplication and bitwise XOR addition to output the results. The column data, which can be represented as a 41 matrix, will be multiplied against a 44 matrix in a format called the Gallois field, and set as an inverse of input and output. That will look something like the following:

As you can see, there are four bytes in that are ran against a 44 matrix. In this case, matrix multiplication has each input byte affecting each output byte and, obviously, yields the same size.

Now that we have a decent understanding of the different operations utilized to scramble our data via AES encryption, we can look at the order in which these operations execute. It will be as such:

Note: The MixColumns operation is not in the final round. Without getting into the actual math of this, theres no additional benefit to performing this operation. In fact, doing so would simply make the decryption process a bit more taxing in terms of overhead.

If we consider the number of rounds and the operations per round that are involved, by the end of it, you should have a nice scrambled block. And that is only a 16-byte block. Consider how much information that equates to in the big picture. Its miniscule when compared to todays file/packet sizes! So, if each 16-byte block has seemingly no discernable pattern at least, any pattern that can be deciphered in a timely manner Id say AES has done its job.

We know the advanced encryption standard algorithm itself is quite effective, but its level of effectiveness depends on how its implemented. Unlike the brute force attacks mentioned above, effective attacks are typically launched on the implementation and not on the algorithm itself. This can be equated to attacking users as in phishing attacks versus attacking the technology behind the service/function that may be hard to breach. These can be considered side-channel attacks where the attacks are being carried out on other aspects of the entire process and not the focal point of the security implementation.

While I always advocate going with a reasonable/effective security option, a lot of AES encryption is happening without you even knowing it. Its locking down spots of the computing world that would otherwise be wide open. In other words, there would be many more opportunities for hackers to capture data if advanced encryption standard wasnt implemented at all. We just need to know how to identify the open holes and figure out how to plug them. Some may be able to use AES and others may need another protocol or process.

Appreciate the encryption implementations we have, use the best ones when needed, and happy scrutinizing!

Continue reading here:
Advanced Encryption Standard (AES): What It Is and How It Works - Hashed Out by The SSL Store - Hashed Out by The SSL Store

Posted in Quantum Computing | Comments Off on Advanced Encryption Standard (AES): What It Is and How It Works – Hashed Out by The SSL Store – Hashed Out by The SSL Store

Google’s Head of Quantum Computing Hardware Resigns – WIRED

In late October 2019, Google CEO Sundar Pichai likened the latest result from the companys quantum computing hardware lab in Santa Barbara, California, to the Wright brothers first flight.

One of the labs prototype processors had achieved quantum supremacyevocative jargon for the moment a quantum computer harnesses quantum mechanics to do something seemingly impossible for a conventional computer. In a blog post, Pichai said the milestone affirmed his belief that quantum computers might one day tackle problems like climate change, and the CEO also name-checked John Martinis, who had established Googles quantum hardware group in 2014.

Heres what Pichai didnt mention: Soon after the team had first got its quantum supremacy experiment working a few months earlier, Martinis says, he had been reassigned from a leadership position to an advisory one. Martinis tells WIRED that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis resigned from Google early this month. Since my professional goal is for someone to build a quantum computer, I think my resignation is the best course of action for everyone, he adds.

A Google spokesman did not dispute this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project. Parent company Alphabet has a second, smaller, quantum computing group at its X Labs research unit. Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, in 2006, and initially focused on software. To start, the small group accessed quantum hardware from Canadian startup D-Wave Systems, including in collaboration with NASA.

Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Qubits are analogous to the bits of a conventional computer, but in addition to representing 1s and 0s, they can use quantum mechanical effects to attain a third state, dubbed a superposition, something like a combination of both. Qubits in superposition can work through some very complex problems, such as modeling the interactions of atoms and molecules, much more efficiently than conventional computer hardware.

How useful that is depends on the number and reliability of qubits in your quantum computing processor. So far the best demonstrations have used only tens of qubits, a far cry from the hundreds or thousands of high quality qubits experts believe will be needed to do useful work in chemistry or other fields. Googles supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer on the order of 10,000 years, but does not have a practical application.

Martinis leaves Google as the company and rivals that are working on quantum computing face crucial questions about the technologys path. Amazon, IBM, and Microsoft, as well as Google offer their prototype technology to companies such as Daimler and JP Morgan so they can run experiments. But those processors are not large enough to work on practical problems, and it is not clear how quickly they can be scaled up.

When WIRED visited Googles quantum hardware lab in Santa Barbara last fall, Martinis responded optimistically when asked if his hardware team could see a path to making the technology practical. I feel we know how to scale up to hundreds and maybe thousands of qubits, he said at the time. Google will now have to do it without him.

More Great WIRED Stories

Continue reading here:
Google's Head of Quantum Computing Hardware Resigns - WIRED

Posted in Quantum Computing | Comments Off on Google’s Head of Quantum Computing Hardware Resigns – WIRED