The horizon of the former - and beyond ...
Technologies

The horizon of the former - and beyond ...

On the one hand, they should help us defeat cancer, accurately predict the weather, and master nuclear fusion. On the other hand, there are fears that they will cause global destruction or enslave humanity. At the moment, however, the computational monsters are still unable to do great good and universal evil at the same time.

In the 60s, the most efficient computers had the power megaflops (millions of floating point operations per second). First computer with processing power above 1 GFLOPS (gigaflops) was Cray 2, produced by Cray Research in 1985. The first model with processing power above 1 TFLOPS (teraflops) was ASCI Red, created by Intel in 1997. Power 1 PFLOPS (petaflops) reached Roadrunner, released by IBM in 2008.

The current computing power record belongs to the Chinese Sunway TaihuLight and is 9 PFLOPS.

Although, as you can see, the most powerful machines have not yet reached hundreds of petaflops, more and more exascale systemsin which the power must be taken into account exaflopsach (EFLOPS), i.e. about more than 1018 operations per second. However, such designs are still only at the stage of projects of varying degrees of sophistication.

REDUCTIONS (, floating point operations per second) is a unit of computing power used primarily in scientific applications. It is more versatile than the previously used MIPS block, which means the number of processor instructions per second. A flop is not an SI, but it can be interpreted as a unit of 1/s.

You need an exascale for cancer

An exaflops, or a thousand petaflops, is more than all the top XNUMX supercomputers combined. Scientists hope that a new generation of machines with such power will bring breakthroughs in various fields.

Exascale processing power combined with rapidly advancing machine learning technologies should help, for example, finally crack the cancer code. The amount of data that doctors must have in order to diagnose and treat cancer is so huge that it is difficult for conventional computers to cope with the task. In a typical single tumor biopsy study, more than 8 million measurements are taken, during which doctors analyze the behavior of the tumor, its response to pharmacological treatment, and the effect on the patient's body. This is a real ocean of data.

said Rick Stevens of the US Department of Energy (DOE) Argonne Laboratory. -

Combining medical research with computing power, scientists are working to CANDLE neural network system (). This allows you to predict and develop a treatment plan tailored to the individual needs of each patient. This will help scientists understand the molecular basis of key protein interactions, develop predictive drug response models, and suggest optimal treatment strategies. Argonne believes that exascale systems will be able to run the CANDLE application 50 to 100 times faster than the most powerful supermachines known today.

Therefore, we are looking forward to the appearance of exascale supercomputers. However, the first versions will not necessarily appear in the US. Of course, the US is in the race to create them, and the local government in a project known as Aurora cooperates with AMD, IBM, Intel and Nvidia, striving to get ahead of foreign competitors. However, this is not expected to happen before 2021. Meanwhile, in January 2017, Chinese experts announced the creation of an exascale prototype. A fully functioning model of this kind of computational unit is − Tianhe-3 - however, it is unlikely that it will be ready in the next few years.

The Chinese hold tight

The fact is that since 2013, Chinese developments have topped the list of the most powerful computers in the world. He dominated for years Tianhe-2and now the palm belongs to the mentioned Sunway TaihuLight. It is believed that these two most powerful machines in the Middle Kingdom are much more powerful than all twenty-one supercomputers in the US Department of Energy.

American scientists, of course, want to regain the leading position they held five years ago, and are working on a system that will allow them to do this. It is being built at the Oak Ridge National Laboratory in Tennessee. Summit (2), a supercomputer scheduled for commissioning later this year. It surpasses the power of Sunway TaihuLight. It will be used to test and develop new materials that are stronger and lighter, to simulate the interior of the Earth using acoustic waves, and to support astrophysics projects investigating the origin of the universe.

2. Spatial plan of the Summit supercomputer

At the mentioned Argonne National Laboratory, scientists soon plan to build an even faster device. Known as A21Performance is expected to reach 200 petaflops.

Japan is also taking part in the supercomputer race. Although it has been somewhat overshadowed recently by the US-China rivalry, it is this country that plans to launch ABKI system (), offering 130 petaflops of power. The Japanese hope that such a supercomputer can be used to develop AI (artificial intelligence) or deep learning.

Meanwhile, the European Parliament has just decided to build an EU billion euro supercomputer. This computing monster will begin its work for the research centers of our continent at the turn of 2022 and 2023. The machine will be built within EuroGPK projectand its construction will be financed by the Member States – so Poland will also participate in this project. Its predicted power is commonly referred to as "pre-exascale".

So far, according to the 2017 ranking, of the five hundred fastest supercomputers in the world, China has 202 such machines (40%), while America controls 144 (29%).

China also uses 35% of the world's computing power compared to 30% in the US. The next countries with the most supercomputers on the list are Japan (35 systems), Germany (20), France (18) and the UK (15). It is worth noting that, regardless of the country of origin, all five hundred of the most powerful supercomputers use different versions of Linux ...

They design themselves

Supercomputers are already a valuable tool supporting science and technology industries. They enable researchers and engineers to make steady progress (and sometimes even huge leaps forward) in areas such as biology, weather and climate forecasting, astrophysics, and nuclear weapons.

The rest depends on their power. Over the next decades, the use of supercomputers can significantly change the economic, military and geopolitical situation of those countries that have access to this type of cutting-edge infrastructure.

Progress in this field is so rapid that the design of new generations of microprocessors has already become too difficult even for numerous human resources. For this reason, advanced computer software and supercomputers are increasingly playing a leading role in the development of computers, including those with the prefix "super".

3. Japanese supercomputer

Pharmaceutical companies will soon be able to fully operate thanks to computing superpowers processing a huge number of human genomes, animals and plants that will help create new medicines and treatments for various diseases.

Another reason (actually one of the main ones) why governments are investing so much in the development of supercomputers. More efficient vehicles will help future military leaders develop clear combat strategies in any combat situation, allow the development of more effective weapons systems, and support law enforcement and intelligence agencies in identifying potential threats in advance.

Not enough power for brain simulation

New supercomputers should help decipher the natural supercomputer known to us for a long time - the human brain.

An international team of scientists has recently developed an algorithm that represents an important new step in modeling the brain's neural connections. New NO algorithm, described in an open access paper published in Frontiers in Neuroinformatics, is expected to simulate 100 billion interconnected human brain neurons on supercomputers. Scientists from the German research center Jülich, the Norwegian University of Life Sciences, the University of Aachen, the Japanese RIKEN Institute and the KTH Royal Institute of Technology in Stockholm were involved in the work.

Since 2014, large-scale neural network simulations have been running on RIKEN and JUQUEEN supercomputers at the Jülich Supercomputing Center in Germany, simulating the connections of approximately 1% of neurons in the human brain. Why only so many? Can supercomputers simulate the entire brain?

Susanne Kunkel from the Swedish company KTH explains.

During the simulation, a neuron action potential (short electrical impulses) must be sent to approximately all 100 people. small computers, called nodes, each equipped with a number of processors that perform the actual calculations. Each node checks which of these impulses are related to the virtual neurons that exist in this node.

4. Modeling the brain connections of neurons, i.e. we are only at the beginning of the journey (1%)

Obviously, the amount of computer memory required by processors for these additional bits per neuron increases with the size of the neural network. To go beyond the 1% simulation of the entire human brain (4) would require XNUMX times more memory than what is available in all supercomputers today. Therefore, it would be possible to talk about obtaining a simulation of the whole brain only in the context of future exascale supercomputers. This is where the next generation NEST algorithm should work.

TOP-5 supercomputers of the world

1. Sanway TaihuLight – A 93 PFLOPS supercomputer launched in 2016 in Wuxi, China. Since June 2016, it has topped the TOP500 list of supercomputers with the highest computing power in the world.

2. Tianhe-2 (Milky Way-2) is a supercomputer with a computing power of 33,86 PFLOPS built by NUDT () in China. From June 2013

until June 2016, it was the fastest supercomputer in the world.

3. Pease Dynt - a design developed by Cray, installed at the Swiss National Supercomputing Center (). It was recently upgraded - Nvidia Tesla K20X accelerators were replaced with new ones, Tesla P100, which made it possible to increase computing power from 2017 to 9,8 PFLOPS in the summer of 19,6.

4. Gyokou is a supercomputer developed by ExaScaler and PEZY Computing. Located at the Japan Agency for Marine Science and Technology (JAMSTEC) of the Yokohama Institute of Geosciences; on the same floor as the Earth simulator. Power: 19,14 PFLOPs.

5. Titanium is a 17,59 PFLOPS supercomputer manufactured by Cray Inc. and launched in October 2012 at the Oak Ridge National Laboratory in the United States. From November 2012 to June 2013, Titan was the world's fastest supercomputer. It is currently in fifth place, but is still the fastest supercomputer in the US.

They also compete for supremacy in quantum

IBM believes that in the next five years, not supercomputers based on traditional silicon chips, but will begin broadcasting. The industry is just beginning to understand how quantum computers can be used, according to the company's researchers. Engineers are expected to discover the first major applications for these machines in just five years.

Quantum computers use a computing unit called kubitem. Ordinary semiconductors represent information in the form of sequences of 1 and 0, while qubits exhibit quantum properties and can simultaneously perform calculations as 1 and 0. This means that two qubits can simultaneously represent sequences of 1-0, 1-1, 0-1. ., 0-0. Computing power grows exponentially with every qubit, so theoretically a quantum computer with just 50 qubits could have more processing power than the world's most powerful supercomputers.

D-Wave Systems is already selling a quantum computer, of which there are said to be 2. qubits. However D-Wav copiese(5) are debatable. Although some researchers have put them to good use, they still have not outperformed classical computers and are only useful for certain classes of optimization problems.

5. D-Wave quantum computers

A few months ago, the Google Quantum AI Lab showed off a new 72-qubit quantum processor called bristle cones (6). It may soon achieve "quantum supremacy" by surpassing a classical supercomputer, at least when it comes to solving some problems. When a quantum processor demonstrates a sufficiently low error rate in operation, it can be more efficient than a classical supercomputer with a well-defined IT task.

6. Bristlecone 72 qubit quantum processor

Next in line was the Google processor, because in January, for example, Intel announced its own 49-qubit quantum system, and earlier IBM introduced a 50-qubit version. intel chip, Loihi, it is innovative in other ways as well. It is the first "neuromorphic" integrated circuit designed to mimic how the human brain learns and understands. It is "fully functional" and will be available to research partners later this year.

However, this is only the beginning, because in order to be able to deal with silicon monsters, you need z millions of qubits. A group of scientists at the Dutch Technical University in Delft hope that the way to achieve such scale is to use silicon in quantum computers, because their members have found a solution how to use silicon to create a programmable quantum processor.

In their study, published in the journal Nature, the Dutch team controlled the rotation of a single electron using microwave energy. In silicon, the electron would spin up and down at the same time, effectively holding it in place. Once that was achieved, the team connected two electrons together and programmed them to run quantum algorithms.

It was possible to create on the basis of silicon two-bit quantum processor.

Dr Tom Watson, one of the authors of the study, explained to the BBC. If Watson and his team manage to fuse even more electrons, it could lead to a rebellion. qubit processorsthis will bring us one step closer to the quantum computers of the future.

- Whoever builds a fully functioning quantum computer will rule the world Manas Mukherjee of the National University of Singapore and principal investigator at the National Center for Quantum Technology recently said in an interview. The race between the biggest technology companies and research labs is currently focused on the so-called quantum supremacy, the point at which a quantum computer can perform calculations beyond anything the most advanced modern computers can offer.

The above examples of the achievements of Google, IBM and Intel indicate that companies from the United States (and hence the state) dominate in this area. However, China's Alibaba Cloud recently released an 11-qubit processor-based cloud computing platform that allows scientists to test new quantum algorithms. This means that China in the field of quantum computing blocks also does not cover the pears with ashes.

However, efforts to create quantum supercomputers are not only enthusiastic about new possibilities, but also cause controversy.

A few months ago, during the International Conference on Quantum Technologies in Moscow, Alexander Lvovsky (7) from the Russian Quantum Center, who is also a professor of physics at the University of Calgary in Canada, said that quantum computers destruction toolwithout creating.

7. Professor Alexander Lvovsky

What did he mean? First of all, digital security. Currently, all sensitive digital information transmitted over the Internet is encrypted to protect the privacy of interested parties. We have already seen cases where hackers could intercept this data by breaking the encryption.

According to Lvov, the appearance of a quantum computer will only make it easier for cybercriminals. No encryption tool known today can protect itself from the processing power of a real quantum computer.

Medical records, financial information, and even the secrets of governments and military organizations would be available in a pan, which would mean, as Lvovsky notes, that new technology could threaten the entire world order. Other experts believe that the Russians' fears are unfounded, since the creation of a real quantum supercomputer will also allow initiate quantum cryptography, is considered indestructible.

Another approach

In addition to traditional computer technologies and the development of quantum systems, various centers are working on other methods for building supercomputers of the future.

The American agency DARPA funds six centers for alternative computer design solutions. The architecture used in modern machines is conventionally called von Neumann architectureOh, he's already seventy years old. The defense organization's support for university researchers aims to develop a smarter approach to handling large amounts of data than ever before.

Buffering and parallel computing Here are some examples of the new methods these teams are working on. Another ADA (), which makes it easier to develop applications by converting the CPU and memory components with modules into one assembly, rather than dealing with issues of their connection on the motherboard.

Last year, a team of researchers from the UK and Russia successfully demonstrated that the type "Magic Dust"of which they are composed light and matter - ultimately superior in "performance" to even the most powerful supercomputers.

Scientists from the British universities of Cambridge, Southampton and Cardiff and the Russian Skolkovo Institute used quantum particles known as polaritonswhich can be defined as something between light and matter. This is a completely new approach to computer computing. According to scientists, it can form the basis of a new type of computer capable of solving currently unsolvable questions - in various fields, such as biology, finance and space travel. The results of the study are published in the journal Nature Materials.

Remember that today's supercomputers can only handle a small fraction of the problems. Even a hypothetical quantum computer, if it is finally built, will at best provide a quadratic speedup for solving the most complex problems. Meanwhile, the polaritons that create "fairy dust" are created by activating layers of gallium, arsenic, indium, and aluminum atoms with laser beams.

The electrons in these layers absorb and emit light of a certain color. Polaritons are ten thousand times lighter than electrons and can reach sufficient density to give rise to a new state of matter known as Bose-Einstein Condensate (eight). The quantum phases of polaritons in it are synchronized and form a single macroscopic quantum object, which can be detected by photoluminescence measurements.

8. Plot showing a Bose-Einstein condensate

It turns out that in this particular state, a polariton condensate can solve the optimization problem we mentioned when describing quantum computers much more efficiently than qubit-based processors. The authors of British-Russian studies have shown that as polaritons condense, their quantum phases are arranged in a configuration corresponding to the absolute minimum of a complex function.

“We are at the beginning of exploring the potential of polariton plots for solving complex problems,” writes Nature Materials co-author Prof. Pavlos Lagoudakis, Head of the Hybrid Photonics Laboratory at the University of Southampton. “We are currently scaling our device to hundreds of nodes while testing the underlying processing power.”

In these experiments from the world of subtle quantum phases of light and matter, even quantum processors seem to be something clumsy and firmly connected with reality. As you can see, scientists are not only working on the supercomputers of tomorrow and the machines of the day after tomorrow, but they are already planning what will happen the day after tomorrow.

At this point reaching exascale will be quite a challenge, then you will think about the next milestones on the flop scale (9). As you might have guessed, just adding processors and memory to that is not enough. If scientists are to be believed, achieving such powerful computing power will allow us to solve megaproblems known to us, such as deciphering cancer or analyzing astronomical data.

9. The future of supercomputing

Match the question with the answer

What's next?

Well, in the case of quantum computers, questions arise as to what they should be used for. According to the old adage, computers solve problems that wouldn't exist without them. So we should probably build these futuristic supermachines first. Then problems will arise by themselves.

In what areas can quantum computers be useful?

Artificial Intelligence. AI () works on the principle of learning through experience, which becomes more and more accurate as feedback is received and until the computer program becomes "smart". The feedback is based on calculations of the probabilities of a number of possible options. We already know that Lockheed Martin, for example, plans to use its D-Wave quantum computer to test autopilot software that is currently too complex for classical computers, and Google is using a quantum computer to develop software that can distinguish cars from landmarks. .

Molecular modeling. Thanks to quantum computers, it will be possible to accurately model molecular interactions, looking for the optimal configurations for chemical reactions. Quantum chemistry is so complex that modern digital computers can only analyze the simplest molecules. Chemical reactions are quantum in nature because they create highly entangled quantum states that overlap each other, so fully developed quantum computers can easily evaluate even the most complex processes. Google already has developments in this area - they have modeled the hydrogen molecule. The result will be more efficient products, from solar panels to medicines.

Cryptography. Security systems today depend on efficient primary generation. This can be achieved with digital computers by looking for every possible factor, but the sheer amount of time required to do so makes "code breaking" costly and impractical. Meanwhile, quantum computers can do this exponentially, more efficiently than digital machines, meaning that today's security methods will soon become obsolete. There are also promising quantum encryption methods that are being developed to take advantage of the unidirectional nature of quantum entanglement. Citywide networks have already been demonstrated in several countries, and Chinese scientists recently announced that they are successfully sending entangled photons from an orbiting "quantum" satellite to three separate base stations back to Earth.

Financial modeling. Modern markets are among the most complex systems in existence. Although the scientific and mathematical apparatus for their description and control has been developed, the effectiveness of such activities is still largely insufficient due to the fundamental difference between scientific disciplines: there is no controlled environment in which experiments can be carried out. To solve this problem, investors and analysts have turned to quantum computing. One immediate advantage is that the randomness inherent in quantum computers is consistent with the stochastic nature of financial markets. Investors often want to evaluate the distribution of outcomes in a very large number of randomly generated scenarios.

Weather forecast. NOAA Chief Economist Rodney F. Weiher claims that almost 30% of US GDP ($6 trillion) is directly or indirectly dependent on the weather. for food production, transportation and retail. Thus, the ability to better predict the aura would be very useful in many areas, not to mention the longer time allotted for natural disaster protection. The UK's national meteorological arm, the Met Office, has already begun investing in such innovations to meet the power and scalability needs it will have to deal with from 2020 onward, and has published a report on its own exascale computing needs.

Particle Physics. Solid particle physics models are often extremely complex, intricate solutions that require a lot of computational time for numerical simulations. This makes them ideal for quantum computing, and scientists have already capitalized on this. Researchers at the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI) recently used a programmable quantum system to perform this simulation. According to a publication in Nature, the group used a simple version of a quantum computer in which ions performed logical operations, the basic steps of any computer calculation. The simulation showed complete agreement with the real experiments of the described physics. says theoretical physicist Peter Zoller. - 

Add a comment