Supercomputer expert wins $1 million ‘Nobel Prize of computing’

A pioneer in the development of supercomputers has won this year’s Turing Award, computer science’s equivalent of the Nobel prize.

Jack Dongarra, a professor at the University of Tennessee in Knoxville and a researcher at the Oak Ridge National Laboratory, will receive the award and the $1 million check that accompanies it, the Association for Computing Machinery, which administers the prize, said.

The 71-year-old Dongarra has made numerous contributions to the field of supercomputing, including helping to establish the benchmark against which the world’s supercomputers are most commonly ranked. Before the benchmark came into existence, in 1993, different manufacturers made competing claims about their computers that were difficult for scientists to judge.

Dongarra’s LINPACK Benchmark assesses supercomputers based on how many floating-point calculations they can make while solving a standardized set of dense linear equations. Today the world’s most powerful supercomputer, the Fugaku—built by Fujitsu in Japan—achieves 442 petaflops per second (or 442 quadrillion floating point calculations per second) on this benchmark. Dongarra is one of five researchers who founded and run the Top500, the non-profit organization that maintains the ranking of the world’s most powerful computers.

The Turing Award winner also made important contributions to the programming libraries for linear algebra that underly how many scientific and engineering algorithms are processed on computers of all sizes, ranging from mobile phones to the largest supercomputers. “These libraries were essential in the growth of the field—allowing progressively more powerful computers to solve computationally challenging problems,” the Association for Computing Machinery, the scientific association that administers the Turing Award, said of Dongarra’s work.

Nick Higham, a professor of applied mathematics at the University of Manchester, said that “an underlying theme throughout Dongarra’s work is that mathematics, in particular linear algebra, provides a powerful language with which to reason about and design algorithms to exploit increasingly complex computer architectures.”

Dongarra helped standardize how computers split large problems into pieces that can be solved in parallel, rather than having to wait for each piece of the puzzle to be solved in turn. This vastly improved the efficiency with which high-performance computers can run complex algorithms.

Before Dongarra helped unify these methods, each computer company had its own protocols for parallel processing. “It was like the Wild West,” Dongarra told Fortune in an interview earlier this week, noting that programmers would have to rewrite a program for each different computer brand. Dongarra helped create something called the message passing interface (or MPI), which has standardized this parallel processing so that programs can be run on any machine.

“Jack Dongarra’s work has fundamentally changed and advanced scientific computing,” Jeff Dean, Google senior fellow and senior vice president of Google Research and Google Health, said. Google sponsors the $1 million prize that accompanies the Turing Award.

Ian Foster, a professor of computer science at the University of Chicago and director of data science and learning at the Argonne National Laboratory, said Dongarra should get credit for his work rallying different groups—scientists who needed to use supercomputers in their work, the companies that built the computers, and the companies that created the software to run them—around standardized programming libraries and benchmarks. The standards were adopted through community consensus, not because any official standards setting body mandated that they be adopted.

“His visionary technological contributions plus his sustained community leadership played a bigger role, I would argue, that anyone else’s in the evolution of today’s world-changing highly parallel applications and the massively parallel computers on which they run,” Foster said. He said that many scientists and programmers now take these things for granted, but they “were by no means obvious outcomes” when Dongarra first started working on them in the 1980s.

Despite the importance of his LINPACK Benchmark to supercomputing, Dongarra said he agreed with critics who fault the technical yardstick for no longer being representative of the kinds of workloads many scientists and businesses want to run on supercomputers today, particularly in areas such as machine learning.

In the late 1970s, when the benchmark was developed, Dongarra said, floating point calculations were expensive—in both processing power and energy. Today, that’s not the case. Not only are floating point calculations relatively cheap, he says, but many machine learning applications don’t require the kind of precision the LINPACK Benchmark is designed to measure.

What’s expensive now, he says, is moving vast amounts of data from various locations where it is stored to the processors where a calculation is performed, and then back again. A newer supercomputing benchmark that Dongarra also had a hand in helping to promulgate—it’s called HPCG (short for “high performance conjugate gradients”)—is supposed to better reflect real world uses of supercomputers. On these benchmarks, the supercomputers achieve only a fraction—the best are at about 3%—of their top performance on the LINPACK Benchmark.

Dongarra also said that those who worry about the energy consumption of ever-larger supercomputers are right to be concerned. Today’s most powerful supercomputers consume about 20 megawatts of electrical power—which comes to about $20 million worth of electricity in a year, he says. This power consumption is set to double as first “exascale” supercomputers begin to come online this year (machines capable of quintillions of calculations per second.)

He said he supports alternative benchmarks that would help drive those building high-performance computers and data centers to care as much about energy efficiency as they do about raw computing power. “It is important to come up with mechanisms that reduce that power consumption and the carbon footprint associated with it,” he says.

Dongarra also said he looked forward to a future in which quantum computers—which harness the properties of quantum physics to make their calculations—are used alongside a variety of other specialized computer chips, including graphics processing units, which handle many machine learning workloads today, and neuromorphic chips, which take inspiration from the wiring of human and animal brains. He says it will be important to extend his ideas about how to break large problems into smaller pieces, or batches, and farm them out to different kinds of hardware, in order to ring the most efficiency out of this kind of hybrid computing environment.

The Turing Award, which has been awarded since 1966, is named after Alan Turing, the British mathematician who articulated many of the theoretical foundations of computing and helped build some of the earliest computers. He also helped the British government crack the German Enigma cipher during World War II.

Never miss a story: Follow your favorite topics and authors to get a personalized email with the journalism that matters most to you.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward