Stonebraker founded or helped found several key database technologies.
Michael Stonebraker, the database prodigy who could also be rightfully called a serial entrepreneur, to borrow an overused phrase, on Wednesday won the A.M. Turing Prize, computer science’s highest award.
Now an adjunct professor at MIT CSAIL (the university’s Computer Science and Artificial Intelligence Laboratory), Stonebraker pioneered many key database concepts since his days as an undergraduate at Princeton University and later at the University of Michigan. Before joining MIT he was a professor of computer science at the University of California at Berkeley for nearly 30 years.
Stonebraker founded or helped found Ingres, an early SQL database now known as Actian; PostgreSQL, still of the most popular open-source databases around; Streambase an event-processing database that is now part of Tibco; and Vertica, which Hewlett-Packard purchased in early 2011.
His latest effort, Tamr, aims to apply both machine learning and good old-fashioned human know-how to make sense of often messy corporate data that comes in many shapes and sizes.
He is famous for arguing that database is not a one-size-fits-all category, that companies need different databases for different things and he started doing so at a time when the big relational database giants were trying to convince the universe that a big relational database could do it all—handle both the neat rows-and-columns of relational data along with the messy object-oriented data (such as images).
While they were pitching that uber-database notion, Stonebraker helped create Vertica for data warehouse applications. These are the sort of jobs in which a user needs to query a huge historical data set, say the sales of a certain item at many stores over a long period of time.
Other super fast real-time databases are more suited for on-the-fly queries of streaming data. He co-founded VoltDB to perform in-memory database operations to suit such needs.
The Turing Award is named after Alan Turing, the British computer scientist and war hero who was the subject of the 2014 film The Imitation Game. He led the effort by British intelligence to crack the Nazi code during World War II, work that was credited with saving tens of thousands of Allied lives.
The award itself is arguably computer science’s highest honor—its Oscar, if you will. Past recipients include last year’s winner Leslie Lamport, now at Microsoft Research, for his work on distributed systems and 2004 winners Vint Cerf and Bob Kahn for their foundational work on networking systems and the TCP/IP protocol that made the Internet possible. Edgar F. “Ted” Codd, the father of the relational database, won it in 1981.
The award comes with $1 million in prize money.
To hear Michael Stonebraker expand on the impending battle of the database elephants, listen to this 2013 podcast.