Has Big Data Gone Mainstream?

January 13, 2016, 9:43 PM UTC
479801072
Cyber Attack A01
Photograph by Getty Images/iStockphoto

Big data is no longer the hot buzzword it was a few years ago that people strained their brains to understand. It’s now entered the mainstream and can be viewed as an extension of traditional data crunching.

That’s one of the takeaways from a Deloitte report on trends in data analytics released on Wednesday.

“Big data and traditional analytics are merging,” said Tom Davenport, an independent senior advisor to Deloitte who helped write the report and who is also a Babson College professor. “It’s getting harder and harder to distinguish the two.”

The term “big data” generally refers to the idea of analyzing enormous volumes of information to make better business decisions and improve the performance of a company’s internal computer systems. Additionally, all that information doesn’t have to be stored in one place, like an Oracle (ORCL) database, for example. It can be scattered across multiple databases and systems.

SIGN UP: Get Data Sheet, Fortune’s daily newsletter about the business of technology.

It was just a few years ago that executives struggled to understand the term and had trouble finding employees who specialized in the types of technology used to crunch tons of information.

Now, however, universities offer specialized master’s degrees for advanced data analytics and companies are creating their own in-house programs to train talent in data science. The Deloitte report cites networking giant Cisco (CSCO) as an example of a company that created an internal data science training program that over 200 employees have gone through.

Because of media reports, consulting services, and analysts talking up “big data,” people now generally understand what big data means and how they can apply it to their own business.

The Deloitte report explains that Google searches for the term “big data” in 2010 were high, but they have since declined. Davenport explained that’s because people now grasp the term and no longer need to look up what it means.

“Frankly, I think there’s less of a desire to read about this stuff, and more of a desire to do it,” said Davenport.

Now, the new hot topic is cognitive computing, which generally refers to computer systems built to simulate human thinking and reasoning using artificial intelligence techniques like machine-learning algorithms. Cognitive computing systems can recognize speech, identify objects in pictures, and even learn to adapt to dangerous road conditions if embedded within a self-driving car, for example.

WATCH: For more on data see our Fortune video:

If companies are able to use these advanced data crunching techniques to solve traditional business problems, such as using IBM Watson to improve water use on farms, cognitive computing might eventually be lumped in with how people view traditional data analytics.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward