Metanautix CEO Theo Vassilakis.
Courtesy: Metanautix
By Heather Clancy
February 2, 2015

Three years ago, a Google developer and Facebook image processing engineer teamed up to simplify business intelligence. Their mission: create a universal way to search for answers to basic queries like “How did I win that customer?”

Last August, their startup Metanautix emerged from stealth with $7 million in Series A funding from Sequoia Capital. Its first product, the Quest Data Compute Engine, emerged shortly thereafter. Its unique twist: it doesn’t think business should have to reformat existing data sources to tease out the answers.

“The reality of modern enterprises is that heterogeneity is the norm in data, systems and processes,” wrote co-founder and CEO Theo Vassilakis in a blog post coinciding with the launch. He continued: “Instead of structuring analysis around storage as has been done traditionally, we help you structure analysis around computation—i.e., answering your business question.”

The inspiration for Quest was a custom search engine called Dremel, used internally at Google (GOOG) for business analysis. (Vassilakis was one of the creators.) Early adopters include Hewlett-Packard, Shutterfly, and other unnamed companies from the consumer packaged goods, pharmaceutical, and utility sectors. Fortune spoke with Vassilakis about Metanautix’s differentiation in a crowded big data software market and how the startup will use its partnership with visual analytics powerhouse Tableau Software to build momentum.

Excerpts from the interview were edited for length and clarity.

Explain the Data Compute Engine concept to a non-technologist.

Basically the way we explain it is to say traditionally when you want to analyze data you first have to put it into a thing, like a spreadsheet or a database, so you can ask questions and get results. What a Data Compute Engine helps you do is to just ask your questions of your database, your spreadsheet, whatever files you have.

Describe some early applications.

An example that is more traditional would be some of the work we’ve done at Shutterfly, where they’re doing marketing analytics. They said, “Let’s look at all of our orders. We sell prints and calendars and t-shirts and things like that. How do we get those orders? What part of our marketing works? Was it the e-mail? Was it the ads? What was it that got us the order?”

They were doing that analysis for some time actually with a method called “last touch.” That means identifying the last thing that the customer did before they bought—as in they clicked an ad and then they bought whatever. The company figured it must’ve been that ad that caused the customer to buy the print. Or someone got a direct mail campaign message and then they bought the calendar. That was the motivation.

[Shutterfly] looked at that process and said, “You know, that’s a good model. It’s a good approximation, but it would be better to look at everything touching the user before their last purchase and since the purchase before that.” This greatly expanded the data that they had to consider to do the analysis, so the process became very slow. It took two days to compute the likely marketing channels for all their orders.

One thing we helped Shutterfly do was radically speed that up. We took those two days of compute down to basically 20-25 minutes on a single machine and then just three and one-half minutes on a cluster.

So that’s one class of applications: processing is just faster, simpler, easier. Then there are other classes of applications where we’re doing more exotic things: for example, for one utility company we’re helping them analyze 3D models of gas pipes and move them around between laptops into the cloud, which lets them query the images and do fancier stuff with that data.

How much help are you giving your early customers? Are you doing a lot of services around those applications or are you kind of pretty much out of the way?

We are at a stage where we’ve got eight or nine customers and are still early in our development. We want to make sure every one of our customers succeeds. So we do help, but it’s not a big part of the objective to build out a whole professional services organization.

I’ll give you an example. I was talking to a large entertainment company a couple of weeks ago and they’re really interested in predicting box office outcomes of movies. Part of the discussion was about how to build a better model. And of course a lot of us at Metanautix have built models and so we understand this process. We can give input. But our objective at the end of the day wouldn’t be to say, “Hey let me build this model for you.” Our objective would be, “Hey it’s your model. You should own it.” In fact, in order for it to be impactful in your organization you have to own it because it has to change the culture of your org and how you think about collecting data.

Are you still taking on customers very selectively?

We’re a Tableau partner, and we are focusing on organizations that have Tableau because it usually represents a commitment by an organization to be data-driven and all of those good things. So one of the things that happened—and we’re excited about this—is that the Tableau partnership sort of started the process where people are starting to reach in without us necessarily reaching out.

Ford recently appointed a chief data officer and all sorts “Internet of Things” devices are generating millions of metrics. How does Quest handle this data?

We’ve built a demonstration, a little car that you can kind of drive around with a remote control. Using Quest, we can query the car and say, “Hey car, where are you and what are your sensors telling us?” We can do that with very simple commands.

But actually a lot of data, especially in Internet of Things types of applications, maybe never gets stored. It gets generated and maybe queried on the fly. Then it goes away. Part of what we’re trying to do is to say what’s really powerful is having the same model of query and analysis for your data, regardless of where it is.

What’s your company’s top priority for 2015?

Build on the Tableau partnership that we’ve initialized and really go after lots of big Tableau server deployments that are tied into core systems in the enterprise—such as Teradata, such as SAP, such as Oracle. From there, we can really help performance, simplicity, and build real examples and repeatable scenarios. Tableau is where we started, but there are obviously a lot of comparable scenarios out there.

This item first appeared in the Feb. 2 edition of Data Sheet, Fortune’s daily newsletter on the business of technology. Sign up here.


You May Like