A little-known name in the world of autonomous driving is paving the way for a new type of self-driving car—one that can use “common sense,” as the company calls it, to navigate an uncontrolled environment.
While most companies developing self-driving cars are focused on improving sensors, perception, and control, iSee CEO Yibiao Zhao says his company is the first to work on creating a robot that can really understand what’s going on.
Zhao founded iSee just about a year ago along with Chris Baker, his lab partner at the Massachusetts Institute of Technology, and Debbie Yu, who has a history with tech startups. The three are supported by MIT’s venture capital firm, The Engine, in Cambridge, Mass.
“We know that seeing is not equivalent to understanding,” Zhao told Fortune in a phone call last week. “Currently cars can see, but they cannot really understand what’s really going on and what other people are thinking, and what are the other people’s intentions.”
With iSee, the cars’ programming has a special algorithm allowing it to collaborate with humans in an open environment. The system has two components: deep learning and the common sense engine.
Deep learning is something other companies like Waymo and Uber have already established; it’s the notion that if you practice something enough, you’ll be able to do it unconsciously. In humans, it’s the fast, subconscious thinking that allows you to multitask while driving. In self-driving cars, it’s the type of learning that lets a car remain within a lane.
When you get to an obstacle, however, you’ll need reasoning, or conscious thinking. When you merge on the highway, change lanes, or come to an intersection, you need to predict the actions of other cars, negotiate with them, and consider different possibilities in order to make a safe decision.
As a human driver, “we’ll consciously think about those types of possible parallel futures,” says Zhao. “That is enabled by our common sense engine in our mind, and that gives us the ability to handle some new scenario that we never encountered before.”
In a self-driving car, the common sense engine allows it to navigate new situations based on a handful of past experiences and general knowledge.
This component, unique to iSee, helps the car to “truly understand what is going on, and to predict what they might do in the next two seconds,” says Zhao. This lets the robot “make safe and strategic decisions when they need to interact or even negotiate with the other drivers in the environment,” he says.
Once Zhao, Baker, and Yu had this algorithm established and passed through a simulator about a year ago, they figured “why not” try it on a real car, says Zhao. Yu generously agreed to let her car, a hybrid SUV, serve as guinea pig.
“We spent just two weeks, and we made the car driving,” says Zhao, laughing as he recalls how cold it was working in the garage in the winter of 2017. “It was a very fun experience.”
Since the success of that first experiment, iSee has gone through multiple variations of programming. The team tests their system with both a simulation engine and manned cars driving in multiple states.
This kind of technology, allowing robots to work fluidly with humans, has potential outside the industry of autonomous cars, but Zhao says iSee is focused on self-driving cars for now.
“We believe the self-driving car is the emerging market. Everyone is working so hard towards it, and the market is ready, the customer is ready,” he says. “What is lacking is this enabling technology, so we want to make this killer application work first. In the future, we can extend it to other applications.”
With the success of the common sense engine, iSee hopes to become widely accepted—without the controversies that have surrounded industry leaders like Waymo, which is reportedly hated by its human neighbors in Arizona due to the cars’ overly-conservative driving.
“I think that’s the open challenge in the field,” says Zhao. “There’s one single piece—that is this core part of the common sense understanding—and I think even Waymo and Uber, those companies, haven’t figured it out yet. We are laser focusing on that and I think that can be the enabling technology to make it really work well in a real-world scenario.”
How soon will that be? “It’s already happening,” says Zhao. “It’s not the future. It’s now.”