BMW Group banks on its Silicon Valley's skunkwerks to innovate in the fashion-forward realm of active safety.
BMW Group just unveiled an eyewear product concept that uses augmented reality technology to display virtual information overlaid on a driver’s view of the road. The glasses, dubbed Mini Augmented Vision, were introduced at Auto Shanghai 2015, and emanated from research conducted at the BMW Group Technology Office in Silicon Valley.
What’s the point of these spectacles? To show how heads-up displays could be worn as eyewear, anchoring graphical information to the actual environment, rather than offering limited dashboard data in a small portion of the windshield. It’s an early step from a “look to” approach to a “look-through” strategy.
The glasses, sporting a steampunk aesthetic, use nearly identical components to what’s found in a smart phone or tablet, but the touchscreen is replaced with two stereoscopic high-definition displays in front of your eyes. Wifi and Bluetooth connections allow data from the car and the Internet—such as speed, navigation coordinates, and points of interest—to appear as if floating over the roadway. Similarly, cameras mounted on the passenger side-view mirror and turn indicator can theoretically be superimposed on the lenses, allowing drivers to effectively see through the car’s body.
After a briefing with executives, I stepped through black curtains in a corner of the Mini dealership where I was fitted with a pair of Mini Augmented Vision glasses. “This came directly out of our secret lab,” said Robert Richter, senior advanced technology engineer at BMW Group. “You’re one of the first people to see the technology.”
The highly orchestrated demonstration revealed progress in overcoming a key hurdle in augmented reality technology: keeping data and graphics fixed in a specific location in the field of view, even though our heads are constantly moving. However, some movement reverb remained, which along with the entire experience of adding digital information on top of the real world, left me slightly queasy. Moreover, the effect of seeing through sheet metal to the street did not work, creating a double image of pedestrians on the side of the road.
“There’s clearly room for improvement,” said Jay Wright, vice-president of Qualcomm Vuforia, the company’s mobile vision platform. “But compared to what’s been done in the past, it’s delivering on the promise.”
Wright explained the phenomenon known as motion-to-photon latency, a slight lag in displaying information in virtual environments to match the movement of users wearing augmented-reality headsets. “If you want the object to stay in exactly the right place, we need to do that with effectively zero processing time,” said Wright. Based on the fact that my glasses got warm during a 20-minute demonstration, it appears that the Qualcomm Snapdragon 805 processor, running on an Android operating system, was working overtime.
The glasses have two built-in cameras, one facing out and one facing up, as well as an accelerometer and a gyro sensor. “That’s the magic,” said Wright. The cameras, along with the inertial sensors are not intended to take pictures, but only to provide an exact location of the glasses down to fractions of a millimeter, according to Wright. Knowing precisely where the driver is looking allows the glasses to only show road information in a specific zone directly in front of the car. That information disappears, or is replaced with other contextual graphics, when the driver looks away.
“We have to know exactly where in the car the glasses are, because, for example, the position of the speedometer on the windshield has to show up in the same place for everyone,” said Wright.
In the stationary simulation, which projected a video of road conditions on a wall in front of a parked Mini, the effect was extraordinary at times. When everything successfully aligned, for example, an animated row of arrows swept through the road in front of me, emphatically showing the way forward, rather than a conventional navigational system simply showing a flat two-dimensional arrow and the name of the street.
Whether or not drivers want, or have the cognitive capacity, to use special glasses that blend and blur digital displays over real-world cars, trucks, pedestrians and road hazards, remains to be seen. In creating Mini Augmented Vision, BMW will be among the first automakers to gauge public reaction to the concept. “This allows our designers and engineers to flex their muscles and show what’s possible,” said Richter. He said there’s no set production timing, but that feedback from this introduction would determine if the glasses project would advance to more testing and development.
Yet, for Qualcomm, the new technology that pinpoints the location of eyewear, and firmly locks context-specific data and graphics over the real environment, could be a breakthrough for its Vuforia imaging platform. The Mini Augmented Vision glasses are intended to be worn outside the car as well, providing step-by-step instructions to and from the car, and capturing addresses of points-of-interest that are seamlessly fed to the car’s navigation system.
“We think this has the potential of being a turning point for the wearables category,” said Wright.