U.S. Markets closed

Your Car Is About to Get Even Smarter

Your Car Is About to Get Even Smarter

Consumer Reports has no financial relationship with advertisers on this site.

Consumer Reports has no financial relationship with advertisers on this site.

Engineers and designers have been fine-tuning technology that lets a car read a driver's emotions and interpret voice commands and hand gestures, updates that will soon be found in new vehicles.

That's thanks to three things coming together at the same time: more sensors, more powerful on-board computers, and access to big data clouds of consumer information, says Ken Washington, chief technology officer at Ford.

“The combination is making it possible to mimic the intelligence—to some degree—of complex (human) decision-making,” Washington says.

The latest high-tech advances reflect how the relationship between consumers and car companies is changing, says Mike Ramsey, research director at Gartner, a global research and advising firm.

While in the past, dealers would just sell a car and not see the buyer again until it was time to replace it, there's now a push by automakers to promote a vehicle as a platform for technology that can be updated frequently, much like smartphones today.

Some innovations, many of which were showcased at this year’s CES consumer technology trade show in Las Vegas, are likely to show up first in luxury models.

Both BMW and Mercedes-Benz are refining their digital personal assistants. This year, each brand's assistant will be able to take Alexa-like commands, as in “Hey, BMW” and “Hey, Mercedes.”

These cars will be able to learn a driver’s habits, for example, anticipating that you’re commuting to work and preprogramming your route with reports about live traffic conditions on the navigation system.

Mercedes says it’s constantly improving its digital assistant's ability to respond to complex questions, and it's no longer confused by the voices of multiple passengers. Sensors help the car anticipate basic needs based on hand movements, like lighting up the front passenger's side when you reach for something in the glove box.

BMW says its assistant will be able to answer real-time questions about car functions and maintenance needs, like an integrated owners’ manual.

New Monitors in the Car

Automotive suppliers are getting into the act as well, to help make sure drivers are keeping their eyes on the road.

Nuance, an artificial-intelligence company that works with automakers like BMW, says its system, still in development, can sense the emotions of the driver using facial recognition and voice-pattern analysis.

The car may be able to ask why you’re stressed when you climb inside, or it may ask what’s wrong if you suddenly become upset in the middle of the drive. It can compare your emotional state to what’s happening outside the vehicle, such as traffic conditions.

What does the Nuance system do with that information? If it determines that the driver is concentrating, it can keep alerts and other non-critical driver inputs at a minimum. If it senses you're stressed, it will keep its requests short and to the point. If it thinks you're distracted or drowsy, it will propose taking over parts of the driving task.

Computer chip maker Nvidia is also working on a driver-monitoring system, known as DRIVE IX, that tracks the driver’s gaze and hand movements. It, too, can tell whether a driver is drowsy or distracted and provide alerts or take corrective action if needed.

Nvidia’s system also uses cameras to produce augmented reality for head-up displays that add to the world outside the car. That can result in warnings like flashing alerts on the windshield about fast-approaching vehicles, or the presence of slow-moving pedestrians on the horizon.

Veoneer, a tech startup company spun off from airbag and seatbelt-maker Autoliv, is zeroing in on the tricky handoff between a human driver and a car that’s capable of driving itself in some conditions. Using cameras and microphone, Veoneer’s test vehicle reads moods, watches for distractions, and measures how quickly a driver notices what's happening on the road.

At the end of a trip, the driver can be scored on their driving skill. The system, which is still in development, could also provide a graph showing when the driver was attentive and not attentive, and when the driver was slow to notice, say, a construction zone or a fast-approaching emergency vehicle. Ideally, drivers would be motivated to get better scores and become safer drivers.

The Driver as Pilot

In years past, some chatter at CES focused on self-driving cars. But that talk has been tamped down recently, as companies face technological hurdles that make it difficult to get them on the road.

Although Toyota has concluded that fully self-driving cars are many years away, it feels a moral obligation to share what it learns to make today’s cars safer, says Gill Pratt, CEO of the Toyota Research Institute.

Toyota is inspired by fighter jets, which use technology to amplify a pilot’s intentions, Pratt says. A pilot uses a joystick to tell the plane what to do, and the plane makes sure its speed and wing angles keep it in the air. The Japanese automaker says it is implementing that kind of control into cars.

“It’s not a discrete on-off switch between human and machine,” Pratt says. “Rather, it’s a blend of both human and machine working together as a team.”

Most of the time under Toyota’s system, known as Guardian, the driver will be 100 percent in control of the car. But as a car approaches unsafe conditions, it begins to collaborate with the human driver to return to safety.



More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2019, Consumer Reports, Inc.