JOCHEN HAAB sees like a car.
Having spent years poring over radar and camera data, Mercedes’ driver assistance guru seems to instinctively know what his car can and can’t see.
“The radar knows that trees are a certain percentage water, a certain percentage cellulose. They have a predictable radar return. The camera doesn’t know what a tree is, they all look different,” he says.
Spend any time with him and you start to see like a car too.
He’s not touching the steering wheel. The new Mercedes S560 seems as happy taking instructions from its own bank of sensors than from the lump of gristle behind the wheel.
Three years ago, this technology seemed like black magic in a Tesla Model S. Now we get a similar suite of technology fitted as standard to a Honda Civic, albeit dumbed down for mass consumption.
We’re here today to help Mercedes in its validation program, riding shotgun in a car packed with computers on the home leg of a drive from Sydney to Melbourne.
We’re tracking a part of the Hume Highway that Mel Nichols immortalised for Wheels in his classic ‘HO Down the Hume’ yarn, albeit at a more sedate pace. Haab’s brief is to see how the car copes with Aussie conditions, improving the mapping data along the way.
Whenever there’s a glitch in the system, such as the car’s cameras picking up the speed limit sign from an adjacent slip road or running out of reliable road markings, Haab flags it, the data goes back to Merc’s nerve centre at Sindelfingen and, after a validation process, the improved mapping data appears in customer’s cars.
“Australia is a pretty easy place to work with autonomous cars. The roads are open and the speed differentials between cars are not great,” he explains.
“There are some anomalies, like cyclists on the major highways, digital speed limit signs that change shape and as for Melbourne’s hook turns…” He trails off and shakes his head. “The hook turn alone will probably mean Melbourne won’t be in the first phase of autonomous car trials. It’s complicated, but describable, so we can solve it. It’s just that it takes resources to do that.”
As impressive as the Benz is, driving down a well-marked highway in bright sunlight is, technically speaking, easy.
It’s like firing tennis balls from a machine while full autonomy, known as Level 5, is expecting that ball machine to beat Rafa Nadal over three sets on clay.
The current state of the art is represented by the forthcoming Audi A8 that boasts Level 3 autonomy, that is, it can drive itself on highways and traffic jams at up to 60km/h.
Level 4 trials, where fully autonomous vehicles can conduct themselves in strictly geofenced areas, are under way with the first retail products likely to go on sale by 2022. But Level 5, where the car is fully autonomous anywhere and the driver is optional, is still presenting huge challenges to software engineers, hook turns notwithstanding.
The sheer pace of change is the market dynamic that’s shaking out the hindmost.
The autonomous market leader at present appears to be Waymo – now a sister company of Google under holding group Alphabet – which is determined to tackle the complexities of Level 5 autonomy head on.
Despite public records in California showing that Waymo performed 600,000 miles of testing in the state last year – more than 30 times as much as all the other testers combined – a technological step-change has meant the company, and, indeed, Mercedes-Benz, is faced with merging two divergent design philosophies.
The current approach for self-driving start-ups is an artificial neural network, used to emulate the way a human brain functions.
‘Deep learning’ accelerates the way an autonomous car takes on new information, ingesting vast amounts of data to create more sophisticated algorithms than the more traditional rules-based approach that Waymo spent years developing.
Engineers have discovered that it’s virtually impossible to hand-code the number of scenarios the car could conceivably encounter. But where the latter requires a virtually infinite number of coders, deep learning theoretically requires an almost infinite input of data to train the system. Melding the two approaches is both a risk and an opportunity for Waymo. In short, it can’t afford to fail.
Upping the stakes to this extent has stretched both software and hardware developers. Handling the deluge of sensor data from cameras, radar, lidar, and ultrasonic units has seen new power partnerships such as that between Intel and Mobileye emerge.
Processor giant Nvidia has been the computing power behind Tesla’s Autopilot system but its latest AI computer, codenamed Pegasus and due in late 2018, has been designed specifically for Level 5 vehicles and can number crunch roughly 10 times quicker than the company’s current boards.
This allows the Pegasus’ four deep learning processors to handle a terabyte of bandwidth each second. That still may not be enough. Level 5 autonomy requires multiple levels of redundancy in order to ensure safety and Nvidia estimates that this level of computational power is 50 to 100 times more than the capabilities of today’s autonomous cars.
Bridging the gap between where we are now and where we need to be for Level 5 has led to a latter-day gold rush as cash-rich car companies desperately try to buy their way out of stasis. General Motors has created a car-sharing business in Maven and bought into Cruise Automation and Lyft. Volvo has partnered with Uber in a deal that will see it supplying 24,000 self-driving cars to the ride-hailing service. Volkswagen has invested in mobility service provider Gett and has established mobility service provider MOIA.
Google has hoovered up microsensor business Lumedyne, crowdsource traffic app Waze and a stack of other mapping/sensing businesses: Keyhole, Endoxon, Image America, Digisfera, Where2, Zipdash, the list goes on.
Integrating these businesses while defending intellectual property is a huge issue for Google, as is litigation over purportedly stolen IP with Uber.
The German car manufacturers are developing ways of cutting Google out of the loop, while recognising the difficulty of doing just this.
“They have some advantages because they’re a software company and we’re coming from hardware,” admits Haab. The Germans are quick learners, though.
Audi has established Autonomous Intelligent Driving (AID) GmbH, a division that has opened its doors to cooperation with partners in the automotive and IT sectors.
In effect, it’s plotting a similar pathway to how digital map company HERE has evolved. Audi, BMW, and Daimler bought the former Nokia unit for $2.9 billion in a bid to reduce any possible dependence on Google and Apple; the latter’s VoxelNet system promising to revolutionise the effectiveness of lidar.
Going head-on with these tech giants is a high-risk strategy, but if successful, will also be likely to head off challenges from upstarts like Baidu, the Chinese search engine juggernaut, which has also announced a US$1.5bn investment in its Apollo autonomous vehicle software.
It’s a brave new world that’s due a huge shakeout and when it comes, some very big names will be casualties. It’s also one that is yet to overcome significant user resistance.
A survey by Pew has revealed that 56 percent of Americans say they will not ride in a driverless vehicle, 81 percent believe the technology will destroy jobs and 30 percent fear that driverless cars will make roads more dangerous.
Yet the true driverless car is still some way off.
“I’d estimate Level 5 will come in western cities at some point late in the next decade,” Haab says.
“This car has the feel of Level 3, but in order to build a genuine Level 3 car, you need redundancy of systems. You currently have two braking systems. With Level 3 you need redundancy of parts like steering,” he says before admitting he’s not certain how Audi has cracked this issue with its forthcoming A8.
With Benz’s fleet comprising 175 test vehicles that have, so far, covered 9.5 million kilometres and made 1.2 million ‘improvements’, it’s hard to underestimate the scale of Haab’s task.
The data logger in the boot of our car records 70,000 data channels, storing 6GB of data every 30 seconds. “At higher automation, the map is a basic sensor,” says Haab.
His brow furrows as he scans his incoming data while the S-Class blitzes past a logging truck, noticing when the monitor eventually fuses radar and camera signatures. It’s fascinating to see the car think, but it’s also sobering to understand the current limitations of this tech.
So yes, the future’s coming. Just don’t expect it tomorrow.
COMMENTS