Autonomous Cars – A Cautionary Story about Selling Change

Written by Jason Craker on

Nothing is as dangerous to the success of a new idea or technology as the unrealistic expectations we place upon it based on our misconceptions. In the early days, the gap between those expectations and reality can be fatal. 

Consider these two very different stories about self-driving passenger cars. 

Today in the East Valley of Phoenix Arizona, you can book a cab via an app and it’ll arrive and take you to your destination.  So what’s remarkable about that? Well, if you want to make small talk, you’ll have to do it yourself because this cab doesn’t have a driver. The cab belongs to Waymo One and they’re running driverless cars – today. Science fiction has become a reality in Arizona! 

On March 18th, 2018, Elaine Herzberg was struck and killed by car while pushing her bicycle across a four-lane highway in Tempe, Arizona (coincidentally also part of the East Valley). A tragic but again unremarkable story, except for the fact that the car was a self-driving test vehicle owned by Uber and it didn’t brake because it didn’t recognise Herzberg. The extensive research into the fatality revealed that while the Uber had been “taught” to avoid pedestrians and cyclists but it had also been “taught” that cyclists sit on bicycles and pedestrians only crossed the road at crossways. Herzberg met neither of these criteria so the Uber carried on driving – until it hit her.  

Elevator

Autonomous cars. Self-driving cars. They’re phrases we’re lived with for so long, in books and films, in tech programmes and Wired articles. Finally, companies like Waymo (which began life as Alphabet’s self-driving car project back in 2009) and Tesla seem to be delivering the dream. Unfortunately, the truth is a little more complicated – on one hand disappointing, but on the other hand astonishing.  

Before we go any further, you need to understand the term operational design domain (ODD). An ODD is a set of circumstances a self-driving car is programmed to recognise and successfully react to, or in layman’s terms, rules and logic applied within a clearly defined context. Human drivers are adaptive – we can deal with unexpected situations, although not always with the best outcomes. Automated cars are completely dependent on their programming and don’t currently extrapolate from previous experience. It’s no coincidence that both the Waymo and Uber stories are based in Phoenix. It’s the most benign driving environment imaginable. With wide streets, few if any cyclists, and pedestrians who only cross on crosswalks, it’s a perfect environment for testing automated driving systems because it’s so predictable. Unfortunately, this predictability is both a blessing and a curse. It’s unlikely cars “taught” to drive in East Valley would cope with city streets in Delhi, Beijing, Paris or London (to be honest, not many humans enjoy driving in those cities either) – but they can deal with motorways because that driving environment is very similar to American freeways.  

The term autonomous car is a misnomer – I much prefer the term Intelligent vehicles. The experts, the Society of Automotive Engineers (SAE), define six levels of driving automation. (BTW there’s a great summary here https://www.synopsys.com/automotive/what-is-autonomous-car.html) 

Creating a truly self-driving car is a pyramid problem. In a recent conversation with news company Tortoise, Matthew Avery, director of research at the UK’s Thatcham Research, explained the first 80 per cent of automating driving is easy to achieve, the next ten per cent is hard because it involves automating the road rules (what can I do when?) and the last ten per cent is really, really hard because it involves the unexplained and unexpected – a deer jumping out in front of your car, or another driver suddenly swerving into your lane, or a cyclist running a red light. It’s that last ten per cent that gives us Level 5 and allows the driver to safely sit in the back and watch a movie. 

The words we use are important. Take Tesla’s Autopilot system. We all know what autopilot means – a computer flying a plane while the pilots eat their lunch, right? Wrong. The autopilot essentially maintains the speed and heading set by the pilot in an environment with no other planes nearby – and no pedestrians or cyclists. Even automated landing systems work in a very defined set of circumstances, essentially driving a “vehicle” on a dedicated “road” packed with technology telling it where to go. By comparison, even driving in Phoenix is frightening unpredictable. The word autopilot triggers a set of assumptions about the technology’s capabilities, and those assumptions influence how we use – or misuse – it. Today’s reality is explained on Tesla’s website – Current Autopilot features require active driver supervision and do not make the vehicle autonomous.   

Elon Musk tried to sum up the problem in this 2018 tweet:

This is crux of matter: can’t make system too annoying or people won’t use it, negatively affecting safety, but also can’t allow people to get too complacent or safety again suffers.

— Elon Musk

I think that’s only part of the problem. If you talk about advanced driver assist systems (ADAS) or automated lane keeping systems (ALKS), people’s eyes start to glaze over, but if you say Autopilot, we all assume we know what that means… Like I said, words are important because they trigger assumptions and assumptions can be dangerous, even fatal. 

Today, most car manufacturers deliver Level 2 automation technology and some have Level 3 on high-end models. However, they are all working hard to increase the level of automation specific for value-adding scenarios. The new Mercedes S-class is a case in point. It will be the first to offer what the DfT termed ‘traffic jam chauffeur technology” in an August 2020 Autocar article – ALKS technology allows true hands-free driving on congested motorways.  Its ODD means it can’t exceed 60 Km per hour (37mph) or change lanes, (and if you don’t take over if it encounters something it doesn’t recognise, it’ll just stop) but once engaged, the car will keep itself in lane and at a safe distance from the car in front. 

Sci-fi author William Gibson once said “the future is already here, it’s just unevenly distributed”. Waymo One and the S-Class are examples of this – Level 4 systems limited to a tightly defined geographical area or a very specific set of circumstances. However, what both show is manufacturers’ commitment to constantly testing and improving the state of the art.   

Elaine Herzberg’s death presents a number of important lessons we need to learn but perhaps the most important concerns the reliance we are prepared to put on technology, regardless of its level of maturity. In-car video released by the police after the crash showed the safety driver was not watching the road moments before the vehicle struck Herzberg.  Perhaps she was bored because the car was driving; after all, it was performing perfectly – right up until it hit Herzberg.  The numerous stories of people blindly following satnav directions speak volumes about our willingness to delegate responsibility to our devices. Automated driving systems represent a step change in terms of driving safety but manufacturers need to be honest about their limitations and consumers need to understand their responsibilities and use the technology wisely.  

Self-driving car technology will continue to evolve and will ultimately make our lives much safer, as drivers, cyclists and pedestrians. The trick is for us not to assume it can do more than it can at any given time. In a 2018 interview with Wired magazine, Matthew Avery said, Seventy percent of people believe you can buy autonomous cars. It’s really worrying that consumers are believing the hype. Whether we’re talking about self-driving cars or any other kind of radical change, expectation setting is crucial and we should resist the urge to oversell to grab people’s attention. 

Seventy percent of people believe you can buy autonomous cars. It’s really worrying that consumers are believing the hype.

— Matthew Avery

I believe that, although my grandchildren will use cars, they won’t need to learn to drive. Manual driving will be an idiosyncrasy indulged by a few, probably within carefully defined geographical areas. Automation technology will make our journeys safer and allow us to better use the time we once spent concentrating on the road ahead. However, successfully managing periods of change requires the ability to both build a compelling vision and also underpin it with a roadmap that delivers the totality in stages, with a clear indication of what capability is available at every point.  The original “moonshot” project is a good example.  Each and every mission in the Apollo space programme tested a specific set of technologies and maneuverers that cumulatively enabled the successful moon landing. 

The problem most companies face is that they only embark on large scale change infrequently and either lack experience or are unfamiliar with the tools and techniques that reduce risk, accelerate progress, and support the changes in culture and mindset necessary to succeed. Here at changemaker, we support organisations and individuals in delivering sustainable and lasting change, especially during pivotal times of transformation. Our 5 Step change model allows us to help identify the changes that need making then bring a structured process to change that fully engages your people and makes change predictable so you can be certain of realising your goals in an increasingly uncertain world. 

 I’ll be looking to share further thoughts and insights on several other automotive-related topics in the coming months so please continue to look out for further blog posts. As always, if this blog or any of the others has sparked an idea that you’d like to talk about, please get in touch.  

If you want to learn more about us, take a look at our website www.changemaker.org.uk or email myself at jason.craker@changemaker.org.uk.