Did Uber Disable Volvo Tech in the Autonomous XC90 That Killed a Pedestrian?

Posted by

You may have read the news this week that a pedestrian in Tempe, Arizona was killed this week when one of Uber’s autonomous test vehicles struck her as she crossed an intersection. It could be the incident that puts the brakes on autonomous testing until policymakers, insurers, the legal community and state and federal transportation agencies get their collective acts together and decide what technologies we’re comfortable with allowing, and why.

UPDATE: 3/28/18

In a 3/22/18 update, BestRide.com raised questions about Volvo’s City Safety with Pedestrian Detection technology, and whether or not it had been disabled in an Uber-operated XC90 that killed a 49-year-old woman in Tempe, Arizona. A spokesman for Volvo declined to comment, but now Bloomberg is reporting that Aptiv Plc, the company that provides the technology that makes City Safety with Pedestrian Detection, is suggesting that the technology had been disabled prior to the crash, in favor of Uber’s own autonomous technology.

“‘We don’t want people to be confused or think it was a failure of the technology that we supply for Volvo, because that’s not the case,’ Zach Peterson, a spokesman for Aptiv Plc, told Bloomberg. The Volvo XC90’s standard advanced driver-assistance system ‘has nothing to do’ with the Uber test vehicle’s autonomous driving system, he said.

Aptiv is speaking up for its technology to avoid being tainted by the fatality involving Uber, which may have been following standard practice by disabling other tech as it develops and tests its own autonomous driving system,” read the report from Bloomberg.

UPDATE: 3/22/18

Police in Tempe have released video from the autonomous Uber test vehicle crash that killed a 49-year-old woman this week. Note that this video may be disturbing, but it doesn’t show the actual crash:

The video shows the pedestrian walking in front of the car, which was traveling at a speed of 38 miles per hour. It also shows that the “driver,” 44-year-old Rafael Vasquez, “whose job is to take over the operation of the vehicle if necessary,” reads a story from NPR. “The driver’s gaze seems divided between something inside the car and the road. His eyes appear to be diverted from the road for about four or five seconds before he looks up with a startled expression less than a second before the impact.”

Here’s the truly interesting part nobody is talking about: The vehicle here is one of the 24,000 Volvos that Uber plans to purchase over the coming years for its autonomous testing, which is equipped with City Safety technology as a standard feature.

City Safety is designed to either prevent or mitigate the effects of a crash just like this one. From the material at Volvo.custhelp.com, “Your Volvo is equipped standard with City Safety which is a driver support system to help warn the driver to pedestrians, cyclists, and other vehicles that may emerge in front of your vehicle without much warning.  Your Volvo can alert you with a visual, sound, or automatically breaking [sic] if the driver is not able to respond quick enough.

City Safety triggers brief, forceful braking in an attempt to stop your vehicle immediately behind the vehicle or object ahead. This braking may be perceived as being very sudden.

City Safety can help avoid a collision with a vehicle or cyclist ahead by reducing your vehicle’s speed by up to 30 mph (50 km/h). In the case of pedestrians, City Safety can reduce speed by up to 28 mph (45 km/h). If the difference in speed between your vehicle and the vehicle ahead/pedestrian is greater than 30 mph (50 km/h) or 28 mph (45 km/h)respectively, City Safety’s auto-brake function cannot prevent a collision but it can help mitigate effects of the collision.

The vehicle in question did not alert the driver, nor did it apply the brakes. The driver wasn’t focused on the road, but that’s exactly what this technology is designed to do.

Many, many questions are still unanswered. Did Uber disable this standard technology? The NTSB investigation continues.

Up until a few years ago, it seemed like the rules about motor vehicle “operators” were pretty locked down. But that’s not the case, almost solely because state agencies never anticipated that robots would be operating the cars.

As a result, states have scrambled to enact new legislation that tries to catch up with the technology, but only ends up being a day late and a dollar short, like that time you bought a pair of parachute pants in 1985, only to find that their coolness had expired a week before.

It’s no accident — pardon the pun — that this fatality occurred in Arizona. Arizona jumped into the deep end of the autonomous vehicle testing pool with both feet, still wearing its shoes, socks, khakis and Uber polo shirt.

RELATED: Do we really want autonomous vehicles? A new MIT survey says not really.

The state has become a center of autonomous driving activity for a couple of reasons. First, autonomous vehicles are kind of like your 80 year old aunt, in that they steadfastly refuse to drive in the rain or the snow. They simply don’t work in inclement weather, and they require a state that has full sun about 99 percent of the time.

Second, Arizona is is governed by laissez-faire politicians who have eschewed regulation wherever possible.

Arizona’s Republican Governor Doug Ducey — whose claim to fame prior to being elected was creating the ice cream stand where they mash your Oreos in to a flavorless base with a pair of paddles — clearly had dollar signs in his eyes at the potential of Arizona becoming the nation’s leading real-world autonomous driving capital.

Ducey signed an executive order in 2015 that effectively allowed the state to step out of the way and offer up Arizona’s roadways to technology companies for live testing. “Arizona welcomes Uber self-driving cars with open arms and wide open roads,” he said in a statement in 2016, and threw California under the bus while he was at it. “While California puts the brakes on innovation and change with more bureaucracy and more regulation, Arizona is paving the way for new technology and new businesses,” the Republican governor said.

Ducey signed Executive Order 2015-09 — which specifically mentioned the business opportunities for Arizona — that allowed the testing of autonomous vehicles with only four restrictions:

The vehicle had to have a human with a driver’s license behind the wheel that could turn or stop it if necessary.

That person had to be designated as an operator by the company doing the testing.

The vehicle had to be “monitored.”

The vehicle owner had to submit proof of financial responsibility to the Director of the Arizona Department of Transportation.

Just weeks before the crash that claimed the life of 49-year-old Elaine Herzberg, Ducey updated that order with Executive Order 2018-04, which actually provides some rules for autonomous vehicle testing. “As technology advances, our policies and priorities must adapt to remain competitive in today’s economy,” Ducey said in a statement. “This executive order embraces new technologies by creating an environment that supports autonomous vehicle innovation and maintains a focus on public safety.”

RELATED: Autonomous Car Technology Explained

Following the accident, Arizona’s government sprang into action to  deploy its thoughts and prayers. “Our prayers are with the victim, and our hearts go out to her family,” Patrick Ptak, the governor’s spokesman, said in an email reported in the Phoenix Business Journal. “Our office is in communication with law enforcement. Public safety is our top priority, and the governor’s latest executive order provides enhanced enforcement measures and clarity on responsibility in these accidents.”

Nevertheless, consumer advocacy group Consumer Watchdog lashed out at the state, calling it “the wild west of the wild west of robot car testing with virtually no regulations in place.” According to Consumer Watchdog’s Privacy and Technology Project Director, John M. Simpson, “That’s why Uber and Waymo test there.”

A lack of regulation is fine as long as nothing bad happens, but we all knew this was going to happen, right?

In 2012, the New England Motor Press Association held a conference at MIT on the topic of vehicle autonomy, and not many of the questions that arose are any closer to having answers. Everybody knew that eventually, somebody was going to get killed. But all of the entities involved in this technology seemed to want to just wait and see what happened until it did.

“When there’s no sheriff in town, people get killed,” Consumer Watchdog’s Simpson said.

The group is calling for a nationwide moratorium on autonomous vehicle testing on public roads “until the complete details of this tragedy are made public and are analyzed by outside experts so we understand what went so terribly wrong.” The NTSB is currently investigating the crash.

It’s true that more than 37,000 people die on America’s highways every year. But that’s not why Uber and Waymo are testing autonomous vehicles. They’re testing autonomous vehicles because they want to eliminate the labor cost of paying humans to drive their cars, and they want to disrupt the entire automotive industry, making it economically feasible to just take a robot car everywhere instead of owning a car.

“If it could replace the cost of paying its drivers with autonomous cars, Uber could dramatically reduce its prices to the point that taking Uber rides everywhere would be a cheaper transportation option for most people than owning a car,” according to an article in Business Insider. “Research from asset management firm ARK Invest released earlier this year forecasted that Uber rides in autonomous cars would cost passengers 25 cents per mile. Right now the cost of an UberX taxi ride is about $2.15 per mile, according to ARK.”

About to those human drivers and their terrible driving record, yes, 37,000 people get killed every year. But there are 143,781,202 cars piloted by humans on Americas highways today. Autonomous vehicles on the road today number fewer than 750. There are currently fewer than 200 autonomous cars operating in California, for example.

So you risk of dying in a crash with a human powered vehicle is 1 in 3,885. Your risk of dying from a  crash with a robot-powered vehicle just leapt to 1 in 750, and that risk skyrockets in the places where “wild west” autonomous testing is underway.

The question is this: Are you willing to accept that risk so that businesses like Uber and Waymo can reduce their labor cost?

 

Share:
Craig Fitzgerald

Craig Fitzgerald

Writer, editor, lousy guitar player, dad. Content Marketing and Publication Manager at BestRide.com.