{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
18
19
20
21
22
23
24
25
26
27
28
News Every Day |

A Waymo vehicle hit a child. What can we learn from the incident?

On January 23rd, outside an elementary school in Santa Monica, California, a Waymo vehicle hit a child.

That’s what we know for sure. 

It sounds shocking, horrifying even. And it’s already giving plenty of groups cover to demand that California revoke Waymo’s license to operate its cars.

But the details matter. And once you start digging a bit, the scary headline about a kid struck down by a heartless robot clearly isn’t the whole story. 

In fact, accidents like this provide a lens through which to improve both human and robot driving—and even save lives.

Braking Hard

The specifics of the incident in Santa Monica are still coming out. As it does with any potential safety incident involving a self-driving car, the National Highway Transportation Safety Administration is actively investigating.

That investigation—as well as a voluntary statement from Waymo—is already revealing quite a lot of nuance.

It appears that the incident happened during drop off time at the SoCal school. A Waymo vehicle appears to have been driving among vehicles operated by parents delivering their kids.

As often happens during stressful school dropoffs (I have three kids, so believe me, I know!), a large SUV had double-parked, blocking part of the roadway.

As the Waymo approached the double-parked SUV, a child ran out from behind the SUV and into the roadway, directly in front of the Waymo.

The next bit is crucial. Waymo says that its vehicle “…braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.”

Waymo doesn’t specify the exact distances involved. But dropping 11mph in a split second represents a slamming-on of the brakes, not a gentle slowdown. It’s an aggressive move. And it may very well have saved a life.

Waymo says that–because its vehicle was traveling only 6mph when it made contact with the child–”the pedestrian stood up immediately” and “walked to the sidewalk” on their own. 

Waymo called 911 and reported the incident to authorities. The company initially said that the child sustained “minor injuries,” but it’s not clear what injuries, if any, actually happened.

The Problem With People

To be clear, any time a child gets hit by a car, it’s a horrible incident. It’s good that the NHTSA is investigating. As a parent, I feel for the parents involved here–seeing your kid hit by any vehicle must be terrifying.

But before drawing any broader conclusions about the safety of self-driving cars, it’s important to consider the question: “Would a human driver have handled this situation any better?”

SafeKids, an advocacy organization, reports that between 2013 and 2022 almost 200 school-aged kids were killed in school zone accidents. 

And that’s only kids. Just days before the Waymo incident, two parents were killed in a crosswalk after dropping their child off at a different California school.

Why do so many people die on the way to school? Speed and distraction are the two biggest factors. 

SafeKids reports that as many as 10% of drivers are distracted while driving in school zones–mostly by phones and other devices. 

3% of drivers observed by the group were even seen using two devices at the same time–perhaps fumbling with a Bluetooth headset while also trying to sign their kid into school on their cellphone.

And most school zones, the group reports, have speed limits that are way too high–under 20mph is ideal, but most are 25mph+

Not that drivers follow those, anyway–other data shows that when drivers hit kids in school zones, they’re traveling an average of 27 miles per hour.

Human drivers, in other words, make tons of mistakes. Especially with the stress of traffic and the pressure to avoid the dreaded “late pass,” it’s all too easy for parents to speed and to take their eyes off the road during dropoff.

Sadly, when kids are involved–with their propensity to dart into the road, as happened in Santa Monica–that combo of speed and distraction means that people die.

Worse With a Person?

Again, that begs the question, in the context of Waymo’s incident, of whether a person would have done better than an AI-powered robot.

Let’s assume, for a moment, that a human was behind the wheel of the vehicle in Santa Monica. What might have gone down differently?

The average human reaction time while driving is about ¾ of a second. When the child darted into the road, that means their car–going 17mph–would have traveled about 19 feet before the driver would even perceive the presence of a pedestrian.

Perhaps they would have immediately slammed on the brakes. But the NHTSA itself says that most people don’t. Whether through surprise or simply a delay in processing, drivers consistently underbrake, even in potentially fatal accidents.

With a person behind the wheel, it’s thus likely that the child in Santa Monica would have been hit at a much higher speed. 

Waymo says that its own independent models show “a fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”

And again, most drivers in school zones aren’t “fully attentive.” As SafeKids points out, they’re distracted, rushing, and speeding. 

Waymos aren’t perfect by any means. But they consistently follow speed limits–sometimes to a fault. 

And because they’re constantly scanning the road, they react faster than people–and hit the brakes hard when they see something even remotely concerning. They never check their phones or try to shave while ferrying passengers around.

When a 5,000 robot kits a kid, there’s a natural human tendency to vilify the robot. But in this specific case, the question of whether a person could have done better is far from clear.

Optimize for Safety

That doesn’t mean we should crucify autonomous vehicles–nor does it mean we should let them off the hook.

The NHTSA’s investigation will probably come down to a question not of whether Waymo outperformed a human in this incident, but rather whether self-driving cars could do more to keep kids safe near schools.

Indeed, NHTSA says it’s specifically investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”

Given that Waymos can be programmed to behave a certain way in specific circumstance—and will do so consistently once the parameters are set—they provide a unique opportunity to set even higher safety standards than we apply to humans.

Again, SafeKids says that most school zones have speed limits above the 20mph ideal. There’s no reason, though, that Waymo couldn’t program their cars to consistently travel at a slower speed when in a school zone at pickup or dropoff times. 

Perhaps Waymos could always travel 15mph when traversing an active school zone. 

That might bug the hell out of parents navigating the pickup line, but it would keep kids safer in the event of an accident. Waymos near schools could even serve as moving “traffic calming” devices, forcing distracted, impatient human drivers behind them to slow down, too!

Likewise, Waymo could set parameters that instruct their vehicles to slow to a crawl when approaching a double parked car near a school. SafeKids specifically calls out double parking as a big risk factor for accidents near schools.

Thankfully–whether through Waymo’s ingenious driving (in the company’s telling) or dumb luck–this incident ended with a kid walking away alive. But that’s not a reason to dismiss what happened.

Rather, incidents like this provide a unique opportunity to define society’s rules for challenging circumstances like driving near kids–and then program them into a machine that (unlike people) will actually follow them.

Asking the tough questions required to set those guidelines–and holding the reality that scary incidents are also learning experiences–is a lot harder than simply blaming the robots and reverting to the human-powered status quo.

But with kids dying in school zones every year, learning the right lessons from accidents like this is absolutely crucial–even life-saving.

Ria.city






Read also

You are covered in mites – and most of the time that’s completely normal

'Next time we'll put up a better show': Tariq breaks silence after defeat to India

No. 13 Texas Tech feels it’s ‘scratching surface,’ ready for ASU

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости