On roads teeming with robotaxis, crossing the street can be harrowing

SAN FRANCISCO - Living in this city on the cutting edge, I have lately found myself in a game of chicken with cars driven by nothing but artificial intelligence.

Waymo robot taxis, owned by the same company as Google, are everywhere in San Francisco - and they will soon be driving themselves in six U.S. cities. During rush hour each weekday, easily two or three dozen of the white SUVs, loaded up with cameras and spinning sensors, pass by a street near my house.

Subscribe to The Post Most newsletter for the most important and interesting stories from The Washington Post.

I generally find riding in a Waymo to be smooth and relaxing, and I have long assumed its self-driving technology is a net benefit for the city.

ADVERTISEMENT

Then I started to notice something unsettling.

When I try to cross my street at a marked crosswalk, the Waymo robotaxis often wouldn’t yield to me. I would step out into the white-striped pavement, look at the Waymo, wait to see whether it’s going to stop - and the car would zip right past.

It cut me off again and again on the path I use to get to work and take my kids to the park. It happened even when I was stuck in a small median halfway across the road. So I began using my phone to film myself crossing. I documented more than a dozen Waymo cars failing to yield in the span of a week. (You can watch some of my recordings below.)

It is a cautionary tale about how AI, intended to make us more safe, also needs to learn how to coexist with us. The experience has taught my family that the safest place around an autonomous vehicle is inside it, not walking around it.

At my crosswalk, which is not protected by a stop sign, the Waymo would yield for me about 3 out of 10 times. But I couldn’t figure out what made the robotaxi change its mind. I tried sticking one foot out, crossing in both heavy and light traffic, waving at the car and even pushing a baby stroller (without my baby!).

ADVERTISEMENT

I could consistently make it stop only by darting out into the street - but that’s not how my momma taught me to behave in an intersection.

Waymo cars don’t behave this way at all intersections. Some friends report that the cars are too careful on quiet streets, while others say the vehicles are too aggressive around schools.

California’s laws are very clear about crosswalks: Anytime there is a pedestrian in a crosswalk, cars are supposed to stop. When I contacted the Department of Motor Vehicles about my experience, it told me that autonomous vehicles, “like all vehicles, must adhere to the rules set forth in the California Vehicle Code, including when AVs approach crosswalks.”

I showed my videos to Waymo. Spokesman Ethan Teicher said its cars are designed to follow road rules and be courteous to pedestrians. He didn’t acknowledge that Waymo cars broke any road rules, but he said there is “opportunity for continued improvement in these highly dynamic social interactions between man and machine.”

No Waymo car has hit me, or any other person walking in a San Francisco crosswalk - at least so far. (It did strike a cyclist earlier this year.) The company touts that, as of October, its cars have 57 percent fewer police-reported crashes compared with a human driving the same distance in the cities where it operates.

ADVERTISEMENT

Yet major crashes are only one of the new kinds of concerns when you’re living around self-driving cars. Some riders report being trapped inside Waymo cars as people try to harass them. The company is also under investigation by the National Highway Traffic Safety Administration for driving in an unexpected and disruptive manner, including around traffic control devices (which includes road markings).

Many human drivers don’t yield for pedestrians, either. But Waymo has the power to seize the moral high ground and have robots be better than people.

What’s more, how does an AI designed to follow the law learn how to break it?

- - -

Road rules

I showed my videos to Waymo’s Anne Dorsey, a software engineer who helps lead a team that focuses on how the autonomous vehicles interact with non-cars - and hears a lot of pedestrian feedback. She described yielding to pedestrians as a matter of “politeness” and said it’s more nuanced and complex than it might appear.

ADVERTISEMENT

“We’re gaining more and more insight and knowledge into what the social norms are in each area that we’re driving in,” she said. The expectations of pedestrians can vary from “cars should always completely stop” to “I’ll wait for a gap in traffic to cross.”

When I’m crossing an intersection, I look a driver in the eye and know they have seen me when they start to slow down. But with a Waymo, there’s nobody to look at.

So how does the car decide when to stop? “It’s really hard for me to say that there’s one particular thing that is the determining factor in how an interaction will go down,” Dorsey said. I provided Waymo with video footage, the times and the locations of 10 occasions when its vehicles failed to yield for me; it didn’t give me a specific explanation for those instances.

But in more general terms, Waymo said its software tries to predict a pedestrian’s “intent” to cross based on signals such as their forward motion and where they are looking.

“We look at their behavior: What have they done as they approach the crosswalk? Do they seem like they’re about to cross or are they just sort of milling around waiting for someone?” Dorsey said. “Are they, for example, texting and walking or are they looking up, looking toward that roadway like they’re going to move into it?”

So an AI gets to decide whether I really want to cross - even when my feet are already in the crosswalk? It’s classic Silicon Valley hubris to assume Waymo’s ability to predict my behavior supersedes a law designed to protect me.

Waymo also said its car might decide not to stop if adjacent cars don’t yield. So is it possible that Waymo’s AI is learning from the human drivers on the road who also act like jerks?

“We don’t learn directly from other random people we see on the road,” Dorsey said. “We do indirectly learn by, for example, studying that data and learning ‘we shouldn’t do this.’”

I showed my videos to outside experts, too. Phil Koopman, a Carnegie Mellon University professor who conducts research on autonomous-vehicle safety, said Waymo had no excuse not to stop.

“A person making the decision not to yield to a pedestrian in a crosswalk is acting with the knowledge of potential penalties, ranging from a traffic ticket, to a civil tort lawsuit for injuries, to potential jail time,” Koopman said. “In California, a computer driver can’t even get a meaningful traffic ticket yet and is certainly not worried about going to jail.” (With a new law, they can get only a “notice of noncompliance.”)

He thinks Waymo should take the high ground, since it’s always touting safety. “Instead of arguing that they shouldn’t stop if human drivers are not going to stop, they could conspicuously stop for pedestrians who are standing on road pavement on a marked crosswalk,” Koopman said. “That might improve things for everyone by encouraging other drivers to do the same.”

Another theory: Waymo cars are intentionally getting more aggressive to help shed a market reputation for being slowpokes.

“I think the problem is one of pressure from the top down to move more quickly,” said Missy Cummings, an engineering professor at George Mason University who has consulted for California on self-driving cars. “They’re trying to start competing with the rideshare companies on a mile and time basis.”

Dorsey said Waymo has evolved its driving behavior over time, though safety remains its top priority. “We do factor in a variety of social dynamics. There’s some objectivity, for example, in some social norms: politeness, rudeness.”

We may need better ways to communicate with autonomous vehicles. Waymo has introduced a way to message with pedestrians (and other cars) through a small screen mounted on its cars’ roofs. I saw those sometimes - but, of course, they’re not helpful when the car doesn’t actually decide to stop.

Cities with lots of self-driving cars may need to give pedestrians better tools, too. My intersection is an unprotected crosswalk - there are no crossing lights. But a flashing light beacon there could let me flag my intent to both humans and robots. (I asked the city of San Francisco to install one there, but it declined, saying it wasn’t a priority.)

I asked Dorsey how she crosses the street in front of a Waymo. “I walk out like I would with a human driver - maybe a little more comfortably … because I know the Waymo is seeing me,” she said.

I know conceptually that Waymos are loaded with 360-degree sensors that can spot me better than a human can.

But I also know these cars weigh about 5,000 pounds - and I need a little more than faith in AI before I step out in front of one.

- - -

Jeremy Merrill contributed to this report.

---

Video Embed Code

Video: Tech columnist Geoffrey A. Fowler noticed that Waymo robotaxis in San Francisco often would not stop for him at a crosswalk he uses every day.(c) 2024 , The Washington Post

Embed code:

Related Content

Ethiopian family seeks medal’s return 87 years after grandfather’s execution

The 2025 economy: 5 things to watch

The best place to run while traveling? Try the track.