The unwavering belief (and, really, you can't even suggest it might not be true to some people) of self-driving car boosters is that they'll be safer than human drivers. I'm not one who even worries about safety in that, as I keep saying, if they work they'll be safe, tautologically, but there isn't any particular reason to think this is actually true. It might be! Some day! I'm sure in 2300 we'll all be uploading our brains into robot bodies, too, but that's a few months away. Human drivers get distracted and drunk blah blah blah, but self-driving car sensors aren't very good, the AI required to recognize that a stop sign is indeed a stop sign is actually pretty hard (and a stop sign's the easy one!), "intuition" and interaction with other entities on the road is going to be limited, and weather conditions are not those of Mountain View all over the country.
Every time there's an accident/death involving a self-driving car, people will assert that "well, self-driving cars are safer than human drivers so we're going to overall reduce the number of accidents." Thank you, serious scientist person and stupidest economist on twitter, for this assertion without any evidence whatsover. And even if it will be true one day, the things don't even work yet! How is it true now? I guess you don't make driving safer without cracking a few skulls first.
Fundamentally, the question is can a self-driving car *ever* be at fault for hitting a pedestrian and, if they are, who gets the manslaughter charge? The answer, of course, is "never" and "nobody." They'll be safer one day! OK, cool, hope it isn't your kid.