Indeed, it's entirely possible to imagine a self-driving car system that always follows the letter of the law—and hence never does anything that would lead to legal finding of fault—but is nevertheless way more dangerous than the average human driver. Indeed, such a system might behave a lot like Uber's cars do today.
In a sense they can programmed to never be "at fault" and even if they work perfectly they aren't necessarily going to be safer than human drivers, both in terms of accidents they could be involved in and accidents they might cause to happen around them. If there's little pressure, legal or otherwise, for them to be more safe they certainly won't be. The point is "they'll be safer" is just an article of faith. Possibly? Yes! Certainly? Not necessarily even if they work "perfectly"!