Sunday, April 9, 2017

Autos!

I have two CGP Grey videos to recommend: the one from last time, Humans Need Not Apply, which mentions self-driving cars, and The Simple Solution To Traffic, which is about self driving cars in the end.

So, self-driving cars!  (Also "Autos" as CGP Grey likes to call them). Why am I so excited about them? Safety and convenience. Driving (or being a passenger) is the most dangerous thing I do. And it's hard to imagine more convenient travel.

Why might they be a bad idea? If it turns out they actually aren't safe, and maybe them taking away too many jobs.

So first, safety. After reading the articles, I'm less sure of their current safety. (I was 2,000% convinced they were as is better than humans, now I'm 90% sure we'll get there soon). The first warning sign was from the first Tesla Article. They boast about new hardware going into the Tesla 3s but they include this worrying line “Teslas with new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features.” Why...? They mention the need for "more robust validation" for things like "automatic emergency braking, collision warning, lane holding and active cruise control" which seems like everything an Auto should do. They'll eventually push these with an update, but...why not have them now? What's going wrong in testing?

The article that said car makers can't "Drive their way to safety" was interesting. It mentioned that with a fleet of 100 cars driving 24 hours a day, it would take twelve and a half years to get to 95% confidence that it was better than humans at driving. This does make current claims about Autos' safety more dubious, but I don't really see why that fleet can't increase to 1,000 and bring the testing years required down to one or two. (Is that how statistics work?). Basically, we may not be sure now, which surprises me, but we will be certain in the near future.

However, there are good reasons to think that cars are currently better drivers than humans. These Autos can see 360 degrees around themselves, which humans never could. Also, they cannot get sleepy or drunk, which is incredibly important. In America, nearly 10,000 people die every year in alcohol related crashes, which is a third of car crashes overall! Obviously, Autos cannot get drunk.

There are a few pieces that worry me, though. The first major one is how much computers still struggle with object recognition. It's not impressive for something that needs to distinguish pedestrians, animals, and random plastic bags in the blink of an eye. And radar/lidar can only do so much in that regard. This article mentioned that Uber had trouble on bridges. The reasoning for that were worrying for the scalability of the tech. Uber cars rely on heavily detailed maps of a specific area, including everything from buildings to parked cars. On bridges, those landmarks don't exist, and the car isn't confident enough to drive itself. This seems like a major issue, as mapping the entire country in that much detail, and keeping it constantly up-to-date, seems like a major task.

Still, every single Auto learns from the experiences of every other Auto. The crash that caused the Tesla fatality will never happen again, whereas humans are doomed to make the same (easily avoidable) mistakes over and over again. And computer tech is explosive. If we're near viability now, next decade's cars will be better than we can imagine.

As far as automatic cars taking everyone's jobs away, I'll just say that I'm not a Luddite and leave it at that.

Now for the less interesting question of the "social dilemma of autonomous vehicles." Does a car save the driver, or go for the greater good of humanity? In the impossibly rare case where a car has to chose between killing the driver and killing pedestrians, what does it do? (Assume random circumstances made neither the pedestrians nor the car at fault). I would say kill the driver. How do you go to those families and say, "Yah, they could have been alive, but I didn't want that, and had my car sacrifice them." But that's all I want to say on the matter. I think this question eats up way too much of the discussion about Autos, and isn't worth discussing, frankly, because it's so rare.

The real question is, once we prove that Autos are safer than humans, do we allow humans to drive? And another interesting point: Autos don't have to be perfect, just significantly safer than us. For example, let's say Autos are twice as safe. That is, they would cause 15,000 deaths every year instead of humanity's 30,000. They cause 15,000 deaths because of faulty programming, or whatever. That's still much better than humans could ever do! Is it moral to let any humans drive when we have machines that are even that flawed? I don't think so. And that's the more interesting question. Not "Oh I don't want to drive in a car that will/won't put my life ahead of pedestrians" but "Do we allow humans to drive at all when we have machines that are twice as safe as them?" (To be clear, I think Autos will be much better than twice as safe as us, but I think this argument holds up even with that high of an error rate).

The "social dilemma" might save 4 lives in total if it kills the driver instead of the 5 pedestrians, but banning humans would save tens of thousands of lives.

Self driving cars will drastically impact many areas of everyday life. Socially, driving will be safer, easier, and probably cheaper. I would imagine fewer people will own cars, and more will simply use a self-driving taxi service. This would have a massive impact on the economy, un-employing literally millions of people. (The transportation industry is the largest employer in America...) I'm not sure how we deal with this, maybe UBI? But that's a discussion for Automation in general. Politically, I'm not sure. No party really seems to be rallying behind this stuff either way. If self driving cars eventually take too many jobs, I could see it becoming a fighting point, with every accident hailed as doom for the industry, but so far Autos seem to have broad political approval.

The government seems to be doing the right thing so far, which is allowing self-driving cars, but with reasonable safety measures, like a person ready to take over the wheel. (Reasonable until they are more fully proven safe). The only thing I can think of would be a federal law, rather than state-by-state randomness, but things are going ok for now.

Finally, would I want a self-driving car? Sort of. Do I want to be a passenger in one? Hell yes! Do I want to own one? Not really. I'd use a self-driving taxi service for all my needs. Why own an expensive asset and have it require space, insurance and maintenance only to spend 95% of its life unused?

No comments:

Post a Comment