I missed this from earlier in the week. Barbara Marie Brannan with questions for Google and Apple on the self-driving car.
Who’s In Control? – That’s a good question to ask. Google would like to dispense with the steering wheel altogether. That just doesn’t seem right. My iPhone is a wonderfully predictable device but sometimes it just doesn’t work. If the driverless car decides to stop driving, then who is in control? And without a steering wheel does it matter?
Who Writes The Laws? – A self-driving car is a robot. Mechanisms and computers and algorithms and sensors combine to get the car from here to there, and, ostensibly back again, or somewhere else. What principles guide the robot car?
The Three Laws Of Robotics from Isaac Asimov:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws
Will Apple’s iMobile follow those laws?
Accountability, My Dear Watson – Finally, who is accountable when a self-driving car commits a sin and crashes? Will insurance rates go up? Or, go down for self-driving car owners and users? Am I liable because Google screwed up and drove the car into a schoolyard crowded with children because that’s where Google Maps said the freeway on ramp was located?