You Can Have My Steering Wheel When You Can Pry It From My Cold, Dead Hands
You Can Have My Steering Wheel When You Can Pry It From My Cold, Dead Hands
Supposedly the Next Big Thing, or one of them anyway, is self-driving cars. That is, vehicles that can operate themselves and get you where you’re going without you ever having to do anything.
I’ve got several problems with that.
The first is – where’s the fun in it?
Sure, a lot of slackers and lazy people would be glad to be able to get in a car, tell it where to go, and then sit back and read, eat, use their phones, even sleep, while the obedient, automated vehicle goes on its merry way.
Then there are other people, like me, who just plain LIKE to drive. We like to be in control of the car, deciding and directing for ourselves what it should do instead of the machine telling us what it will do.
I own a Jeep. And it is just plain FUN to drive. In fact, it is so much fun to drive that oftentimes on my way home from work or when I’m not in a hurry to get where I’m going, I enjoy taking the long way around just for the fun of it, instead of heading straight where I’m going as quickly as I can get there. I’ll take turns at random, go this way instead of that just because, go down this road I’ve never been down before just to see what’s there to look at. I’ve found some very interesting places and scenic views that way.
How are you going to do that in a car that will only drive itself? Keep telling it, “turn right here. No, stop. Go back the other way.” That would not only be tedious, it would be ridiculous. When you are in control of the vehicle, you can just drive by feel, without even thinking about it, unconsciously making decisions almost with your muscles instead of your head.
See something interesting? Or want to avoid an obstacle or someone you don’t want to see? You don’t even have to think about it, you just make the next hard right that you can. In an automated car, you’d have to tell it what to do and then it might not understand in time or it might be too late to make the turn you suddenly and spontaneously wanted to take, or avoid someone or something you didn’t want to see.
How about wanting to speed up to catch that hot blonde in the red Mustang convertible? How would you explain that to a self-driving car? “Speed up. No, faster. Now pull up alongside the red Mustang and keep pace with it.” Does anyone seriously believe even a self-aware car would be able to understand, much less accomplish those tasks?
If I just want to go aimlessly driving around on a beautiful, warm, sunny summer day with no particular destination in mind, or if I want to cruise alongside the ocean or the beach and just take in the view at my own leisure and my own pace, I want to have the ability to do that instead of having to either input a destination for the car and continually have to update on a whim as my fancy takes me.
If I want to slow down to take in a particularly beautiful vista or even pull over for a closer, better look, I want to be able to do that, instead of having to countermand the car which otherwise will just mindlessly plug along at its preprogrammed pace to its inputted destination, regardless of the scenery.
The second problem I have with automated cars is the technology itself. As stated above, it’s almost a truism that the more you allow technology to “help” you or “assist” you, the less control you have over both yourself and the technology. It also makes you more dependent on the technology, the more you lean on it to make your life easier.
Will the day come when no one wants to drive themselves? Will children born in the next few years even learn HOW to drive manually?
My biggest fear is that the day will come when we are not ALLOWED to drive our own cars. With the growth of the omnipotent, oppressive Nanny State most Western governments are becoming, I can definitely see it coming, the day it will be illegal to drive your own car.
Another problem with technology, ANY technology, is that it can be hacked and broken. Your enemy doesn’t like you, so he hacks into your self-driving car’s system and takes control, using your own vehicle to kill you by smashing into a bridge support column or driving off a cliff into a river. The technology has not yet been invented that is 100% hack proof. With cars that have a constant need to talk to GPS satellites, some central control system, and each other, I fear automated cars will be one of the most hackable technologies ever invented. There are just too many ways to get in and all it takes is one weak spot or chink in the armor for someone with malicious intent to do so.
No thanks, just give me the plain, old fashioned, mechanical distributer, steering wheel, accelerator pedal, and engine block with no computers or other electronics to “assist” me.
Another problem with automated cars where the driver (will that word even have meaning anymore?) is not able to interfere with the controls, is what happens when the autonomous vehicle must decide whether to save you or someone else in an unavoidable crash situation. This scenario has been explored on Top Gear and other automotive forums.
If the car must choose between plowing into the back of a stopped semi trailer and killing you, or mounting the curb to save you, yet killing a pedestrian, what choice will the vehicle make? Of course, the machine is only as good as its maker and can only do what it is told to do. So who will decide? Who will program that decision into the vehicle’s electronic brain? Does anyone even have answers to these questions yet? But they most assuredly WILL need to be addressed before total control of vehicles is taken from the hands of human drivers and given over to machines.
Also entering into the confusion is the question of liability in the case of an accident, should one or more of the fail-safe attributes of self-driving cars actually fail. Because we all know nothing is ever perfect and machines called cars, no matter how well-built or how well they are cared for, are notorious for breaking down, often precisely at the worst possible moment.
If the automated car you are riding in is involved in a wreck, or worse yet, causes one, who is responsible? Technically you were not in operational control of the vehicle since it was driving itself, but would that matter in the eyes of the law, or those whose property or health were damaged in the accident?
Would the manufacturer of the car be held accountable for the failure of the vehicle? Would it be the designers and implementers of the automated systems that were supposed to guide and control it? Would it be the governmental agency or network in charge of safely coordinating all the movements and interactions of self-driving cars? These are vital questions that must be answered before this imminent technology takes hold, as future prognosticators tell us it surely must, and fairly soon. I’m sure lawyers and government bureaucrats will have a field day hashing out this mess.
In spite of my fears and misgivings about automated, driverless cars, I can see some benefits in certain situations. Say, if you’re too sick to drive yourself to the doctor or you hurt yourself in some sort of horrendous accident and need to get to the hospital fast and no one else is around to take you. I can see it happening like this.
“Take me to the hospital! I’ve cut my carotid artery and I’m bleeding to death!”
“Which hospital, sir?”
“I don’t care which hospital! The closest one! The one you can get to the fastest!”
“Westside Regional is two point two nine six miles to the southeast. St. Joseph’s is three point one seven miles to the west. Community General is -.”
“Just go, you stupid, freaking machine! GO!”
“I need a destination, sir.”
“I’m dying, you worthless piece of . . . .”
“I will call 911 for you, sir. Sir? Sir? Sir?”
Another advantage to a car that drives itself would be if you were late for work because you stayed up or out too late and overslept. On the way to work in the car, you could dress, eat breakfast, brush your teeth, comb your hair, check the briefings you should have gone over the night before one more time, and anything else you needed to do to be presentable, instead of doing all of that before you left the house.
On long, boring, transcontinental trips you could put the vehicle into autodrive and eat, sleep, read, play games on the onboard computer or your phone, and so on, while the car did all the work. You could leave at dark, get a good night’s sleep on the road, and arrive at your destination rested, relaxed, and refreshed the next morning instead of wiped out, fatigued, and dead tired from driving all day or all night.
If you get caught in a traffic jam, a blocked highway, a storm, or lost in an unfamiliar place, you could turn control over to the automaton (no pun intended) in charge of the automobile and sit back and let it get you out of the jam or back to civilization or find a safer, shorter, or quicker route.
Which leads me back to another potential complaint I can see coming. While my vehicle GPS is handy most of the time, sometimes it can be so STUPID! And I mean really stupid! Sometimes it picks the most ridiculous routes to get where I ask it to go. And it doesn’t know the shortcuts I know, quicker ways to get through certain neighborhoods or around potential trouble spots, which streets are less crowded at certain times of day compared to others, cutting through parking lots or behind businesses, how fast I can drive on certain streets and highways, despite the posted speed limits programmed into the device, and so on.
All it knows is what maps have been loaded into it by people who don’t live in my town and only know it through overhead satellite images and road maps. Again, the machine is only as smart as those who program it.
Sometimes I think whoever programmed my GPS was a bunch of idiots. If I ask it for the shortest route possible, at times it wants me to go past my destination and make a u-turn, when all I have to do is just turn left. Or it will asininely direct me to make three lefts when all I really need to do is make one right. It even wants me to go out of my way and drive further to get to an address when even a six year old child can look at the streets on the map and tell you another way is much shorter.
Perhaps when cars are completely automated, they will have bigger brains with more memory and calculating power to really choose the shortest or quickest route, but based on past performances of technology compared to human brain power, I seriously doubt any vehicle or machine will ever equal the decision making of a human brain. And even then it will not know the human shortcuts, such as, I can knock a half mile off my trip by cutting through the hospital parking lot, or I can get there five minutes faster by taking the service road back behind the mall instead of the main thoroughfare in front of it.
The only way that could work would be if the computer that controls the vehicle is semi-sentient and learns as it goes the way a human does. For example, if you keep overriding it and taking a certain route you prefer instead of the one it thinks is best, it eventually will figure out that is the route you like and begin to suggest that instead of the ones it was programmed to recommend. But that would require a calculating machine with an intellect approaching that of a human being. To me, that thought is scary.
Much like nearly all technological advances that have great potential to help, self-aware, or even self-driving cars are also fraught with peril. They can either enhance or control our lives, depending on how we react to it or want it to.
I can put up with assistance from my car, as long as I have the final say over what it does instead of its onboard computer. It’s all about control and freedom. I would never put control of my life in the hands of another human so why would I do it with a machine? I have to have the power to tell my car not only where I want to go, but how I want to get there and how fast I want to go at any particular moment, not it dictating to me which way it will go and how long it will take me to get there. I must have the freedom and the power to decide what my vehicle will do.
So someday in the future when academic scientists and government bureaucrats declare that they have all the bugs and kinks worked out and assure us that all cars can drive themselves with no human interaction at all with the controls, and subsequently make it illegal for me to operate my vehicle myself, then I will have to tell them the same thing I’d say if they came for my guns.
You can have my steering wheel when you can pry it from my cold, dead hands.
Jeff Vanderslice
Supposedly the Next Big Thing, or one of them anyway, is self-driving cars. That is, vehicles that can operate themselves and get you where you’re going without you ever having to do anything.
I’ve got several problems with that.
The first is – where’s the fun in it?
Sure, a lot of slackers and lazy people would be glad to be able to get in a car, tell it where to go, and then sit back and read, eat, use their phones, even sleep, while the obedient, automated vehicle goes on its merry way.
Then there are other people, like me, who just plain LIKE to drive. We like to be in control of the car, deciding and directing for ourselves what it should do instead of the machine telling us what it will do.
I own a Jeep. And it is just plain FUN to drive. In fact, it is so much fun to drive that oftentimes on my way home from work or when I’m not in a hurry to get where I’m going, I enjoy taking the long way around just for the fun of it, instead of heading straight where I’m going as quickly as I can get there. I’ll take turns at random, go this way instead of that just because, go down this road I’ve never been down before just to see what’s there to look at. I’ve found some very interesting places and scenic views that way.
How are you going to do that in a car that will only drive itself? Keep telling it, “turn right here. No, stop. Go back the other way.” That would not only be tedious, it would be ridiculous. When you are in control of the vehicle, you can just drive by feel, without even thinking about it, unconsciously making decisions almost with your muscles instead of your head.
See something interesting? Or want to avoid an obstacle or someone you don’t want to see? You don’t even have to think about it, you just make the next hard right that you can. In an automated car, you’d have to tell it what to do and then it might not understand in time or it might be too late to make the turn you suddenly and spontaneously wanted to take, or avoid someone or something you didn’t want to see.
How about wanting to speed up to catch that hot blonde in the red Mustang convertible? How would you explain that to a self-driving car? “Speed up. No, faster. Now pull up alongside the red Mustang and keep pace with it.” Does anyone seriously believe even a self-aware car would be able to understand, much less accomplish those tasks?
If I just want to go aimlessly driving around on a beautiful, warm, sunny summer day with no particular destination in mind, or if I want to cruise alongside the ocean or the beach and just take in the view at my own leisure and my own pace, I want to have the ability to do that instead of having to either input a destination for the car and continually have to update on a whim as my fancy takes me.
If I want to slow down to take in a particularly beautiful vista or even pull over for a closer, better look, I want to be able to do that, instead of having to countermand the car which otherwise will just mindlessly plug along at its preprogrammed pace to its inputted destination, regardless of the scenery.
The second problem I have with automated cars is the technology itself. As stated above, it’s almost a truism that the more you allow technology to “help” you or “assist” you, the less control you have over both yourself and the technology. It also makes you more dependent on the technology, the more you lean on it to make your life easier.
Will the day come when no one wants to drive themselves? Will children born in the next few years even learn HOW to drive manually?
My biggest fear is that the day will come when we are not ALLOWED to drive our own cars. With the growth of the omnipotent, oppressive Nanny State most Western governments are becoming, I can definitely see it coming, the day it will be illegal to drive your own car.
Another problem with technology, ANY technology, is that it can be hacked and broken. Your enemy doesn’t like you, so he hacks into your self-driving car’s system and takes control, using your own vehicle to kill you by smashing into a bridge support column or driving off a cliff into a river. The technology has not yet been invented that is 100% hack proof. With cars that have a constant need to talk to GPS satellites, some central control system, and each other, I fear automated cars will be one of the most hackable technologies ever invented. There are just too many ways to get in and all it takes is one weak spot or chink in the armor for someone with malicious intent to do so.
No thanks, just give me the plain, old fashioned, mechanical distributer, steering wheel, accelerator pedal, and engine block with no computers or other electronics to “assist” me.
Another problem with automated cars where the driver (will that word even have meaning anymore?) is not able to interfere with the controls, is what happens when the autonomous vehicle must decide whether to save you or someone else in an unavoidable crash situation. This scenario has been explored on Top Gear and other automotive forums.
If the car must choose between plowing into the back of a stopped semi trailer and killing you, or mounting the curb to save you, yet killing a pedestrian, what choice will the vehicle make? Of course, the machine is only as good as its maker and can only do what it is told to do. So who will decide? Who will program that decision into the vehicle’s electronic brain? Does anyone even have answers to these questions yet? But they most assuredly WILL need to be addressed before total control of vehicles is taken from the hands of human drivers and given over to machines.
Also entering into the confusion is the question of liability in the case of an accident, should one or more of the fail-safe attributes of self-driving cars actually fail. Because we all know nothing is ever perfect and machines called cars, no matter how well-built or how well they are cared for, are notorious for breaking down, often precisely at the worst possible moment.
If the automated car you are riding in is involved in a wreck, or worse yet, causes one, who is responsible? Technically you were not in operational control of the vehicle since it was driving itself, but would that matter in the eyes of the law, or those whose property or health were damaged in the accident?
Would the manufacturer of the car be held accountable for the failure of the vehicle? Would it be the designers and implementers of the automated systems that were supposed to guide and control it? Would it be the governmental agency or network in charge of safely coordinating all the movements and interactions of self-driving cars? These are vital questions that must be answered before this imminent technology takes hold, as future prognosticators tell us it surely must, and fairly soon. I’m sure lawyers and government bureaucrats will have a field day hashing out this mess.
In spite of my fears and misgivings about automated, driverless cars, I can see some benefits in certain situations. Say, if you’re too sick to drive yourself to the doctor or you hurt yourself in some sort of horrendous accident and need to get to the hospital fast and no one else is around to take you. I can see it happening like this.
“Take me to the hospital! I’ve cut my carotid artery and I’m bleeding to death!”
“Which hospital, sir?”
“I don’t care which hospital! The closest one! The one you can get to the fastest!”
“Westside Regional is two point two nine six miles to the southeast. St. Joseph’s is three point one seven miles to the west. Community General is -.”
“Just go, you stupid, freaking machine! GO!”
“I need a destination, sir.”
“I’m dying, you worthless piece of . . . .”
“I will call 911 for you, sir. Sir? Sir? Sir?”
Another advantage to a car that drives itself would be if you were late for work because you stayed up or out too late and overslept. On the way to work in the car, you could dress, eat breakfast, brush your teeth, comb your hair, check the briefings you should have gone over the night before one more time, and anything else you needed to do to be presentable, instead of doing all of that before you left the house.
On long, boring, transcontinental trips you could put the vehicle into autodrive and eat, sleep, read, play games on the onboard computer or your phone, and so on, while the car did all the work. You could leave at dark, get a good night’s sleep on the road, and arrive at your destination rested, relaxed, and refreshed the next morning instead of wiped out, fatigued, and dead tired from driving all day or all night.
If you get caught in a traffic jam, a blocked highway, a storm, or lost in an unfamiliar place, you could turn control over to the automaton (no pun intended) in charge of the automobile and sit back and let it get you out of the jam or back to civilization or find a safer, shorter, or quicker route.
Which leads me back to another potential complaint I can see coming. While my vehicle GPS is handy most of the time, sometimes it can be so STUPID! And I mean really stupid! Sometimes it picks the most ridiculous routes to get where I ask it to go. And it doesn’t know the shortcuts I know, quicker ways to get through certain neighborhoods or around potential trouble spots, which streets are less crowded at certain times of day compared to others, cutting through parking lots or behind businesses, how fast I can drive on certain streets and highways, despite the posted speed limits programmed into the device, and so on.
All it knows is what maps have been loaded into it by people who don’t live in my town and only know it through overhead satellite images and road maps. Again, the machine is only as smart as those who program it.
Sometimes I think whoever programmed my GPS was a bunch of idiots. If I ask it for the shortest route possible, at times it wants me to go past my destination and make a u-turn, when all I have to do is just turn left. Or it will asininely direct me to make three lefts when all I really need to do is make one right. It even wants me to go out of my way and drive further to get to an address when even a six year old child can look at the streets on the map and tell you another way is much shorter.
Perhaps when cars are completely automated, they will have bigger brains with more memory and calculating power to really choose the shortest or quickest route, but based on past performances of technology compared to human brain power, I seriously doubt any vehicle or machine will ever equal the decision making of a human brain. And even then it will not know the human shortcuts, such as, I can knock a half mile off my trip by cutting through the hospital parking lot, or I can get there five minutes faster by taking the service road back behind the mall instead of the main thoroughfare in front of it.
The only way that could work would be if the computer that controls the vehicle is semi-sentient and learns as it goes the way a human does. For example, if you keep overriding it and taking a certain route you prefer instead of the one it thinks is best, it eventually will figure out that is the route you like and begin to suggest that instead of the ones it was programmed to recommend. But that would require a calculating machine with an intellect approaching that of a human being. To me, that thought is scary.
Much like nearly all technological advances that have great potential to help, self-aware, or even self-driving cars are also fraught with peril. They can either enhance or control our lives, depending on how we react to it or want it to.
I can put up with assistance from my car, as long as I have the final say over what it does instead of its onboard computer. It’s all about control and freedom. I would never put control of my life in the hands of another human so why would I do it with a machine? I have to have the power to tell my car not only where I want to go, but how I want to get there and how fast I want to go at any particular moment, not it dictating to me which way it will go and how long it will take me to get there. I must have the freedom and the power to decide what my vehicle will do.
So someday in the future when academic scientists and government bureaucrats declare that they have all the bugs and kinks worked out and assure us that all cars can drive themselves with no human interaction at all with the controls, and subsequently make it illegal for me to operate my vehicle myself, then I will have to tell them the same thing I’d say if they came for my guns.
You can have my steering wheel when you can pry it from my cold, dead hands.
Jeff Vanderslice