Self-Driving Car Tops 120mph On The Track

That's at Thunderhill! :) Fun track.

Saw ur sig, yay DSM :D. I got a 1992 GSX 6/4 bolt combo.

But back on topic, f that I wouldn't want a self driving car going that fast. You'll be sitting there while it's autopiloting then a light will come on saying "MALFUNCTION" then you go flying into a building or some shit and die. No thanks, I'll drive please.
 
I wouldn't want to ride in a vehicle that is computer driven. Imagine if the car hit someone.
Witness: That man hit someone
Man: No I didn't, it was the Car! It can drive itself.
WItness: I saw you behind the wheel
Man: I was just there for the ride. The car hit the man on it's own
Police: Sure it did fellow, let's see how the Judge down town feels about this.
Man: NOOOOOOOOOOOOO!
 
It's one thing to have a track pre-programmed, where you know beforehand every inch of the road -- and in this case, also don't have to deal with other vehicles -- and another to go like google's cars, scanning, learning the road, and reacting to the input and objects as you go.
 
Saw ur sig, yay DSM :D. I got a 1992 GSX 6/4 bolt combo.

But back on topic, f that I wouldn't want a self driving car going that fast. You'll be sitting there while it's autopiloting then a light will come on saying "MALFUNCTION" then you go flying into a building or some shit and die. No thanks, I'll drive please.

I'd be happy to let my car drive itself and everyone elses' cars too. There are way too many idiots out there who are drunk, unconscious, or just overly aggressively speeding for travel by car to be very safe. If robotic systems are reliable and smart, I can tell my car to take me someplace and then sleep, play Angry Birds, or do something else useful while I get taken where I want to go.

I'm sure I wouldn't want to go that fast, but at normal speeds, it'd be a great thing to get stupid people away from making stupid decisions while they're in control of their cars.
 
There are way too many people who don't use the accelerator and apply the brakes far too often and erratically. In my experience those people are far more common (haven't come across many unconscious drivers myself) and present a more frequent danger due to hesitation and a general lack of driving intelligence.

It would be a great thing to get people away from making stupid decisions with cars but any dumbass can get a license. I don't think automated driving is the answer to that problem :\
 
Excellent..Put these into production and make the drivers test about 100x harder where only the people who can actually drive can get it. See if we can't eliminate the majority of stupid drivers off the road, so the speed limits can be raised for those of us who are actually capable of handling our car at decent speeds.
 
Hollywood doesn't do stunt scenes, they barely even have actors anymore, everything is CGI!
 
I'd be happy to let my car drive itself and everyone elses' cars too. There are way too many idiots out there who are drunk, unconscious, or just overly aggressively speeding for travel by car to be very safe. If robotic systems are reliable and smart, I can tell my car to take me someplace and then sleep, play Angry Birds, or do something else useful while I get taken where I want to go.

I'm sure I wouldn't want to go that fast, but at normal speeds, it'd be a great thing to get stupid people away from making stupid decisions while they're in control of their cars.

Grocery getting and town driving, that is it. Highway, no thanks.
 
It's one thing to have a track pre-programmed, where you know beforehand every inch of the road -- and in this case, also don't have to deal with other vehicles -- and another to go like google's cars, scanning, learning the road, and reacting to the input and objects as you go.

exactly. this is old news. great, you can pre-program a car to drive on a closed course. :rolleyes: anybody ever played a racing game before? same concept.
 
Was the road preprogrammed though? Awful lot of alien antenna poking above the car for me to think it was simply doing a preprogrammed drive.
 
This car may know the layout of the course, which is no different than a professional racer. However, the car itself determines the fastest lines, and then calculates how to aggressively push that line to the limits without going over. This isn't pre-programmed maneuvers. The software is continually testing how hard it can drive. Then there's the issue of course correction. This is a computer simulation being played out on a real life physics engine that isn't 100% predictable. The software still has to be able to adjust its line when the car doesn't behave 100% the way it was supposed to.

Here's an article that has a little more detail on how this car operates:
http://news.stanford.edu/news/2010/february1/shelley-pikes-peak-020310.html
 
Was this not done on Top Gear once before some years back? They drove a car around the track a bunch of times to program in the route and then shoved Jeremy in the car and set it on auto.
 
exactly. this is old news. great, you can pre-program a car to drive on a closed course. :rolleyes: anybody ever played a racing game before? same concept.
Same concept, totally different execution.

A simulator uses very limited physics and will be pixel perfect down to a T.

It doesn't start drifting slightly on braking because the left caliper has slightly more brake force than the right, or because there is a cross wind, or the tires slipped a little on the last turn and the like.

While we've had the concept of self-driving vehicles via simulators for a long time, to be able to turn that into reality with the near infinite complexity of real world "butterfly effect" physics at 120mph, is quite an amazing leap.
 
Excellent..Put these into production and make the drivers test about 100x harder where only the people who can actually drive can get it. See if we can't eliminate the majority of stupid drivers off the road, so the speed limits can be raised for those of us who are actually capable of handling our car at decent speeds.

I'd never be allowed to drive legally if the driving test wasn't basically stupid-proof, but I really wouldn't mind at all if my car would do the driving for me.

Grocery getting and town driving, that is it. Highway, no thanks.

It doesn't matter to me where, but I think highways would be safer places for robotics to predict their situation. There's like random kids playing, trash cans rolling out, lots more intersections, and generally more complicated traffic flow in suburban and urban areas that make it a more difficult setting in which to use software to predict what might happen. Highways, on the other hand, are long, flat stretches of well-marked road with controlled entries and exits. Signs are even standardized.

This car may know the layout of the course, which is no different than a professional racer. However, the car itself determines the fastest lines, and then calculates how to aggressively push that line to the limits without going over. This isn't pre-programmed maneuvers. The software is continually testing how hard it can drive. Then there's the issue of course correction. This is a computer simulation being played out on a real life physics engine that isn't 100% predictable. The software still has to be able to adjust its line when the car doesn't behave 100% the way it was supposed to.

Here's an article that has a little more detail on how this car operates:
http://news.stanford.edu/news/2010/february1/shelley-pikes-peak-020310.html

Yeah, the software is based on their previous car, "Stanley" that competed in a DARPA off-road competition in like 2004 or 2005. It really does analyze current road conditions and makes decision on-the-fly about how to drive. The previous version used some LADAR scanning junk and combined it with a video camera to make smart decisions and the array of stuff stuck to the roof of the new car makes it pretty obvious this is an updated version of the same equipment.

NOVA did a show that included Stanley that covers a lot more about it and other robotic vehicles. It's on Hulu here: http://www.hulu.com/watch/23347 and you can get it from Netflix by searching for "Great Robot Race"

Wikipedia has a little thingey on Stanley too but it doesn't talk much about the tech that makes it work: http://en.wikipedia.org/wiki/Stanley_(vehicle)
 
Screw that, I have a bad enough time being a passager to another meat-bag behind the wheel, and they want us to put our safety in software?

I'll say it again.

Whatever happened to training people how to drive in all conditions and pay attention to the road?
 
Whatever happened to training people how to drive in all conditions and pay attention to the road?
Liberals. Driving became a right instead of a privelege. Take away the driver's licenses of the dumbest 25% of drivers, and it becomes your responsibility to get them where they need to go.
 
It doesn't matter to me where, but I think highways would be safer places for robotics to predict their situation. There's like random kids playing, trash cans rolling out, lots more intersections, and generally more complicated traffic flow in suburban and urban areas that make it a more difficult setting in which to use software to predict what might happen. Highways, on the other hand, are long, flat stretches of well-marked road with controlled entries and exits. Signs are even standardized.

If it isn't smart enough to navigate in town, then I won't feel safe when I am doing 60 mph on a highway. There is always deer and animals to jump out of the forest to ruin your day as well as black ice and other hazards that can spontaneously exist. In all honesty, I would never let a car drive me. I like driving, I am a very skilled driver as well. Not saying it is a bad idea, just I would rather use my driving skill than let a machine do it for me so when something bad does happen, I can truely say I tried everything I could, not I expected my car to do this, but it didn't. Even if it is a self driven car, you can't hold a car responsible for anything bad, it will always be the human behind the wheel because ultimately, we are suppose to be in control of our vehicle at all times, cruise control gets stuck and smashes into a car. Drivers fault because the driver could have used the brakes, the e-brake (if necessary), and the gears to stop it. I'll manually drive thank you :).
 
If it isn't smart enough to navigate in town, then I won't feel safe when I am doing 60 mph on a highway. There is always deer and animals to jump out of the forest to ruin your day as well as black ice and other hazards that can spontaneously exist. In all honesty, I would never let a car drive me. I like driving, I am a very skilled driver as well. Not saying it is a bad idea, just I would rather use my driving skill than let a machine do it for me so when something bad does happen, I can truely say I tried everything I could, not I expected my car to do this, but it didn't. Even if it is a self driven car, you can't hold a car responsible for anything bad, it will always be the human behind the wheel because ultimately, we are suppose to be in control of our vehicle at all times, cruise control gets stuck and smashes into a car. Drivers fault because the driver could have used the brakes, the e-brake (if necessary), and the gears to stop it. I'll manually drive thank you :).

November 3, 2007 - DARPA's Urban Challenge :)

http://archive.darpa.mil/grandchallenge/
 
Screw that, I have a bad enough time being a passager to another meat-bag behind the wheel, and they want us to put our safety in software?

I'll say it again.

Whatever happened to training people how to drive in all conditions and pay attention to the road?

It probably went out the door with teaching everyone to cook, learn French, understand all of Shakespeare, and any number of things you may prioritize in life but other people don't/can't.

I enjoy driving, especially driving stick, but this is better for everyone. Luddites and nostalgics will obviously hold it back for a while, but people suck at driving... even skilled race drivers get tired/sick/drunk/distracted.
 
I enjoy driving, especially driving stick, but this is better for everyone. Luddites and nostalgics will obviously hold it back for a while, but people suck at driving... even skilled race drivers get tired/sick/drunk/distracted.

One of the worst parts of being on a road with other people is their thought about how good they are behind the wheel. I know of very few who will ever say anything but, "I'm a good driver," or "I'm extremely safe behind the wheel." Overconfidant drivers tend to be younger and less experienced, drive faster, and take more risks. It sometimes takes a few totaled cars or even an injury or fatality for someone to stop depending on their own belief about how good they are at driving and start making better, safer decisions. :(
 
One of the worst parts of being on a road with other people is their thought about how good they are behind the wheel. I know of very few who will ever say anything but, "I'm a good driver," or "I'm extremely safe behind the wheel." Overconfidant drivers tend to be younger and less experienced, drive faster, and take more risks. It sometimes takes a few totaled cars or even an injury or fatality for someone to stop depending on their own belief about how good they are at driving and start making better, safer decisions. :(

I'm an excellent driver (srslyyyy). I ride a bus because that's not true of everyone else.
 
One of the worst parts of being on a road with other people is their thought about how good they are behind the wheel. I know of very few who will ever say anything but, "I'm a good driver," or "I'm extremely safe behind the wheel." Overconfidant drivers tend to be younger and less experienced, drive faster, and take more risks. It sometimes takes a few totaled cars or even an injury or fatality for someone to stop depending on their own belief about how good they are at driving and start making better, safer decisions. :(

So what happens when that person is actually a really good driver and knows it, but doesn't let their ability get to their head where it can cloud their mind and actually impair them?
 
So what happens when that person is actually a really good driver and knows it, but doesn't let their ability get to their head where it can cloud their mind and actually impair them?

What measurement are we using to figure out if they're good? Do we rely on their feelings about it? The most error-prone perceptions are self-perceptions. Humans pretty much fail miserably across the board at objective self-assessments because they can't detach themselves from their own feelings, even when they want to very badly and know they should.
 
What measurement are we using to figure out if they're good? Do we rely on their feelings about it? The most error-prone perceptions are self-perceptions. Humans pretty much fail miserably across the board at objective self-assessments because they can't detach themselves from their own feelings, even when they want to very badly and know they should.

Use the same measurments that were used to show they were a bad driver.
 
Use the same measurments that were used to show they were a bad driver.

Don't take the easy way out. :) What are they? Who decides what makes something "good" or "bad?" There's the obvious stuff like, "You just crashed into a telephone pole while sending a text message and then backed what was left of your car into someone's living room."

What about more subtle things though and what happens in that half second you glance down to push a button on the stereo or look at the passenger seat to see if the tray of freshly baked cookies didn't shift when you turned a corner and you end up squishing something that you didn't know was there? What about the lid of your drink that comes off and spills in your lap and causes you to crash? What about not noticing someone's kid that ducks behind your car to grab a runaway ball without realizing that you're backing up? How about responding to another person's distractions?

Robots and sensor systems can help prevent those kinds of mistakes. They don't get tired, they don't look down to play with a stereo or glance at an incoming call, and they aren't tempted to send just one little text message when they should be watching what's around them. DARPA wants robot vehicles to use in the battlefield so there are fewer humans at risk when a convoy is attacked, but this stuff can work really well for the average person too.
 
Robots and sensor systems can help prevent those kinds of mistakes. They don't get tired, they don't look down to play with a stereo or glance at an incoming call, and they aren't tempted to send just one little text message when they should be watching what's around them. DARPA wants robot vehicles to use in the battlefield so there are fewer humans at risk when a convoy is attacked, but this stuff can work really well for the average person too.

No, sensors get covered with dirt, computers crash, electronics fail, software has bugs.....

However, someday computer driven cars will be safer than human driven cars (even cars with safe drivers).
With redundant sensors & computers and wireless comunications between the cars, a computer driven car will be much better at avoiding an acident than any human.
 
Don't take the easy way out. :) What are they? Who decides what makes something "good" or "bad?" There's the obvious stuff like, "You just crashed into a telephone pole while sending a text message and then backed what was left of your car into someone's living room."

What about more subtle things though and what happens in that half second you glance down to push a button on the stereo or look at the passenger seat to see if the tray of freshly baked cookies didn't shift when you turned a corner and you end up squishing something that you didn't know was there? What about the lid of your drink that comes off and spills in your lap and causes you to crash? What about not noticing someone's kid that ducks behind your car to grab a runaway ball without realizing that you're backing up? How about responding to another person's distractions?

Robots and sensor systems can help prevent those kinds of mistakes. They don't get tired, they don't look down to play with a stereo or glance at an incoming call, and they aren't tempted to send just one little text message when they should be watching what's around them. DARPA wants robot vehicles to use in the battlefield so there are fewer humans at risk when a convoy is attacked, but this stuff can work really well for the average person too.

So all of this hardware that will be used to make these systems in the millions upon millions of cars will be absolutely perfect and will not ever fail? I see perfectly brand new monitors here at work fail within a week, not because the process got messed up on it, but because of tiny little imperfections that cannot be fixed, ever. Will there be a backup system? I wouldn't mind having a backup system for if my human ability fails, but I will never let something fully take over.

Also on the note about what makes someone a good or bad driver, I said that sentence because you already named off what makes a person a bad driver. Speed, isn't always a bad thing and is sometimes good, being slow causes a hazard for other people. Taking advantage of opportunities, if you have the opportunity to make a turn or merge, do it quickly, again the slower the more of a hazard. Risk taking, only take necessary risks, such as exploding your car into a tree if a kid runs in front of your car. Also driving is a giant string of processes and combinations of what might and could happen. When going to make a decision when driving, there is always the thought process of, can I make the maneuver, is it safe, how should I use my speed, what do I do if I cannot, what do I do if I cannot adjust a certain way, how to adjust my adjustment to properly maneuver. I don't know about you, but that is almost the bare minimum that goes through my head when I go to make a maneuver such as merging in rush hour. It is a lot of thought, but it happens very fast because I do it on a daily basis multiple times and have done it since I began driving.
 
One of the worst parts of being on a road with other people is their thought about how good they are behind the wheel. I know of very few who will ever say anything but, "I'm a good driver," or "I'm extremely safe behind the wheel." Overconfidant drivers tend to be younger and less experienced, drive faster, and take more risks. It sometimes takes a few totaled cars or even an injury or fatality for someone to stop depending on their own belief about how good they are at driving and start making better, safer decisions. :(
Pfft, that's just what shit drivers say :p
 
So all of this hardware that will be used to make these systems in the millions upon millions of cars will be absolutely perfect and will not ever fail? I see perfectly brand new monitors here at work fail within a week, not because the process got messed up on it, but because of tiny little imperfections that cannot be fixed, ever. Will there be a backup system? I wouldn't mind having a backup system for if my human ability fails, but I will never let something fully take over.

Depending on other people to keep you safe is just as bad. I think the problem a lot of people have with this is that they love to drive things. For lots of reasons, people want to love their car and cherish it because they feel they have control over something in their lives when they drive it. They don't see it as another appliance like a fridge or a blender which is just a means to an end. The love for a machine never made much sense to me though and neither does the desire to be in complete control. No one is in control of their lives. Every day they depend on utilities controlled by someone else, buildings built by someone else, and so forth. Nothing is fully within a person's control already and I'd rather depend on a machine than a person.

Also on the note about what makes someone a good or bad driver, I said that sentence because you already named off what makes a person a bad driver. Speed, isn't always a bad thing and is sometimes good, being slow causes a hazard for other people. Taking advantage of opportunities, if you have the opportunity to make a turn or merge, do it quickly, again the slower the more of a hazard. Risk taking, only take necessary risks, such as exploding your car into a tree if a kid runs in front of your car. Also driving is a giant string of processes and combinations of what might and could happen. When going to make a decision when driving, there is always the thought process of, can I make the maneuver, is it safe, how should I use my speed, what do I do if I cannot, what do I do if I cannot adjust a certain way, how to adjust my adjustment to properly maneuver. I don't know about you, but that is almost the bare minimum that goes through my head when I go to make a maneuver such as merging in rush hour. It is a lot of thought, but it happens very fast because I do it on a daily basis multiple times and have done it since I began driving.

Didn't you just say that it wasn't as safe on a highway for a robot because of the speed? Now you're saying going slow isn't safe.

Pfft, that's just what shit drivers say :p

I'll be the first person to admit that I'm not even close to a good driver. I've hit random stuff, made tons of dumb mistakes, and done dangerous and stupid things because I was in a hurry to get someplace and thought that getting those extra two minutes was somehow going to make it all better. :( I drive as little as possible (mostly to work and to get food or run necessary errands) and I try my best not to do that kinda stuff and I haven't hurt anyone else or caused an accident yet, but that doesn't mean I'm a skilled, safe, or good driver. I want something that isn't controlled by emotion to make decisions for me because, honestly, I suck at this whole car thing.
 
Depending on other people to keep you safe is just as bad. I think the problem a lot of people have with this is that they love to drive things. For lots of reasons, people want to love their car and cherish it because they feel they have control over something in their lives when they drive it. They don't see it as another appliance like a fridge or a blender which is just a means to an end. The love for a machine never made much sense to me though and neither does the desire to be in complete control. No one is in control of their lives. Every day they depend on utilities controlled by someone else, buildings built by someone else, and so forth. Nothing is fully within a person's control already and I'd rather depend on a machine than a person.



Didn't you just say that it wasn't as safe on a highway for a robot because of the speed? Now you're saying going slow isn't safe.

I guess driving is a privilege rather than a right. I'de rather depend on myself for things that I actually can control.

And I said I wouldn't feel safe, it could be 100% safe and I wouldn't feel safe. I feel safer driving than flying, but flying has statistically shown to be safer.
 
Alright take the fun out of driving too....cool idea but kinda lame. " hey kids! when i was your age we drove cars! " Can only imagine the first recall events with bugs/glitches..."Car drove of cliff because owner didnt wash hes vehicle" Whats next emotional cars? "Honey the car is in the driveway crying for oil again" "DAMIT its the 5th time this week......"
 
I guess driving is a privilege rather than a right. I'de rather depend on myself for things that I actually can control.

And I said I wouldn't feel safe, it could be 100% safe and I wouldn't feel safe. I feel safer driving than flying, but flying has statistically shown to be safer.

I admit that it would take a few weeks to get used to something like that and I'd be all O.O out the windshield for a while I learned to stop worrying and love the robot.
 
I admit that it would take a few weeks to get used to something like that and I'd be all O.O out the windshield for a while I learned to stop worrying and love the robot.

Would definitely be weird to me. I doubt I could get used to it.
 
Back
Top