Sign Up | Log In
NEWS HEADLINE DISCUSSIONS
Self-Driving Uber Car Racks Up First Kill
Monday, March 19, 2018 7:33 PM
JEWELSTAITEFAN
Wednesday, March 21, 2018 1:33 PM
Wednesday, March 21, 2018 2:10 PM
WISHIMAY
Wednesday, March 21, 2018 11:37 PM
Sunday, March 25, 2018 9:27 AM
SECOND
The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two
Quote:Originally posted by Wishimay: I think I would still trust A.I. driving over people. AI doesn't have ego, road rage, grudges, bad vision, can't be drunk, stoned, sick or stupid.
Sunday, March 25, 2018 12:00 PM
Quote: Originally posted by Wishimay: I think I would
Sunday, March 25, 2018 12:48 PM
Quote:Originally posted by Wishimay: Quote: Originally posted by Wishimay: I think I would I said "would". I don't trust them NOW. I think all designs are still too primitive. Give these things the ability to learn from previous mistakes and in 20 yrs I'll think about it
Sunday, March 25, 2018 3:28 PM
Quote:Originally posted by JEWELSTAITEFAN: Wouldn't it be more beneficial if all these "previous mistakes" occurred in the lab, where the future dead could be the programmers?
Sunday, March 25, 2018 4:47 PM
Quote:Originally posted by Wishimay: They ALREADY have better accident per mile records than PEOPLE do, as Google started testing its first self-driving car in 2009. ONE fatality from then till NINE years later?? It's impressive. Not enough to make me go near it yet, but still impressive.
Sunday, March 25, 2018 8:55 PM
Sunday, March 25, 2018 9:48 PM
Quote:Originally posted by Wishimay: Quote:Originally posted by JEWELSTAITEFAN: Wouldn't it be more beneficial if all these "previous mistakes" occurred in the lab, where the future dead could be the programmers?Every time you put tire to road, you risk your life...why would you expect perfection from these things without being out on an actual road?? You can't program every scenario in a closed course... They ALREADY have better accident per mile records than PEOPLE do, as Google started testing its first self-driving car in 2009. ONE fatality from then till NINE years later?? It's impressive. Not enough to make me go near it yet, but still impressive.
Sunday, March 25, 2018 10:54 PM
Quote:Originally posted by Wishimay: I'm talking ALL self driving vehicles, obviously. If a person doesn't have time to react, a computer system doesn't really either. A computer itself is faster, but this is an integrated system. Crackhead shouldn't have been WALKING in busy road at night with a bike. PERIOD. She had a death wish. She didn't even LOOK at oncoming traffic. You play in traffic on a dark night with NOTHING reflective or lit up, you are GOING to get hit!
Monday, March 26, 2018 3:17 AM
Quote:Originally posted by JEWELSTAITEFAN: Of all the vehicles and human drivers that she managed to not get hit by in her life, her luck ran out when she first met a non-human driven car.
Monday, March 26, 2018 3:21 AM
Quote:Originally posted by JEWELSTAITEFAN: And then to delude yourself into considering it impressive?
Monday, March 26, 2018 3:58 AM
Quote:Originally posted by Wishimay: Quote:Originally posted by JEWELSTAITEFAN: Of all the vehicles and human drivers that she managed to not get hit by in her life, her luck ran out when she first met a non-human driven car.Drinking it up tonight, eh? THERE WAS A HUMAN BEHIND THE WHEEL.
Monday, March 26, 2018 12:25 PM
MOOSE
Monday, March 26, 2018 2:37 PM
Quote:Originally posted by Moose: So News Headlines is now an extension of RWED and it’s hateful bullshit? One more place to ignore. Hopefully Haken will remove it from the latest discussion listing, just like RWED.
Monday, March 26, 2018 3:24 PM
Monday, March 26, 2018 10:38 PM
Quote:Originally posted by Moose: Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident. Maybe, maybe not.
Monday, March 26, 2018 11:20 PM
Monday, March 26, 2018 11:31 PM
Quote:Originally posted by Moose: If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use. But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well. It ultimately doesn’t matter. This place has been on life support for a long time. Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident. Maybe, maybe not.
Monday, March 26, 2018 11:35 PM
Quote:Originally posted by second: Quote:Originally posted by Moose: Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident. Maybe, maybe not. A human driver who wasn't half asleep and driving that dark road would have used high beams on the headlights because there wasn't any oncoming traffic. The pedestrian would have shown much better. The Arizona governor has expressed himself forcefully: Updated March 26,2018, 10:14 p.m. ET Arizona Gov. Doug Ducey on Monday ordered Uber Technologies Inc. to suspend testing autonomous vehicles on public roadways in the state, a blow to the company’s development efforts after one of its self-driving cars struck and killed a pedestrian in Tempe. Mr. Ducey, a Republican who welcomed Uber’s self-driving technology with open arms to Arizona in 2016, said in a letter to Uber’s chief executive that he had directed the state’s department of transportation to suspend the company’s ability to test the cars.
Tuesday, March 27, 2018 9:06 PM
Wednesday, March 28, 2018 1:54 AM
Quote:Originally posted by second: March 27, 2018 / 5:13 PM The new Uber driverless vehicle is armed with only one roof-mounted lidar sensor compared with seven lidar units on the older Ford Fusion models Uber employed, according to diagrams prepared by Uber. In scaling back to a single lidar on the Volvo, Uber introduced a blind zone around the perimeter of the SUV that cannot fully detect pedestrians, according to interviews with former employees and Raj Rajkumar, the head of Carnegie Mellon University's transportation center who has been working on self-driving technology for over a decade. The lidar system made by Velodyne - one of the top suppliers of sensors for self-driving vehicles - sees objects in a 360-degree circle around the car, but has a narrow vertical range that prevents it from detecting obstacles low to the ground, according to information on Velodyne’s website as well as former employees who operated the Uber SUVs. Autonomous vehicles operated by rivals Waymo, Alphabet Inc's self-driving vehicle unit, have six lidar sensors, while General Motors Co's vehicle contains five, according to information from the companies. Uber declined to comment on its decision to reduce its lidar count, and referred questions on the blind spot to Velodyne. Velodyne acknowledged that with the rooftop lidar there is a roughly three meter blind spot around a vehicle, saying that more sensors are necessary. "If you're going to avoid pedestrians, you're going to need to have a side lidar to see those pedestrians and avoid them, especially at night," Marta Hall, president and chief business development officer at Velodyne, told Reuters. www.reuters.com/article/us-uber-selfdriving-sensors-insight/ubers-use-of-fewer-safety-sensors-prompts-questions-after-arizona-crash-idUSKBN1H337Q
Wednesday, March 28, 2018 7:25 AM
Quote:Originally posted by JEWELSTAITEFAN: What kind of retard thinks this is OK? At least now we know they are targeting small kids.
Wednesday, March 28, 2018 10:42 AM
ZEEK
Wednesday, March 28, 2018 12:01 PM
Quote:Originally posted by JEWELSTAITEFAN: Quote:Originally posted by Moose: If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use. But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well. It ultimately doesn’t matter. This place has been on life support for a long time. Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident. Maybe, maybe not. I was using the descriptive subtitle on the "To The Boards" page, which I try to do with each of the subforums. I was not attempting hijacking.
Wednesday, March 28, 2018 1:24 PM
Quote:Originally posted by Zeek: Honestly we don't know enough details right now to determine what went wrong. Self driving systems are not so much programmed as they are trained. The only way for the systems to learn is to continue to experience more and more different situations. One theory I've heard floated around is that they were potentially training the lidar sensors by using the other sensors to control the driving. Given that this happened at night that seems like a bad idea. Lidar is supposed to be better at detecting things that are difficult to see. It could also be that the lidar sensor they were using was insufficient. That sounds negligent to me and could potentially turn into an expensive lawsuit.
Wednesday, March 28, 2018 4:29 PM
Quote:Originally posted by Moose: Quote:Originally posted by JEWELSTAITEFAN: Quote:Originally posted by Moose: If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use. But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well. It ultimately doesn’t matter. This place has been on life support for a long time. Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident. Maybe, maybe not. I was using the descriptive subtitle on the "To The Boards" page, which I try to do with each of the subforums. I was not attempting hijacking.Fair enough. Sorry I’ve been bitchy in this thread, I’ve had a f’ed up couple of weeks and let it spill into here.
Thursday, March 29, 2018 10:50 AM
Quote:Originally posted by second: Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license. The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly
Thursday, March 29, 2018 1:01 PM
Quote:Originally posted by Zeek: Quote:Originally posted by second: Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license. That's not a fair comparison. The self driving system is learning everything. Not just driving. So, if we are following the same timeline then the system needs 15 years to be equal to a human. It's learning the equivalent of hand eye coordination. It's learning object permanence. It's learning what a dog is, what a cat is, what a deer is. It's learning the world just like we do as infants. We even get a head start because some of that is ingrained in our DNA and we just react on instincts. There's no evolution process to give a machine instincts. They have to learn it all. However, being a specialized system it doesn't really need to learn all the things humans learn. It doesn't need to learn history, science, math, etc. It's not going to spend time on entertainment. It'll never understand pop culture, but we don't need or want it to learn those things. So, maybe 15 years is too generous. I'm fairly certain we'll see full autonomous cars commercially available in under 5 years. So, I think it will beat a human even in learning. The other fun advantage is that they're talking about letting the system learn in a virtual environment. Sort of a supped up car driving videogame that allows the car to learn in the matrix. So, maybe we'll never hook a human up to the matrix to teach them kung fu, but for artificial intelligence it might just be a possibility. And then no human lives are on the line while it learns.
Quote:Originally posted by second: Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license.
Thursday, March 29, 2018 1:08 PM
Quote:Originally posted by Zeek: Quote:Originally posted by second: Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license. The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly That's not a fair comparison. The self driving system is learning everything. Not just driving. So, if we are following the same timeline then the system needs 15 years to be equal to a human. It's learning the equivalent of hand eye coordination. It's learning object permanence. It's learning what a dog is, what a cat is, what a deer is. It's learning the world just like we do as infants. We even get a head start because some of that is ingrained in our DNA and we just react on instincts. There's no evolution process to give a machine instincts. They have to learn it all. However, being a specialized system it doesn't really need to learn all the things humans learn. It doesn't need to learn history, science, math, etc. It's not going to spend time on entertainment. It'll never understand pop culture, but we don't need or want it to learn those things. So, maybe 15 years is too generous. I'm fairly certain we'll see full autonomous cars commercially available in under 5 years. So, I think it will beat a human even in learning. The other fun advantage is that they're talking about letting the system learn in a virtual environment. Sort of a supped up car driving videogame that allows the car to learn in the matrix. So, maybe we'll never hook a human up to the matrix to teach them kung fu, but for artificial intelligence it might just be a possibility. And then no human lives are on the line while it learns.
Thursday, March 29, 2018 2:35 PM
Quote:Originally posted by JEWELSTAITEFAN: Quote:Originally posted by Moose: Quote:Originally posted by JEWELSTAITEFAN: Quote:Originally posted by Moose: If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use. But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well. It ultimately doesn’t matter. This place has been on life support for a long time. Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident. Maybe, maybe not. I was using the descriptive subtitle on the "To The Boards" page, which I try to do with each of the subforums. I was not attempting hijacking.Fair enough. Sorry I’ve been bitchy in this thread, I’ve had a f’ed up couple of weeks and let it spill into here. Thanks for the reply. I suspect I understood what you were referring to, but that was beyond my control, and I couldn't conjure a way for me to satisfy the points you were making. I do hope you will again call me out if I post a thread here which is truly not a News Headline for Discussion.
Friday, March 30, 2018 10:46 AM
Friday, March 30, 2018 11:04 AM
Quote:Originally posted by JEWELSTAITEFAN: So they've only been on the road for 9 years so far, right? The 5 year mark was 4 years ago.
Quote:Originally posted by second: It’s still an open question how the video, lidar, and radar sensors all managed to miss seeing a pedestrian in front of the car. I’ll bet Uber already has a pretty good idea of what happened, but so far they aren’t telling. If I had to guess, the sensors were NOT working properly and the Uber Computer did NOT react by stopping on the side of the road with its hazard blinkers flashing while it waited for a technician to fix whatever knocked out the sensors. Maybe Uber did not program the computer to handle that unusual circumstance. I've seen human drivers not programmed for unusual circumstances. For one example: a teenage driver with a flat tire kept driving down the road at slow speed as the tire tears itself apart and the aluminum rim was ruined. What was the reason the teenager gave as I replaced the wheel with the spare tire? Home was only few more blocks away. This was a learning experience on what not to do on a front wheel drive car when the rear tire on the passenger side goes flat. The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly
Friday, March 30, 2018 2:18 PM
Quote:Originally posted by second: Prevent accidents like Uber’s self-driving car crash https://qz.com/1241384 “Imagine if this camera [in Uber’s car] was actually analyzing the driver’s head pose, eye closure rate, eyes on the road or not, various emotional and cognitive states, and in real time was able to alert if the safety driver was not paying attention,” says Rana el Kaliouby, CEO of Affectiva, an AI startup working with auto manufacturers like BMW and Daimler on this problem. Since the Uber car already had a driver-facing camera, el Kaliouby notes, the technology and infrastructure for this solution already exists. All that’s needed is to implement software that actually monitors the driver. Cadillacs with the “Super Cruise” semi-autonomous mode already have a form of this technology. These cars have a small camera located in the steering wheel that tracks a person’s head and eyes. If the camera detects that a person is not looking at the road, the steering wheel begins to flash and the car makes a warning noise, prompting them to pay attention. If the driver still isn’t watching the road, the car automatically slows down to a stop, puts on the hazard lights, and calls the emergency vehicle-assistance service OnStar. “How long it takes before the system notices a driver is not paying attention depends on your speed,” Robb Bolio, a lead engineer for GM’s autonomous vehicles unit, told CNBC. “If you are going 75 miles per hour, it’s three or four seconds, depending on the traffic around you. If you are in bumper-to-bumper traffic going 10 miles per hour, it’s a little longer.” Euro NCAP, an automotive safety organization backed by the European Commission and five European governments, has already said that driver monitoring systems like this will be a key factor in how it rates cars for safety. “The idea is that this is a symbiotic relationship between human and machine,” el Kaliouby says. “We need to leverage that relationship until we’re comfortable that these vehicles can drive in an autonomous mode, I don’t think anyone would say we’re at a place where we’re comfortable doing that.”
Friday, March 30, 2018 4:42 PM
Quote:Originally posted by JEWELSTAITEFAN: Does the Cadillac pull to the side of the road? Stopping in the middle lane of Interstate could be unhealthy.
Friday, March 30, 2018 4:56 PM
Quote:Originally posted by second: Quote:Originally posted by JEWELSTAITEFAN: Does the Cadillac pull to the side of the road? Stopping in the middle lane of Interstate could be unhealthy.You won’t like what the instructions say: The system is designed to maintain the current lane. You need to take control to change lanes, steer around a traffic situation or object, merge into traffic, or exit the freeway. Super Cruise does not detect construction zones. 3RD ALERT If the steering wheel light bar flashes red for too long, a voice prompt will be heard. You should take over steering immediately; otherwise, the vehicle will slow in your lane of travel and eventually brake to a stop. Super Cruise and Adaptive Cruise Control will disengage. In the event of an unresponsive driver, the vehicle will come to a controlled stop, activate the hazard lights, and contact OnStar Emergency Services. www.cadillac.com/content/dam/cadillac/na/us/english/index/ownership/technology/supercruise/pdfs/2018-cad-ct6-supercruise-personalization.pdf If you drop dead at 70mph or fall asleep, the Cadillac won’t drive your body to the hospital or to the edge of the road. It will simply stop moving a few seconds after your heart stops or you begin snoring. It won’t cruise west on I-10 until it runs out of gas.
Saturday, March 31, 2018 9:03 AM
Quote:Originally posted by JEWELSTAITEFAN: That sounds hazardous. Think of the middle lane of 10.
Saturday, March 31, 2018 3:06 PM
Saturday, March 31, 2018 3:24 PM
Monday, May 7, 2018 4:11 PM
Monday, May 7, 2018 7:34 PM
Quote:Originally posted by second: Uber has reportedly discovered that the fatal crash involving one of its prototype self-driving cars was probably caused by software faultily set up to ignore objects in the road, sources told The Information. Specifically, The autonomous programming detects items around the vehicle and operators fine-tune its sensitivity to make sure it only reacts to true threats (solid objects instead of bags, for example). Unfortunately, the car's software was supposedly set too far in the other direction, and didn't stop in time to avoid hitting bicyclist Elaine Herzberg. www.engadget.com/2018/05/07/uber-crash-reportedly-caused-by-software-that-ignored-objects-in/
Tuesday, May 8, 2018 10:27 AM
Tuesday, May 8, 2018 4:17 PM
Quote:Originally posted by Zeek: Quote:Originally posted by second: Uber has reportedly discovered that the fatal crash involving one of its prototype self-driving cars was probably caused by software faultily set up to ignore objects in the road, sources told The Information. Specifically, The autonomous programming detects items around the vehicle and operators fine-tune its sensitivity to make sure it only reacts to true threats (solid objects instead of bags, for example). Unfortunately, the car's software was supposedly set too far in the other direction, and didn't stop in time to avoid hitting bicyclist Elaine Herzberg. www.engadget.com/2018/05/07/uber-crash-reportedly-caused-by-software-that-ignored-objects-in/ I really hope Uber was trying to give a simplistic explanation for their software, because that does not sound like the proper way to build a machine learning algorithm. You don't have sensitivity to objects. You get the system to learn how to identify what an object is. Besides something the size of a person and a bike should definitely not be assumed to be a non-solid object. That "sensitivity" would have to be almost entirely disabled at that point. I mean by size that's extremely similar to a motorcycle. You can't just plow through those like they're a bag. Uber probably should not be in the self driving business. They just don't have the resources to develop a good system. They're burning cash like crazy and that is not a good situation to be investing in a lot of research and development.
Tuesday, May 8, 2018 4:22 PM
Quote:Originally posted by second: Quote:Originally posted by Wishimay: They ALREADY have better accident per mile records than PEOPLE do, as Google started testing its first self-driving car in 2009. ONE fatality from then till NINE years later?? It's impressive. Not enough to make me go near it yet, but still impressive.People driving have 1.02 fatalities per 100 million vehicles miles traveled in 2014. Uber has 1.00 fatalities per thousand vehicle miles. There is much room for improvement at Uber. Maybe next time Uber could apply the brakes? Or steer around the pedestrian? Or beep the horn? Flash the high beams on the headlights? Or do something other than drive straight toward the pedestrian at a constant speed in the same lane like a railroad train on tracks. https://en.wikipedia.org/wiki/Transportation_safety_in_the_United_States#Road_safety Maybe the Uber Computer was not paying sufficient attention because it was watching youtube when it killed the pedestrian. Who knows?
Quote:
Wednesday, May 30, 2018 12:26 AM
Wednesday, May 30, 2018 4:23 PM
Wednesday, May 30, 2018 4:28 PM
Quote:Originally posted by second: The most difficult to build is the perception module, says Sebastian Thrun, a Stanford professor who used to lead Google’s autonomous-vehicle effort. The hardest things to identify, he says, are rarely-seen items such as debris on the road, or plastic bags blowing across a highway. In the early days of Google’s AV project, he recalls, “our perception module could not distinguish a plastic bag from a flying child.” According to the NTSB report, the Uber vehicle struggled to identify Elaine Herzberg as she wheeled her bicycle across a four-lane road. Although it was dark, the car’s radar and LIDAR detected her six seconds before the crash. But the perception system got confused: it classified her as an unknown object, then as a vehicle and finally as a bicycle, whose path it could not predict. Just 1.3 seconds before impact, the self-driving system realised that emergency braking was needed. But the car’s built-in emergency braking system had been disabled, to prevent conflict with the self-driving system; instead a human safety operator in the vehicle is expected to brake when needed. But the safety operator, who had been looking down at the self-driving system’s display screen, failed to brake in time. The cause of the accident therefore has many elements, but is ultimately a system-design failure. When its perception module gets confused, an AV should slow down. But unexpected braking can cause problems of its own: confused AVs have in the past been rear-ended (by human drivers) after slowing suddenly. Hence the delegation of responsibility for braking to human safety drivers, who are there to catch the system when an accident seems imminent. In theory adding a safety driver to supervise an imperfect system ensures that the system is safe overall. But that only works if they are paying attention to the road at all times. www.economist.com/the-economist-explains/2018/05/29/why-ubers-self-driving-car-killed-a-pedestrian
Wednesday, May 30, 2018 10:43 PM
YOUR OPTIONS
NEW POSTS TODAY
OTHER TOPICS
FFF.NET SOCIAL