• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

So Uber Car Technology detected you crossing road but DECIDED to KILL YOU!!

swine_flu_H1H1

Alfrescian
Loyal
http://www.phoenixnewtimes.com/news...edestrian-before-impact-did-not-stop-10403960

Now Play
ducey-uber-3.jpg

Governor Ducey enticed Uber to come to Arizona with the promise of lax regulations. Before one of its vehicles killed a pedestrian, Uber programmed its vehicles to ignore obstacles in the road, a bombshell article states.
@DougDucey via Twitter
Report: Uber Autonomous Car Detected Pedestrian Before Fatal Impact but Didn't Stop
Ray Stern | May 7, 2018 | 4:07pm
AA
The autonomous Uber car that hit and killed a pedestrian in Tempe on March 18 had detected the woman, but ignored the information from its sensors, according to a bombshell article published Monday.

In a statement, Uber didn't refute any of the findings in TheInformation.com article by Amir Efrati.

Related Stories
Efrati's article quotes two unnamed people who may be current or former employees of Uber. The reporting adds a significant twist to the case for autonomous technology followers, and gives the public its first update on the probe into what happened.

Uber, Tempe police, the Maricopa County Sheriff's Office, the National Transportation Safety Board, and the National Highway Traffic Safety Administration all continue to work on a probe into the fatal crash.

Elaine Herzberg, 49, was crossing mid-block on Mill Avenue just north of the bridge over the Town Lake when one of Uber's Volvo XC-90 cars in autonomous mode slammed into her at about 40 mph. An interior camera showed that the car's backup driver, Rafaela Vasquez, wasn't looking at the road in the seconds before the crash.

uber-car-ntsb.jpg

NTSB investigators in Tempe check out the damage to the Uber vehicle involved in the fatal accident on March 18.
NTSB
Governor Doug Ducey encouraged Uber's autonomous-vehicle testing program to come to Arizona with the promise of lax regulations. Since the crash, Ducey banned Uber's autonomous vehicles from the road after the crash, but has allowed other autonomous vehicle companies to continue testing.

As Phoenix New Times reported last week, two companies — Waymo (Google) and Nuro, which makes autonomous delivery vehicles — have filed paperwork recently with the state to begin testing fully autonomous, no-backup-driver vehicles on Arizona roads.

According to Efrati's sources, Vasquez and the car might have been alerted to the pedestrian's presence — if only the car had been programmed to respond appropriately to obstacles in the road.

"The car’s sensors detected the pedestrian ... but Uber’s software decided it didn’t need to react right away," the article states. "That’s a result of how the software was tuned."

Uber's cars, like other self-driving vehicles, is made to ignore obstacles they detect that pose no threat.

"In this case, Uber executives believe the company’s system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn’t react fast enough," Efrati's sources said.

Efrati doesn't make clear who the sources are, but they seem to have deep inside knowledge of the company's self-driving unit. The article describes how employees broke into tears after hearing of Herzberg's death.

Uber's internal investigation showed that a "vital piece of the self-driving car was likely working properly: the 'perception' software, which combines data from the car’s cameras, lidar, and radars to recognize and 'label' objects around it. In this case, the software is believed to have seen the objects. The problem was what the broader system chose to do with that information."

Efrati's sources said that Uber had been "racing" to meet a goal to deploy fully autonomous vehicles on Arizona roads that have no backup drivers and would operate in "good weather" only.

As it worked toward the goal, the article states, Uber programmed its vehicles to react less vigorously to obstacles in the road deemed "false positives" so that passengers would have a smoother ride.

If you like this story, consider signing up for our email newsletters.
SHOW ME HOW
New Times asked Uber if any of the information in Efrati's article was unreliable or false. The company's response made no mention of anything inaccurate in the article:

“We’re actively cooperating with the NTSB in their investigation," the company said through a spokesperson. "Out of respect for that process and the trust we’ve built with NTSB, we can’t comment on the specifics of the incident."

Uber's engaged in a "top-to-bottom safety review," the company said, adding that former NTSB Chair Christopher Hart is advising Uber on "overall safety culture."

Tempe police didn't respond immediately to a request for an update on its investigation.


Ray Stern has worked as a newspaper reporter in Arizona for more than two decades. He's won many awards for his reporting, including the Arizona Press Club's Don Bolles Award for Investigative Journalism.


https://www.technologyreview.com/th...s-car-detected-a-pedestrian-but-chose-to-not/


What's up in emerging technology
Today \n","length":319}">
pit2-1.jpg

In a fatal crash, Uber’s autonomous car detected a pedestrian—but chose to not stop


The company has found the likely cause of its self-driving-car crash in March that killed someone trying to cross the road.

The news: According to a report by the Information, the vehicle’s software did in fact detect the pedestrian, but it chose not to immediately react. The car was programmed to ignore potential false positives, or things in the road that wouldn’t interfere with the vehicle (like a plastic bag). But those adjustments were taken too far.

Why? The car may have been part of a test for increased rider comfort. Autonomous cars aren’t known for their smooth rides, and by ignoring things that are probably not a threat, a vehicle can cut down on the number of start-and-stop jerks riders experience.

What’s next? Uber is conducting a joint investigation with the National Transportation Safety Board, after which more details are expected to be released. In the meantime, the report could inspire other self-driving-vehicle companies to treat potential false positives with more caution.



https://www.recode.net/2018/5/7/17328104/uber-self-driving-crash-arizona-software-elaine-herzberg

Uber’s self-driving software detected the pedestrian in the fatal Arizona crash but did not react in time
The company’s internal investigation as well as the federal investigation are ongoing.
By Johana Bhuiyan@JMBooyah May 7, 2018, 4:00pm EDT
Share
Screen_Shot_2018_03_21_at_3.55.26_PM.0.png

As part of its ongoing preliminary internal investigation, Uber has determined that its self-driving software did detect the pedestrian who was killed in a recent crash in Arizona but did not react immediately, according to The Information.

The software detected Elaine Herzberg, a 47-year-old woman who was hit by a semi-autonomous Volvo operated by Uber, as she was crossing the street but decided not to stop right away. That’s in part because the technology was adjusted to be less reactive or slower to react to objects in its path that may be “false positives” — such as a plastic bag.

Both Uber and the National Transportation Safety Board launched investigations into the crash to determine whether the software was at fault. Both investigations are ongoing. But people who were briefed on some of the findings of the investigation told The Information that the software may have been the likely cause of the crash.

Self-driving companies are able to tune their technologies to be more or less cautious when it is maneuvering around obstacles on public roads. Typically when the tech — like the computer vision software that is detecting and understanding what objects are — is less sophisticated, companies will make it so the vehicle is overly cautious.

Those rides can be clumsy and filled with hard brakes as the car stops for everything that may be in its path. According to The Information, Uber decided to adjust the system so it didn’t stop for potential false positives but because of that was unable to react immediately to Herzberg in spite of detecting her in its path.

Have more information or any tips? Johana Bhuiyan is the senior transportation editor at Recode and can be reached at [email protected] or on Signal, Confide, WeChat or Telegram at 516-233-8877. You can also find her on Twitter at @JmBooyah.

Uber has halted all its self-driving tests on public roads and has hired the former chair of the NTSB, Christopher Hart, to help asses the safety protocols of its self-driving technology.

The company also said it was unable to comment on the investigation as it’s against NTSB policy to reveal any information unless its has been vetted by the agency.

“We have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture,” an Uber spokesperson said in a statement. “Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”

Herzberg’s death has ushered in an important debate about Uber’s safety protocols as well as a broader debate about the safety of testing semi-autonomous technology on public roads. For example, companies typically have two safety drivers — people trained to take back control of the car — until they are completely confident in the capability of the tech. However, Uber only had one vehicle operator.

That’s in spite of the self-driving technology’s slow progress relative to that of other companies, like Waymo. As of February 2017, the company’s vehicle operators had to take back control of the car an average of once every mile, Recode first reported. As of March of 2018, the company was still struggling to meet its goal of driving an average of 13 miles without a driver having to take back control, according to the New York Times.

Alphabet’s self-driving company, Waymo, had a rate of 5,600 miles per intervention in California. (At the time, Uber pointed out this is not the only metric by which to measure self-driving progress.)

But even with multiple vehicle operators, it’s unclear how dependable humans can be as a backup to a technology that is not yet fully developed. As CityLab previously reported, some Uber safety drivers shared those concerns.

 

SeeFartLoong

Alfrescian
Loyal
The software detected Elaine Herzberg, a 47-year-old woman who was hit by a semi-autonomous Volvo operated by Uber, as she was crossing the street but decided not to stop right away. That’s in part because the technology was adjusted to be less reactive or slower to react to objects in its path that may be “false positives” — such as a plastic bag.


It is OK to treat Ang Moh like plastic bags, they are worthless to run over.
 

obama.bin.laden

Alfrescian
Loyal
So proven that it is bullshit that they claimed Niggers are STEALTH at night, this time Ang Moh got killed at night by powerful Uber Technology! Ang Moh STEALTH!
 

tanwahtiu

Alfrescian
Loyal
Angmoh bible Genesis 1 say angmoh are made from image of God...

So Uber kills their God and most coders are IT ah neh ABNN.

So ABNN kill pommie God...

So proven that it is bullshit that they claimed Niggers are STEALTH at night, this time Ang Moh got killed at night by powerful Uber Technology! Ang Moh STEALTH!
 
Top