recentpopularlog in

charlesarthur : selfdrivingcar   115

« earlier  
Drive.ai, a self-driving car startup once worth $200 million, is closing • SFChronicle.com
Sophia Kunthara and Melia Russell:
<p>Mountain View startup Drive.ai, which made kits to turn regular cars into autonomous ones, will shut its office in June and lay off 90 workers in a permanent closure of its business, according to a filing with a state agency.

At the same time, Apple has hired a handful of hardware and software engineers from Drive.ai, in what appears to be part of a renewed effort by the iPhone and Mac maker to branch out into self-driving cars.

Three weeks ago, Apple was said to be exploring a purchase of Drive.ai, a deal that would let Apple pick up dozens of Drive.ai engineers while eliminating a competitor from the market.

So far, five former Drive.ai employees have changed their LinkedIn profiles to say they left Drive.ai in June and joined Apple the same month. Four of those workers list “special projects” in their job titles. Those employees include data, systems and software engineers.</p>


Apple doesn't seem to quite want to let go of this idea. Can't be a sunk cost thing; they know when to stop throwing good money after bad. Either their ambitions are much bigger than we suspect, or much smaller than we infer.
apple  selfdrivingcar 
8 weeks ago by charlesarthur
Self-driving cars have a problem: safer human-driven ones • WSJ
Christopher Mims:
<p>If you buy one of many new makes and models of car today, you might be surprised to find that, as a standard feature, it can do something your previous car couldn’t: It will take over when it thinks you’re making a mistake.

In the coming years, many cars will do more than that, even driving mostly by themselves, at least on highways. And not just luxury models such as the latest Audi A8 or Cadillac CT6, but something as mainstream as a Nissan Rogue.

Some of this technology has been in development for years, but the newest versions of it—with advanced object recognition, radar-and-laser detection and lightning-fast artificial intelligence—were created for autonomous cars. Many tech entrepreneurs have argued that fleets of robo-taxis would convince us to abandon personal car ownership in favor of “transportation as a service.” Some of them have predicted these robot cars will start populating U.S. roads within the next two years.

But the paradox of how this evolution is playing out is that technology developed to give us driverless vehicles from the likes of Tesla and Alphabet’s Waymo could actually delay their adoption.

When car makers put these incremental tech advances in human-driven cars, they pre-empt one of the fully self-driving car’s supposed advantages: safety. These new systems marry the best machines capabilities—360-degree sensing and millisecond reflexes—with the best of the human brain, such as our ability to come up with novel solutions to unique problems.</p>


Maybe we'll just never quite get there; maybe it'll be an asymptotic, Zeno-style approach rather than a big bang.
selfdrivingcar 
9 weeks ago by charlesarthur
Technical glitches plague Cruise, GM’s $19bn self-driving car unit • The Information
Amir Efrati:
<p>Aside from software shutting off unexpectedly, other more common issues that have surfaced in regular testing of Cruise’s self-driving cars include near collisions with other vehicles, strange steering or unexpected braking—all of which can unnerve passengers, according to previously undisclosed data. (A human backup driver is always present to grab the wheel if anything goes wrong.) Moreover, the cars are relatively slow: in testing in San Francisco, trips typically take 80% longer than they would with a regular car, according to people with knowledge of the company. Cruise did not have a comment.

And comparing Cruise’s vehicles to how regular cars driven by people perform suggest that Cruise’s system, by the end of this year, is expected to be only 5% to 10% as safe as human-level driving in terms of the frequency of crashes, internal data shows. (See separate story.)

Cruise’s problems are not unique. Alphabet’s Waymo, which last December launched a limited robotaxi service with human backup drivers in suburban Phoenix, also has struggled. Uber, which has poured more than a billion dollars into its own self-driving car technology, has been largely stymied since one of its vehicles killed a pedestrian last year. None of the programs are close to offering safe driverless vehicles to the public at a meaningful scale, let alone showing how they might operate the vehicles profitably.</p>


I find it really hard to know where we are on this timeline. In 2012, it took <a href="https://www.wired.com/2012/06/google-x-neural-network/">a colossal effort</a> for Google to categorise cat videos. Last week a conference heard that you could do the same work for $200 in the cloud. Are we maybe just expecting too much, too soon of self-driving vehicles?
selfdrivingcar 
10 weeks ago by charlesarthur
Ford taps the brakes on the arrival of self-driving cars • WIRED
Aarian Marshall:
<p>Ford CEO Jim Hackett Tuesday joined the growing ranks of vehicle and tech execs willing to say publicly that self-driving cars won’t arrive as soon as some had hoped.

The industry “overestimated the arrival of autonomous vehicles,” Hackett told the Detroit Economic Club. Though Ford is not wavering from its self-imposed due date of 2021 for its first purpose-built driverless car, Hackett acknowledged that the vehicle’s “applications will be narrow, what we call geo-fenced, because the problem is so complex.” Bloomberg earlier reported the comments.

Hackett is the latest high-ranking industry insider to engage in public real talk about the prospects for self-driving cars, which back in 2016 seemed just around the corner…

…What’s so complicated about full self-driving? For one, there are no federal regulations for the tech, and states have struggled to fill the void with their own testing rules. Second, industry insiders say sensors need to get better—to “see” farther more cheaply—before the tech can be deployed widely. And developers are still hacking away at better algorithms, ones that can handle the uncertainty of new road situations without hurting their cargo.</p>
selfdrivingcar  ford 
may 2019 by charlesarthur
The Boeing 737 Max crash is a warning to drivers, too • Slate
Henry Grabar:
<p>automation has not made pilots’ jobs easier, says Steve Casner, a pilot and research psychologist at NASA’s Ames Research Center: “You’d think it would dumb down the role of the pilot. Contrary to expectation, you have to know more than ever.”

Casner is one of a number of pilots and analysts who see a parallel between the introduction of automation in airplanes more than 30 years ago and its arrival in cars today, as drivers prepare to relinquish the burdens of navigating the blacktop.

“It’s like 1983 all over again,” Casner told me Monday. Where airlines by and large got it right, he thinks car-makers may be overeager in sticking humans in the car with unfamiliar technologies. “I’m very concerned that even though aviation has shown us how to do it, we’re about to make a big mistake with cars. Sitting there waiting like a potted plant for the lights to blink is not one of our fortes.”

Together with the cognitive psychologist Edwin Hutchins, Casner is the author of a new paper, “What Do We Tell the Drivers? Towards Minimum Driver Training Standards for Partially Automated Cars.” One of their main points is that automation would not have made commercial flight as safe as it is today without pilots who understood how the systems worked.</p>


We're already seeing crashes where the human driver doesn't realise that the system isn't functioning correctly. Disengaging it might get harder.
selfdrivingcar  airlines 
march 2019 by charlesarthur
Uber escapes criminal charges for 2018 self-driving death in Arizona • Ars Technica
Timothy Lee:
<p>"After a very thorough review of all evidence presented, this office has determined that there is no basis for criminal liability for the Uber corporation," wrote Yavapai County Attorney Sheila Sullivan Polk in a letter dated Monday.

Tempe is in Maricopa County, not Yavapai County. But Maricopa County once collaborated with Uber on a public safety campaign. So prosecutors referred the case to Yavapai County to avoid any potential for a conflict of interest.

While Uber appears to be off the hook, Uber driver Rafael Vasquez could still face criminal charges. Dashcam video showed Vasquez repeatedly looking down at her lap in the final minutes before the crash—including five agonizing seconds just before her car struck Herzberg. Records obtained from Hulu suggest that Vazquez was streaming the television show The Voice just before the fatal crash.

Yavapai County Attorney Polk said she didn't have enough information to decide whether it would be appropriate to charge Vasquez. </p>


"The driver of the self-driving car is responsible for this death." That's going to be fun to prosecute. Vasquez was given a horrible task: in charge of a potentially lethal device, but with minimal time to avert it killing. A weird formulation of the trolley problem.
uber  selfdrivingcar  death 
march 2019 by charlesarthur
Apple self-driving car layoffs hit 190 employees in Santa Clara, Sunnyvale • SFChronicle.com
Roland Li:
<p>Apple will lay off 190 employees in Santa Clara and Sunnyvale in its self-driving car division, the company said.

The layoffs were disclosed, along with new details, in a letter this month to the California Employment Development Department. CNBC reported last month that layoffs were occurring in the self-driving car division, known as Project Titan. Tom Neumayr, an Apple spokesman, confirmed that the letter to the state referenced the same layoffs.

Most of the affected employees are engineers, including 38 engineering program managers, 33 hardware engineers, 31 product design engineers and 22 software engineers. The layoffs will take effect April 16, according to the filing.

Apple’s expansion into Santa Clara and Sunnyvale, which are close to the company’s headquarters in Cupertino, ramped up starting in 2014, according to property records and previous news reports. That was the same year the self-driving division was founded.</p>


Laying off engineers? Project Titan just added an -ic to its name.
apple  selfdrivingcar 
february 2019 by charlesarthur
Apple lays off over 200 from Project Titan autonomous vehicle group • CNBC
Lora Kolodny, Christina Farr and Paul Eisenstein:
<p>Apple dismissed just over 200 employees this week from Project Titan, its stealthy autonomous vehicle group, people familiar with the matter told CNBC.

An Apple spokesperson acknowledged the layoffs and said the company still sees opportunity in the space:

“We have an incredibly talented team working on autonomous systems and associated technologies at Apple. As the team focuses their work on several key areas for 2019, some groups are being moved to projects in other parts of the company, where they will support machine learning and other initiatives, across all of Apple,” the spokesperson said.

“We continue to believe there is a huge opportunity with autonomous systems, that Apple has unique capabilities to contribute, and that this is the most ambitious machine learning project ever.”</p>


As someone remarked (on Twitter of course), they should just add the abbreviation for "integrated car" to the end of the project name. Just can't really see Apple doing cars.
apple  cars  selfdrivingcar 
january 2019 by charlesarthur
The deadly recklessness of the self-driving car industry • Gizmodo
Brian Merchant:
<p>The newest and most glaring example of just how reckless corporations in the autonomous vehicle space can be involves the now-infamous fatal crash in Tempe, Arizona, where one of Uber’s cars struck and killed a 49-year-old pedestrian. The Information obtained an email reportedly sent by Robbie Miller, a former manager in the testing-operations group, to seven Uber executives, including the head of the company’s autonomous vehicle unit, warning that the software powering the taxis was faulty and that the backup drivers weren’t adequately trained.

“The cars are routinely in accidents resulting in damage,” Miller wrote. “This is usually the result of poor behavior of the operator or the AV technology. A car was damaged nearly every other day in February. We shouldn’t be hitting things every 15,000 miles. Repeated infractions for poor driving rarely results in termination. Several of the drivers appear to not have been properly vetted or trained.”

That’s nuts. Hundreds of self-driving cars were on the road at the time, in San Francisco, Pittsburgh, Santa Fe, and elsewhere. The AV technology was demonstrably faulty, the backup drivers weren’t staying alert, and despite repeated incidents—some clearly dangerous—nothing was being addressed. Five days after the date of Miller’s email, a Volvo using Uber’s self-driving software struck Elaine Herzberg while she was slowly crossing the street with her bicycle and killed her. The driver was apparently streaming The Voice on Hulu at the time of the accident.

This tragedy was not a freak malfunction of some cutting-edge technology—it is the entirely predictable byproduct of corporate malfeasance.</p>


There isn't a great deal that's new here (apart from his efforts to get Tesla to explain its thinking on autonomous driving), but gathering it in one place is quite startling.
selfdrivingcar  uber 
december 2018 by charlesarthur
Vigilante engineer stops Waymo from patenting key lidar technology • Ars Technica
Mark Harris:
<p>Following a surprise left-field complaint by Eric Swildens, the US Patent and Trademark Office (USPTO) has rejected all but three of 56 claims in Waymo's 936 patent, named for the last three digits of its serial number. The USPTO found that some claims replicated technology described in an earlier patent from lidar vendor Velodyne, while another claim was simply "impossible" and "magic."

Swildens, who receives no money or personal advantage from the decision, told Ars that he was delighted at the news. "The patent shouldn't have been filed in the first place," he said. "It's a very well written patent. However, my personal belief is that the thing that they say they invented, they didn't invent."

The 936 patent played a key role in last year's epic intellectual property lawsuit with Uber. In December 2016, a Waymo engineer was inadvertently copied on an email from one of its suppliers to Uber, showing a lidar circuit design that looked almost identical to one shown in the 936 patent…

…Remarkably, Swildens does not work for Uber or for Velodyne, nor for any other self-driving developer—he works for a small cloud computing startup. Swildens became interested in the patent when it surfaced during the Uber case, and he saw how simple Waymo's lidar circuit seemed to be. "I couldn't imagine the circuit didn't exist prior to this patent," he told Wired last year.

Swildens' research uncovered several patents and books that seemed to pre-date the Waymo patent. He then spent $6,000 of his own money to launch a formal challenge to 936. Waymo fought back, making dozens of filings, bringing expert witnesses to bear, and attempting to re-write several of the patent's claims and diagrams to safeguard its survival.

The USPTO was not impressed. In March, an examiner noted that a re-drawn diagram of Waymo's lidar firing circuit showed current passing along a wire between the circuit and the ground in two directions—something generally deemed impossible.</p>


As everyone on Twitter has been saying, not all heroes wear capes.
selfdrivingcar  patent  lidar 
october 2018 by charlesarthur
Fully driverless Waymo taxis are due out this year, alarming critics • Ars Technica
Timothy Lee:
<p>Waymo, Google's self-driving car project, is planning to launch a driverless taxi service in the Phoenix area in the next three months. It won't be a pilot project or a publicity stunt, either. Waymo is planning to launch a public, commercial service—without anyone in the driver's seat.

And to date, Waymo's technology has gotten remarkably little oversight from government officials in either Phoenix or Washington, DC.

If a company wants to sell a new airplane or medical device, it must undergo an extensive process to prove to federal regulators that it's safe. Currently, there's no comparable requirement for self-driving cars. Federal and state laws allow Waymo to introduce fully self-driving cars onto public streets in Arizona without any formal approval process.

That's not an oversight. It represents a bipartisan consensus in Washington that strict regulation of self-driving cars would do more harm than good.

"If you think about what would be required for some government body to examine the design of a self-driving vehicle and decide if it's safe, that's a very difficult task," says Ed Felten, a Princeton computer scientist who advised the Obama White House on technology issues.</p>


Pretty much impossible to prove "safe". But how safe? Safer than a human? My suspicion is that they will be safer than humans in general, but do some strange things leading to accidents when humans wouldn't have.
regulation  waymo  selfdrivingcar 
october 2018 by charlesarthur
This military tech could finally help self-driving cars master snow • Ars Technica
Jonathan Gitlin:
<p>The research conducted at the country's National Laboratories is usually highly classified and specifically aimed at solving national security problems. But sometimes you get a swords-into-ploughshares moment. That's the case here, as a startup called WaveSense looks to apply technology originally developed by MIT Lincoln Laboratory to detect buried mines and improvised explosive devices for use in self-driving cars.

If you want a car to drive itself, it has to know where it is in the world to a pretty high degree of accuracy. Until now, just about every variation of autonomous vehicle we've come across has done that through a combination of highly accurate GPS, an HD map, and some kind of sensor to detect the environment around it. Actually, you want more than one kind of sensor, because redundancy is going to be critical if humans are going to trust their lives to robot vehicles.

Most often, those sensors are a mix of optical cameras and lidar, both of which have pluses and minuses. But is a combination of lidar and camera truly redundant, if both are relying on reflected light? Other solutions have included far infrared, which works by detecting emitted light, but WaveSense's approach is truly photon-independent. What's more, it's the first sensor we've come across that should be almost completely unfazed by snow.


That's because it uses ground-penetrating radar (GPR), mounted underneath the vehicle, to sense the road beneath—now you can see where the military application was. The GPR scans the ground underneath it to a depth of around 10 feet (3m), running at a little over 120Hz to build up a picture of the subterranean world beneath it. As the car drives along, it compares that data to a map layer of already-collected GPR data for the road network and can place the car to within a couple of centimeters.

Yes, this requires pre-mapping, but so does lidar. And WaveSense says that remapping should be far less frequent as conditions under the road are less subject to change than they are above ground.</p>


OK, but don't we need them to master fair-weather roads first?
selfdrivingcar  snow 
august 2018 by charlesarthur
Hackers plan to keep GM's self-driving cars safe • Yahoo News
Rob Pegoraro:
<p>Their plan for the autonomous vehicles coming from Cruise, based on the Chevy Bolt electric car, starts with a simple premise: Remove the systems that opened up those other vehicles to remote attacks.

Bluetooth? Forget it — the car is driving itself, so you don’t need hands-free calling. The radio? You’ll listen to your phone anyway. And that fancy touchscreen hardwired into the dashboard doesn’t need to exist either, not when the passengers can interact with the car via a stripped-down, locked-down tablet.

“If you don’t need something, take it out,” Valasek said. It’s Security 101 to reduce a device’s “attack surface” — the parts that respond to outside inputs, and which an adversary could therefore try to exploit. But it hasn’t always been Connected Car 101.

Miller’s and Valasek’s formula also includes a healthy dose of paranoia. Their design calls for the car to refuse any inbound connections — no data will come to the vehicle unless it asks for it first.

And much as in the locked-down framework Apple (AAPL) built for the iOS software inside iPhones and iPads, this autonomous-vehicle system will digitally sign and verify code at all levels, with messages from one component to another encrypted whenever possible.

Miller noted one possible speed bump: The wired networking in many cars is too old to support that encryption. “The components in cars are just so far behind,” he complained.

If this level of security by design sounds like something worth paying extra for — sorry, you can’t. Cruise Automation will run only as a ride-hailing service, like an Uber or Lyft but devoid of life forms in the driver’s seat.

That solves the issue of how you sell a car without a radio or Bluetooth: You don’t have to.</p>


Clever - and probably necessary.
selfdrivingcar  hacking 
august 2018 by charlesarthur
Former Apple employee charged with theft of trade secrets related to autonomous car project • Mac Rumors
Juli Clover:
<p>Xiaolang Zhang was hired at Apple in December of 2015 to work on Project Titan, developing software and hardware for use in autonomous vehicles. Zhang specifically worked on Apple's Compute Team, designing and testing circuit boards to analyze sensor data.

He was provided with "broad access to secure and confidential internal databases" due to his position, which contained trade secrets and intellectual property for the autonomous driving project that he ultimately ended up stealing.

In April 2018, Zhang took family leave from Apple following the birth of his child, and during that time, he visited China. Shortly after, he told his supervisor at Apple he was leaving the company and moving to China to work for XMotors, a Chinese startup that also focuses on autonomous vehicle technology.

Zhang's supervisor felt that he had "been evasive" during the meeting, which led Apple's New Product Security Team to begin an investigation, looking into Zhang's historical network activity and analyzing his Apple devices, which were seized when he resigned.

Apple found that just prior to Zhang's departure, his network activity had "increased exponentially" compared to the prior two years he had worked at Apple. He accessed content that included prototypes and prototype requirements, which the court documents specify as power requirements, low voltage requirements, battery system, and drivetrain suspension mounts.</p>


Arrested at the airport as he was about to leave for China. Neil Cybart has <a href="https://twitter.com/neilcybart/status/1016792807145656320">dug into the court filing</a>, which shows there are 5,000 Apple employees who know about "Project Titan" (the self-driving vehicle project) and 2,700 who have access to the Project Titan database. Here's the <a href="https://www.scribd.com/document/383602916/USA-v-Xiaolang-Zhang#from_embed">full court filing</a>.
apple  selfdrivingcar 
july 2018 by charlesarthur
The dream of driverless cars is dying • The Spectator
Christian Wolmar went to a giant "Self-driving vehicle" exhibition in Germany, but found them in short supply:
<p>Surprisingly, I met more doomsayers than purveyors of the autonomous driving dream. The starkest warning came from Tim Mackey, who styles himself ‘senior technical evangelist’ for Black Duck Software, a company that specialises in security issues around autonomous vehicles. He believes there will be a seminal event that will stop all the players in the industry in their tracks. ‘We have had it in other areas of computing, such as the big data hacks and security lapses,’ he said, ‘and it will happen in relation to autonomous cars. At the moment, none of the big players are thinking properly about security aspects and then they will be forced to.’ He pointed to a video showing on another stand in which a man was calling up a car from a garage using a phone app: ‘That sort of thing is just too easy to hack. There’s all sorts of software put into cars that is old and easy to access. We just have to hope that the wake-up call will be minor and not kill anyone.’ Indeed, in a test a few years ago, hackers were able to get hold of a car’s steering and braking systems and Mackey is convinced that criminals will one day use the same method.

More widely, there was a general expectation these suppliers were riding the crest of a wave that will hit the rocks soon. While there is no doubting the scale of this industry, with billions being invested every year, none of the OEMs has yet made a penny from selling a driverless car. This money, benefiting these exhibitors, is therefore a punt, a high-stakes bet there is a pot of gold at the end of the rainbow. One, Johannes, told me: ‘I see a pattern like the dotcom boom. At some point, people are going to realise that the day when they start to get returns for their investment is far off, if ever. Then they will start pulling out and who knows how bad it will get. But the clever money will move somewhere else.’ The bad publicity caused by a couple of deaths in Tesla cars while its autopilot was engaged and by the Uber fatality may be seen as the start of public disenchantment with the concept.</p>

The Spectator is a fairly right-wing magazine, so you might expect it to be down on new tech; but I worked with Wolmar at The Independent, and he's fair but firm on topics like this.
Selfdrivingcar 
july 2018 by charlesarthur
Uber test car driver streamed Hulu before fatal crash • Consumer Reports
Jeff Plungis and Keith Barry:
<p>The Tempe police report says distraction was a factor in the crash that killed the pedestrian, Elaine Herzberg.

During Vasquez’s ride in the Uber vehicle, which was recorded on video inside the vehicle as part of the testing, she looked down 204 times, mostly in the direction of the lower center console near her right knee, according to the police report. She was looking down for 5.2 of the final 5.7 seconds prior to the crash, the report says.

A log of Vasquez’s account provided by the video-streaming service Hulu, under a search warrant, showed that “The Voice” was streaming on her account in the final 43 minutes of the drive and that the streaming ended at 9:59 p.m., the approximate time of the collision, the police report says. 

The police concluded that the crash wouldn’t have occurred if Vasquez had been paying attention to the roadway, and indicated that she could be charged with vehicular manslaughter. Details from the police report were published Thursday by the Arizona Republic, Reuters, and other media outlets.</p>


In which case what's the point of it being "self-driving"? It's the limitations that make this pointless. You couldn't trust it on motorways, side roads, at night. In which case there's no point having it. Self-driving systems have to be really, really good, or else not employed at all, because driver inattention will be a thing, and accidents will keep happening.
selfdrivingcar  uber 
june 2018 by charlesarthur
Tesla lawsuit highlights risks of inside threat • CNBC
Kate Fazzini:
<p>The incidents described in CEO Elon Musk's email to employees and the <a href="https://www.cnbc.com/2018/06/20/tesla-sues-former-employee-for-allegedly-stealing-gigabytes-of-data-making-false-claims-to-media.html">company's lawsuit against the former employee</a> are jarring because they show how much access insiders have to critical systems of these vehicles, and how difficult it might be to determine whether they are altering code on machines that test the cars.

Cybersecurity professionals have demonstrated how to hack into the infotainment systems of several vehicle brands over the years. These demonstrations have shown that, while it's fairly easy to break into the computer systems that control dashboard computers, getting deeper into the systems that actually run a vehicle – and control its steering, acceleration and braking -- is much harder. It is often difficult to get to these computers physically, and they typically aren't connected to the internet or remotely available, making it necessary for an attacker to have physical access to the device.

It's even less likely outside attackers could get access to computers used in vehicle testing.

But insiders have far greater access. Employees may not only have physical access to the critical systems that run manufacturing or program car components, but they may know important information that allows them to write code that can cause meaningful damage to the vehicle.</p>
tesla  software  selfdrivingcar 
june 2018 by charlesarthur
Preliminary report released for crash involving pedestrian, Uber Technologies test vehicle • NTSB
<p>The <a href="https://goo.gl/2C6ZCH">report</a> states data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

In the report the NTSB said the self-driving system data showed the vehicle operator engaged the steering wheel less than a second before impact and began braking less than a second after impact. The vehicle operator said in an NTSB interview that she had been monitoring the self-driving interface and that while her personal and business phones were in the vehicle neither were in use until after the crash.

All aspects of the self-driving system were operating normally at the time of the crash, and there were no faults or diagnostic messages.</p>


It doesn't do emergency braking when it's under computer control, but it doesn't alert the "driver" either. That's all sorts of wrong. It's a pity that someone had to die for this huge error to become apparent.
ai  safety  uber  selfdrivingcar 
may 2018 by charlesarthur
Self-driving cars are here • Medium
Andrew Ng of Drive.ai, which is introducing self-driving cars in Frisco, Texas in July:
<p>It is every self-driving company’s responsibility to ensure safety. We believe the self-driving car industry should adopt these practices:

• Self-driving cars should be made visually distinctive, so that people can quickly recognize them. Even with great AI technology, it is safer if everyone recognizes our cars. After examining multiple designs, we found that a bright orange design is clearly recognizable to pedestrians and drivers.

We deliberately prioritized recognizability over beauty, since it is recognizability that enhances safety.

• While a human driver would make eye contact with a pedestrian to let them know it is safe to cross, a driverless car cannot communicate the same way. Thus, a self-driving car must have other ways to communicate with people around it. Drive.ai is using exterior panels to do this.

• Self-driving car companies should engage with local government to provide practical education programs. Just as school buses, delivery trucks, and emergency vehicles behave differently from regular cars, so too are self-driving cars a different class of vehicle with their own behaviors. It has unique strengths (such as no distracted driving) and limitations (such as inability to make eye contact or understand hand gestures). It’s important to increase the public’s awareness of self-driving through media, unique signage, and dedicated pickup and dropoff zones. We also ask members of the local community to be lawful in their use of public roads and to be considerate of self-driving cars so that we can improve transportation together.</p>


OK, but what about <a href="https://www.theinformation.com/articles/uber-finds-deadly-accident-likely-caused-by-software-set-to-ignore-objects-on-road">people who seem like plastic bags</a>?
ai  cars  selfdrivingcar  drive 
may 2018 by charlesarthur
Uber halts autonomous cars after 49-year-old pedestrian is killed in Arizona • The Washington Post
Faiz Siddiqui and Michael Laris:
<p>The National Transportation Safety Board has opened an investigation into the crash, NTSB spokesman Eric Weiss said.

Uber issued a short statement.

“Our hearts go out to the victim’s family. We are fully cooperating with local authorities in their investigation of this incident,” a company spokeswoman said.

The vehicle was in autonomous mode at the time of the crash, though a driver was behind the wheel, Tempe police said in a statement. The crash occurred about 10 p.m. Sunday in the area of Curry Road and Mill Avenue, a busy intersection with multiple lanes in every direction.

Police said the vehicle was northbound on Curry Road when a woman, identified as 49-year-old Elaine Herzberg, crossing from the west side of street, was struck. She died at a hospital, the department said.

Missy Cummings, a robotics expert at Duke University who has been critical of the swift rollout of driverless technology across the country, said the computer-vision systems for self-driving cars are “deeply flawed” and can be “incredibly brittle,” particularly in unfamiliar circumstances.</p>


Herzberg wasn't on a "crosswalk" (UK lingo: pedestrian crossing) when she was hit. But that's irrelevant. Cars are meant to yield to pedestrians. Lots more to be discovered about this, including how fast the car was going, how well-lit things were, what system it was using to detect obstacles, and more.

So the first self-driving car has killed a non-driver. Now the really hard questions begin. Who's responsible - the person inside the car, or the authors of the software? How do you stop this happening again - or is there a level of pedestrian killing that is "acceptable"?
selfdrivingcar  killing  pedestrian 
march 2018 by charlesarthur
What’s it like to ride in a self-driving car? • The Economist
Tom Standage:
<p>The vehicle I climbed into was a modified Volvo XC90, with a bundle of extra sensors, including cameras and a spinning LIDAR unit, on its roof. Ryan, the vehicle’s safety driver, manually drove the vehicle out of the car park and onto the public roads, before pressing a button to engage the self-driving system. And then the car started driving itself.

At first, the experience is thrilling. It seems like magic when the steering wheel turns by itself, or the car gently slows to a halt at a traffic light. The autonomous Uber drove carefully but confidently in downtown traffic and light snow, slowing down when passing a school or approaching the brow of a hill, and putting its foot down (as it were) when faced with an open, straight road with no other traffic. The most noticeable difference from a human driver was that the vehicle made no attempt to avoid Pittsburgh’s notorious potholes, making the ride slightly bumpy at times. Sitting in the back seat, I could see a digital representation, displayed on an iPad mounted between the front seats, of how the car perceived the world, with other vehicles, pedestrians and cyclists highlighted in clusters of blue dots. I felt as though I was living in the future. But then, after a minute or two, the novelty wore off. When technology works as expected, it’s boring.</p>

The potholes thing would get a bit weary-making after a while, though. Also expensive getting your tyres and axles fixed.
Selfdrivingcar 
march 2018 by charlesarthur
Volvo's Drive Me takes detour on road to full autonomy • Automotive News
Douglas Bolduc:
<p>Volvo’s Drive Me autonomous driving project is taking some detours compared with promises the automaker made when it announced the program four years ago, but Volvo says the changes will make its first Level 4 vehicle even better when it arrives in 2021.

In early announcements about Drive Me, Volvo promised to have 100 self-driving vehicles on the road but that has been downgraded. Volvo now says it will have 100 people involved in the Drive Me program within the next four years. Initially, the people taking part in Drive Me will test the cars with the same Level 2 semiautonomous assistance systems that are commercially available to anyone who purchases the vehicle in markets such as Europe and the U.S.

Drive Me is a public autonomous driving experiment that now includes families in Sweden and will be extended to London and China later. The goal is to provide Volvo with customer feedback for its first model with Level 4 autonomy, which means the car can drive itself but still has a steering wheel and pedals so that the driver can take control when needed.</p>


"When needed"? I don't like that phrase. How quickly might I be needed?
selfdrivingcar 
december 2017 by charlesarthur
Bob Lutz: Kiss the good times goodbye • Auto News
Bob Lutz is a former vice chairman and head of product development at General Motors. He also held senior executive positions with Ford, Chrysler, BMW and Opel: I
<p>t saddens me to say it, but we are approaching the end of the automotive era.

The auto industry is on an accelerating change curve. For hundreds of years, the horse was the prime mover of humans and for the past 120 years it has been the automobile.

Now we are approaching the end of the line for the automobile because travel will be in standardized modules.

The end state will be the fully autonomous module with no capability for the driver to exercise command. You will call for it, it will arrive at your location, you'll get in, input your destination and go to the freeway.

On the freeway, it will merge seamlessly into a stream of other modules traveling at 120, 150 mph. The speed doesn't matter. You have a blending of rail-type with individual transportation.

Then, as you approach your exit, your module will enter deceleration lanes, exit and go to your final destination. You will be billed for the transportation. You will enter your credit card number or your thumbprint or whatever it will be then. The module will take off and go to its collection point, ready for the next person to call.

Most of these standardized modules will be purchased and owned by the Ubers and Lyfts and God knows what other companies that will enter the transportation business in the future.

A minority of individuals may elect to have personalized modules sitting at home so they can leave their vacation stuff and the kids' soccer gear in them. They'll still want that convenience.

The vehicles, however, will no longer be driven by humans because in 15 to 20 years — at the latest — human-driven vehicles will be legislated off the highways.

The tipping point will come when 20 to 30 percent of vehicles are fully autonomous. Countries will look at the accident statistics and figure out that human drivers are causing 99.9 percent of the accidents.</p>
selfdrivingcar 
november 2017 by charlesarthur
Building the best possible driver inside Waymo’s castle • TechCrunch
Darrell Etherington:
<p>Waymo classifies anything from Levels 1 through 3 as technically “driver assist” features, according to Krafcik, and this is an “important divide” which Waymo has observed first hand, concluding early on that it’s not an area they’re interested in pursuing.

Krafcik revealed that one of the first products Waymo considered bringing to market back in 2012 and 2013 was a highway driving assist feature, which would handle everything. between onramp and exit, but that also required drivers to be fully attentive to the road and their surroundings while it was in operation.

The results, per Krafcik, were downright frightening: footage taken from the vehicles of Google employees testing the highway assist features, which the company showed us during the briefing, including people texting, doing makeup, fumbling around their seat for charge cables and even, in one particularly grievous instance, sleeping while driving 55 MPH on a freeway.

“We shut down this aspect of the project a couple of days after seeing that,” Krafcik said. “The better you make the driver assist technologies… the more likely the human behind the wheel is to fall asleep. And then when the vehicle says hey I need you to take over, they lack contextual awareness.”

This is why Waymo has been very vocal in the past and today about focusing on Level 4 (full autonomy within specific ‘domains’ or geographies and conditions) and Level 5 (full, unqualified autonomy).</p>

"Lacks contextual awareness" is a nice way to say "won't know what the hell is going on". Reminds of the old joke - "I want to die peacefully in my sleep, like my father, not screaming in terror like his passengers."
selfdrivingcar 
october 2017 by charlesarthur
Do autonomous cars dream of driverless roads? • Dark Reading
Laurence Pitt is strategic director for security at Juniper Networks in Europe/Mid-East/Africa:
<p>The UK government is seeking to take a leadership role in the development of these rules by contributing an Autonomous and Electric Vehicle bill which will create a new insurance framework for self-driving cars. In tandem, the UK Department for Transport and Centre for the Protection of National Infrastructure have released a series of documents outlining principles of cyber security for connected and automated vehicles.’These documents form a modern version of Asimov’s Robotic Laws, but with the focus being on the automotive manufacturers to ensure that these vehicles are developed with a defense-in-depth approach so that they remain resilient to threat at all times – even in situations where sensors are unable to respond due to attack or failure.

This legislation will put the United Kingdom at the centre of these new and exciting technological developments, while ensuring that safety and consumer protection remain at the heart of an emerging industry.</p>


Top marks to the sub-editor who ignored Pitt's chosen narrative (Asimov's Laws, which as he points out aren't applicable because the cars aren't sentient) and went with the Philip K Dick one for the headline.

In fact, I'd say it's headline of the month.
selfdrivingcar 
september 2017 by charlesarthur
Autonomous cars: the level 5 fallacy • Monday Note
Jean-Louis Gassée on the idea that cars will be completely self-driving ("Level 5"):
<p>In prior Monday Notes that discussed electric and autonomous cars, a subject of endless fascination, I evoked scenarios where SD cars can’t cope with circumstances that require human intervention. Today, I’ll offer the pedestrian crossing at the intersection of Hayes and Octavia in San Francisco:

<img src="https://cdn-images-1.medium.com/max/1600/1*axBSoWELlJ-n8D4jEKGGFw.png" width="100%" />

Understandably, the Google Street View picture was taken in the early morning. Now, imagine the 1 pm Sunday scene with crowded sidewalks and sticky car traffic. In today’s world, pedestrians and drivers manage a peaceful if hiccuping coexistence. Through eye contact, nods, hand signals, and, yes, courteous restraint, pedestrians decide to sometimes forfeit their right-of-way and let a few cars come through. On the whole, drivers are equally patient and polite (an unceasing subject of amazement for Parisians walking the streets of San Francisco).

Can we “algorithmicize” eye contact and stuttering restraint? Can an SD car acknowledge a pedestrian’s nod, or negotiate “turning rights” with a conventional vehicle?

No, we can’t. And we don’t appear to have a path to overcome such “mundane” challenges.
But you don’t have to believe me, or think I’m not “with it”. We can listen to Chris Urmson, Google’s Director of Self-Driving Cars from 2013 to late 2016 (he had joined the team in 2009). In a SXSW talk in early 2016, Urmson gives a sobering yet helpful vision of the project’s future, summarized by Lee Gomes in an IEEE Spectrum article [as always, edits and emphasis mine]:
<p>“Not only might it take much longer to arrive than the company has ever indicated — as long as 30 years, said Urmson — but the early commercial versions might well be limited to certain geographies and weather conditions. Self-driving cars are much easier to engineer for sunny weather and wide-open roads, and Urmson suggested the cars might be sold for those markets first.”</p>
</p>
ai  machinelearning  automotive  selfdrivingcar 
september 2017 by charlesarthur
Researchers trick self-driving car cameras using stickers • CNet Roadshow
Andrew Krok:
<p>Researchers created two different sorts of attacks on a self-driving car's systems, using a whole lot of math and a little bit of printing. It involves gaining access to a car's classifier, a part within its vision system that tells the car what an object is and what it means to the vehicle. If the car's cameras detect an object, it's up to the classifier to determine how the car handles said object.

<img src="https://cnet4.cbsistatic.com/img/aJ_uWNccbLXhR7gHq1Loe1PDj90=/2017/08/07/93b4233a-70c2-4d48-bae5-b23c64b9a039/sdc-bamboozle-promo.jpg" width="100%" />

The first kind of attack involves printing out a life-size copy of a road sign and taping it over an existing one. A right-turn sign with a sort of grayed-out, pixelated arrow confused the system into believing it was either a stop sign or an added-lane sign, but not a right-turn sign. Thus, a confused vehicle may attempt to stop when it does not need to, causing additional confusion on the road.

The second kind of attack involved small stickers that give off a sort of abstract-art look. These rectangular stickers, in black and white, tricked the system into believing the stop sign was a 45-mph speed limit sign. It should be fairly obvious that nothing good can come from telling a car to hustle through an intersection at speed, as opposed to stopping like usual.

Of course, this all hinges on whether or not malicious parties have access to a vehicle system's classifier, which may be the same across different automakers if they all purchase their systems from a single supplier.</p>
selfdrivingcar  hacking 
august 2017 by charlesarthur
Autonomous driving, parking and planning • DIGITS to DOLLARS
Jonathan Greenberg:
<p>Beijing, like many rapidly growing cities, now has some formidable traffic problems. (We budgeted two hours between every meeting and found this left very little cushion for on-time arrivals.) But there is another problem with all those cars – where to put them. In recent years, we have watched with mounting horror the difficulties of parking in Beijing. We visited a shopping district in an office park far from the center of the city, and at lunch time cars were double and triple parked the length of the entire street.

If you scan the Internet you can find a whole literature about the amount of space given over to parking. Multi-car garages in the US can occupy a third of the house’s square footage. Add up the amount of land set aside for parking lots and curb parking, and then the necessary buffers, it is a staggering amount of space. In the US our cities are now planned for parking (except in San Francisco which has adopted a deliberate plan to reduce the amount of parking, which is a whole other topic). If cars were autonomous, we could radically reshape the way we build cities. We could make cities denser without making them feel more crowed. Walking around a suburb could become realistic. If we want to get fully Utopian, we could imagine the health benefits from this alone. A more quantitative approach would be to calculate the real estate savings alone from halving parking, an amount that is probably measured in hundreds of billions of dollars.

We readily admit that this is a bit of fantastical, but it is not wholly unrealistic.</p>
selfdrivingcar 
july 2017 by charlesarthur
Self-driving cars prove to be labour-intensive for humans • FT
Tim Bradshaw:
<p>Self-driving cars seem like a magical idea. The concept of vehicles that can operate themselves, without steering wheels or pedals, leaps straight from the pages of science fiction. 

Yet like so many fantastical stories, there are “wizards” hidden behind the curtain — lots of them. Constructing the road to fully automated driving, it turns out, requires a lot of manual labour. 

Most companies working on this technology employ hundreds or even thousands of people, often in offshore outsourcing centres in India or China, whose job it is to teach the robo-cars to recognise pedestrians, cyclists and other obstacles. The workers do this by manually marking up or “labelling” thousands of hours of video footage, often frame by frame, taken from prototype vehicles driving around testbeds such as Silicon Valley, Pittsburgh and Phoenix. 

“Machine learning is a myth, it’s all Wizard of Oz type work,” says Jeremy Conrad, an investor at Lemnos Labs in San Francisco. “The labelling teams are incredibly important in every company, and will need to be there for some time because the outdoor environment is so dynamic.”

…“AI practitioners, in my mind, have collectively had an arrogant blind spot, which is that computers will solve everything,” says Matt Bencke, founder and chief executive of Mighty.Ai, which taps a community of part-time workers to filter and tag training data for tech companies.</p>
selfdrivingcar  ai 
july 2017 by charlesarthur
A fleet of self-driving cars will test-drive from Oxford to London • Inverse
Mike Brown:
<p>The United Kingdom is about to play host to one of the most ambitious autonomous car tests ever. Its goal? To find out what happens when you let a fleet of self-driving cars loose into the real world.

The DRIVEN consortium is a government-funded group of companies involved in several aspects of autonomous car development, starting a 30-month test project that will culminate in six to 12 self-driving cars driving between London and Oxford in the second half of 2019. The project aims to go beyond the question of whether we can make a car drive itself, exploring bigger issues like how a computer can judge risk and what happens when an autonomous car loses cellular service.

The open-road testing will put to use the technology developed by Oxford-based artificial intelligence firm Oxbotica. The cars will operate with SAE Level 4 autonomy.

“This is the first exercise where there’s a connected fleet talking to each other about risk and routes and all those sorts of things,” Dr. Graeme Smith, CEO of Oxbotica, tells Inverse.

“Typically, vehicles today work as single vehicles, so this is the first trial where we’re looking at doing some joined-up thinking between the different vehicles.”</p>
selfdrivingcar 
june 2017 by charlesarthur
Berkeley duo's plan to solve traffic jams: hyper-fast lanes for self-driving cars • The Guardian
Benjamin Preston:
<p>Hyperlane works a lot like existing dedicated commuter lanes, only instead of paying extra to use higher-speed, lower-congestion lanes in a human-driven vehicle, the separate lanes are only for autonomous vehicles. After entering an acceleration lane, Hyperlane’s central computer takes over the car’s functions and finds a slot for it in the already fast-moving traffic in the dedicated lanes. Barrs and Chen said vehicles would travel at speeds up to 120mph, and that the centralized computer control – which would be in constant communication with each vehicle using emerging 5G technology – would allow for a more tightly-packed traffic pattern.

“We liken the Hyperlane network to an air traffic control system,” Barrs said.

Sensors in the road would evaluate traffic density, weather hazards, accidents and other changes, prompting the system to adjust vehicle speed as necessary. Like Uber’s pricing structure, fees for Hyperlane would be based upon demand.</p>


Hmm. Controlled by a central computer. No chance of that going wrong, and no risk with vehicles going at 120mph.
selfdrivingcar 
june 2017 by charlesarthur
Intel CEO Krzanich: self-driving cars will double as security cameras • CNBC
Chantel McGee:
<p>The benefits of having self-driving cars go far beyond automatic parking or fewer accidents, Intel CEO Brian Krzanich told CNBC on Thursday.

Among those other benefits: Driverless cars will double as security cameras, he said from the sidelines of the Code Conference in California.

"I always say that the cars are going to be out there looking, so the next time an Amber alert comes up and they're looking for a license plate, the cars should be able to find that license plate quite rapidly," said Krzanich.

The idea could bring up concerns about privacy, but Krzanich has already thought of how to minimize those worries.

"We'll have to put limitations on it," he said. "We'll have to encrypt that data and make sure I can't tell that it's John's [car] necessarily," said Krzanich.</p>


Mass surveillance without a warrant! How delightful.
intel  selfdrivingcar 
june 2017 by charlesarthur
Uber fires former Google engineer at heart of self-driving dispute • The New York Times
Mike Isaac:
<p>Uber has fired Anthony Levandowski, a vice president of technology and the star engineer leading the company’s self-driving automobile efforts, according to an internal email sent to employees on Tuesday.

Mr. Levandowski’s termination, effective immediately, comes as a result of his involvement in a legal battle between Uber and Waymo, the self-driving technology unit spun out of Google last year. Waymo claims that Uber is using trade secrets stolen from Google to develop Uber’s self-driving vehicles, a plan aided by Mr. Levandowski, a former longtime Google employee.

Uber has long denied the accusations. But when Mr. Levandowski was ordered by a federal judge to hand over evidence and testimony to that end, he asserted his Fifth Amendment rights, seeking to avoid possible criminal charges, according to his lawyers. Uber has been unable to convince Mr. Levandowski to cooperate.</p>


The soap opera continues.
google  uber  selfdrivingcar 
may 2017 by charlesarthur
Uber allowed to continue self-driving car project but must return files to Waymo • The Guardian
Sam Levin:
<p>A judge has granted a partial reprieve to Uber in its high-profile intellectual property lawsuit with Google’s self-driving car operation, allowing the ride-hailing company to continue developing its autonomous vehicle technology.

The judge, however, has barred an Uber executive accused of stealing trade secrets from Google spin-off Waymo from continuing to work on self-driving cars’ radar technology, and has ordered Uber to return downloaded documents to Waymo. The judge also said that evidence indicates that Waymo’s intellectual property has “seeped into Uber’s own … development efforts” – suggesting that Uber could face a tough battle as the case moves ahead.

Google’s lawyers were seeking a broader injunction against Uber, which could have significantly impeded the taxi startup’s entire self-driving car program, a move that could have been a fatal setback. The partial victory for Uber follows a judge’s recommendation that federal prosecutors launch a criminal investigation into the accusations that it stole Waymo’s technology.</p>


The case has also been referred to criminal prosecutors on the basis that the code might have been stolen; and Waymo gets to review Uber's code. Uber is really screwed.
uber  waymo  lawsuit  selfdrivingcar 
may 2017 by charlesarthur
Don't worry, driverless cars are learning from Grand Theft Auto • Bloomberg
<p>Last year, scientists from Darmstadt University of Technology in Germany and Intel Labs developed a way to pull visual information from Grand Theft Auto V. Now some researchers are deriving algorithms from GTAV software that’s been tweaked for use in the burgeoning self-driving sector.

The latest in the franchise from publisher Rockstar Games Inc. is just about as good as reality, with 262 types of vehicles, more than 1,000 different unpredictable pedestrians and animals, 14 weather conditions and countless bridges, traffic signals, tunnels and intersections. (The hoodlums, heists and accumulated corpses aren’t crucial components.)

The idea isn’t that the highways and byways of the fictional city of Los Santos would ever be a substitute for bona fide asphalt. But the game “is the richest virtual environment that we could extract data from,” said Alain Kornhauser, a Princeton University professor of operations research and financial engineering who advises the Princeton Autonomous Vehicle Engineering team.

Waymo uses its simulators to create a confounding motoring situation for every variation engineers can think of: having three cars changing lanes at the same time at an assortment of speeds and directions, for instance. What’s learned virtually is applied physically, and problems encountered on the road are studied in simulation.</p>


"Yeah, this new car knows what to do if someone tries to carjack you, too!"
selfdrivingcar  gta 
may 2017 by charlesarthur
A single autonomous car has a huge impact on alleviating traffic • MIT Technology Review
Jamie Condliffe:
<p>You’ve likely seen the <a href="https://www.youtube.com/watch?v=7wm-pZp_mi0">demonstration of phantom traffic jams</a> where cars drive around in a circle to simulate the impact of a single slowing car on a road full of traffic. One car pumps its brakes for no particular reason, and the slowdown ripples through the traffic. Now, the University of Illinois research, led by Daniel Work, shows that placing even just a single autonomous car into one of

The team’s results show that by having an autonomous vehicle control its speed intelligently when a phantom jam starts to propagate, it’s possible to reduce the amount of braking performed further back down the line. The numbers are impressive: the presence of just one autonomous car reduces the standard deviation in speed of all the cars in the jam by around 50%, and the number of sharp hits to the brakes is cut from around nine per vehicle for every kilometer traveled to at most 2.5 — and sometimes practically zero.</p>


When motorways are busy, phantom jams are a key cause of holdups - caused by people driving too close to the car in front, then reacting too violently. Autonomous cars will probably help by keeping greater distances. Except that a human will then insert their car into the, as they see it, too-big space. Repeat until the self-driving car is at the back of the line.
selfdrivingcar 
may 2017 by charlesarthur
Charlie Miller on why self-driving cars are so hard to secure from hackers • WIRED
Andy Greenberg:
<p>Two years ago, Charlie Miller and Chris Valasek pulled off a demonstration that shook the auto industry, remotely hacking a Jeep Cherokee via its internet connection to paralyze it on a highway. Since then, the two security researchers have been quietly working for Uber, helping the startup secure its experimental self-driving cars against exactly the sort of attack they proved was possible on a traditional one. Now, Miller has moved on, and he’s ready to broadcast a message to the automotive industry: Securing autonomous cars from hackers is a very difficult problem. It’s time to get serious about solving it.

Last month, Miller left Uber for a position at Chinese competitor Didi, a startup that’s just now beginning its own autonomous ridesharing project. In his first post-Uber interview, Miller talked to WIRED about what he learned in those 19 months at the company—namely that driverless taxis pose a security challenge that goes well beyond even those faced by the rest of the connected car industry.</p>


Consider how lousy the security on most IoT stuff is. Self-driving cars will be different, but you know they'll have sockets for maintenance..
selfdrivingcar 
april 2017 by charlesarthur
Autonomous trucking overlooks skilled labor need • Supply Chain 24/7
Joseph Kane and Adie Turner:
<p>Unsurprisingly, analysts expect automated trucks to proliferate in the next five to ten years, leading to significant job losses in the process.

The only problem? The numbers do not clearly back up the predictions.

In addition to the numerous regulatory and logistical hurdles that automated trucks still need to clear, generalizing the skilled work undertaken by millions of truck drivers and their peers overlooks how this industry functions.

In many ways, the current national conversation on the trucking industry tends to overemphasize the technology and oversimplify the complex set of labor concerns, where many jobs are not likely to disappear anytime soon.

Similar to most infrastructure jobs, truck drivers depend on a wide range of skills to carry out their jobs every day. Just as there are different types of doctors, there are different types of truck drivers – from heavy and tractor-trailer truck drivers who focus on long-haul journeys to delivery truck drivers who carry lighter loads and navigate local streets.

Read APICS Blog: Truck Drivers (Still) Wanted

Not surprisingly, many of these drivers are not simply sitting behind the wheel all day on auto drive. They also inspect their freight loads, fix equipment, make deliveries, and perform other non-routinized tasks.

Standardized data verify this non-routinized conception of truck-driving. The Department of Labor’s O*NET database shows how truck drivers have a lower “degree of automation” compared to most occupations nationally.

On a scale of 0 (not at all automated) to 100 (completely automated), O*NET surveys workers across all types of occupations, where those with simpler, repeated tasks are often better suited for automated technologies, such as telephone operators and travel agents.

<img src="http://www.supplychain247.com/images/article/metro_20170321_automatedtrucking_scatterplot.jpg" width="100%" />

The average degree of automation, however, remains quite low (29.6) for all occupations, and heavy and tractor-trailer truck drivers (22) and delivery drivers (24) rate even lower than that. Significantly, they also rate lower than some of the country’s other largest occupations, including office clerks (32), cashiers (37), and receptionists (47).</p>


Expect counternarratives like this to become increasingly common as we really begin to examine what machine learning systems can and can't do. Rather like the last mile problem, it's the small but essential things humans do that makes them indispensable.
trucks  autonomy  selfdrivingcar 
april 2017 by charlesarthur
End of road for trucking startup Palleter • Medium
Märt Kelder was chief executive of the aforementioned Palleter:
<p>European trucking market is broken — fragmented and inefficient. There are 
2 000 000 trucks and 600 000 trucking companies in Europe. The average company size is three trucks while 80% of the companies have less than 10 trucks. All this fragmentation leads to huge inefficiencies — 25% of the trucks on the road are empty while the rest are loaded to only 59%.
We started Palleter in November 2015 believing the fragmented trucking market presents a huge opportunity and that with clever technology Palleter could increase the efficiency of trucking.

The above is a nice narrative. It’s a story investors buy easily. It’s a story we ourselves bought easily. In fact it was so good we managed to convince ourselves to work 1.5 years with no salary in order to make our dream — a truly efficient trucking marketplace — a reality. A platform where cargo is matched in real time with nearby trucks moving the same way as the freight.

Unfortunately, as you’ll soon see, the reality proved to be a little different than the narrative.</p>


Reality: trucks have less available space; they're not willing to pick up other loads. Wonder if there are lessons to be learnt for those proposing self-driving trucks. Sounds like it might be easier to disrupt humans with robots in this case. (Via <a href="https://twitter.com/Charlesknight/">Charles Knight</a> via Chris Anderson.)
trucks  selfdrivingcar 
april 2017 by charlesarthur
The customer is always wrong: Tesla lets out self-driving car data – when it suits • The Guardian
Sam Thielman:
<p>Tesla regularly communicates detailed information about crashes involving its cars with the media whenever a driver points a finger at its automation software following an accident.

“Autopilot has been shown to save lives and reduce accident rates, and we believe it is important that the public have a factual understanding of our technology,” said a company spokesperson in an email.

The Guardian could not find a single case in which Tesla had sought the permission of a customer who had been involved in an accident before sharing detailed information from the customer’s car with the press when its self-driving software was called into question. Tesla declined to provide any such examples and disputed the description of its automation software, called Autopilot, as “self-driving”.

Data that shows up in the press often comes from the onboard computers of the cars themselves and can tell the public – and law enforcement officials – whether a customer’s hands were on the wheel, when a door was opened, which of its self-driving processes were active at the moment and whether or not they had malfunctioned.

In only one case – the May death of Canton, Ohio, Tesla driver Joshua Brown – has the company publicly admitted that its software made a mistake. In that case, the Autopilot software did not “see” the white side of a tractor-trailer as it moved in front of the car against the white sky. The driver was reportedly watching one of the Harry Potter movies at that moment and did not see the vehicle, either.

Tesla takes issue with the characterization of Autopilot’s performance in the crash as a failure and told the Guardian that it only distributes detailed information from the site of auto accidents to the press when it believes someone quoted in the media is being unfair.</p>


..unfair to Tesla, that is. This is a terrifically clever piece of journalism: it's not based on an event, or an announcement. It's based on observation which reveals something deeper about how we're being manipulated by these companies.
tesla  data  privacy  selfdrivingcar 
april 2017 by charlesarthur
Inside Uber’s self-driving car mess • Recode
Johana Bhuiyan:
<p>taking drivers out of the equation would also increase the company’s profits: Self-driving cars give Uber 100 percent of the fare, the company would no longer have to subsidize driver pay and the cars can run nearly 24 hours a day.

But the company’s autonomous efforts are in turmoil. According to extensive interviews Recode conducted with former and current employees at the self-driving effort, many think it is at a technological standstill and plagued by significant internal tension, especially among its executive leadership.

The issues have included a wave of key talent departures and problematic demos. At least 20 of the company’s engineers have quit since November. One source says a “mini civil war” has broken out between those who joined Otto in search of the independence of a startup, and those who joined Uber’s ATG with ambitions to solidify the company’s place in the future of transportation.

Many of those issues and the resulting questions can be traced back to when Uber acquired Otto, several sources said. As part of the acquisition, Kalanick put its founder and CEO Anthony Levandowski in charge of all of its autonomous efforts.

Uber says that it’s normal for an entity that was founded two years ago as of January 2017 to see this level of attrition, particularly as the company recently paid out its employee bonuses. However, the departures began as early as November 2016. Additionally, a company spokesperson said Uber’s ATG has seen fewer departures than the overall company and has hired more people than have left since the start of the year.</p>


There was a meeting scheduled for Monday, though possibly it was brought forward?
uber  selfdrivingcar 
march 2017 by charlesarthur
Uber suspends self-driving car program after Arizona crash • Reuters
Gina Cherelus:
<p>Uber Technologies suspended its pilot program for driverless cars on Saturday after a vehicle equipped with the nascent technology crashed on an Arizona roadway, the ride-hailing company and local police said.

The accident, the latest involving a self-driving vehicle operated by one of several companies experimenting with autonomous vehicles, caused no serious injuries, Uber said.

Even so, the company said it was grounding driverless cars involved in a pilot program in Arizona, Pittsburgh and San Francisco pending the outcome of investigation into the crash on Friday evening in Tempe.

"We are continuing to look into this incident," an Uber spokeswoman said in an email.

The accident occurred when the driver of a second vehicle "failed to yield" to the Uber vehicle while making a turn, said Josie Montenegro, a spokeswoman for the Tempe Police Department.

"The vehicles collided, causing the autonomous vehicle to roll onto its side," she said in an email. "There were no serious injuries."

Two 'safety' drivers were in the front seats of the Uber car, which was in self-driving mode at the time of the crash, Uber said in an email, a standard requirement for its self-driving vehicles. The back seat was empty.</p>


Over at Google/Waymo, they'll be groaning. Or delighted. This was inevitable (admit it), but is it better than it happened to Uber - everyone's favourite whipping boy this month - rather than the poster boy for self-driving cars, Waymo?
waymo  uber  selfdrivingcar  accident 
march 2017 by charlesarthur
Internal metrics show how often Uber’s self-driving cars need human help • BuzzFeed News
Priya Anand:
<p>Human drivers were forced to take control of Uber’s self-driving cars about once per mile driven in early March during testing in Arizona, according to an internal performance report obtained by BuzzFeed News. The report reveals for the first time how Uber’s self-driving car program is performing, using a key metric for evaluating progress toward fully autonomous vehicles.

Human drivers take manual control of autonomous vehicles during testing for a number of reasons — for example, to address a technical issue or avoid a traffic violation or collision. The self-driving car industry refers to such events as “disengagements,” though Uber uses the term “intervention” in the performance report reviewed by BuzzFeed News. During a series of autonomous tests the week of March 5, Uber saw disengagement rates greater than those publicly reported by some of its rivals in the self-driving car space.</p>


Once per mile. Never enough let you relax. Sure to improve, but what is the "safe" amount?
uber  selfdrivingcar 
march 2017 by charlesarthur
The Uber bombshell about to drop • Daniel With Music
Daniel Compton:
<p>In the last few weeks Alphabet <a href="https://medium.com/waymo/a-note-on-our-lawsuit-against-otto-and-uber-86f4f98902a1#.vd51cmjdf">filed a lawsuit</a> against Uber. Alphabet and Waymo (Alphabet’s self-driving car company) allege that Anthony Levandowski, an ex-Waymo manager, stole confidential and proprietary information from Waymo, then used it in his own self-driving truck startup, Otto. Uber acquired Otto in August 2016, so the suit was filed against Uber, not Otto.

This alone is a fairly explosive claim, but the subtext of Alphabet’s filing is an even bigger bombshell. Reading between the lines, (in my opinion) Alphabet is implying that Mr Levandowski arranged with Uber to:

• Steal LiDAR and other self-driving component designs from Waymo
• Start Otto as a plausible corporate vehicle for developing the self-driving technology
• Acquire Otto for $680 million
• Below, I’ll present the timeline of events, my interpretation, and some speculation on a possible (bad) outcome for Uber. </p>


It's quite an interpretation. (Also, legal things tend not to go with bombshells. They're more like super-slow burners.) One suspect it isn't going to be that bad, but Uber could find itself a few years behind rivals if things go badly. Still, it has a ton of money which it can use to get through the hard times.
google  uber  alphabet  selfdrivingcar 
march 2017 by charlesarthur
Autonomous cars must learn to drive the Italian way, the German way and every way in-between • IB Timeds
Alistair Charlton:
<p>Another challenge faced by autonomous cars is how to navigate different countries and around humans using different forms of etiquette.

Callegari explained how self-driving cars will need to be taught how human driving and behaviours differ by country, and adapt accordingly.

"Blatting down the Autobahn at 250km/h (155mph) is quite common in Germany, then you'll get chased down by a Mercedes or a Porsche. Then in Italy you'll have someone in a Punto doing the same thing, but the driving conditions and the expectations there are quite different."

In other words, autonomous cars will need to be comfortable with moving quickly in Germany, where lane discipline is generally very good, but in Italy they will need to deal with far more erratic driving from locals.

Callegari went on: "People don't really tailgate in the UK; you think it's bad there but it's not that bad. But here [Switzerland] people tailgate, it's just part of the way you drive. They sit two metres off your bumper and the conditions are very, very different in those cases...also how people drive, how aggressive they are, how casual they are is very different. In [rural] US it's very relaxed but around the M25 [motorway around London] it's completely different."</p>
selfdrivingcar 
march 2017 by charlesarthur
Trump administration re-evaluating self-driving car guidance • Reuters
David Shepardson:
<p>US Transportation Secretary Elaine Chao said on Sunday she was reviewing self-driving vehicle guidance issued by the Obama administration and urged companies to explain the benefits of automated vehicles to a skeptical public.

The guidelines, which were issued in September, call on automakers to voluntarily submit details of self-driving vehicle systems to regulators in a 15-point “safety assessment” and urge states to defer to the federal government on most vehicle regulations.

Automakers have raised numerous concerns about the guidance, including that it requires them to turn over significant data, could delay testing by months and lead to states making the voluntary guidelines mandatory…

…Chao said she was "very concerned" about the potential impact of automated vehicles on employment. There are 3.5 million U.S. truck drivers alone and millions of others employed in driving-related occupations.</p>


That last bit suggest that self-driving vehicles might not get the clear road they're hoping for.
selfdrivingcar 
february 2017 by charlesarthur
A lawsuit against Uber highlights the rush to conquer driverless cars • The New York Times
Mike Isaac and Daisuke Wakabayashi:
<p>In one case, an autonomous Volvo zoomed through a red light on a busy street in front of the city’s Museum of Modern Art.

Uber, a ride-hailing service, said the incident was because of human error. “This is why we believe so much in making the roads safer by building self-driving Ubers,” Chelsea Kohler, a company spokeswoman, said in December.

But even though Uber said it had suspended an employee riding in the Volvo, the self-driving car was, in fact, driving itself when it barreled through the red light, according to two Uber employees, who spoke on the condition of anonymity because they signed nondisclosure agreements with the company, and internal Uber documents viewed by The New York Times. All told, the mapping programs used by Uber’s cars failed to recognize six traffic lights in the San Francisco area. “In this case, the car went through a red light,” the documents said.</p>


OK, so Uber is getting a reputation as being a bit of a liar. The "human error" was not stopping the car which was running autonomously from doing something wrong.

But quite separately, further down the story:
<p>[Anthony] Levandowski [who has since left Google to join Uber to run its self-driving cars project] gained some notoriety within Google for selling start-ups, which he had done as side projects, to his employer. In his biography for a real estate firm, for which he is a board member, Mr. Levandowski said he sold three automation and robotics start-ups to Google, including 510 Systems and Anthony’s Robots, for nearly $500m. After this story was published, the real estate firm updated its website erasing Mr. Levandowski’s biography and said that it had “erroneously reported certain facts incorrectly without Mr. Levandowski’s knowledge.”</p>


Feels a bit like Paul Nuttall of UKIP, the polar explorer and Martian astronaut, whose website was just wrong about him. Will Levandowski - who is part of Google's lawsuit against Uber - fit in well at his new employer, do you think?
google  uber  selfdrivingcar 
february 2017 by charlesarthur
The drunk utilitarian: Blood alcohol concentration predicts utilitarian responses in moral dilemmas • Science Direct
Aaron Duke and Laurent Bègue:
<p>In two field studies with a combined sample of 103 men and women recruited at two bars in Grenoble, France, participants were presented with a moral dilemma assessing their willingness to sacrifice one life to save five others. Participants’ blood alcohol concentrations were found to positively correlate with utilitarian preferences (r = .31, p < .001) suggesting a stronger role for impaired social cognition than intact deliberative reasoning in predicting utilitarian responses in the trolley dilemma. Implications for Greene’s dual-process model of moral reasoning are discussed.</p>


So we need self-driving cars to be drunk? ("Utilitarian response" is "kill one person to save five".)
selfdrivingcar  trolleyproblem  alcohol 
february 2017 by charlesarthur
All in a glance • mmitII
Matt Ballantine, writing as a cyclist, picks up the discussion about self-driving vehicles and the risks to cyclists:
<p>Taking a right turn North out of the top of [my] street at most times is a complex process, involving a number of tacit rules…

There are hundreds of ways in which that manoeuvre at that junction can pan out, and most of them don’t strictly follow to the letter of the Highway Code. If one were to wait for both lanes to be clear to be able to turn right, you could be there for hours.

In urban and suburban areas, there are thousands of spots that have similar tacit rules and constant negotiation between road users for them to work effectively…

…the amount of subtle interaction between people who make up the users of the road means that we are a very, very long way from the steering wheel-less motorcars of “the future”. Without being able to switch everyone over to driver-free cars at the same instant and removing all non-autonomous road users at the same time the extent to which road use is a constant form of human interaction is, it seems, lost on the robot car evangelists.</p>
selfdrivingcar 
february 2017 by charlesarthur
The self-driving car's bicycle problem • IEEE Spectrum
Peter Fairley:
<p>when it comes to spotting and orienting bikes and bicyclists, performance drops significantly. Deep3DBox is among the best, yet it spots only 74% of bikes in the benchmarking test. And though it can orient over 88% of the cars in the test images, it scores just 59% for the bikes.

Košecká says commercial systems are delivering better results as developers gather massive proprietary datasets of road images with which to train their systems. And she says most demonstration vehicles augment their visual processing with laser-scanning (ie lidar) imagery and radar sensing, which help recognize bikes and their relative position even if they can’t help determine their orientation.

Further strides, meanwhile, are coming via high-definition maps such as Israel-based Mobileye’s Road Experience Management system. These maps offer computer vision algorithms a head start in identifying bikes, which stand out as anomalies from pre-recorded street views. Ford Motor says “highly detailed 3D maps” are at the core of the 70 self-driving test cars that it plans to have driving on roads this year.</p>


How long before the first bicycle knockover?
bicycle  selfdrivingcar 
february 2017 by charlesarthur
Unexpected consequences of self driving cars • Rodney Brooks
<p>There are big AI perception challenges, just in my neighborhood, to get driverless cars to interact with people as well us driverful cars do. What if level 4 and level 5 autonomy self driving cars are not able to make that leap of fitting in as equals as current cars do?

Cars will clearly have to be able to perceive people walking along the street, even and especially on a snowy day, and not hit them. That is just not debatable. What is debatable is whether the cars will need to still pass them, or whether they will slowly follow people not risking passing them as a human driver would. That slows down the traffic for both the owner of the driverless car, and for any human drivers. The human drivers may get very annoyed with being stuck behind driverless cars. Driverless cars would then be a nuisance.

In the little side streets, when at a stop sign, cars will have to judge when someone is about to cross in front of them. But sometimes people are just chatting at the corner, or it is a parent and child waiting for the school bus that pulls up right there. How long should the driverless car wait? And might someone bully such cars by teasing them that they are about to step off the curb–people don’t try that with human drivers as there will soon be repercussions, but driverless cars doing any percussioning will just not be acceptable.

Since there are no current ways that driverless cars can give social signals to people, beyond inching forward to indicate that they want to go, how will they indicate to a person that they have seen them and it safe to cross in front of the car at a stop sign? Perhaps the cars will instead need to be 100% guaranteed to let people go. Otherwise without social interactions it would be like the case of the dark country road. In that case driverless cars would have a privileged position compared to cars with human drivers and pedestrians. That is not going to endear them to the residents.</p>


I don't think we've even begun to consider how pedestrians and self-driving vehicles are going to interact. All the focus has been on getting the vehicles to navigate themselves, which is a tiny part of driving a car.
selfdrivingcar  interaction 
january 2017 by charlesarthur
CES proves carmakers still confused about autonomous driving • The Information
Amir Efrati:
<p>Mr. Hafner’s [of Mercedes, which has teamed up with Nvidia] comments are interesting given a view among traditionalists in the self-driving field—including people who work at Waymo (formerly Google), Baidu and Ford—that Nvidia’s approach, which is sometimes called “end-to-end deep learning,” either won’t work or is outright dangerous.

Coincidentally, a day before the Mercedes-Nvidia announcement, a primitive version of Nvidia’s “AI-trained” car being demonstrated in a parking lot outside the exhibition hall veered off course. It would have crashed into a portable wall if Nvidia engineers hadn’t remotely stopped it, according to a person who saw the incident.

Danny Shapiro, senior director at Nvidia’s automotive business, said in an interview that the car’s self-driving system, called “pilot net,” had been “trained” earlier in the week during cloudy conditions so when the sun came out on Thursday, the system was unprepared. He added that the car is not representative of Nvidia-powered autonomous driving systems because it was making driving decisions based on data from just one camera. Nvidia’s latest system supports vehicles with many more cameras and other sensors.</p>

But how long will it take to train them in every conceivable weather, road and other condition?
Mercedes  selfdrivingcar  nVidia  ai 
january 2017 by charlesarthur
Uber stops San Francisco self-driving pilot as DMV revoked registrations • TechCrunch
Darrell Etherington:
<p>Uber has confirmed that it will stop its self-driving pilot in San Francisco, following a meeting today with the California DMV and Attorney General’s office. The DMV revoked the registration on 16 self-driving test vehicles Uber was using in its pilot.

The DMV tells TechCrunch that it invited Uber to complete its permitting process at the same time it revoked it the vehicle registrations. Uber told TechCrunch that it will instead be looking to deploy the vehicles elsewhere for the time being. Here’s Uber’s statement on the matter in full:
<p>We have stopped our self-driving pilot in California as the DMV has revoked the registrations for our self-driving cars. We’re now looking at where we can redeploy these cars but remain 100 percent committed to California and will be redoubling our efforts to develop workable statewide rules.</p>
</p>

Amazing arrogance from Uber: it's saying it's the laws that are wrong, rather than its lawbreaking cars. So it has <a href="http://venturebeat.com/2016/12/22/ubers-self-driving-cars-flee-to-arizona-after-california-shutdown/">moved them to Arizona</a>. Good luck, folks!
uber  selfdrivingcar  california 
december 2016 by charlesarthur
Witness says self-driving Uber ran red light on its own, disputing Uber's claims • The Guardian
Sam Levin:
<p>An autonomous Uber malfunctioned while in “self-driving mode” and caused a near collision in San Francisco, according to a business owner whose account raises new safety concerns about the unregulated technology launch.

The self-driving car – which Uber introduced without permits, as part of a testing program that California has deemed illegal – accelerated into an intersection while the light was still red and while the automation technology was clearly controlling the car, said Christopher Koff, owner of local cafe AK Subs.

“It looked like the car ran the red light on its own,” Koff, 49, said of the self-driving Uber Volvo, which has a driver in the front seat who can take control when needed. Another car that had the green light had to “slam the brakes” to avoid a crash, he said.

<a href="http://www.consumerwatchdog.org/resources/ltsoublet122016redated.pdf">Koff’s story</a>, which advocacy group Consumer Watchdog shared with state officials on Tuesday, directly contradicts Uber’s public claims that red-light violations have been the result of “human error” and that the drivers, not the technology, have failed to follow traffic laws.</p>


Koff said it happened at about 5am. Early, but still almost caused a crash. I'm running out of synonyms for "foolhardy" in regard to Uber.

Also - like John Gruber - I think Uber is sliding around definitions here. Its suggestion of "human error" could actually mean "a human was meant to stop it, but didn't" instead of "a human was driving this all the time". But you can't expect people to monitor a car like this; it's both exhausting and numbing, like constantly overseeing a learner driver.
uber  selfdrivingcar 
december 2016 by charlesarthur
Uber admits to self-driving car 'problem' in bike lanes as safety concerns mount • The Guardian
Sam Levin:
<p>The San Francisco Bicycle Coalition has <a href="http://www.sfbike.org/news/a-warning-to-people-who-bike-self-driving-ubers-and-right-hook-turns/">released a warning about Uber’s cars</a> based on staff members’ first-hand experiences in the vehicles. When the car was in “self-driving” mode, the coalition’s executive director, who tested the car two days before the launch, observed it twice making an “unsafe right-hook-style turn through a bike lane”.

That means the car crossed the bike path at the last minute in a manner that posed a direct threat to cyclists. The maneuver also appears to violate state law, which mandates that a right-turning car merge into the bike lane before making the turn to avoid a crash with a cyclist who is continuing forward.

“It’s one of the biggest causes of collisions,” said coalition spokesman Chris Cassidy, noting that the group warned Uber of the problem. Company officials told the coalition that Uber was working on the issue but failed to mention that the self-driving program would begin two days later without permits, he said.

“The fact that they know there’s a dangerous flaw in the technology and persisted in a surprise launch,” he said, “shows a reckless disregard for the safety of people in our streets.”</p>


These things haven't been on (illegal) test a week yet, and the risks are mounting. This doesn't feel good. If Uber kills a cyclist, it is in very deep trouble.
uber  cycling  selfdrivingcar 
december 2016 by charlesarthur
Uber might self-certify its own autonomous cars to carry the public • Car and Driver
Mark Harris:
<p>in May, Otto carried out an unlicensed public demonstration of a driverless semi in Nevada, despite being warned by the DMV that it would contravene the state’s rules regarding autonomous testing. The truck drove on Interstate 80 near Reno for several miles with a human driver in the front seats. A DMV official called the stunt illegal and threatened to shut down the agency’s AV program, but under Nevada’s current regulations, there are currently no legal or financial penalties for breaking the rules.

Otto’s runaround of the regulations could have come back to haunt the company.

One of the DMV’s regulation documents says, “Evidence of the unfitness of an applicant to operate an ATCF includes . . . willfully failing to comply with any regulation adopted by the Department.” Another says, “The Department may . . . deny a license to an applicant, upon the grounds of willful failure of the applicant . . . to comply with the provisions of . . . any of the traffic laws [or regulations] of this State.”

Instead, the DMV granted Otto an ATCF license within days of receiving its application. The only company to have flouted Nevada’s autonomous vehicle rules is now the only company licensed to certify itself and other companies wishing to test autonomous technologies.

Jude Hurin, the DMV administrator who had termed Otto’s drive illegal, confirmed that Uber can now certify its own vehicles for public use.</p>


So with California this is now two states where Uber has flouted rules to run self-driving vehicles. Amazing.
uber  selfdrivingcar  legislation 
december 2016 by charlesarthur
Uber blames humans for self-driving car traffic offences as California orders a halt • The Guardian
Sam Levin:
<p>“It is essential that Uber takes appropriate measures to ensure safety of the public,” the California department of motor vehicles (DMV) wrote to Uber on Wednesday after it defied government officials and began piloting the cars in San Francisco without permits. “If Uber does not confirm immediately that it will stop its launch and seek a testing permit, DMV will initiate legal action.”

An Uber spokesperson said two red-light violations were due to mistakes by the people required to sit behind the steering wheel and said the company has suspended the drivers.

A <a href="https://www.youtube.com/watch?v=_CdJ4oae8f4">video</a> posted by Charles Rotter, an operations manager at Luxor, a traditional cab company, shows one of Uber’s computer-controlled cars plowing through a pedestrian crosswalk in downtown about four seconds after the light turned red. Elsewhere, a photo from a San Francisco writer showed one of the Uber vehicles entering an intersection against a red light.

“People could die,” Rotter said in an interview later. “This is obviously not ready for primetime.”</p>


Rewind: "after it defied government officials and began.." So this is Uber being both foolhardy and headstrong, as well as wrong. Can we expect fines, both for breaking traffic laws and operating without a licence?

Oh yes, and these (dangerous) offences happened on day one of the illegal testing.
uber  selfdrivingcar  risk 
december 2016 by charlesarthur
Google to spin out self-driving car project in new company, Waymo • Business Insider
Biz Carson and Danielle Muoio:
<p>the first version of Waymo's self-driving technology to become available won't be quite the revolution that Google once promised. While Google has been testing a fleet of pod-shaped autonomous vehicles without steering wheels or pedals, executives acknowledged on Tuesday that, for the time being at least, cars will continue to be piloted by humans, with Waymo's self-driving technology included as a feature.

The spinout of the self-driving car unit, which is currently housed in X, another Alphabet company, has been expected for some time. But the move comes as Google has faced some setbacks in bringing its vision of a steering-wheel free car to market and as it faces increases competition from Uber, the ride-hailing company which is also developing self-driving cars, as well as other automakers.

“We are a self-driving technology company," Krafcik said. "We’ve made it pretty clear we are not a car company…. We’re not in the business of making better cars, we’re in the business of making better drivers. We’re a self-driving technology company.”</p>


As noted before, all that stuff with the cars was just to attract interest.
google  selfdrivingcar 
december 2016 by charlesarthur
Google Said to Plan Ride-Sharing Service With Fiat Chrysler - Bloomberg
Tommaso Ebhardt , Daniele Lepido , and Mark Bergen:
<p>Google parent Alphabet plans to start a ride-sharing service with Fiat Chrysler Automobiles NA’s minivans as part of a reorganization of the tech company’s automotive unit, people familiar with the matter said.

Google will deploy a semi-autonomous version of the Chrysler Pacifica minivan that it’s developing with the Italian-American carmaker for the new service as early as the end of 2017, said the people, who asked not to be identified as the matter is private. The U.S. tech company will announce as soon as Tuesday a new business model for Alphabet’s auto unit, two of the people said. Alphabet and Fiat Chrysler declined to comment.

For the service, Google will need more than the 100 Pacificas it agreed to develop with Fiat Chrysler in May, the people said. The companies announced plans that month to create about 100 prototypes based on the Chrysler Pacifica hybrid-powered minivan for Google to test its self-driving technology.</p>


When the dust settles, how big a name will Google be in self-driving vehicles?
google  selfdrivingcar 
december 2016 by charlesarthur
Google scaled back self-driving car ambitions • The Information
Amir Efrati:
<p>The decision to pursue a less ambitious plan was made by Alphabet CEO Larry Page and CFO Ruth Porat, who determined that making a car without a steering wheel and foot pedals was impractical, say people familiar with the decision. Current U.S. regulatory guidelines call for a steering wheel and pedals.

Eliminating the steering wheel and pedals would allow designers to reimagine the experience of being in a car. For passengers who want to take a nap, for instance, there might be a reclining, bed-like seat option. The autonomous vehicle industry, including Mr. Page, by and large believes that cars without steering wheels will dominate some day.

The decision to be pragmatic and focus on building a real business with traditional cars wasn’t universally embraced by people at Chauffeur. It’s “a step back, a deviation,” said one person who has been involved with Chauffeur.

For many people at Chauffeur, focusing on a car without a steering wheel would differentiate the car from its rivals. Google co-founder Sergey Brin, who’s had a hand in Chauffeur from nearly the beginning, had been hoping the unit would continue work on a system for vehicles without a wheel, these people say. The self-driving car unit’s former chief, Chris Urmson, had also wanted to pursue this approach. He left this summer, less than a year after Mr. Page hired Mr. Krafcik to lead Chauffeur and bring much-needed structure and urgency to the program.</p>

Reality is biting hard.
Selfdrivingcar  google  chauffeur 
december 2016 by charlesarthur
Michigan lets self-driving cars on roads without human drivers • Associated Press
<p>Companies can now test self-driving cars on Michigan public roads without a driver or steering wheel under new laws that could push the state to the forefront of autonomous vehicle development.

The package of bills signed into law Friday comes with few specific state regulations and leaves many decisions up to automakers and companies like Google and Uber.

It also allows automakers and tech companies to run autonomous taxi services and permits test parades of self-driving tractor-trailers as long as humans are in each truck. And they allow the sale of self-driving vehicles to the public once they are tested and certified, according to the state.

The bills allow testing without burdensome regulations so the industry can move forward with potential life-saving technology, said Gov. Rick Snyder, who was to sign the bills. "It makes Michigan a place where particularly for the auto industry it's a good place to do work," he said.

The bills give Michigan the potential to be a leader by giving the companies more autonomy than say, California, which now requires human backup drivers in case something goes awry.</p>

Let's hope they've got the insurance details all figured out. Michigan is, of course, the home state of the vehicle manufacturing capital Detroit.
Selfdrivingcar  detroit  michigan 
december 2016 by charlesarthur
Google’s self-driving car team is hiring executives as it prepares to spin out from Alphabet’s X - Recode
Johana Bhuiyan:
<p>Google’s self-driving project, led by ex-Hyundai CEO and president John Krafcik, is expected to be graduating from Alphabet’s moonshot shop “soon.” That’s according to Krafcik, who spoke at the Nikkei Innovation Forum in Palo Alto in October.

While the timeline of the project’s impending spinout isn’t any clearer two months later, the self-driving project is evidently preparing to separate from the mothership by hiring several of its own executives to positions X, formerly known as Google X, already has.

The first was Kevin Vosen, who was hired to be the self-driving arm’s chief legal officer. Now, Alphabet’s self-driving shop is looking for a head of real estate — or someone to secure new space for the autonomous company when it “graduates” from X.

In other words, the project is moving away from having to depend on X for things like dealing with regulation and expansion.</p>


Will it still be in "Other Bets" or might it be a separate company inside Alphabet?
google  selfdrivingcar 
december 2016 by charlesarthur
Driverless cars will be like ‘Learners', so we'll bully them • The Memo
Oliver Smith:
<p><a href="http://www.thinkgoodmobility.goodyear.eu/the-survey">Surveying some 12,000 drivers</a>, the London School of Economics and Goodyear found that many drivers expect these autonomous cars to be extra cautious and patient on the road, and that they plan to take ruthless advantage of this.

“The autonomous cars are going to stop. So you’re going to mug them right off,” said one participant. “They’re going to stop and you’re just going to nip round.”

The survey found that especially among more “competitive” drivers driverless cars are perceived as a potential nuisance, an opportunity to take advantage of, or “bully” on the roads.

Overall there was a feeling that autonomous vehicles would lack “common sense” on the roads.

So next time you see a car struggling to edge out on a roundabout, will you be the sympathetic one to let the driverless car out?</p>


These cars are going to get slaughtered in London traffic.
selfdrivingcar 
october 2016 by charlesarthur
Uber’s self-driving cars are already getting into scrapes on the streets of Pittsburgh • Quartz
Alison Griswold:
<p>While it would be easy to write off these incidents [a self-driving car seen going the wrong way up a one-way street; being hit from behind by the following car, which is always the following car's fault] as minor mishaps, both suggest how much work Uber has left to do on its autonomous software, even as it’s begun putting real passengers in the cars. One reason Uber’s vehicles are currently traveling only a small area of Pittsburgh is because those are supposed to be the streets its engineers have carefully mapped and taught the cars about. If that’s really the case, no self-driving car should be turning the wrong way down a one-way street—nor should its safety driver, who is in theory the final check on the car’s autonomy.

Driverless vehicles also tend to operate in a cautious, hyper-logical manner and follow the rules of the road to a tee. Uber, again via its mapping efforts, has tried to prepare its cars to avoid certain tricky situations they might run into. On one street near the ATC in Pittsburgh, Uber engineers have instructed the self-driving cars to hang close to the curb because trucks making turns are more likely to swerve into the oncoming lane. By that same logic, the cars should also know certain intersections are hotspots for rear-ending accidents and be on the alert to avoid them, much as a savvy human driver would be. Uber’s approach differs from that of other companies such as Nvidia, which have focused on teaching computer systems to drive in a more adaptive, human-like way—by being introduced to situations a few times, and then applying what they learn to other encounters on the road.</p>
selfdrivingcar  pittsburgh  uber 
october 2016 by charlesarthur
Self-driving hype doesn’t reflect reality • WSJ
Christopher Mims digs into the details and asks the (peculiarly unasked) questions:
<p>Ford, for example, has said it would release a self-driving car by 2021. Dig into the statements and press for details, and a Ford spokesman says that car will only be self-driving in the portion of major cities where the company can create and regularly update extremely detailed 3-D street maps. Ford declines to say how big those areas will be.

Lyft is collaborating with GM and says it will introduce fully self-driving cars by 2021. But co-founder John Zimmer says the vehicles will be limited to a specific geographic area and a top speed of 25 miles an hour.

Representatives of Volvo and Israel’s Mobileye NV, which makes self-driving technology and is collaborating with Intel and BMW, will impose similar limits on their coming self-driving vehicles. Volvo’s cars might refuse to go into self-driving mode on roads that are insufficiently mapped, says Erik Coelingh, the technical lead on Volvo’s self-driving car efforts. The cars will pull over to the side of the road, or come to a stop, if inclement weather impedes the vehicle’s perceptual abilities, Mr. Coelingh says.

That is a scary thought—and one reason why early “fully autonomous” cars will require monitoring by humans.</p>


We were promised flying cars. Then we were promised fully autonomous cars…

This is though another example of the difference between good journalism and "oh look another corporate blogpost" writing. Mims simply kept asking for the detail, and the detail turns out to make the painting a lot less attractive than the broad brushstrokes given previously implied.
selfdrivingcar 
september 2016 by charlesarthur
Google Car: sense and money impasse • Monday Note
Jean-Louis Gassée:
<p>With all of the Can You Top This? PR that surrounds driving automation, [Alphabet CEO Larry] Page’s stance [that an autonomous car must be fully autonomous] is an admirable injection of thoughtfulness, a sobriety check. The visionary statements and self-driving demos (cue demo jokes) blithely omit the “mere matter of implementation”. What’s the plan, the timeline? What are we going to do with the 235 million cars and trucks on US roads, some expected to last 20 years or more? How will manufacturers negotiate the US Department of Transportation’s <a href="https://www.transportation.gov/AV/federal-automated-vehicles-policy-september-2016">Federal Automated Vehicles Policy</a>? Sometimes, the last 5% of a project takes 200% of the time and money.

Then we have another unanswered Google Car question: The path to money.

Personally, I think a company needs one really good idea every ten years, so for a company as rich as Google, a few billion dollars for a new breakthrough looked eminently affordable…for a while. But there is such a thing as too much, such as <a href="https://en.wikipedia.org/wiki/Google_barges">Google barges</a> and many other puzzling pursuits that fall into the <em>Because We Can</em> category.

In May 2015, Ruth Porat left Morgan Stanley where she was Executive VP and Chief Financial Officer to become Alphabet’s CFO. The story is that her appointment had been heavily encouraged by investors who were concerned about Alphabet’s runaway “moonshot” projects. As expected, Porat set out to improve financial discipline and, for many projects, to demand a path to profitability. Highly speculative research, such as the Calico project’s quest to extend human life by 20 to 100 years, doesn’t entail huge financial outlays, but a grand and realistic endeavor such as developing the Google Self-Driving Car will require billions to reach its destination and raises business model questions as a result.</p>


The years when Google could pile into lots of "because we can" - about five years ago? - feel distant now. As the presidential debate of some years went, echoing the fast food outlet advert, "where's the beef?"
google  selfdrivingcar 
september 2016 by charlesarthur
Uber driverless car in Pittsburgh: review, photos • Business Insider
Danielle Muoio was given the VIP treatment; self-driving means there's a driver and engineer in the front just in case:
<p>Once you're actually riding in the self-driving car, it feels surprisingly ... normal. My driver had his hands on the wheels most of the time just in case he had to take over, so we had to double check a few times that the car was, in fact, self-driving.

But that speaks to just how good these cars are at handling city roads. Pittsburgh terrain isn't easy to tackle, with steep hills and several bridges, but the cars rolled through just fine.

That being said, the cars are nowhere near perfect. There were at least four occasions in our roughly five-mile route where a "ding" went off indicating the driver needed to take control. It happened once on a bridge, but also on a perfectly straight back road without any perceptible obstacles.

We've talked about why Uber's self-driving cars <a href="http://uk.businessinsider.com/autonomous-cars-bridges-2016-8?r=US&IR=T">struggle with bridges</a>.</p>


Bridges are hard because they don't have surrounding buildings, in general. Uber is definitely stealing a march here. Meanwhile, Bloomberg says "<a href="http://www.bloomberg.com/news/articles/2016-09-12/google-car-project-loses-leaders-and-advantage-as-rivals-gain">Google's self-driving car project is losing out to rivals</a>", which has these interesting paragraphs:
<p>“Google still has an imperfect system and no clear path to go to market,” said Ajay Juneja, chief executive officer of Speak With Me Inc., which offers voice recognition and related technology for cars, watches and other connected devices. “How exactly would they have shipped something by now?”

This is part of a broader challenge Google parent Alphabet Inc. faces turning research projects into profitable businesses. The company is more cautious about rolling out new technology early, after its Glass internet-connected eyewear flopped, according to one of the people. There’s also a higher bar now for projects as Chief Financial Officer Ruth Porat has said she requires clearer paths to profitability before approving more funding or expansion.</p>


Porat is starting to look like an inconvenient pragmatist. But it's early days still.
selfdrivingcar  uber  google 
september 2016 by charlesarthur
Our next chapter: Otto joins Uber • Official Otto Blog
<p>When we founded Otto, we committed to rethinking transportation. Today we are taking a leap forward by joining the Uber team to deliver on that promise.

Together with Uber, we will create the future of commercial transportation: first, self-driving trucks that provide drivers unprecedented levels of safety; and second, a platform that matches truck drivers with the right load wherever they are.

At Otto, we believe that drivers shouldn’t have to choose between safety and earnings. Our self-driving trucks will allow drivers to rest while their truck is moving, and our platform will ensure drivers can easily find loads and are paid fairly.

By combining these two technologies, we can create a freight network that is constantly learning and improving. Each truck that joins the network can provide valuable information that makes all other trucks safer and more efficient. In turn, drivers get paid more and shippers get a more reliable service. Self-driving trucks together with a marketplace create a virtuous cycle where everyone benefits.</p>


Clearly, Uber's aims go far beyond a simple taxi service now. Taken together with the news that it's going to start <a href="http://www.bloomberg.com/news/features/2016-08-18/uber-s-first-self-driving-fleet-arrives-in-pittsburgh-this-month-is06r7on">testing self-driving cars in Pittsburgh this month</a>, we can begin to discern the shape of future commercial transport. There don't seem to be a lot of human drivers in it.
uber  otto  selfdrivingcar  trucks 
august 2016 by charlesarthur
comma.ai research
<p>the comma.ai driving dataset

7 and a quarter hours of largely highway driving. Enough to train what we had in <a href="http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/">Bloomberg</a> [a prototype self-driving car built in a garage].

Examples

We present two Machine Learning Experiments to show possible ways to use this dataset:

<img src="http://research.comma.ai/images/selfsteer.gif" width="100%" />

<a href="https://github.com/commaai/research/blob/master/SelfSteering.md">Training a steering angle predictor</a>


<img src="http://research.comma.ai/images/drive_simulator.gif" width="100%" />

<a href="https://github.com/commaai/research/blob/master/DriveSim.md">Training a generative image model</a></p>


45GB compressed, so you'll need a fast link. More to the point, it's out there for you to do something with - if you're in machine learning.
machinelearning  cars  selfdrivingcar 
august 2016 by charlesarthur
Driverless cars threaten to crash insurers’ earnings • WSJ
Leslie Scism:
<p>The insurance industry has a $160bn blind spot: the driverless car.

Car insurers last year hauled in $200bn of premiums, about a third of all premiums collected by the property-casualty industry. But as much as 80% of the intake could evaporate in coming decades, say some consultants, assuming crucial breakthroughs in driverless technology make driving safer and propel big changes in car ownership.

As the threat approaches, U.S. insurance executives are spending millions and embedding with car companies, testing the technology themselves, and wrestling with whether to lower prices as parts of the autonomous future hit America’s roads.

For the actuaries who set insurance rates, it is a puzzle like no other: How do they prepare for a world of so many fewer auto accidents? In the future, will underwriters be insuring drivers or computer code?…

…Just as air bags and seat belts did in generations past, increasingly common semi-autonomous equipment is expected to offer significant improvements in safety. Among the most effective is automatic braking, which is in fewer than 10% of cars now but will be standard on new cars by 2022, according to the insurance-industry funded Insurance Institute for Highway Safety.

The Highway Loss Data Institute, a sister organization to IIHS, last year found that 11 front-crash-prevention systems from six manufacturers showed 10% to 15% lower rates of claims for damaging other vehicles, compared with models without the gear.

Surprisingly, the institute found no consistent reduction in claim rates from “lane-departure warning” systems.</p>
driving  selfdrivingcar 
july 2016 by charlesarthur
Self-driving Mercedes-Benz bus takes a milestone 12-mile trip • TechCrunch
Darrell Etherington:
<p>CityPilot has taken a key early step towards fully autonomous public transportation: The Mercedes-Benz self-driving bus program saw one of its Future Bus vehicles drive 20 km (or around 12.4 miles) in the Netherlands, on a route that connected Amsterdam’s Schiphol airport with the nearby town of Haarlem. To make the trip, the bus had to stop at traffic lights, pass through tunnels, and navigate among pedestrians.

This is a big win for the program, which owes its origins to the transport truck-focused Highway Pilot program debuted by Mercedes two years ago. That autonomous vehicle program didn’t face the added challenges of navigating an urban environment, however, which makes the Future Bus successful test run a significant achievement.</p>
selfdrivingcar 
july 2016 by charlesarthur
Tesla’s dubious claims about autopilot’s safety record • Technology Review
Tom Simonite:
<p>Tesla and Musk’s message is clear: the data proves Autopilot is much safer than human drivers. But experts say those comparisons are worthless, because the company is comparing apples and oranges.

“It has no meaning,” says Alain Kornhauser, a Princeton professor and director of the university’s transportation program, of Tesla’s comparison of U.S.-wide statistics with data collected from its own cars. Autopilot is designed to be used only for highway driving, and may well make that safer, but standard traffic safety statistics include a much broader range of driving conditions, he says.

Tesla’s comparisons are also undermined by the fact that its expensive, relatively large vehicles are much safer in a crash than most vehicles on the road, says Bryant Walker Smith, an assistant professor at the University of South Carolina. He describes comparisons of the rate of accidents by Autopilot with population-wide statistics as “ludicrous on their face.” Tesla did not respond to a request asking it to explain why Musk and the company compare figures from very different kinds of driving.</p>

As Ben Thompson also pointed out in his Stratechery newsletter, the fact that Tesla opened its blogpost about this death <em>significantly caused by its technology</em> with statistics, rather than an expression of empathy for the dead person and those affected, is an indictment of its tone-deafness.
selfdrivingcar  tesla  society 
july 2016 by charlesarthur
Google’s cars need a clear road map to revenue • The Information
Amir Efrati considers partnership (vehicle makers won't do it), licensing (vehicle makers won't do it), and suggests what's left:
<p>One natural path for Google is to reach consumers directly with an internet-based service. That’s its DNA. We know that Google’s car designers have thought long and hard about operating a “robo taxi” service to allow people to order cars on demand. It’s likely to go down that kind of path; its leaders have talked up the benefits of reducing car ownership so that one car could be used by many people throughout the day and night. Perhaps there will be subscription-type offerings that guarantee customers a pickup within a certain period of time, rather than the Uber-type system in which pickup times and prices can vary based on customer demand or driver availability.

By not needing to pay drivers, which represent the single biggest expense in ride-hailing, Google could price such a service below those run by Uber and other firms and build up its own customer base. But first, Google would need to produce these cars and get them deployed. Making thousands of new cars per year, particularly advanced models that have never been mass-produced before, would be a tough and expensive undertaking. Just ask Tesla how hard it is to make thousands of cutting-edge electric vehicles in a year.</p>


"Go-to-market" is the big important step between "have a great idea" and "make pots of money from great idea".
google  selfdrivingcar 
june 2016 by charlesarthur
Pittsburgh offers ultimate test for Uber's self-driving Fusions • TribLIVE
Aaron Aupperlee:
<p>The car took control with the click of a button.

Mathew Priest, the Uber employee in the driver's seat who was no more than a passenger at this point, took his hands off the wheel and foot off the pedal as the car drove itself east on the 31st Street Bridge.

The Ford Fusion slowed to a stop behind several cars at a red light and turned left onto River Avenue.

Uber is testing its fleet of self-driving cars on the streets, bridges and hills of Pittsburgh, the ride-sharing company confirmed Wednesday.

The San Francisco-based firm has said little about its progress in developing autonomous vehicles since it opened the Advanced Technology Center 15 months ago in Pittsburgh's Strip District.

John Bares, head of Uber's Pittsburgh lab, took a Tribune-Review reporter on a ride in a Fusion hybrid that drove itself for portions of the trip.

It was the first time Uber allowed a member of the media to ride in a test car in self-driving mode, he said.</p>


Autonomous cabs? Very <a href="https://www.youtube.com/watch?v=0H5k--n7sFI">Total Recall</a>. Uber is thinking just over the horizon, and this is it bringing that closer.
selfdrivingcar  uber 
may 2016 by charlesarthur
GM, Lyft to test self-driving electric taxis • WSJ
Mike Ramsey and Gautham Nagesh:
<p>General Motors Co. and Lyft Inc. within a year will begin testing a fleet of self-driving Chevrolet Bolt electric taxis on public roads, a move central to the companies’ joint efforts to challenge Silicon Valley giants in the battle to reshape the auto industry.

The plan is being hatched a few months after GM invested $500 million in Lyft, a ride-hailing company whose services rival Uber Technologies Inc. The program will rely on technology being acquired as part of GM’s separate $1bn planned purchase of San Francisco-based Cruise Automation Inc., a developer of autonomous-driving technology.</p>


City yet to be announced. Detroit?
selfdrivingcar 
may 2016 by charlesarthur
Autonomous tractor brings in the harvest » Hackaday
Jenny List:
<p>Matt Reimer is a farmer in Southwestern Manitoba, Canada. It’s grain country, and at harvest time he has a problem. An essential task when harvesting is that of the grain cart driver, piloting a tractor and grain trailer that has to constantly do the round between unloading the combine harvester and depositing the grain in a truck. It’s a thankless, unrelenting, and repetitive task, and Matt’s problem is that labour is difficult to find when every other farmer in the region is also hiring.

His solution was to <a href="https://hackaday.io/project/10697-autonomous-tractor">replace the driver with a set of Arduinos and a Pixhawk autopilot</a> controlling the tractor’s cab actuators, and running ArduPilot, DroneKit, and his own Autonomous Grain Cart software. Since a modern tractor is effectively a fly-by-wire device this is not as annoying a task as it would have been with a tractor from several decades ago, or with a car. The resulting autonomous tractor picks up the grain from his combine, but he reminds us that for now it still deposits the harvest in the truck under human control. It is still a work-in-progress with only one harvest behind it, so this project is definitely one to watch over the next few months.</p>


Trucks, tractors... this stuff all happens quietly around the edges, and then suddenly you notice that the edges are a lot closer than you used to think.
tractor  selfdrivingcar 
april 2016 by charlesarthur
Where's the lane? Self-driving cars confused by shabby U.S. roadways » Reuters
Alexandria Sage:
<p>Volvo's North American CEO, Lex Kerssemakers, lost his cool as the automaker's semi-autonomous prototype sporadically refused to drive itself during a press event at the Los Angeles Auto Show.

"It can't find the lane markings!" Kerssemakers griped to Mayor Eric Garcetti, who was at the wheel. "You need to paint the bloody roads here!"

Shoddy infrastructure has become a roadblock to the development of self-driving cars, vexing engineers and adding time and cost. Poor markings and uneven signage on the 3 million miles of paved roads in the United States are forcing automakers to develop more sophisticated sensors and maps to compensate, industry executives say.

Tesla CEO Elon Musk recently called the mundane issue of faded lane markings "crazy," complaining they confused his semi-autonomous cars.

An estimated 65% of U.S. roads are in poor condition, according to the U.S. Department of Transportation, with the transportation infrastructure system rated 12th in the World Economic Forum's 2014-2015 global competitiveness report.

<img src="http://fingfx.thomsonreuters.com/gfx/rngs/1/1141/1709/AUTOCAR.jpg" width="100%" /></p>


Make America Navigable By Autonomous Cars Agai.. um, For The First Time.
selfdrivingcar 
march 2016 by charlesarthur
Driverless lorry convoys to be trialled in the UK » Ars Technica UK
Sebastian Anthony:
<p>Convoys of automated lorries will be trialled on UK motorways, chancellor George Osborne is expected to announce in his 2016 Budget speech later this month.

The Times reports that the trials will take place on a northerly stretch of the M6, which runs from Birmingham all the way up to the border of Scotland, near Carlisle. The Department for Transport confirms that planning for "HGV platoons" is under way, though it did not comment on whether the trials will receive funding in the Budget, nor give any kind of timeline for the fleet's deployment.

A DfT spokesman said: "We are planning trials of HGV platoons—which enable vehicles to move in a group so they use less fuel—and will be in a position to say more in due course." The Times reports that these platoons could consist of up to 10 driverless lorries, each just a few metres away from each other.

The DfT's "less fuel" claim refers to "drafting," where the first lorry in the platoon creates a slipstream, significantly reducing drag and fuel consumption for the other lorries behind it. In a semi-automated lorry demo a couple of years ago, the fuel economy for a platoon of lorries improved by about 15%. Expand that out to the thousands of trucks that are on UK roads at any one time and you're looking at potentially huge cost reductions.</p>
selfdrivingcar  uk  lorries 
march 2016 by charlesarthur
Google says it bears 'some responsibility' after self-driving car hit bus » Reuters
David Shepardson:
<p>The crash may be the first case of one of its autonomous cars hitting another vehicle and the fault of the self-driving car. The Mountain View-based Internet search leader said it made changes to its software after the crash to avoid future incidents.

In a Feb. 23 report filed with California regulators, Google said the crash took place in Mountain View on Feb. 14 when a self-driving Lexus RX450h sought to get around some sandbags in a wide lane.

Google said in the filing the autonomous vehicle was traveling at less than 2 miles per hour, while the bus was moving at about 15 miles per hour.

The vehicle and the test driver "believed the bus would slow or allow the Google (autonomous vehicle) to continue," it said.

But three seconds later, as the Google car in autonomous mode re-entered the center of the lane, it struck the side of the bus, causing damage to the left front fender, front wheel and a driver side sensor. No one was injured in the car or on the bus.</p>


Yeah, if you did that in a driving test, you'd get failed. It's not the bus's fault if you try to enter its right of way.
google  selfdrivingcar 
february 2016 by charlesarthur
A driverless car saved my life - no, really » Forbes
Joann Muller took a ride in Delphi's model on the Vegas roads during CES:
<p>One of the first things I noticed was how polite the self-driving car was. It always stayed under the speed limit, and always drove a safe distance behind the car in front of us. It was kind of annoying, frankly, in frenetic Las Vegas, where 170,000 heavily caffeinated tech freaks converged for CES, the big three-day consumer electronics show.

At a busy four-way intersection, the Audi navigated itself into a left-turn lane behind five or six other cars stopped at a traffic light. I thought the gap between us and the car ahead seemed excessive, but that’s how the car is programmed to behave. If I were driving, I would have inched way up behind the other guy’s bumper.

The traffic arrow turned green, and as the cars ahead started moving, so did we. Just as we approached the intersection to make the left turn, the arrow turned yellow and our car stopped abruptly. My Delphi guide, Nandita Mangal, explained that because the car detected stopped traffic on the other side of the intersection it did not feel it was safe to proceed on yellow, even though most drivers (myself included) are probably more aggressive and would have tried to make the light.

That point was driven home just a few minutes later when our car, now first in the left turn lane, got a green arrow to proceed. The Audi drove forward and started turning left, when all of a sudden, out of the corner of my eye, I saw not one, but two cars come speeding through the intersection from the right, running the red light. I wanted to yell “Look out!” but before I could even get the words out, the Audi slammed its brakes as the bad drivers swerved around us. If the self-driving car hadn’t detected what was about to happen and stopped, we likely would have been T-boned on the right side, and I might not be here to write this story.</p>


It will only take a few cases like this for SDCs to be hailed at the best thing since sliced bread. Will the bad drivers (like those running the light) get them first, though? (Note too: this isn't a Google car.)
selfdrivingcar 
january 2016 by charlesarthur
« earlier      
per page:    204080120160

Copy this bookmark:





to read