tag:blogger.com,1999:blog-4503292949532760618.post7045061032000666945..comments2024-03-28T13:39:27.601-07:00Comments on DSHR's Blog: AutonowashingDavid.http://www.blogger.com/profile/14498131502038331594noreply@blogger.comBlogger32125tag:blogger.com,1999:blog-4503292949532760618.post-42133600799523581642023-08-19T18:29:14.341-07:002023-08-19T18:29:14.341-07:00The AP reports that Cruise agrees to cut fleet of ...The AP reports that <a href="https://www.theguardian.com/us-news/2023/aug/19/cruise-halves-fleet-san-francisco-robotaxis" rel="nofollow"><i>Cruise agrees to cut fleet of San Francisco robotaxis in half after crashes</i></a>:<br /><br />"General Motors’ Cruise autonomous vehicle unit has agreed to cut its fleet of San Francisco robotaxis in half as authorities investigate two recent crashes in the city.<br /><br />The state department of motor vehicles asked for the reduction after a Cruise vehicle without a human driver collided with an unspecified emergency vehicle on Thursday.<br /><br />“The DMV is investigating recent concerning incidents involving Cruise vehicles in San Francisco,” the DMV said on Saturday in a statement to the Associated Press. “Cruise has agreed to a 50% reduction and will have no more than 50 driverless vehicles in operation during the day and 150 driverless vehicles in operation at night.”<br /><br />The development comes just over a week after California regulators allowed Cruise and the Google spinoff Waymo to operate autonomous robotaxis throughout San Francisco at all hours, despite safety worries spurred by recurring problems with unexpected stops and other erratic behavior.<br />...<br />On Thursday around 10pm, the Cruise vehicle had a green light, entered an intersection, and was hit by the emergency vehicle responding to a call, the San Francisco Chronicle reported, based on tweets from Cruise.<br /><br />The robotaxi was carrying a passenger, who was taken by ambulance to a hospital with injuries that were not severe, Cruise told the newspaper.<br /><br />Also on Thursday night, a Cruise car without a passenger collided with another vehicle in San Francisco, the newspaper reported."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-32524572015069569742023-05-04T09:41:19.230-07:002023-05-04T09:41:19.230-07:00San Francisco is now infested with self-driving ca...San Francisco is now infested with self-driving cars lacking the essential fully trained self-driving car driver. When they come upon situations outside their training data, the result is chaos. First responder are the ones who have to deal with unusual situations, such as a house blowing up when a meth lab malfunctions, and as Joe Eskanazi reports in <a href="https://missionlocal.org/2023/05/waymo-cruise-fire-department-police-san-francisco/" rel="nofollow"><i>‘No! You stay!’ Cops, firefighters bewildered as driverless cars behave badly</i></a> they are getting even more annoyed than the rest of SF's residents:<br /><br />"The self-driving future has arrived in San Francisco. And, increasingly and all too often, it looks like a confounded cop, road flare in hand, commanding a wayward autonomous vehicle as if it were a misbehaving, two-ton puppy.<br /><br />“No!” shouts the cop, as captured in his <a href="https://videos.files.wordpress.com/M9Dl3QYN/robocars.mp4" rel="nofollow">body-worn camera footage</a>. “You stay!”<br /><br />The incident occurred on Feb. 9, during one of San Francisco’s more memorable recent emergencies: A dollar-store Walter White apparently lost control of his Sunset District garage dope factory, resulting in a lethal explosion and fire. The normally sedate neighborhood in the vicinity of 21st Avenue and Noriega Street was instantly transformed into both a disaster scene and a crime scene.<br /><br />And, to make it a truly San Francisco scene, a driverless Waymo vehicle subsequently proceeded to meander into the middle of things, like an autonomous Mr. Magoo."<br /><br />Go read the whole thing, it'll be time well spent.David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-15326946352617217222022-10-06T16:23:40.180-07:002022-10-06T16:23:40.180-07:00The title of Max Chafkin's Even After $100 Bil...The title of Max Chafkin's <a href="https://www.bloomberg.com/news/features/2022-10-06/even-after-100-billion-self-driving-cars-are-going-nowhere" rel="nofollow"><i>Even After $100 Billion, Self-Driving Cars Are Going Nowhere</i></a> sums up the must-read article based largely on input from self-driving pioneer Anthony Levandowski:<br /><br />"Levandowski says his skepticism of the industry started around 2018. It was a little more than a year after Elon Musk unveiled a demo of a Tesla driving itself to the tune of Paint It Black. Levandowski checked the official road-test data that Tesla submitted to California regulators. The figures showed that, in that time, the number of autonomous miles Tesla had driven on public roads in the state totaled—wait for it—zero. (Tesla hasn’t reported any autonomous miles traveled in California since 2019. The company didn’t respond to a request for comment.) Although Levandowski says he admires Tesla, is impressed by its driver-assistance technology, and believes it may one day produce a truly self-driving car, he says the lack of progress by Musk and his peers forced him to question the point of his own years in the field. “Why are we driving around, testing technology and creating additional risks, without actually delivering anything of value?” he asks.<br /><br />While Tesla has argued that its current system represents a working prototype, Musk has continued to blur the lines between demos and reality."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-35664846967670402312022-07-17T08:30:33.755-07:002022-07-17T08:30:33.755-07:00Russ Mitchell reports that Musk said not one self-...Russ Mitchell reports that <a href="https://www.latimes.com/business/story/2022-07-14/elon-musk-said-no-self-driving-tesla-had-ever-crashed" rel="nofollow"><i>Musk said not one self-driving Tesla had ever crashed. By then, regulators already knew of 8</i></a>:<br /><br />"Santa Monica investment manager and vocal Tesla booster Ross Gerber was among the allies who sprang to his defense.<br /><br />“There has not been one accident or injury since FSD beta launch,” he <a href="https://twitter.com/GerberKawasaki/status/1482809775565926404" rel="nofollow">tweeted</a> in January. “Not one. Not a single one.”<br /><br />To which Musk responded with a single word: “Correct.”<br /><br />In fact, by that time dozens of drivers had already filed safety complaints with the National Highway Traffic Safety Administration over incidents involving Full Self-Driving — and at least eight of them involved crashes. The complaints are in the public domain, in a <a href="https://www.nhtsa.gov/nhtsa-datasets-and-apis" rel="nofollow">database</a> on the NHTSA website."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-90864602660599456902022-07-08T05:58:34.219-07:002022-07-08T05:58:34.219-07:00Andrew J. Hawkins' Two new fatal Tesla crashes...Andrew J. Hawkins' <a href="https://www.theguardian.com/politics/2022/jul/07/race-to-replace-boris-johnson-slow-to-take-shape-amid-resignation-chaos%22" rel="nofollow"><i>Two new fatal Tesla crashes are being examined by US investigators</i></a> starts:<br /><br />"Two fatal Tesla crashes are being examined by investigators at the National Highway Traffic Safety Administration. <a href="https://www.reuters.com/business/autos-transportation/us-opens-new-probe-into-fatal-tesla-pedestrian-crash-california-2022-07-07/" rel="nofollow">Reuters reported</a> that NHTSA opened a special investigation into a recent fatal crash in California, in which a Tesla driver killed a pedestrian. And an agency spokesperson confirmed to <i>The Verge</i> that a crash that took place on July 6th in Florida is also under examination."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-29450788543039049232022-05-19T19:25:25.887-07:002022-05-19T19:25:25.887-07:00In Fatal Tesla Model S Crash in California Prompts...In <a href="https://www.bloomberg.com/news/articles/2022-05-18/fatal-tesla-model-s-crash-in-california-prompts-federal-probe?srnd=premium" rel="nofollow"><i>Fatal Tesla Model S Crash in California Prompts Federal Probe</i></a>, Keith Laing reports that:<br /><br />"The US National Highway Traffic Safety Administration is investigating a fatal crash involving a 2022 Tesla Model S that may have had its automated driving system activated.<br /><br />The accident in Newport Beach, California, killed three people earlier this month, according to details provided Wednesday by NHTSA. The electric vehicle hit a curb and slammed into construction equipment, leaving the car totaled, the Orange County Register reported.<br /><br />The collision is the 42nd included in NHTSA’s Special Crash Investigation of advanced driver assistance systems like Tesla Inc.’s Autopilot. <br />...<br />Tesla vehicles have been involved in all but seven of the advanced driver assistance systems that have been added to NHTSA’s investigation."<br /><br />So there are currently 35 Tesla crashes under investigation. This technology is a clear and present danger to innocent bystanders.David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-39934600713802060912022-01-20T14:13:42.034-08:002022-01-20T14:13:42.034-08:00onathan M. Gitlin reports that IIHS will rate driv...onathan M. Gitlin reports that <a href="https://arstechnica.com/cars/2022/01/iihs-will-rate-driver-assists-like-autopilot-and-super-cruise-for-safety/" rel="nofollow"><i>IIHS will rate driver assists like Autopilot and Super Cruise for safety</i></a>:<br /><br />"On Thursday, the Insurance Institute for Highway Safety <a href="https://www.iihs.org/news/detail/iihs-creates-safeguard-ratings-for-partial-automation" rel="nofollow">announced that it is creating a rating system</a> for hands-free advanced driver-assistance systems like Tesla's Autopilot and General Motors' Super Cruise. Later this year IIHS will issue its first set of ratings, with grading levels of good, acceptable, marginal, or poor. Having a good driver-monitoring system will be vital to getting a good grade.<br /><br />And the institute is not alone. Also on Thursday, Consumer Reports revealed that it, too, will <a href="https://www.consumerreports.org/car-safety/driver-monitoring-systems-ford-gm-earn-points-in-cr-tests-a6530426322/" rel="nofollow">consider the safety of such tech features</a>, adding points if there's a good driver-monitoring system. CR says that so far, only <a href="https://arstechnica.com/cars/2018/02/the-cadillac-ct6-review-super-cruise-is-a-game-changer/" rel="nofollow">Super Cruise</a> and Ford's <a href="https://arstechnica.com/cars/2021/04/ford-will-roll-out-bluecruise-hands-free-driving-software-in-q3-2021/" rel="nofollow">BlueCruise</a> systems are safe enough to get those extra points. Meanwhile, from model year 2024, CR will start subtracting points for cars that offer partial automation without proper driver monitoring."<br /><br />This will be a problem for Tesla, so expect a vocal reaction from the fan-bois.David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-22568270054375837462022-01-18T12:28:09.933-08:002022-01-18T12:28:09.933-08:00Jonathan M. Gitlin reports that Manslaughter charg...Jonathan M. Gitlin reports that <a href="https://arstechnica.com/cars/2022/01/manslaughter-charges-follow-tesla-drivers-autopilot-red-light-run/" rel="nofollow"><i>Manslaughter charges follow Tesla driver’s Autopilot red light run</i></a>:<br /><br />"Prosecutors in California have charged a Tesla driver with two counts of manslaughter as a result of a fatal crash in December 2019. <a href="https://apnews.com/article/technology-business-only-on-ap-california-united-states-91b4a0341e07244f3f03051b5c2462ae" rel="nofollow">According to the Associated Press</a>, the National Highway Traffic Safety Administration confirmed that the Autopilot driver-assistance feature was active at the time of the crash. That makes this case notable in that these are the first felony charges to result from a fatal crash involving a partially automated driving system.<br /><br />The fatal crash took place in Gardena, California, on December 29, 2019. According to reports, the Tesla Model S owned by Kevin Riad exited I-91, failed to stop at a red light, and then collided with a Honda Civic, killing both of that car's occupants, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-86001609878098748232022-01-15T14:47:39.644-08:002022-01-15T14:47:39.644-08:00Thank you, California State Senator Lena Gonzalez!...Thank you, California State Senator Lena Gonzalez! As Russ Mitchell reports in <a href="https://www.latimes.com/business/story/2022-01-11/dmv-message-to-legislatures-ontesla-full-self-driving-safety-its-not-our-job" rel="nofollow"><i>DMV ‘revisiting’ its approach to regulating Tesla’s public self-driving test</i></a>, she has understood the dangers to the public of Tesla's fan-bois using the laughably named "Full Self Driving" option on public streets:<br /><br />"Concerned about public safety, Gonzalez <a href="https://www.latimes.com/business/story/2021-12-11/is-teslas-full-self-driving-public-experiment-safe-legislators-want-answers-from-dmv" rel="nofollow">asked the DMV in December</a> for its take on Tesla’s Full Self-Driving beta program, under which Tesla owners supervise the operation of cars programmed to autonomously navigate highways, city streets and neighborhood roads, stopping at traffic lights and stop signs as well as making left and right turns into traffic.<br /><br />Those are the same features being tested by other robot car developers that report crashes and disengagements to the DMV, a group that includes Waymo, Cruise, Argo and Zoox. Although their cars occasionally crash, there are few YouTube videos that show them behaving dangerously."<br /><br />And it worked:<br /><br />"For years, Tesla has tested autonomous vehicle technology on public roads without reporting crashes and system failures to the California Department of Motor Vehicles, as other robot car developers are required to do under DMV regulations.<br /><br />But confronted with dozens of viral videos showing Tesla’s Full Self-Driving beta technology driving the car into dangerous situations, and a letter of concern from a key state legislator, the DMV now says it’s reviewing Tesla’s behavior and reassessing its own policies.<br /><br />The agency informed Tesla on Jan. 5 that it is “revisiting” its opinion that the company’s test program doesn’t fall under the department’s autonomous vehicle regulations because it requires a human driver.<br /><br />“Recent software updates, videos showing dangerous use of that technology, open investigations by the National Highway Traffic Safety Administration, and the opinions of other experts in this space” prompted the reevaluation"David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-17559622689237095302022-01-11T09:11:30.735-08:002022-01-11T09:11:30.735-08:00Emma Roth reports that, in the image of the CEO, T...Emma Roth reports that, in the image of the CEO, <a href="https://www.theverge.com/2022/1/9/22875382/tesla-full-self-driving-beta-assertive-profile" rel="nofollow"><i>Tesla’s ‘Full Self-Driving’ beta has an ‘assertive’ driving mode that ‘may perform rolling stops’</i></a>:<br /><br />"In the description beneath the “Assertive” option, Tesla notes the vehicle will “have a smaller follow distance” and “perform more frequent speed lane changes.” The vehicle will also “not exit passing lanes” and “may perform rolling stops,” and it’s not entirely clear whether this means cars won’t come to a full stop at stop signs."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-33333388761073354372021-12-22T07:08:48.963-08:002021-12-22T07:08:48.963-08:00The WaPo finally realizes that Musk is risking inn...The WaPo finally realizes that Musk is risking innocent lives in Reed Albergotti and Faiz Siddiqui's <a href="https://www.washingtonpost.com/technology/2021/12/21/tesla-test-drivers/" rel="nofollow"><i>Tesla test drivers believe they’re on a mission to make driving safer for everyone. Skeptics say they’re a safety hazard</i></a>:<br /><br />"Marc Hoag, a self-described Tesla fanboy and a shareholder of its stock, waited for a year and a half to get the software. But once he tried it, he was disappointed.<br /><br />“It’s still so impossibly bad,” he said.<br /><br />Hoag said the driving experience is worse in person than it looks on videos he’s posted to YouTube, which show the car taking turns too wide, speeding into curves and mistaking a crosswalk sign for a pedestrian — while otherwise acting apprehensively at intersections alongside other traffic. Its fidgety wheel and the indecisive braking make for an unpleasant ride, and its unpredictable nature make it scary, he said."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-61220495610622554422021-12-14T09:40:28.613-08:002021-12-14T09:40:28.613-08:00More evidence of the risks Tesla is subjecting inn...More evidence of the risks Tesla is subjecting innocent bystanders to in <a href="https://www.youtube.com/watch?v=2ub2F-UnXIU" rel="nofollow">this video</a> of "FSD" in downtown San Jose, For example: asking for the driver to take over then not letting him, trying to drive down light rail tracks, trying to hit pedestrian who had walk light (~8min in).David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-87021244717205475402021-12-06T19:42:17.257-08:002021-12-06T19:42:17.257-08:00Inside Tesla as Elon Musk Pushed an Unflinching Vi...<a href="https://www.nytimes.com/2021/12/06/technology/tesla-autopilot-elon-musk.html" rel="nofollow"><i>Inside Tesla as Elon Musk Pushed an Unflinching Vision for Self-Driving Cars</i></a> by Cade Metz and Neal E. Boudette is a long article that provides more detail on Musk's irresponsibility. For example:<br /><br />"In addition, some who have long worked on autonomous vehicles for other companies — as well as seven former members of the Autopilot team — have questioned Tesla’s practice of constant modifications to Autopilot and F.S.D., pushed out to drivers through software updates, saying it can be hazardous because buyers are never quite sure what the system can and cannot do.<br /><br />Hardware choices have also raised safety questions. Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions. For several years, Autopilot incorporated radar, and for a time Tesla worked on developing its own radar technology. But three people who worked on the project said Mr. Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-31973873468578478072021-12-03T13:35:09.138-08:002021-12-03T13:35:09.138-08:00In The Deadly Myth That Human Error Causes Most Ca...In <a href="https://www.theatlantic.com/ideas/archive/2021/11/deadly-myth-human-error-causes-most-car-crashes/620808/" rel="nofollow"><i>The Deadly Myth That Human Error Causes Most Car Crashes</i></a>, David Zipper points out that:<br /><br />"More than 20,000 people <a href="https://apnews.com/article/coronavirus-pandemic-business-health-transportation-pete-buttigieg-dbfb430dbcc16e5a0800dbc375efa75a" rel="nofollow">died on American roadways</a> from January to June, the highest total for the first half of <a href="https://www.nbcnews.com/news/us-news/-crisis-road-fatalities-hit-15-year-high-rcna4147" rel="nofollow">any year since 2006</a>. U.S. road fatalities have risen by <a href="https://injuryfacts.nsc.org/motor-vehicle/historical-fatality-trends/deaths-by-type-of-incident/" rel="nofollow">more than 10 percent</a> over the past decade, even as they have fallen across most of the developed world. In the <a href="https://ec.europa.eu/commission/presscorner/detail/en/qanda_20_1004" rel="nofollow">European Union</a>, whose population is one-third larger than America’s, traffic deaths dropped by 36 percent between 2010 and 2020, to <a href="https://ec.europa.eu/commission/presscorner/detail/en/IP_21_1767" rel="nofollow">18,800</a>. That downward trend is no accident: European regulators have <a href="https://www.euroncap.com/en/vehicle-safety/the-ratings-explained/vulnerable-road-user-vru-protection/" rel="nofollow">pushed</a> carmakers to build vehicles that are safer for pedestrians and cyclists, and governments regularly <a href="http://www.welivevisionzero.com/vision-zero/" rel="nofollow">adjust road designs</a> after a crash to reduce the likelihood of recurrence."<br /><br />The myth that 94% of crashes are solely the fault of the US driver, car manufacturers and road designers escape responsibility. This is especially convenient for autonowashing:<br /><br />"the idea that human error causes nearly all crashes is a useful talking point for the makers of autonomous-vehicle technology, which supposedly will prevent such mistakes. Companies including General Motors, Google, and the start-up Aurora have touted the 94 percent statistic in <a href="https://www.gm.com/stories/self-driving-cars" rel="nofollow">promotional materials</a>, <a href="https://www.usatoday.com/story/tech/news/2015/12/16/google-disappointed-by-proposed-rules-from-california-dmv/77447672/" rel="nofollow">press statements</a>, and even <a href="https://www.sec.gov/Archives/edgar/data/0001828108/000119312521259457/filename1.htm" rel="nofollow">SEC filings</a>. But, as the Carnegie Mellon University engineering professor Phil Koopman has <a href="http://safeautonomy.blogspot.com/2018/06/a-reality-check-on-94-percent-human.html" rel="nofollow">pointed out</a>, autonomous systems will make their own errors on the road. He does not expect AVs to reduce crashes by more than 50 percent, even in a best-case scenario. And an all-autonomous driving future is still at least decades away, suggesting that AVs will not reverse the growing death toll on American roads for many years to come—if they ever do."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-17483910669424281122021-11-27T07:04:54.686-08:002021-11-27T07:04:54.686-08:00Tim Stevens isn't impressed with Autopilot. In...Tim Stevens isn't impressed with Autopilot. In <a href="https://www.cnet.com/roadshow/reviews/2021-tesla-model-y-review/" rel="nofollow"><i>2021 Tesla Model Y review: Nearly great, critically flawed</i></a> he writes:<br /><br />"I can't conclusively say that it's because of the missing radar, but I can say that our Model Y is bad at detecting obstructions ahead. Really, really bad. The big issue is false positives, a problem that has become known as "phantom braking" among Tesla owners. Basically, the car often gets confused and thinks there's an obstacle ahead and engages the automatic emergency braking system. You get an instant, unwanted and often strong application of the brakes. This is not a problem unique to Teslas. I've experienced it on other cars, but very, very rarely. On our Model Y this happens constantly, at least once an hour and sometimes much more often than that. In a single hour of driving I caught five phantom braking incidents on camera, two hard enough to sound the automatic emergency braking chime."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-392724558638081862021-11-20T06:48:20.005-08:002021-11-20T06:48:20.005-08:00Matt McFarland's We tried Tesla's 'ful...Matt McFarland's <a href="https://www.cnn.com/2021/11/18/cars/tesla-full-self-driving-brooklyn/index.html" rel="nofollow"><i>We tried Tesla's 'full self-driving.' Here's what happened</i></a> is scary:<br /><br />"I'd spent my morning so far in the backseat of the Model 3 using "full self-driving," the system that Tesla says will change the world by enabling safe and reliable autonomous vehicles. I'd watched the software nearly crash into a construction site, try to turn into a stopped truck and attempt to drive down the wrong side of the road. Angry drivers blared their horns as the system hesitated, sometimes right in the middle of an intersection."<br /><br />Elon Musk's response was, as usual, <a href="https://twitter.com/mims/statuses/1461751830409629701?ref_src=twsrc%5Etfw" rel="nofollow">denial</a>:<br /><br />"I suspect that article was written before the drive even took place,"<br /><br />The cult members response was that "Full Self-Driving" wasn't working well because the driver in the front seat was "inexperienced".David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-89468846893545712022021-11-11T15:36:50.943-08:002021-11-11T15:36:50.943-08:00Mahmood Hikmet's 2:19 video Tesla FSD Beta Dan...Mahmood Hikmet's 2:19 video <a href="https://www.youtube.com/watch?v=GmoroFK1A_o" rel="nofollow"><i>Tesla FSD Beta Danger Compilation</i></a> juxtaposes video of Elon Musk hyping "Full Self Driving" as Level 5 autonomy with video of actual "FSD" disengagements, showing that it isn't even a good Level 2 system.David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-27683466212699388332021-10-22T08:04:50.834-07:002021-10-22T08:04:50.834-07:00Nick Carey's Dutch forensic lab says it has de...Nick Carey's <a href="https://www.reuters.com/business/autos-transportation/dutch-forensic-lab-says-it-has-decoded-teslas-driving-data-2021-10-21/" rel="nofollow"><i>Dutch forensic lab says it has decoded Tesla's driving data</i></a> reveals that:<br /><br />"The Dutch government's forensic lab said on Thursday it had decrypted electric carmaker Tesla Inc's closely guarded driving data-storage system, uncovering a wealth of information that could be used to investigate serious accidents.<br /><br />It was already known that Tesla cars store data from accidents, but the Netherlands Forensic Institute (NFI) said it had discovered far more data than investigators had previously been aware of.<br /><br />The NFI said the decrypted data showed Tesla vehicles store information about the operation of its driver assistance system, known as Autopilot. The vehicles also record speed, accelerator pedal position, steering wheel angle and brake usage, and depending on how the vehicle is used, that data can be stored for over a year."<br /><br />And:<br /><br />"The Dutch lab said rather than seek the data from Tesla, it had "reverse engineered" data logs - a process where software is deconstructed to extract information - present in Tesla vehicles "in order to objectively investigate them."<br /><br />The NFI investigated a collision involving a Tesla driver using Autopilot and a car in front of it that suddenly braked hard.<br /><br />The investigation showed the Tesla driver reacted within the expected response time to a warning to resume control of the car, but the collision occurred because the Tesla was following the other vehicle too closely in busy traffic."<br /><br />I'm shocked, shocked that an Elon Musk company would be less than honest with regulators.David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-75553300703912310102021-09-29T06:28:50.158-07:002021-09-29T06:28:50.158-07:00Andrew J. Hawkins reports on #13 in Tesla sued by ...Andrew J. Hawkins reports on #13 in <a href="https://www.theverge.com/2021/9/28/22698388/tesla-texas-lawsuit-cops-autopilot-crash-injury" rel="nofollow"><i>Tesla sued by Texas cops after a Model X on Autopilot slammed into five officers</i></a>:<br /><br />"A group of Texas law enforcement officials are suing Tesla after a Model X with Autopilot engaged crashed into five police officers.<br />...<br />The crash took place February 27, 2021, in Splendora, a small town in Montgomery County in the eastern part of the state. According to the lawsuit, the Model X SUV crashed into several police officers while they were engaged in a traffic stop on the Eastex Freeway in Texas. “All were badly injured,” the lawsuit says."<br /><br />How many more Teslas on Autopilot slamming into emergency crews will it take before the government takes action?David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-36399513931180079482021-09-28T10:29:57.961-07:002021-09-28T10:29:57.961-07:00Another example of the Musk reality distortion fie...Another example of the Musk reality distortion field in Audrey Carleton's <a href="https://www.vice.com/en/article/epneq4/a-tesla-big-battery-is-getting-sued-over-power-grid-failures-in-australia" rel="nofollow"><i>A Tesla Big Battery Is Getting Sued Over Power Grid Failures In Australia</i></a>:<br /><br />"On Wednesday, the Australian Energy Regulator (AER), the body that oversees the country’s wholesale electricity and gas markets, announced it had filed a federal lawsuit against the Hornsdale Power Reserve (HPR)—the energy storage system that owns the Tesla battery—for failing to provide “frequency control ancillary services” numerous times over the course of four months in the summer and fall of 2019. In other words, the battery was supposed to supply grid backup when a primary power source, like a coal plant, fails."<br /><br />They were being paid for being able to provide power at a moment's notice, but when the moment arrived they couldn't.David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-19526567925390130662021-09-20T08:18:51.350-07:002021-09-20T08:18:51.350-07:00Today's story of Tesla's Autopilot hype en...Today's story of Tesla's Autopilot hype enabling stupid and dangerous behavior by the Musk cult members is Clive Thompson's <a href="https://boingboing.net/2021/09/20/police-use-teslas-autopilot-to-stop-the-car-after-drunk-driver-passes-out.html" rel="nofollow"><i>Police use Tesla's autopilot to stop the car after drunk driver passes out</i></a>:<br /><br />"A woman was driving drunk in her Tesla and passed out; the car, under its autopilot mode, kept going down the highway. The woman's husband was apparently driving behind her and called the police. They showed up and stopped the Tesla by parking in the highway. The autopilot detected the possible collision and, this time, worked perfectly, slowing the Tesla to a halt."<br /><br />Why would anyone think that because they had a car whose Level 2 system required "a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment." that it was OK to drive so drunk you passed out?<br /><br />Why would Tesla sell a Level 2 system that required "a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment." when the system was unable to detect that the driver had passed out drunk?David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-28587414727307006852021-09-18T06:12:01.809-07:002021-09-18T06:12:01.809-07:00Irony alert! Tesla will open controversial FSD bet...Irony alert! <a href="https://techcrunch.com/2021/09/17/tesla-will-open-controversial-fsd-beta-software-to-owners-with-a-good-driving-record/" rel="nofollow"><i>Tesla will open controversial FSD beta software to owners with a good driving record</i></a> by Kristen Korosec explains that in order to use software that means you don't need to drive, you have to be a good enough driver not to need software that means you don't need to drive:<br /><br />"Owners who have paid for FSD, which currently costs $10,000, will be offered access to the beta software through a “beta request button.” Drivers who select the beta software will be asked for permission to access their driving behavior using Tesla’s insurance calculator, Musk wrote in a tweet.<br /><br />“If driving behavior is good for seven days, beta access will be granted,” Musk wrote."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-63364210550708465652021-09-17T12:30:24.455-07:002021-09-17T12:30:24.455-07:00Elon Musk is starting to lose his hold on the cult...Elon Musk is starting to lose his hold on the cult members, as Matt McFarland reports in <a href="https://edition.cnn.com/2021/09/16/cars/tesla-fsd-delay/index.html" rel="nofollow"><i>Some Tesla owners are losing trust in Elon Musk's promises of 'full self-driving'</i></a>:<br /><br />"Frustrated Tesla owners continue to wait for "full self-driving," an expensive and long-delayed software feature that isn't even guaranteed to help their cars' resale values. Some of the company's earliest backers of the "full self-driving" option are even beginning to lose faith in the promise of ever enjoying a truly autonomous Tesla.<br />...<br />Owners with "full self-driving" today don't get the big autonomous vision Musk has long promoted, but instead a set of features that drivers with only Autopilot don't have. They include automated lane changes on highways and "Navigate on Autopilot," which guides a Tesla from highway on-ramp to off-ramp. There's also a parking assist feature as well as "smart summon," in which the car can slowly drive through a parking lot to pick up passengers, and a feature to identify stop signs and traffic lights.<br /><br />For years, Musk made <a href="https://money.cnn.com/2016/10/19/technology/tesla-announcement/index.html" rel="nofollow">grandiose claims</a> about how soon Tesla's cars will drive themselves. Musk and Tesla have fallen short of these <a href="https://www.cnn.com/2019/05/02/tech/elon-musk-predictions/index.html" rel="nofollow">deadlines repeatedly</a>, but he's continued to make optimistic predictions."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-79555843369985316662021-09-15T05:51:28.439-07:002021-09-15T05:51:28.439-07:00Audrey LaForest's NHTSA seeking driver-assist ...Audrey LaForest's <a href="https://www.autonews.com/regulation-safety/nhtsa-seeking-driver-assist-data-12-automakers-tesla-autopilot-probe" rel="nofollow"><i>NHTSA seeking driver-assist data from automakers in Tesla Autopilot probe</i></a> spells more trouble for Tesla's Autopilot:<br /><br />"For each automaker, the agency is seeking the number of vehicles equipped with Level 2 systems that have been manufactured for sale, lease or operation in the U.S. as well as the cumulative mileage covered with the systems engaged and a log of the most recent updates to the systems.<br /><br />The agency also is requesting all consumer complaints, field reports, crash reports and lawsuits that may relate to the driver-assist systems.<br /><br />Automakers must describe the types of roads and driving conditions where the systems are intended to be used, and the methods and technologies used to prevent usage outside the operational design domain specified to customers. In addition, automakers must provide an overview of their approach to enforce driver engagement or attentiveness while the systems are in use."David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.comtag:blogger.com,1999:blog-4503292949532760618.post-36506960711231988672021-09-01T08:32:23.516-07:002021-09-01T08:32:23.516-07:00Jonathan Gitlin's Tesla must tell NHTSA how Au...Jonathan Gitlin's <a href="https://arstechnica.com/cars/2021/09/tesla-must-tell-nhtsa-how-autopilot-sees-emergency-vehicles/" rel="nofollow"><i>Tesla must tell NHTSA how Autopilot sees emergency vehicles</i></a> documents the regulatory vise closing in:<br /><br />"The NHTSA sent Tesla the 11-page letter asking for detailed information on how Autopilot recognizes and reacts to emergency vehicles. The company must respond by October 22 unless it asks for an extension, and the AP says Tesla could be fined $114 million if it does not cooperate.<br /><br />Specifically, the agency wants to know how the system detects "a crash scene, including flashing lights, road flares, reflectorized vests worn by responders, and vehicles parked on the road." Additionally, Tesla must tell NHTSA how Autopilot works in low-light conditions and what happens if the system detects an emergency."<br /><br />My guess is Tesla has a number of problems here:<br /><br />1) Autopilot is an AI system, so Tesla doesn't actually know the answer except "we trained it on data and it seems to work".<br /><br />2) Autopilot, at least in current Model 3 and Model Y, is solely camera-based, so it doesn't work in low-light conditions such as the most recent 5AM crash.<br /><br />3) When Autopilot detects an emergency it can't handle, it hands-off to the "attentive" driver. Which we know doesn't work.David.https://www.blogger.com/profile/14498131502038331594noreply@blogger.com