THE CORRUPT REALITY OF ELON MUSK’S COMMUNITY NOTES

Business Americas Technology

He exploits this misapprehension by crafting lies that everyone can see would easily be disproved if they were false. So, people think they can’t be lies. So, they must be true! He posts to his 150 million Followers on X, including a large number of journalists, politicians, and other Influencers. They then propagate the lies on to nearly everyone, who similarly conclude they must be true.

The few voices that disprove the lies are drowned out by overwhelming disapproval from the deceived. Elon then brands the voices of truth as liars, crackpots, and conspiracy theorists.

For example, this weekend Musk posted on X: “The batting average of @CommunityNotes is so good that we actually notice when it is wrong and I have yet to see an inaccurate note survive for very long. Even once”. Everyone thinks: would he really lie about how many false Community Notes there are when everyone can see them, count them, and check them? Absolutely. Because he knows people won’t check. So, they will just believe him, even though I have had ten ridiculously false Community Notes attached to my posts!

For example, a Community Note on a tweet from @RealDanODowd in February 2023 claimed that “Tesla’s FSD has over 55 million miles driven w/o any reported injuries”. The note provided no supporting evidence for this demonstrably false claim. A cursory Google search reveals a then-recent CNN article headlined: “Tesla ‘full self-driving’ triggered an eight-car crash, a driver tells police”. The article goes on to describe how a Full Self-Driving Tesla caused an eight-car pile-up and put a child in the hospital.

Furthermore, The Dawn Project recently demonstrated that the “driver” of a Full Self-Driving Tesla could eat a meal, stare out of the side window, or rummage in the back seat for more than five minutes at a time without triggering the Driver Monitoring System. We recorded a video of this and posted it on X.

Within 24 hours of the video being published the post had a Community Note attached to it which stated: “This is completely false as it’s been tested that the in cabin camera reacts well to an inattentive driver.”

The link in the Community Note does not support the first sentence as the linked article does not refer to any testing of the in-cabin camera’s ability to detect an inattentive driver. There is nothing in the tweet that is false. The video in the tweet clearly demonstrates that the in-cabin camera does not notice the inattentive driver in two five-minute long clips.

The author of this Community Notes has no basis for any of the false statements made, which are being put forward under the Community Notes rubric of truth.

More false Community Notes are documented here.

Dan O’Dowd, Founder of The Dawn Project, commented: “Musk’s purchase of Twitter, now X, has given him an even louder megaphone to propagate lies that often go undisputed. Musk’s creation and introduction of the Community Notes feature on X provides his army of online supporters with a purportedly credible method to suppress valid concerns or criticisms about Musk and his companies.

“Despite Elon Musk’s hollow statements, Community Notes remains riddled with blatantly untrue statements. Promoted as a publicly-run system to ensure objectivity, Community Notes has instead become a weapon used by Musk’s fans to disparage those who voice concerns about his companies. Community Notes is frequently and easily abused, so Musk seeks to preserve the illusion that the function is objective.

Community Notes is anonymous and not fact-checked by anyone other than Community Notes members, a cabal of pro-Musk users empowered to discredit and suppress genuine issues raised by other users.

X users who are branded with a false Community Note have no recourse to respond or appeal. There needs to be an open and transparent appeals process, and users viewing the Community Note must be able to clearly see that a note is disputed by the author, together with their grounds of appeal.”

Musk’s claim is demonstrably false. Dan O’Dowd, Founder of The Dawn Project, a safety advocacy group campaigning to ban Tesla’s defective Full Self-Driving (FSD) software, is regularly targeted with factually inaccurate Community Notes. These demonstrably false Notes often claim that The Dawn Project’s safety tests are fake and sometimes go even further, promoting false claims about the safety of the software as fact, without any supporting evidence, no doubt encouraged by the false tweets from Musk stating that there have not been any accidents or injuries.