Social media companies are receding from their role as watchdogs against conspiracy theories ahead of the 2024 presidential election.

  • Billiam
    link
    fedilink
    English
    7710 months ago

    That’s a lot of words to just say “Allowing bad actors to lie about elections makes us money, while hiring the staff to combat election lies costs us money, and we’re too sociopathic to spend money for the good of the country.”

  • @Steeve@lemmy.ca
    link
    fedilink
    English
    32
    edit-2
    10 months ago

    This is a shitty opinion article that links speculation, anecdotal evidence, and “experts” with no actual studies at all. Whether or not they’re doing a good job at it, all of these companies invest heavily in moderation on their platforms, and that investment hasn’t been reduced substantially.

    Everyone hates social media through elections, it’s an easy thing to blame because the loud get louder. This article is punching down early for clicks.

    • Prethoryn Overmind
      link
      fedilink
      English
      1110 months ago

      I made a comment the other day saying Lemmy users are just as biased as average people.

      Someone said, “how is this article biased.”

      I am convinced half the user base that was here before Reddit doesn’t know they are stuck in a loop of reading and posting articles here that justify their mindset the same as anyone else.

        • @PRUSSIA_x86@lemmy.world
          link
          fedilink
          English
          410 months ago

          Lemmy caters particularly well to the doomer crowd, since it presents an alternative to corporate-owned social media. Doomers have a lot of overlap with leftists(or liberals or whatever), and this causes a feedback loop of depression and anxiety. Consider two people; the leftist and the doomer:

          -The leftist gets mad about something and posts a few articles

          -The doomer sees this and reposts it with a more fatalistic title

          -The leftist sees the new post and becomes more outraged, posting massive walls of enraged text in the comments

          -The doomer reads the first paragraph and begins wailing about how the world is ending

          -This anger spills over into other posts and generally sours the mood for everyone involved

          Because the doomer is incapable of viewing anything in a positive light and the leftist lives in a state of perpetual butthurt, they feed into one another and fill everybody’s feed with outrage and despair.

          Swap the leftist with alt-right and you have 4chan circa 2017. Let it fester for a few years and you have a machine for churning out political extremists.

    • @Steeve@lemmy.ca
      link
      fedilink
      English
      -510 months ago

      Yes, and stay on Lemmy, the totally unmoderated version of those filled with Russian and CCP propaganda.

          • KSP Atlas
            link
            fedilink
            English
            110 months ago

            Yeah there were systematic issues at play there

        • @Steeve@lemmy.ca
          link
          fedilink
          English
          010 months ago

          If you’re going to compare it to corporations like Meta who spend billions per year on platform moderation then yes, it’s absolutely unmoderated. Even compared to the shitty volunteer moderation of Reddit, which at least has mod tools. If Lemmy was big enough to be noticed by the media it’d be right up there on that list of misinformation spreaders, because that’s just what social media is.

          I like Lemmy better than those platforms, I’m here after all, but saying get off those other platforms from a platform that has the same problems is the pot calling the kettle black

          • @ItsGhost@sh.itjust.works
            link
            fedilink
            English
            110 months ago

            I think it’s also worth bearing in mind there that the average fedi user currently is well aware of the lack of platform level moderation, both the good and the bad that come with that.

    • @TwilightVulpine@lemmy.world
      link
      fedilink
      English
      2310 months ago

      Because like it or not that is where a lot of people get information these days. If it keeps pushing bulshit, people believe bulshit. For an example, anti-vaxxers didn’t use to be so common, until their bulshit was spread all over social media.

      I would love for people to be wise enough to verify information in reliable sources and not just believe everything they see, but sadly that’s not the world where we live in.

      • @Jakeroxs@sh.itjust.works
        link
        fedilink
        English
        110 months ago

        Antivax sentiment has been around for hundreds of years, long before the Internet, mostly through political party rhetoric and/or religion, not saying the spread likely hasn’t increased, but people believe wrong information all the time.

        • @TwilightVulpine@lemmy.world
          link
          fedilink
          English
          110 months ago

          There is always a nutball, but my point is that, yes, it has increased significantly. Vaccines were a settled matter already, people far and wide trusted them. Now vaccination rates have gone down and diseases that we had nearly eliminated are having a comeback. This has happened because now any stupid grifter can have a worldwide platform and a following who actively spreads their nonsense.

    • Andy
      link
      fedilink
      English
      18
      edit-2
      10 months ago

      I think we need to pursue a strategy that attempts to discourage the spread of disinformation while avoiding making them the arbiters of truth.

      I think social media platforms are like a giant food court. If you do nothing to discourage the spread of germs, your salad bar and buffets are all going to be petri dishes of human pathogens. That doesn’t mean that the food court needs to put in hospital-level sterilization measures. It just means that the FDA requires restaurants to use dishwashers that get up to 71 C, and employees are required to wash their hands.

      In this case, I think we should experiment. What if platforms were required to let users flag something as disinformation, and share a credible source if they like? Maybe users could see all the flags and upvote or downvote them. The information would still be there, but you’d go to the InfoWars page and it would say, “Hey: You should know that 95% of people say this page posts mostly bullshit.”

      Something like that. I don’t like the role the companies play currently, but disinformation does carry the potential to cause serious harm.

      • @rootA
        link
        fedilink
        English
        -410 months ago

        Remember when social media was deleting news stories about a certain laptop?

        • Andy
          link
          fedilink
          English
          310 months ago

          Yes?

          I can’t tell if you’re agreeing with me or not.

        • @Jakeroxs@sh.itjust.works
          link
          fedilink
          English
          110 months ago

          I am also against deleting valid news about wrongdoing for Democrats, if you’re implying this stance is political in some way.

  • @RobotToaster@infosec.pub
    link
    fedilink
    English
    20
    edit-2
    10 months ago

    Maybe a hot take, but allowing capitalist corporations to decide what is “the truth” was always a terrible idea.

  • @Jeredin@lemm.ee
    link
    fedilink
    English
    910 months ago

    Facebook was the start (though Yahoo and YouTube weren’t far behind). All conservatives and the rich have is money, so what’s it to them to buy out the company or their CEO? This may end up being a worse propaganda machine than FOX, but time will tell…Haven’t been on Reddit but not sure how their algorithm has been doing but their ads were very conservative before I left…

    • chaogomu
      link
      fedilink
      710 months ago

      Facebook Hired Joel fucking Kaplan right after he left the Bush White House. Kaplan personally exempted rightwing conspiracy news sites from Facebook’s truth standards, while also deprioritizing more overtly left leaning sites.

      He personally nixed any change to the Facebook algorithm that would reduce the radicalization pipeline.

      Oh, and he also stuffed Facebook management with right-wing yes men.

      • @boredtortoise@lemm.ee
        link
        fedilink
        English
        310 months ago

        Thank you. At least someone remembers that Facebook/Meta has been a right winger echo chamber earlier than Musk’s version

      • @Jeredin@lemm.ee
        link
        fedilink
        English
        110 months ago

        Best we can do is migrate…I always had this thought (stick with it): a lot of police are conservative, because how many progressive people do you know who want to have that difficult job? It seems true for management to: progressives don’t enjoy bossing people around and often enjoy the process of producing/making something. So who is left to fill the void? Conservatives. Worse, it doesn’t seem rare that “progressives,” turn hyper capitalist or all together conservative - they simply joins the rich club and most of their ideology. Even most politically left politicians only pander to progressive social issues but tend to bend to special interests. It’s going to take a major shift to change our power structure and who knows how that will come about…

    • Beemo Dinosaurierfuß
      link
      fedilink
      English
      610 months ago

      What?

      Musk might be the number 1 person in the whole world when it comes to losing money this past year.

      • @CanadianCorhen@lemmy.ca
        link
        fedilink
        English
        110 months ago

        Oh, I’m hardcore anti musk, but I mean “firing watchdog staff to reduce overhead”, not that musk is financially literate.

        If musk can fire his watchdog staff, thereby reducing staffing costs, other companies will want to follow suit. Just look at Spez admiring musk.

      • Prethoryn Overmind
        link
        fedilink
        English
        -110 months ago

        They are just another Lemmy user that doesn’t know what the fuck they are talking about.

  • The Snark Urge
    link
    fedilink
    English
    810 months ago

    They know our government effectively has schizophrenia. Trouble is they’re fucked in the long run whether it’s Jekyll or Hyde.

  • @Monomate@lemm.ee
    link
    fedilink
    English
    610 months ago

    Moral of the story: it only took one social media site to start being more lax on cersorship for the other ones to follow suit. Maybe this is indicative that spending a lot of money on censorship measures is useless.

  • AutoTL;DRB
    link
    fedilink
    English
    610 months ago

    This is the best summary I could come up with:


    Social media companies are receding from their role as watchdogs against political misinformation, abandoning their most aggressive efforts to police online falsehoods in a trend expected to profoundly affect the 2024 presidential election.

    These shifts are a reaction from social media executives to being battered by contentious battles over content and concluding there is “no winning,” said Katie Harbath, former director of public policy at Facebook, where she managed the global elections strategy across the company.

    In the run-up to the 2020 presidential election, social media companies ramped-up investigative teams to quash foreign influence campaigns and paid thousands of content moderators to debunk viral conspiracies.

    Civil rights groups pressured the platforms — including in meetings with Zuckerberg and Meta COO Sheryl Sandberg — to bolster their election policies, arguing the pandemic and popularity of mail-in ballots created an opening for bad actors to confuse voters about the electoral process.

    Internal momentum to impose the new rule seemed to plummet after Musk boasted of his plans to turn Twitter into a safe haven for “free speech” — a principle Zuckerberg and some board members had always lauded, one of the people said.

    Instagram head Adam Mosseri, who led efforts to build Threads, said earlier this year that the platform would not actively “encourage” politics and “hard news,” because the extra user engagement is not worth the scrutiny.


    The original article contains 2,614 words, the summary contains 226 words. Saved 91%. I’m a bot and I’m open source!

  • @deft@ttrpg.network
    link
    fedilink
    English
    210 months ago

    tbh if you aren’t in highschool or a content creator why the fuck have an insta or Facebook or any of this at this point. nobody cares about your vacation bro