• FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    For stuff like Twitter-likes and TikTok-likes I want an algorithm.

    Until recommendation algorithms are transparent and auditable, choosing to use a private service with a recommendation algorithm is giving some random social media owner the control of the attention of millions of people.

    Curate your own feed, subscribe to people that you find interesting, go and find content through your social contacts.

    Don’t fall into the trap of letting someone (ex: Elon Musk) choose 95% of what you see and hear.

    Algorithmic recommendations CAN be good. But when they’re privately owned and closed to public inspection, then there is no guarantee that they’re working in your best interest.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        They’re good at predicting what people want to see, yes. But that isn’t the real problem.

        The problem isn’t that they predict what you want to see, it is that they use that information to give you results that are 90% what you want to see and 10% of results that the owner of the algorithm wants you to see.

        X uses that to mix in alt-right feeds. Google uses it to mix in messages from the highest bidder on their ad network and Amazon uses it to mix in product recommendations for their own products.

        You can’t know what they’re adding to the feed or how much is real recommendations that are based on your needs and wants and how much is artificially boosted content based on the needs and wants of the owner of the algorithm.

        Is your next TikTok really the next highest piece of recommended content or is it something that’s being boosted on the behalf of someone else? You can’t know.

        This has become an incredibly important topic since people are now using these systems to drive political outcomes which have real effects on society.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 days ago

            I’m carrying on multiple conversations in this thread, so I’ll just copy what I said in a different thread:

            Of course people like these features, these algorithms are literally trained to maximize how likable their recommendations are.

            It’s like how people like heroin because it perfectly fits our opioid receptors. The problem is that you can’t simply trust that the person giving you heroin will always have your best interests in mind.

            I understand that the vast majority of people are simply going to follow the herd and use the thing that is most like Twitter, recommendation feed and all. However, I also believe that it is a bad decision on their part and that the companies that are intaking all of these people into their alternative social networks are just going to be part of the problem in the future.

            We, as the people who are actively thinking about this topic (as opposed to the people just moving to the blue Twitter because it’s the current popular meme in the algorithm), should be considering the difference between good recommendation algorithm use and abusive use.

            Having social media be controlled by private entities which use black box recommendation algorithms should be seen as unacceptable, even if people like it. Bluesky’s user growth is fundamentally due to people recognizing that Twitter’s systems are being used to push content that they disagree with. Except they’re simply moving to another private social media network that’s one sale away from being the next X.

            It’d be like living under a dictatorship and deciding that you’ve had enough so you’re going to move to the dictatorship next door. It may be a short-term improvement, but it doesn’t quite address the fundamental problem that you’re choosing to live in a dictatorship.