• 1 Post
  • 386 Comments
Joined 4 years ago
cake
Cake day: March 24th, 2022

help-circle
  • If Facebook as it is today is the only profitable way for it to function, it shouldn’t exist.

    That’s not the case of course. A disenshitified Facebook is possible, and would be profitable too, it just wouldn’t make literally all the money. It just wouldn’t exploit you and literally every single way it could manage. Zuckerberg and the other shareholders would have to tolerate lower profits, but they wouldn’t have zero.


  • I think you’re still misunderstanding me, and the scope of existing solutions.

    It is not sufficient to have tools to monitor behavior, blocking whole websites is too crude. If those are the tools that you have you’re really just encouraged not to use them, or to overuse them. There’s no real in between.

    These platforms make desired choices. There are some things they decide not to do because it’s less profitable.

    Again, what if I want my kid to have a Facebook account so that he can coordinate with his soccer club independently, but I don’t want him to DM strangers or join strange groups? I want to facilitate independence, not have to look over his shoulder constantly, but still protect him from groomers. It is not hard platforms to enable this kind of control. It’s just not profitable.


  • The “these tools” that the government is targeting include end-to-end encryption, and I am of course of the position that and end encryption is a good thing.

    When we talk about “abusive systems” we need to be very clear about what kinds of technology or system behaviors we are discussing, or else the government solution by default will usually be “well it can’t be only the platform that spies on you.”


  • That’s literally how Facebook used to function, and a big motivator as to why people joined it. People wanted to interact with people they knew.

    And this would not prevent people from making new connections.

    If you wanted to meet, I don’t know, hot singles in your area, you would actually have to talk to people who knew about the singles group on Facebook and have them share the link to the group with you. Or find it in the search bar if it’s public. You know, seek it out.

    Keep in mind that Facebook does not show you groups or people because it cares about the connections that you make. It just wants you to keep clicking. Your own desire to connect is more than sufficient to drive you to connection…with a search bar.


  • You can control a lot about how platforms specifically work with legislation. After decades of seeing how they function we are more than capable of accurately identifying what functionalities are deleterious to the well-being of people generally or children specifically, what should be under the control of the parents of children, and therefore we have a good idea of what we should legally require platforms to disable or otherwise put under the control parents.

    Again I propose as an example: having an account marked as a child’s account, with a designated parent account and making it so that if someone would attempt to add that child account as a friend or connected account on a social media network such an addition must be approved by the parent account.


  • OS level parental controls do not give a parent control over a child’s use of a social media platform, to the best of my knowledge. For example, how do you prevent a child from friending someone you don’t know on facebook, while still letting your child join a Facebook group for their soccer team? That kind of fine grain control needs to happen on the level of the platform. Universally blocking DMs to your child’s account from accounts they are not friended to needs to happen on the level of the platform. Etc.

    Universally preventing children from joining social media is also an option, or giving parents the tools to block their children individually from accessing known social media sites from hardware under the parent’s control is also an option, but neither of these are sufficient or without negative consequences. Blocking children from social media by law requires age verification to have any effect. Blocking access to certain websites on a hardware level encourages the child to use hardware outside their parents control, or else excludes them from a part of social life.

    Platforms need parental control tools as well, not just operating systems, and those tools need to be sufficient to allow a parent to have real control over what their child can access. I don’t think that will exist without legislation, because it is contrary to the platform’s financial interest.



  • “The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.”

    Oh fuck right off.

    I’m sorry but this is a bad “think of the children” decision. There are limits to what Meta or any platform can do about bad actors at that size without structural changes.

    What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

    And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

    But also: require intercompatibility with other platforms and a standardized form of profile data export so people can leave Facebook but stay in touch with the people who still use it.