• 1 Post
  • 390 Comments
Joined 4 years ago
cake
Cake day: March 24th, 2022

help-circle
  • By “tools” I generally mean software and options/functionalities offered by that software through the regular user interface that enables one to modify the outputs of that software, and thus one’s user experience. So in this sense Windows 11 is a “tool” that as an operating system enables one to use a computer, but also therefore supplies tools to modify the experience, such as one lets a privileged user prevent non-privileged users from uninstalling software or sharing a printer to the LAN, right? Facebook (a software deployment owned and remotely hosted by Meta) has a tool that allows a person with a Javascript-enabled web browser (also a tool) or Meta’s proprietary application to send a message to a stranger on the internet, or a known person, along with a lot of other things, right?

    Now what Windows 11 doesn’t have is a tool that lets me locate my mouse pointer on screen easily, but that’s okay because I can install PowerToys to gain that functionality. I can also install software that modifies the Facebook experience to some degree, but there’s not a lot of that for various reasons, and certainly I can’t find any that sells itself as a child-safety or parental control solution. But that makes sense, right? Because in order to serve that functionality it has to be deployable across all computers the child is using to access that remote service, and it has to be updated to match changes in that service’s software, like your shadow is attached to your feet. No practical at all.

    Obviously this is of limited use, and this is why people use tools to modify their experience of social media sites like FB are usually doing so merely for their own comfort and enjoyment, which is valid but not the same purpose as parental control. And the relationship between the remote service and the local software developer is adversarial. This is why there’s plenty of parental control tools to block a website, but none to modify one.

    I actually agree that moderation is the solution, but not in the way you mean. FB doesn’t create content, it just facilitates people to share their own (bots too, but set that aside). I don’t think any sane person believes that Meta or lemmy [dot] world or any other platform could continue to exist if it was held responsible for what its users said. Platforms make what efforts they do at regulation to avoid getting DMCAed, to keep themselves advertiser-friendly, and to make their services sufficiently enjoyable to users who those advertisers what their ads to be seen by. That last bit’s important, but even look at the first two, a legal regulation and “regulation” by market forces in the wild, and you can see how these already cause problems. But what platforms like FB don’t give you because they don’t want you to have it is control over your user experience.

    FB doesn’t want you to have tools (account options) to moderate your own or your child’s experience on their platform because it would cost them money, both in development costs and opportunity costs. But that’s what’s actually needed to make FB an enjoyable and even child-safe experience. Not broad legal “moderation” demands that no platform could survive without obscenely invasive company-side tools and exploitative labor outsourcing, but functional tools (that yes, would have to be mandated by law because they won’t do it voluntarily) that enable the user to control their own experience. It’s a question of, do you want some underpaid and thrice subcontracted Indian/Nigerian tech workers reading your teen’s sexts with his boyfriend and making judgment calls as to their appropriateness, or do you want the capacity to simply allow communication between those to accounts without monitoring them, but retain the ability to block DMs from unknown accounts so your kid doesn’t get groomed by a stranger? We’re constantly told we have to choose between total system control or the Wild West, but we are only encouraged to consider these possibilities because they’re what’s cheapest for the companies.




  • This is a solvable problem though. FB could create tools to allow their users to cultivate a better experience, including but not limited to parents and children. It wouldn’t require a war of attrition against automation, or infinite moderators, but allowing people to have deeper control of their experience would reduce the number of ads you could shove in their faces and the amount of profit you could make. They therefore won’t do it voluntarily, and that’s why they should be compelled to provide such functionality by law.


  • If Facebook as it is today is the only profitable way for it to function, it shouldn’t exist.

    That’s not the case of course. A disenshitified Facebook is possible, and would be profitable too, it just wouldn’t make literally all the money. It just wouldn’t exploit you and literally every single way it could manage. Zuckerberg and the other shareholders would have to tolerate lower profits, but they wouldn’t have zero.


  • I think you’re still misunderstanding me, and the scope of existing solutions.

    It is not sufficient to have tools to monitor behavior, blocking whole websites is too crude. If those are the tools that you have you’re really just encouraged not to use them, or to overuse them. There’s no real in between.

    These platforms make desired choices. There are some things they decide not to do because it’s less profitable.

    Again, what if I want my kid to have a Facebook account so that he can coordinate with his soccer club independently, but I don’t want him to DM strangers or join strange groups? I want to facilitate independence, not have to look over his shoulder constantly, but still protect him from groomers. It is not hard platforms to enable this kind of control. It’s just not profitable.


  • The “these tools” that the government is targeting include end-to-end encryption, and I am of course of the position that and end encryption is a good thing.

    When we talk about “abusive systems” we need to be very clear about what kinds of technology or system behaviors we are discussing, or else the government solution by default will usually be “well it can’t be only the platform that spies on you.”


  • That’s literally how Facebook used to function, and a big motivator as to why people joined it. People wanted to interact with people they knew.

    And this would not prevent people from making new connections.

    If you wanted to meet, I don’t know, hot singles in your area, you would actually have to talk to people who knew about the singles group on Facebook and have them share the link to the group with you. Or find it in the search bar if it’s public. You know, seek it out.

    Keep in mind that Facebook does not show you groups or people because it cares about the connections that you make. It just wants you to keep clicking. Your own desire to connect is more than sufficient to drive you to connection…with a search bar.


  • You can control a lot about how platforms specifically work with legislation. After decades of seeing how they function we are more than capable of accurately identifying what functionalities are deleterious to the well-being of people generally or children specifically, what should be under the control of the parents of children, and therefore we have a good idea of what we should legally require platforms to disable or otherwise put under the control parents.

    Again I propose as an example: having an account marked as a child’s account, with a designated parent account and making it so that if someone would attempt to add that child account as a friend or connected account on a social media network such an addition must be approved by the parent account.


  • OS level parental controls do not give a parent control over a child’s use of a social media platform, to the best of my knowledge. For example, how do you prevent a child from friending someone you don’t know on facebook, while still letting your child join a Facebook group for their soccer team? That kind of fine grain control needs to happen on the level of the platform. Universally blocking DMs to your child’s account from accounts they are not friended to needs to happen on the level of the platform. Etc.

    Universally preventing children from joining social media is also an option, or giving parents the tools to block their children individually from accessing known social media sites from hardware under the parent’s control is also an option, but neither of these are sufficient or without negative consequences. Blocking children from social media by law requires age verification to have any effect. Blocking access to certain websites on a hardware level encourages the child to use hardware outside their parents control, or else excludes them from a part of social life.

    Platforms need parental control tools as well, not just operating systems, and those tools need to be sufficient to allow a parent to have real control over what their child can access. I don’t think that will exist without legislation, because it is contrary to the platform’s financial interest.



  • “The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.”

    Oh fuck right off.

    I’m sorry but this is a bad “think of the children” decision. There are limits to what Meta or any platform can do about bad actors at that size without structural changes.

    What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

    And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

    But also: require intercompatibility with other platforms and a standardized form of profile data export so people can leave Facebook but stay in touch with the people who still use it.