• deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 hours ago

    OS level parental controls do not give a parent control over a child’s use of a social media platform, to the best of my knowledge. For example, how do you prevent a child from friending someone you don’t know on facebook, while still letting your child join a Facebook group for their soccer team? That kind of fine grain control needs to happen on the level of the platform. Universally blocking DMs to your child’s account from accounts they are not friended to needs to happen on the level of the platform. Etc.

    Universally preventing children from joining social media is also an option, or giving parents the tools to block their children individually from accessing known social media sites from hardware under the parent’s control is also an option, but neither of these are sufficient or without negative consequences. Blocking children from social media by law requires age verification to have any effect. Blocking access to certain websites on a hardware level encourages the child to use hardware outside their parents control, or else excludes them from a part of social life.

    Platforms need parental control tools as well, not just operating systems, and those tools need to be sufficient to allow a parent to have real control over what their child can access. I don’t think that will exist without legislation, because it is contrary to the platform’s financial interest.

    • lmmarsano@group.lt
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      OS level parental controls do not give a parent control over a child’s use of a social media platform

      A quick web search indicates they can filter/block content, restrict apps, report activity. Additional software can monitor communication (including social media) and alert guardians.

      However, the legal opinion wasn’t that parental control software is the best solution or only better solution[1], but that more effective alternatives (such as non-punitive laws promoting use of client-side parental controls) with less adverse impact exist than punitive laws limited in their enforceability by jurisdiction & that unnecessarily burden & deter (thus harm) free exercise fundamental liberties.[2] Client-side parental controls only affect their users without affecting everyone else. Unlike regulations on site operators, they work on content originating outside a law’s jurisdiction. Even at the time of that federal court decision, parental controls could screen dynamic content (eg, live chats) over any protocol.

      By far, the most appropriate answer is responsible adult involvement & supervision and the education of children to address motivation, coping, & responsible behavior.

      The internet is global. A key problem with any coercive law is their jurisdiction isn’t: just as 4chan.org can tell UK’s OfCom to go fuck itself, site operators beyond a law’s jurisdiction can tell its enforcers the same. Another issue is the compliance burden is harder on entrants than the dominant companies in the industry with more resources to afford to comply, thus deterring competition. Do we really want to make it harder to displace our current social media companies with alternatives?

      Communication alone rarely poses immediate danger: there’s usually a number of steps between the communication & actual harm where anyone can intervene. We can block or ignore unwanted communication & choose the information we disclose. Responsible people can guide their children on safety & control their access to the devices they give them.

      A while ago, when my uncle struck his kid for making an unauthorized payment through the kid’s tablet, I scolded him for creating the situation where the kid could do that instead of setting up a child account with parental controls. When I asked him how child abuse is more responsible than reading some shit designed for him to understand and pressing a few buttons to use the system exactly as designed to prevent this shit from happening, he quickly got the point and did that in about an hour. This shit ain’t hard.

      Better solutions already exist, they’re effective, and the solid recommendations governments already have to promote them effectively would work. Governments have largely chosen not to.


      1. The cited recommendations I mentioned elsewhere went beyond parental control software into areas such as the promotion of standards & the development of better standards in the industry. ↩︎

      2. Rather than accept any law, government has a duty to minimize compromises of fundamental rights in meeting its “compelling interests”. When government fails to prove that a law is the least adverse to fundamental liberties among alternatives that are at least as effective, that law must be rejected. ↩︎