I have to admit, I hadn’t realized it had got this bad. How did this get normalized?
I browse with most scripts disabled, and have since JS was first introduced to the browser. What I’ve observed is that some pages contain NO actual content, or just the first paragraph, when I load them. I read what’s provided and move on. If the site is hostile to me reading their content they worked so hard to get in front of me, I’m not going to do any extra work to find out what it is.
Ironically somehow AI is making disabling JS better nowadays, because text/markdown is becoming normalized, so receiving a pure text version of a page is a thing again.
I have to admit, I hadn’t realized it had got this bad. How did this get normalized?
I browse with most scripts disabled, and have since JS was first introduced to the browser. What I’ve observed is that some pages contain NO actual content, or just the first paragraph, when I load them. I read what’s provided and move on. If the site is hostile to me reading their content they worked so hard to get in front of me, I’m not going to do any extra work to find out what it is.
The average user doesn’t know or understand technical details, and don’t believe they have any power to change anything
Also capitalism means a small number of assholes make most of the decisions for reasons that benefit them
Ironically somehow AI is making disabling JS better nowadays, because text/markdown is becoming normalized, so receiving a pure text version of a page is a thing again.
It is mostly because the bar is measured in time to display content (forgot the name of the metric)
So the huge about of bullshit gets hidden by fast internet and asynchronous jobs.
I think it’s “First paint” or something like that.
https://web.dev/articles/fcp
Yeah, you are correct