• 2 Posts
  • 265 Comments
Joined 2 years ago
cake
Cake day: April 27th, 2024

help-circle

  • “AI” is not a use case for a computer. Plain and simple. A real use case would be for instance to edit videos or code or create spreadsheets, and what the everloving shit does adding ✨Agentic and Conversational AI✨fix with literally any use case?

    Sure, researching can be a use case for AI stuff, as well as just talking with it, but there’s no reason to sell an entire fucking class of laptops labeled “AI PCs” when the only thing it has is windows 11 copilot (lobotomised ChatGPT) and an NPU advertised as a “future compatibility” feature…



  • The worst thing about ‘smart TVs’ is that they advertise so many ‘cool’ features but most of them have less performant processors and less ram than my 2019 budget galaxy a series phone, and that’s very telling, you can’t even use their dog shit built in web browsers since everything becomes outdated after like a week, and the performance is so bad that the expensive Hisense tv my dad bought back in 2020 can’t even load Google properly.



  • biggerbogboy@sh.itjust.workstoLemmy Shitpost@lemmy.worldFake moo
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    edit-2
    4 days ago

    I believe that near me, there was something that happened where it was partly human meat. I’ve heard that a worker at the meat packing plant for maccas got fed up with working there, so he tied up all his coworkers and fed them into the meat grinder. I’m actually not too sure if it was the nuggets or the patties, but then again, that’s still fucked

    Edit: I just searched it up a few times and now I’m not even sure if it’s real lol, maybe it’s a myth



  • And the funny thing about those phone plans is that once people get close to paying off their iPhone 213 XLLXQ, the carrier would offer you a “free” upgrade to the iPhone 214 XLLXQ Ultra Big-Boy Edition, which then the person paying for it needs to pay for the entire phone again if they take the bait, which tons of people do unfortunately.

    Personally, I’d rather just buy some old flagship phone used, since the features of phones don’t really change much over the years, and I don’t even need a whole lot since I barely use phones anyway unless they’re apart of my kde connect “mesh” of devices


  • It’ll probably spur on a higher influx of soldered unified memory based systems until even desktops are commonly soldered in terms of ram and processors. It might even allow for new socket standards, since consumers would be begging at that point.

    It kinda even aligns with my theory of how electronics improve through standards becoming incredibly commonplace but stale, which then creates new form factors that are soldered, and then the rest of the market follows, creating new modularity standards to replace the old ones.







  • Yeah nah, the AI shit himself as well as the industry as a whole is peddling isn’t as useful as they say it is, even with the “revolutionary new Gemini 3 models”, which are just slop convo generators. The thing is, when AI is thrusted into a person’s line of sight, the label doesn’t make it so they impulsively rework their entire workflow to be the exact same in terms of quality but use an AI instead, most people brush it off as system bullshit they don’t need, and even if it could help with some things in some capacity, it’s marketed as a “feed everything into me and you can use me for everything” machines, when honestly more accurate, smaller and specialised models should be made instead, even if their purpose is also a bit dubious too.

    And sure, it might seem contradictory, but I do use an AI, but it’s pretty much just for brainstorming and conversational shit for refining my ideas through just articulating them, but I really dislike how my computers are full of AI services for no reason, with a few of my laptops with copilot baked in, another with Gemini baked in, and what have you.

    I suppose the only good thing that came out from LLMs would just be the increase from 8 to 16gb of ram on many machines, but then again, it had to drop again because of said AI companies.




  • What’s with this obsession with putting everything in space? Like don’t get me wrong, some technologies absolutely vastly benefit from it, but like, why put data centres in space? Why put greenhouses in space? Why put a factory in space? Sure it’s cool to see but I genuinely don’t see the benefit, especially if either you have to pay tens of billions to get a standard facility in space working, or have to miniaturise it so much that there isn’t an advantage to it at that point…

    Maybe I have a shit take, I’m not sure, but what I see is how priorities are mismatched on such a crazy level, then again though, at least this isn’t the stupidest idea since it has at least some grounding, but AI companies wanting to send dozens of data centres into space is plain infeasible even if they manage to use neuromorphic and light based chipsets, use nuclear fusion somehow, and manage to pack such a dense radiator system so the whole facility doesn’t burn up.


  • Sucks that Microsoft sees no reason in enforcing any resource usage limits for anything, console manufacturers do this and games run incredibly well on there, same for how Apple (despite other bullshit they pull) enforces software requirements so it can run at least functionally on the oldest supported devices.

    All Microsoft has done is shoot themselves in the foot by upping the requirements so they can get lazy with coding, such as pretty much every UI component being an electron app, or how apparently a third of it is vibe coded. Meanwhile, due to the prices of devices with reasonable amounts of RAM skyrocketing, too many consumers get the bottom of the barrel configs, and then wonder why their computer is insatiably slow; it’s because Microsoft is now enforcing their laziness, possibly so they can change UI components quicker through higher level languages.