

Who doesn’t like their phone charging them by the word?


Who doesn’t like their phone charging them by the word?


To be fair, using react for it was just an odd decision to begin with.


Is this anything new at all?
Even back in the day, you had people wanting to live in the recent past, because the past usually gets romanticised.
So people in the 1960s might have a rosy view of the turn of the century, and want to go back to the 1930 days of art deco and balls, or those today, that might want to return what they believe to be glory days of 1960. Even if it isn’t actually realistic to how you might live in the past. The average citizen in 1930 was not attending balls at a swanky music lounge.
Give it a few decades, we might also have people from 2050 pining for the 2020s, believing it to be just like the advertisements, where we all live in the penthouse level of skyscrapers, overlooking a vast cityscape.


That instance needs a login to show the post.


If you don’t have a Kobo, the file conversion is also a lifesaver.
I have one of the old Kindle e-readers, and it doesn’t support epub, for example. It does support pdf, in theory, but the age of the hardware means any decently large/complicated pdf bogs it down something fierce.
Being able to use calibre to convert my books to a format it does support is nice.


Do kind of wish that they had less silly names, though.
It’s hard to recommend them without sounding like you’re just babbling nonsense.
If you get Libby and Hoopla for your Kobo, you don’t need Ploob, no matter how much Ploob has it for you.


Sort of? Apple’s reputation is traditionally that they make middle-of-the-road hardware, but make up for the shortcomings with software.
On paper, you can buy a Windows computer with better specs for cheaper, but the Apple computer still holds its own because the software is well-made, at least on the OS side of things. Even if the rest of their software was rubbish, you could get rid of it and still have a good foundation to work from. Hence why the Hackintosh was all the rage some years back. In theory, you could eke out the best of both worlds.


I think that’s why we haven’t seen Apple Silicon advertised that heavily lately.
There’s also not much of a point to advertise it at this point. The M-Series chips been around for a good while now, and is used in a bunch of their products. It’s basically turned into the status quo, so they have no need to advertise it, particularly as the improvements seem to be mostly incremental for the time being.


I do kind of wish that there was a way to bring back the old squishy gel 3D icons, though.
The current thing is a bit of an awkward cross between them, and the flat colours that seem to be basically everywhere now.


It’s still a bewildering oversight that, or something just like it, is the only way you can link with a device.
If you stuff your phone with photos, you can’t delete them by connecting them to a computer and sorting through them on that. You have to use a utility to import them either straight onto the computer, or delete them separately on the phone. Even if you use a Mac instead of a PC, you basically need to work with an iTunes-like interface.
Especially with the focus on trying to make the iPad a computer. You’re still largely relegated to the iTunes-type interface, unless you sidestep it with a cloud service, or Airdrop.


Apple Vision Pro seemed doomed from the get go, but they really made it worse by not launching a cheaper headset with Air branding half a year or a year in to actually drive market share enough to make it worthwhile for developers. Could’ve given it an A series CPU since we now know it works in a laptop so why not in XR or whatever they’re calling this.
I think that they shot themselves in the foot by trying to make it a computer that goes on your face, and have it do as much as possible.
The interface is weird, and comes with a bunch of features that don’t seem very useful. The eye thing is simply odd, and the keyboard seems like it would run into the same problems that those laser keyboards that were all the rage back in the day had, where it’s awful to type on, since you get no feedback, and are just whacking your hand against a solid surface.
If they had stripped it all the way down into basically being a wearable monitor you can plug into your devices, with workspaces you can expand or move around as you like, in lieu of having a bunch of monitors, it would have been more of a sell.
As it is, it comes across as a proof-of-concept that’s stuffed to the gills with gimmicks to try and make it fit a niche, which in turn makes it seem a toy more so than anything else.


The skeumorphic days of the early 2000s were nice, and gave things a bit of character. The current trend of having everything be flat colours is fine, but does lose a little bit of that whimsy.
Admittedly, part of it might also just be that the grass is greener. We could easily be saying the same thing in reverse if we were still on the gel look of the time.


I don’t think it’s short-term profits exactly, as much as he’s just focused on making a profit, to the exclusion of all else. Logistics work doesn’t tend to pay off short-term, and that is a lot of what his tenure focused on, with Apple basically bringing everything back in-house.


Their M-Series SoCs are also popular enough that they’re the face of AI outside of GPUs and datacentres, and they were pretty big for the whole computing industry, especially given the whole reputation Macbooks had of being slow and prone to heating, and ARM being seen as slow/exclusively for mobile. Apple wasn’t the first to make a ARM computer, but from memory, a lot of them were relegated to either Chromebooks or Single-board computers. You’d be silly to put an ARM-based CPU in your laptop, if you were planning to do any serious work.
The whole agentic AI trend of late basically has people flocking to go for an M-Series Mac, even when the setup is mostly routed through an external provider, and could run with minute resources.
It’s equally as weird to think that your Macbook runs on an iPad/iPhone chip, but there we are. If you went back 10 - 20 years, and told people that Apple were making Macbooks run on old iPhone chips, they’d think you were joking about how bad they were.


Millennials are ruining the device industry smh


This is counter-productive and can get you in big trouble IMO. I don’t even get what these are protesting
It reads like a policy/implementation fault. The workers have been told to use AI, but haven’t been told clear information, or are presented with a bad model/interface, so they just hop on Google bard or something familiar that works better.
It’s still using AI, so basically the same thing.


The categories that they used for “sabotage” (Entering proprietary information into a different AI, using unapproved chatbots, and using low-quality AI responses as-is) seem like they’re just put together so they can blame employees for sabotage for the failure of the AI rollout, rather than employers trying to wedge it onto a bad use case, or not rolling it out properly.
The first two just seem like the company having issues with people going straight to ChatGPT, and using that as-is, and the third seems to be more people not really caring and using the AI output as required.
None of that comes across as outright sabotage like the organisation or article the to imply. All three seem like reasonable end-points of telling people to use AI, and giving them metrics they need to meet, or a not-great interface, so they just go off and use a different AI thing, because it’s all AI, and basically the same thing, right?


It is weird that they use it as a national identification number, when they are ostensibly virulently against the concept, and it was never designed to be used in that manner to begin with.


It’s like a better iPad in a way, since you could run full-scale desktop programs on it, and use it like a desktop.
I wouldn’t be too surprised if things like surfaces were one of the reasons why Apple seems to be making a push to try and make the iPad functional as a computer on its own.
So what happens if the artist is dead?
Freddie Mercury would find it difficult to maintain an active social nedia presence to prove he’s human, being rather indisposed at the present.