Ever since Lion King came out the “wildebeest” name has been much more common.
But the most commonly used logo for the GNU software is an anthropomorphized gnu.
Ever since Lion King came out the “wildebeest” name has been much more common.
But the most commonly used logo for the GNU software is an anthropomorphized gnu.
In the software, the official position is that the G in GNU is pronounced. But for the animal known as a gnu, the g isn’t pronounced.


briefly released millions of tracks that were scraped from Spotify via BitTorrent.
That’s just an awkward sentence construction but it makes sense: they released track via Bittorrent. The tracks were scraped from Spotify.
I sold my car that was purchased from a dealership via private party sale.
I charged my laptop that normally accepts 100W via a 20W phone charger.
I would’ve used a “which” phrase with commas to avoid the confusion, but the sentence as written is valid and makes sense.


You can reason from a few principles:
So when people start making claims about things with clear, objective definitions (a win condition in chess, the fastest route to take through a maze, a highest lossless compression algorithm for real world text), it’s reasonable to believe that the current AI infrastructure can lead to breakthroughs on that front. So image recognition, voice recognition, and things like that were largely solved a decade ago. Text generation with clear and simple definitions of good or bad (simple summaries, basic code that accomplishes a clearly defined goal) is what LLMs have been doing well.
On things that have much more fuzzy or even internally inconsistent definitions, the AI world gets much more controversial.
But I happen to believe that finding and exploiting bugs or security vulnerabilities falls more into the well defined problem with well defined successes and failures. So I take it seriously when people claim that AI tools are helpful for developing certain exploits.


but isn’t the memory on the Neo on the same die as the processor?
Not actually on the same die, but in the same package, stacked on top using TSMC’s Integrated Fan-Out Package on Package (InFO-PoP).
So the memory still needs to be sourced from memory manufacturers, sent to TSMC, and then have TSMC package it all together in a single package. It’s unclear whether they had locked up this supply at pre-AI prices, though. The underlying A18 Pro chip/package was annoinced and launched about 18 months ago, so if they had the manufacturing pipeline set up for that they might have kept the contractual rights to continue buying memory at the old prices.


This reporting says that the subpoena requires that Reddit produce the information and appear for a hearing.
Shouldn’t the system be storing timestamps in UTC anyway, and then displaying them in whatever localization settings you have?
On the flip side, I’m a former sysadmin and I only stick around for 5 years because I had the educational credentials to move onto another field (and then another field). I’m glad I did the IT thing in my 20’s, and still like to tinker with homelab stuff 20 years later, but in the end it was a stepping stone towards something else (that does require formal schooling). The degree is a tool that can be used to control on a few more things in your life, in the hopes that you can go where you want to end up.


No, it’s not volunteering, at least not anymore.
Subpoena is legal Latin for “under penalty,” because noncompliance with a subpoena carries a penalty.
Originally, it was an information request from the feds, and Reddit refused. Then they escalated to getting a grand jury subpoena (which means they got a bunch of normal citizens to agree that the information was relevant to a criminal investigation), so now noncompliance carries a penalty.
Reddit notified the users, who hired their own lawyers, who are resisting the subpoena and will litigate it to where they need a judge to decide whether Reddit will have to turn the information over.
That’s the process for these things, and we’re a couple steps in already.


Doordash puts the tip in the order so that drivers can view them as bids for service, and they generally snatch up the highest tipping orders immediately. Low/no tip orders with long distances or a lower rated customer might languish, unclaimed, because drivers won’t want to take them for insufficient pay.


The article describes how they immediately went to look for an unsigned 32-bit millisecond counter when they noticed it was happening around 50 days since last reboot, because they already knew that association you describe.
Interesting writeup. Fun little story about the detective work involved.


God I wish democracy meant that we could vote on decisions like this
You can! Only problem is that it’s one vote per dollar instead of one vote per person.


It sucks though, I agree - software should get more efficient over time, just like hardware does.
It generally does, for any given computing task, but the problem is that generally software adds more features over time, not least of which is supporting new hardware that hits the ecosystem.
Arch’s package management is actually the ideal, in my opinion. Official repositories for the stuff the distro maintainers want to officially support, a user-maintained AUR for other common packages, and the ability to build your own software with the Arch Build System, and letting pacman know where everything is. In a sense, the stuff in the official repositories have a privileged position, and you should be aware of the difference between the AUR and the official repositories, but you’re still always in control of what software is installed.
The AUR packages and user-specific builds can be thought of as side loading, and the distinction can matter in some circumstances. So I’m ok with having another name for different installation/upgrade/update methods.


Unless it can be paper thin this does not look better than magnetic tape.
As the article explains, the whole purpose here is to be able to store data on a medium that can endure harsh conditions, including heat, moisture, radiation, and physical abrasion. The company’s website claims the medium can retain data for 5000 years without power, and is water and fire resistant.
I reckon you could scratch it pretty easily.
The underlying ceramic film is already used for protecting tools like drill bits and saw blades from physical damage, which is why it was chosen for this project. They already found one of the most durable materials in the world, and asked whether they could store data using that already-durable material.


With the current level of tech in a car, you’re already likely pushing 300GB in total.
The actual article (and the call it is reporting on, with statements from the CEO) says that 16GB is the average in new cars today. No need to make stuff up.
I’m with you.
GoPro obviously found a really interesting niche that they dominated for about 10 years, and POV videos can still be cool for sports and things like that where the videographer tends not to have hands available for actually holding a camera. I think that’s still pretty cool, and glasses can be a useful form factor for that general use. I’m all for making camera ergonomics better.
But the AI assistant stuff and the attempts to make them part of the actual day to day (both by attempting to making them fashionable and socially normalizing a camera pointing at everything all the time) is obviously a bad development. Even if we implement countermeasures (re-normalizing masks in public, making lighting terrible for digital cameras, etc.) it wouldn’t be a symmetrical effort.


Education and enterprise still have a need for a lot of group-managed laptops. Not all of them will be power users, either. Some of them won’t even have sophisticated IT departments (thinking about elementary schools and the like where their IT needs might not run very high).
I agree that we’re probably seeing the waning days of the casual laptop user who administers their own system as an independent device. Everyone will either be further up the enthusiast/power user ladder or will have switched to phones and tablets.


I’m glad the MacBook neo is only 8gb. That means they have to support it as a usable low-end target.
This is huge. Apple has traditionally supported its laptops for at least 5 major OS versions and 2 more years of security updates, so they’re essentially telling us that the MacOS version they release in 2034 will not require more than 8GB of RAM to function is gonna be a good thing for all users, who will mostly presumably have much more memory available.
From the user perspective, enterprise managed Windows is locked down, too, and somehow less reliable.
Most of the software engineers I know in FAANG and similar tier companies use Macbooks to program. Poke around a coffee shop in the bay area during a weekday and look around.
And personally, I switched to Mac about 15 years ago mainly because dependency management and the shell made more sense to me coming from Linux. Windows has always been trash, and most other non-Apple OEMs make the actual physical laptop experience worse (hinges, behavior on closing the lid, trackpad behavior and size, power management, display quality in both brightness and pixel density, webcam/audio behavior).