

It’s odd, since they used to have a rather nice HTML web interface specifically for low-peformance devices, but it’s since gone away.


It’s odd, since they used to have a rather nice HTML web interface specifically for low-peformance devices, but it’s since gone away.


This doesn’t seem so bad, though. 2 GB more in about 10 years is pretty reasonable in terms of an increase.
It’s not like they doubled it.


deleted by creator


A little confused, is this basically the same thing as Open Street Maps, just in app form?


Is that not on Krafton for buying Unknown Worlds for $500 million, and then offering an additional $250 million if they achieve particular goals?
If it was unrealistic, then don’t buy the company for that much, and provide a contract with those terms.
From Unknown Worlds’ perspective, it would have been irresponsible not to take the deal, assuming no other conditions.
That Krafton’s CEO got buyer’s remorse isn’t their problem to deal with. Caveat emptor and all that.


To be fair, $500 million is a lot of money.
You can barely blame them for not wanting to turn that down.
Should it pan out as planned, they’d get another quarter of a billion. That’s money enough that if you’re halfway sensible with it, you and your descendants would never have to work again.
Even when evenly divided across the entire company, it’s still a life-changing amount. ($1.6 - 2.3 million per person)


It’s also quite unexpected, given that it’s Apple, and they’ve traditionally made more expensive machines, with worse hardware. In my country, for example, it is nearly unheard of for a new Apple computer to cost less than four digits/US$800+.
Particularly at a time when it’s more typical to hear of new computer prices going up instead, due to shortages.


A projector might be an option, but they have their own problems, like with the contrast not being great.


Would it not make sense for them to? Since they make budget televisions, they have to subsidise the cost somehow.
Either that, or because they’re so budget, you’d expect them to cheap out on the electronics and not bother with anything that sophisticated compared to a bare-minimum chip.


It’s also pretty important infrastructure. Even before AI, one of the major providers datacentres going down would take out a solid chunk of modern internet.


All he made was some dinky algorithm. Google Bard could do that in three minutes flat smh.


I do wish that more games still had cheats. It does feel a bit like a lot of newer games have foregone them entirely. You can’t type plane into GTA V, and have a plane materialise, like you could in Vice City, for example.
You’d need to mod it in.


It might also be groundwork for more complicated things on their GPUs.
The article says nothing about nVidia actually planning to enter the desktop CPU market, only that a bunch of unrelated analysts compared the CPU performance, and said it was about equal to what’s on the market.


Quite surprised that they are pushing that, seeing as one of the biggest obstacles for Windows 11 getting adopted was that a lot of the existing hardware didn’t support the TPM requirements it put in place.
Doing it again so soon seems like a recipe to make people not want to use 12 at all. After all, Windows 11 works fine for them, why change so soon?
Right, but the volume was the issue. The cURL team could only work through and verify them so quickly, so the deluge of bug reports just made it impractical for them to dedicate time to sort through it. The idea in getting rid of the bug bounty being that there would be less of an incentive to generate and write a bogus bug report.
If it was just a small handful of fake security reports, they wouldn’t have minded nearly as much.
It was volume that was more the issue with the bug bounty program.
They were flooded, and recognising it is all well and good, but not if there’s no good way to filter it out, not without massive collateral.


It does make it harder to find them, because the phrasing is similar, but not identical due to randomness.
Whereas before, you could probably filter a good chunk of it out by just finding the same message/keywords and filtering by that.


Is it reasonable for them to keep their own local snapshots?
That’s not a trivial amount of work and data, particularly it it’s multimedia.


It is an online poll. You also have to consider that some people don’t care/want to be funny, and so either choose randomly, or choose the most nonsensical answer.
According to the article linked in the article, it’s not that the operating system itself is more demanding, but more that the DE, and Browsers/Websites are more demanding now.
It feels like that Canonical basically needs to do the games thing of having a set of minimum specs for Ubuntu to run at all, and a recommend specs for Ubuntu to run well. Canonically basically bumped up the latter, but it’s being taken as the former.