You can get loads of frames per second with cloud gaming, just not necessarily from the right second.
You can get loads of frames per second with cloud gaming, just not necessarily from the right second.


Password managers are supposed to be designed to resist a situation where they’re compromised, and are only ever supposed to see a mysterious blob of encrypted data without ever having access to any information that would help decrypt it. The headline’s more like M1 Abrams Tanks Vulnerable to Small Arms Fire - it’d be totally expected that most things die when shot with bullets, but the point of a tank is that it doesn’t, so it’s a big deal if it does.


If it’s the problem that I’ve seen people complain about in the past, it’s effectively the same as HTTPS ‘not supporting’ end to end encryption because it runs over IP and IP packets contain the IP address of where they need to go, so someone can see that two IP addresses are communicating, which is unavoidable as otherwise there’s nothing to say where the data needs to go, so no way for it to get there. Someone did a blog post a couple of years ago claiming Matrix was unsecure as encrypted messages had their destination homeserver in plaintext, but that doesn’t carry any information that isn’t implied by the fact that the message is being sent to that homeserver’s IP.


I reckon it depends on how warm someone’s home is and how good their circulation is. If I don’t have shoes on indoors, then for half the year it feels like my feet have been stabbed because they get so cold (slippers are not enough), but I don’t wear the same shoes indoors as outdoors. I suspect that if we set the heating higher and the house wasn’t constructed in a way that makes the floor always much colder than a few inches above the floor, this wouldn’t be a problem.


Investors managed to pour billions into making the metaverse bubble, even though that was just video games being invented a second time by people so uninterested in them that they hadn’t noticed they’d already been around for decades. There’s no reason to think that investors know what they are beyond something on a computer, so obviously they’d see something else on the computer as a viable competitor.
With energy prices in the UK being what they are, it’s only raw potatoes that are cheaper than bread. At least toast toasts quickly, so isn’t that energy-intensive compared with boiling a pan of water.
I dug up a manual for the Windows 3.1 SDK, and it turns out that it had the same GetVersion function with the same return value as the Windows 2000 SDK, and it’s just that the live MSDN docs pretend that Windows 2000 was the first version of Windows, so show that as the earliest version that every function that came from an older version of Windows. http://bitsavers.informatik.uni-stuttgart.de/pdf/microsoft/windows_3.1/Microsoft_Windows_3.1_SDK_1992/PC28914-0492_Windows_3.1_SDK_Getting_Started_199204.pdf page 31.
I then looked at a manual for the Windows 1.03 SDK, and it, too, has a matching GetVersion function.
The only change to GetVersion over the entire history of Windows is that at some point it switched from returning a sixteen bit value with eight bits for the major version and eight bits for the minor version to a 32-bit value with bits split between major, build number and minor versions, and then later on, GetVersionEx was added to return those numbers as members of a struct instead. There has never been a version of Windows where string comparisons of the display name were appropriate or recommended by Microsoft.
If you’re checking for Windows 9 in order to disable features, which is what the jump straight to ten was supposed to protect against (when running a 16-bit binary for 3.1/95 on 32-bit Windows 10, it lies and says it’s Windows 98), then you’re using at least the Windows 2000 SDK, which provided GetVersion, which includes the build and revision numbers in its return value, and the revision number was increased over 7000 times by updates to Windows 2000.
There was a function that would give you a monotonically-increasing build number that you could compare against the build that any given feature was added in that people should have used, but there was also a function that gave you the name of the OS, and lots of people just checked if that contained a 9. The documentation explicitly said not to do that because it might stop working, but the documentation has never stopped people using the wrong function.
And the context was a sentence that was correct if you used OED sense 1, or MW sense 1, but you decided to parse it as MW sense 2b and then complain that the sentence was incorrect.
OED:
- totally or partially resistant to a particular infectious disease or pathogen.
- protected or exempt, especially from an obligation or the effects of something.
Merriam Webster
: not susceptible or responsive
especially: having a high degree of resistance to a disease
a: produced by, involved in, or concerned with immunity or an immune response
b: having or producing antibodies or lymphocytes capable of reacting with a specific antigen
a: marked by protection
b: free, exempt
So unless you pretend that MW’s 2b sense is the only valid one, the immunity is immunity.
If you have a sample of HIV at 37°C in blood, but with all the immune cells removed, it’ll still all become inert after around a week simply due to chemical reactions with other components of blood etc… It’s pretty comparable to a population of animals - if you take away their ability to reproduce, they’ll die of old age when left for long enough even if you’re not actively killing them.
Edit: fat-fingered the save button while previewing the formatting
Even if you ignore that there’s an entirely valid sense of the word immune that has nothing do do with biology (i.e. the one in phrases like diplomatic immunity), my original comment is entirely consistent with the dictionary definition of the biological sense of the word. There are probably sub-fields of biology where immunity is used as jargon for something much more specific than the dictionary definition, but this is lemmyshitpost, not a peer-reviewed domain-specific publication.
When a normal person is exposed to HIV, it reproduces inside of them, so can then go on to expose more people, and if there’s enough of it, infect them in turn (if there’s a smaller amount, their immune system will normally be able to clean it up before it gets enough of a foothold). If someone’s lacking the receptor, then no matter how much they were exposed to, their immune system will eventually manage to remove it all without becoming infected because it can’t reproduce. If they had a ludicrously large viral load, then there’s a possibility that it could be passed on before it was destroyed, but most of the ways people get exposed to HIV aren’t enough to infect someone who’s vulnerable, let alone infect someone else via secondary exposure if there’s not been time for the infection to grow.
People without the receptor that HIV targets are immune to HIV because of that, like how a rock is immune to verbal abuse or double foot amputees are immune to ingrown toenails. The immune system being able to kill something isn’t the only way things can be immune to other things.
That tests the AIDS immunity, but not whether there are off-target edits. IIRC, the mothers were all HIV-positive, so the children are all pretty likely to be exposed anyway, which was part of how he justified the experiment to himself.
If he got incredibly lucky, they’re immune to AIDS. It’s much more likely that they’re not and will develop symptoms of new and exciting genetic disorders never seen before.
The biggest problem was that the technique used is really unreliable, so you’d expect off-target edits to be more common than on-target ones for a human-sized genome. For bacteria, you can get around it by letting the modified bacteria reproduce for a few generations, then testing most of them. If they’re all good, then it worked, and if any aren’t, you need to make a new batch. Testing DNA destroys the cells you’re testing, so if you test enough cells in a human embryo to be sure that the edits worked, it dies. You can’t just start when the embryo is a single cell to ensure that the whole thing’s been edited in the same way as you need to test something pre-edit to be able to detect off-target edits.


To be fair, if I had all that money, I’d probably just pay someone to figure out how to make it do the most good, and continue spending at least some of my time shitposting. It’s okay to have hobbies, but it’s bad to hoard the money or invest it in evil.
I’ve found this is really dependent on placement. If I put my libre a couple of centimeters away from the region I usually use, it’ll read low all night, but as long as I stick to the zone I’ve determined to be fine, it’ll agree with a blood test even if I’ve had pressure on it for ages. Also, the 3 is more forgiving than the 1 or 2 because it’s smaller than the older models, so affects how much the skin bends and squishes less.
CV padding and main character syndrome.