

It’s open source, so I don’t know what the scam is exactly…
It’s just a less than ideal piece of software that only runs on iOS, but I assume somebody will address that.
It’s open source, so I don’t know what the scam is exactly…
It’s just a less than ideal piece of software that only runs on iOS, but I assume somebody will address that.
This should be an arrestable offense. Fuck these pieces of shit.
The docs I linked to LITERALLY explain what it is, how it works, and the mechanics behind it. Not until DLSS 4 released earlier this year do the docs mention any kind of way to employ the use of of models to anything at all.
I mean…I guess thanks for the stepping off point? Android has the Briar Project, which couldn’t be distributed for iOS due to Apple’s license fuckery. I’m at least curious enough to look through this and see what they’ve done different.
I think the most useless part of this is using BT only which has a range of what…40ft?
You didn’t read up on it huh?
Hilarious.
I 100% know what DLSS is, though by the sounds of it you don’t. It is “AI” as much as any other thing is “AI”. It uses models to “learn” what it needs to reconstruct and how to reconstruct it.
No, you don’t. https://en.m.wikipedia.org/wiki/Deep_Learning_Super_Sampling
This is blatantly and monumentally wrong lol. You think it’s literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.
Literally in the docs: https://raw.githubusercontent.com/NVIDIA/DLSS/main/doc/DLSS_Programming_Guide_Release.pdf
What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at “1080p” Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.
No it doesn’t. It allows you to run a game at a higher resolution for no reason at all, instead of dropping to a lower resolution that your card can handle natively. That’s it.
Keep claiming otherwise, and you’re just literally denying reality and the Nvidia link to the docs right in front of you.
Like I said…you don’t know what DLSS is, or how it works. It’s not using “AI”, that’s just marketing bullshit. Apparently it works on some people 😂
You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.
Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn’t improve rendered scenes whatsoever. It’s literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.
Go read up.
And it mentioned nothing…
Continued fascist bullshit.
Are you Tucker Carlson by chance? Would LOVE to pick your brain about how you switch sides so fluidly when it’s more monetarily beneficial to you!
The exact thing a fascist would say.
Get fucked.
We really need to think about banning certain people from the Fediverse.
Get fucked, you Fascist-kink clown.
I will kink shame for this.
Low rent comment.
Second: you apparently are unaware, so just search up the phrase, but as this article very clearly explains…it’s shit. It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements. It’s like saying you want food to taste better, but then they serve you a vegan version of it. AMD’s version is technically more useful, but it’s still a dumb trick.
Stock isnt money in the bank.
See the title of this very post you’re responding to. No, I’m not OP lolz
No. AMD. See my other comments in this thread. Though they are in every major gaming console, the bulk of AMD sales are aimed at the datacenter.
First, DLSS is supported on Linux.
Second, DLSS is kinda bullshit. The article goes into details that are fairly accurate.
Lastly, AMD is at parity with Nvidia with features. You can see my other comments, but AMD’s goal isn’t selling cards for gamers. Especially ones that require an entire dedicated PSU to power them.
Actually…not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn’t have a competing product.
Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that’s not where their product line intends to go. That’s why it’s smart.
For reference: AMD has the most deployed GPUs on the planet as of right now. There’s a reason why it’s in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn’t just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.
AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia’s latest lines of Jetson are just recooked versions from years ago.
I’m still not sure what you’re refferring to.
Jack Dorsey is one of the original Twitter guys, started Square, launched Bluesky…etc.
The only company he’s been involved in that deal with money is Square, so maybe I’m not sure where the “crypto scam” is?