

Lol confirmed both idiot and troll, thanks :)
Lol confirmed both idiot and troll, thanks :)
All right, we are done here. I’ve tried to engage with you in a fair and honest way. Giving you the benefit of the doubt and trying to respond to the points you are trying to make.
But it appears you are just a troll or an idiot, either way I’m done.
Well maybe one person is a little bit more impressed by some pretty pictures than another person. I really don’t see what that has to do with a company like Microsoft putting their money into this? They don’t make songs or movie trailers.
To me I’m stunned but that’s just me, on top of this we’re only in year like 5 of AI going mainstream, where will it be in 10 years? 20 years?
This is a common trap a lot of people fall into. See what improvements have been made the last couple of years, who knows where it will end up right? Unfortunately, reality doesn’t work like that. Improvements made in the past don’t guarantee improvements will continue in the future. There are ceilings that can be run into and are hard to break. There can even be hard limits that are impossible to break. There might be good reasons to not further develop promising technologies from the past into the future. There is no such thing as infinite growth.
Edit:
Just checked out that song, man that song is shit…
“My job vanished without lift.” What does that even mean? That’s not even English.
And that’s just one of the dozens of issues I’ve seen in 30 secs. You are kidding yourself if you think this is the future, that’s one shit future bro.
What’s your point?
Sure that’s the point of venture capital, throwing some money at the wall and see what sticks. You’d expect to have most of them fail, but the one good one makes up for it.
However in this case it isn’t people throwing some money at startups. It’s large companies like Microsoft throwing trillions into this new tech. And not just the one company looking for a little niche to fill, all of them are all in, flooding the market with random shit.
Uber and Spotify are maybe not the best examples to use, although they are examples of people throwing away money in hopes of some sort of payoff (even though they both made a small profit recently, but nowhere near digging themselves out of the hole). They are however problematic in the way they operate. Uber’s whole deal is exploiting workers, turning employees into contractors just to exploit them. And also skirting regulations around taxis for the most part. They have been found to be illegal in a lot of civilised countries and had to change the way they do business there, limit their services or not operate in those countries at all. Spotify is music and the music industry is a whole thing I won’t get into.
The current AI bubble isn’t comparable to venture capital investing in some startups. It’s more comparable to the dotcom bubble, where the industry is perceived to move in a certain direction. Either companies invest heavily and get with the times, or they die. And smart investors put their money in anything with the new tech, since that’s where the money is going to be made. Back then the new tech was the internet, now the new tech is AI. We found out the hard way, it was total BS. The internet wasn’t the infinite money glitch people thought it was and we all paid the price.
However the scale of that bubble was small as compared to this new AI bubble. And the internet was absolutely a trans-formative technology, changing the way we work and live forever. It’s too early to say if this LLM based “AI” technology will do the same, but I doubt it. The amount of BS thrown around these days is too high. As someone with a somewhat good grasp of how LLMs actually work on a fundamental level, the promised made aren’t backed up by facts. And the amount of money being put into this aren’t near any even optimistic payoff in the future.
If you want to throw in a simple, over simplified example: This AI boom is more like people throwing money at Theranos than anything else.
Not simply operating at a loss, absolutely dumping their prices giving away their products for almost nothing to gain market share. They are burning money at an impressive rate, just for some imaginary payoff in the future.
Ah yes that moment when the email comes in saying a game from your wishlist is on sale. You’ve been waiting for two years for the $39.99 game to drop in price. Damn, $7.99, still too much. I guess I’ll wait a little bit longer till it drops further and I can afford it.
And that one game that’s been $24.99 for as long as you’ve wanted it, staring you in the face. Maybe someday, maybe…
There is another factor in this which often gets overlooked. A LOT of the money invested right now is for the Nvidia chips and products based around them. As many gamers are painfully aware, these chips devalue very quickly. With the progress of technology moving so fast, what was once a top of the line unit gets outclassed by mid tier hardware within a couple of years. After 5 years it’s usefulness is severely diminished and after 10 years it is hardly worth the energy to run them.
This means the window for return on investment is a lot shorter than usual in tech. For example when creating a software service, there would be an upfront investment for buying the startup that created the software. Then some scaling investment in infrastructure and such. But after that it turns into a steady state where the input of money is a lot lower than revenue from the customer base that was grown. This allows to get returns on investment for many years after that initial investment and growth phase.
With this Ai shit it works a bit different. If you want to train and run the latest models in order to remain competitive in the market, you would need to continually buy the latest hardware from Nvidia. As soon as you start running on older hardware, your product would be left behind and with all the competition out there users would be lost very quickly. It’s very hard to see how the trillions of dollars invested now are ever going to be recovered within the span of five years. Especially in a time where so much companies are dumping their products for very low prices and sometimes even for free.
This bubble has to burst and it is going to be bad. For the people who were around when the dotcom bubble burst, this is going to be much worse than that ever was.
An LLM cannot be anything other than a bullshit machine. It just guesses at what the next word would likely be. And because it’s trained on source data that contains truths as well as non truths, by chance sometimes what comes out is true. But it doesn’t “know” what is true and what isn’t.
No matter what they try to do, this won’t change. And is one of the main reasons the LLM path will never lead to AGI, although parts of what makes up an LLM could possibly be used inside something that gets to the AGI level.
I don’t know, most experimental technologies aren’t allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.
And the war on drugs was extremely effective and didn’t cause any bad things at all!
Unfortunatly even when you are in the hospital when this happens and everyone around you is aware of what’s happening fast enough to act, it’s probably still fatal. Often times this happens deep inside the brain, there is no way to get someone into brain surgery fast enough. And even if somehow the doctors can get in there, often there is nothing to be done. If it’s deep in the brain, there is no good way of getting in there without causing a lot of damage and depending on the exact situation it can’t even be fixed.
It’s just one of those really sad things that happens without anybody being able to do something about it.
This is unfortunatly common in my family and I’ve had family members eating themselves up about it, if they just acted faster and got them to the hospital faster. But everyone from the hospital side was very clear about this, there is nothing that anyone could have done.
What’s the plan Dutch? We’re going to kick this kid in the ribs! What the hell kind of plan is that? Damnit Arthur, you gotta have faith!
Buffy <3
Yes, I love those. Especially the rainbow candy is excellent!
It’s good to see more and more vegan candy being sold. These days I’ve even come to expect it. When it isn’t advertised on the packaging I check the ingredients and more often than not I see they are in fact vegan.
I think they meant people don’t know how these models work in practice. On a theoretical level they are well understood. But in practice they behave in a chaotic way (chaotic in the math sense of the word). A small change in the input can lead to wild swings in the output. So when people want to change the way the models acts by changing the system prompt, it’s basically impossible to say what change should be made to achieve the desired outcome. And often such a change doesn’t even exist, only something that’s close enough is possible. So they have to resort to trial and error, trying to tweak things like the system prompt and seeing what happens.
I humbly reserve the right to give you a hug, if you’d consent, for all you did for us. Also I would like your permission to kick you in the balls for this “announcement”. If your biology doesn’t allow for that, don’t worry, I’ve got a magic spell that can give you balls just long enough for me to kick them.
Just kidding, love you <3
I feel it’s a vaguely Australian term?