My friend is a full stack programmer with over 15 years experience with one of the largest financial institutions. So he can handle what you’re talking about no problem. But what IS a huge problem is that the reason he has the requisite knowledge now is because he spent years learning best practices by doing the grunt work that’s going to disappear. So in a few years they might no longer have people with the skills to do things right and then what you’re describing will absolutely happen and build quality will go to hell. The assumption from big tech is by then the models will have improved enough it won’t matter by then.
That’s a hell of an assumption. Since we’re whipping out credentials, I’ve been in IT almost 30 years and I can tell you it’s not going to work like that.
Since we’re whipping out credentials, I’ve been in IT almost 30 years and I can tell you it’s not going to work like that.
I’m not the person you were replying to but I’ve also been in tech since 1996 and lots of things have worked just like that. All successful technology starts off barely functional and improves over time until nearly all members of it’s intended audience can successfully use it.
As an example in 1996 setting up a router was a specialty task that required training, by 2016 any moron could buy one off the shelf and have it running in an hour. As another example basic HTML was a specialty skill in 1996 but by 2003 you could do it with Microsoft Word. Smartphones are another example, they went from barely functional Windows Mobile and Blackberry devices which required ridiculous amounts of back end skill to deliver email to iPhones and Androids that any numskull can use for nearly anything at all.
My point is this; too many people are stuck on the “What use is a newborn baby?” question without realizing that the infant is growing-up at blinding speed. It’s also the first technology to carry the promise, real or not, of self-improvement when it reaches sufficient maturity. Assuming that happens all further improvement will be increasingly automatic and happen even faster.
AI isn’t going away and it’s only going to get better as time goes on.
I can see, in programming, how the current AI trend is displacing a lot of junior programmers who will not be senior programmers in 10 years due to the inability to obtain experience.
AI hasn’t come for DevOps or SysAdmins jobs either, but it’s ‘good enough’ to do help-desk/tier 1-type tasks. That limits the job pool for new IT workers and will create a future shortage of experienced workers.
I’m not worried about MY job, I’ve already accumulated the experience. It’s the new guys who are trying to get into support positions, where they are glorified knowledge base/Google searchers, who are having the hard time because AI CAN do search and summarization/RAG pretty effectively.
My friend is a full stack programmer with over 15 years experience with one of the largest financial institutions. So he can handle what you’re talking about no problem. But what IS a huge problem is that the reason he has the requisite knowledge now is because he spent years learning best practices by doing the grunt work that’s going to disappear. So in a few years they might no longer have people with the skills to do things right and then what you’re describing will absolutely happen and build quality will go to hell. The assumption from big tech is by then the models will have improved enough it won’t matter by then.
That’s a hell of an assumption. Since we’re whipping out credentials, I’ve been in IT almost 30 years and I can tell you it’s not going to work like that.
I’m not the person you were replying to but I’ve also been in tech since 1996 and lots of things have worked just like that. All successful technology starts off barely functional and improves over time until nearly all members of it’s intended audience can successfully use it.
As an example in 1996 setting up a router was a specialty task that required training, by 2016 any moron could buy one off the shelf and have it running in an hour. As another example basic HTML was a specialty skill in 1996 but by 2003 you could do it with Microsoft Word. Smartphones are another example, they went from barely functional Windows Mobile and Blackberry devices which required ridiculous amounts of back end skill to deliver email to iPhones and Androids that any numskull can use for nearly anything at all.
My point is this; too many people are stuck on the “What use is a newborn baby?” question without realizing that the infant is growing-up at blinding speed. It’s also the first technology to carry the promise, real or not, of self-improvement when it reaches sufficient maturity. Assuming that happens all further improvement will be increasingly automatic and happen even faster.
AI isn’t going away and it’s only going to get better as time goes on.
Then you’re not dealing with cutting edge tech. Living in the past isn’t going to help you.
Thank you for assuming what I do or don’t do, or what I’m plugged into or not.
I can see, in programming, how the current AI trend is displacing a lot of junior programmers who will not be senior programmers in 10 years due to the inability to obtain experience.
AI hasn’t come for DevOps or SysAdmins jobs either, but it’s ‘good enough’ to do help-desk/tier 1-type tasks. That limits the job pool for new IT workers and will create a future shortage of experienced workers.
I’m not worried about MY job, I’ve already accumulated the experience. It’s the new guys who are trying to get into support positions, where they are glorified knowledge base/Google searchers, who are having the hard time because AI CAN do search and summarization/RAG pretty effectively.
Bingo!