The first sentence of the article shows the problem.
For years, we heard about the tech talent shortage — that there were a glut of jobs and not enough bodies to fill them.
The problem wasn’t ever “bodies,” which people have always misunderstood. It’s qualified workers.
I worked in tech for a long time, at a bunch of different companies, and I never once worked anywhere that there was a glut of jobs and “not enough bodies” to fill them.
The people going into these careers includes a large number of people who want the money but aren’t qualified do what we’re looking for.
Its more than that; companies also continuously propagate the message of “shortage of workers” while continuing to raise the requirements for entry level positions more and more. It reaches a point where “entry level” is not attainable for most fresh grads to get experience, and keeps their starting wages (and continuing wages) very well depressed due to the high supply.
Its a very targeted campaign to make sure educated workers are oversupplied, tied down with student debt, and don’t get too many ideas of independence in their heads.
It’s a bit more nuanced than that. A lot of college grads I’ve interviewed come out expecting to be senior level when they don’t even have a basic foundation of IT. Don’t expect to get paid 6 figures right out of college when you have 0 experience and can’t even provide basic answers to questions that help desk people know. Colleges have lied to them that we(the IT industry) needs them and that they’re special. Show me you have the foundation before telling me how the industry works.
can’t even provide basic answers to questions that help desk people know
University is not a job training program though. A degree demonstrates that you have the skills to figure things out, not that you already have everything figured out. Even with decades of experience, it takes me a bit of time to spin up on a new library, framework, programming language, etc.
Companies are supposed to provide this training, not just to new hires, but to all employees. It does take a little extra time to teach new hires, but their salaries are also lower so it should balance out. And if they want to keep those employees around, then they should give them generous pay increases so they don’t just jump for a salary increase.
Sorry but a degree just demonstrates that you can pass exams and follow rules. Almost all new graduates I knew had a big ego, a lack of critical thinking, that combined in a massive Dunning Kruger effect. They are better middle management material than engineers. They can’t even RTFM, like c’mon. And AI is just making all this worse.
I don’t expect you to know everything, but while you’re in college you can still learn AD, spin up a server, make a domain. See the basics of a web server, see how HFWs work…the foundation of IT. Companies shouldn’t be paying you and paying to train you for learning things that, if you’re interested in this career path, you should have learned on your own.
I don’t mean “doesn’t know the flavour of Linux” I mean doesn’t conceptually know what a web server is so can’t restart the service running on the box.
Yeah, it’s going to be a couple years before you break into the high earner. The problem is that silly valley was hiring tech grads at $300k total comp when money was cheap. Money isn’t cheap anymore.
Not to mention that many IT degrees are basically worthless as far as practical experience is concerned. You’d be better off spending $100k on certification training.
Fresh college grads should presumably be taking entry level / junior positions unless something about the candidate speaks for itself, it’s wild how hostile you’re acting to the notion of having to teach people who are new to the field how to work professionally in it.
Given that out of college they’d typically at best have internship experience of some kind. People got to start somewhere.
I knew of a company that listed an internal tool as a job requirement so they could claim a skill shortage and hire foreign workers. They coached them to put that tool on their resume.
It doesn’t help that conpanies lie on their requirements in job postings. Even entry level retail jobs are asking for 2-3 years of retail experience. That’s just insulting to those with retail experience and an impossible “entry level” requirement. Leads people to just ignore any requirements.
It’s not just that. HR departments (who, let’s be honest, were never exactly super-clear on what tech roles are or do because they’re busy with everything else) have been infected by AI to the point that no one can just see a job and apply for it unless they rearrange everything in the resume to match the job posting verbiage exactly.
Everyone who makes it past that hurdle are sorted lowest-to-highest salary requirements. Oh you have seventeen years experience? Fuck you. Everyone after that is sorted by age/race/ whatever. It’s the perfect system for fucking up tech hiring.
Unless you rebrand everything you do as AI. Then you’ll get 100 million dollars from Zuckabug. (It used to be “cloud” but that was a long time ago now). So the tech manager who knows what they’re looking for gets a bunch of applications from newbies who talk like AI is everything and they don’t want that.
How much is it that these companies don’t want to train. I have a hard time believing your job is so advanced and technical you couldn’t find someone qualified at any point.
Training people up would be a great idea when you have the attitude that you’re going to keep working there for 30 years. Those old “company man” jobs are all but gone. If you stay at a job 5 years, people start to wonder if there’s something wrong with you. That’s just starting to be enough time for training to be worth the investment.
If tech was unionized, and the union had the attitude that they are basically a trade guild that will build up your skills, that would change things.
In my experience there is a huge gap between those that are smart and enthusiastic and those that are just average. I consider myself part of the former group and I can’t blame coworkers for just doing their job and go home. But it means the gap just widens.
This has been my experience as well, since I started in community college in the early 2000s.
There is an unfortunately large difference in tech between a person who has an innate interest and someone who is checking the boxes to get and keep a job.
But one you can underpay and abuse because they are excited. The other has a lot better idea of what they’ll accept and will leave when it’s not worth it anymore.
Depends on what you see as “the job”. I would prefer many projects to be better than they currently are, both from the end user and the developer side.
When I think about the projects I have seen, you need very good people to clean up technical debt in a viable and sustainable way, as well as develop in a way that is sustainable and maintainable in a good way long term.
If you don’t have very good people, code quality devolves quickly, whereas the negative impact is felt a bit later, and at that point, it’ll be hard for most people to clean up and improve the project in a reasonable fashion, and it usually never happens.
The skill, experience, and being able to grasp what needs to be grasped gap is one thing, the time people are in a project or firm is another.
In the end, it depends on what the job is. Sure, most apps work. But there are so many applications that annoy and hinder me as a user. Even as a user, it’s a mess. I’m sure the dev team doesn’t have it much better on those projects.
With very good people acting as mentors and guidance, others can certainly get the job done, and contribute in productive ways. Most importantly, they learn and improve significantly.
I guess overall it’s not really about the big gap, but more of a continuum of skill. There’s certainly a weighted spread though.
You went on though to describe how difficult and technical the skill is needed to wade through this code. Which is kind of in my mind what I’ve seen with so many jobs. People in roles for long periods of time have this ability to make their job seem like the most difficult thing possible. Lately I’ve been watching these window tinting competitions. Listening to those guys describe putting on window tint always reminds me of tech guys. It’s like at some point just chill out. It isn’t that important.
Average in my mind is what the hiring process should be looking for. I would think the average one is someone who gets the job done. Like there is some diminishing returns trying to find above average or hire. The effort needed to get that best of the best turns into what we have now. Plus what I see, the best of the best are job hopping anyways.
(…) I never once worked anywhere that there was a glut of jobs and “not enough bodies” to fill them.
I have. My first job wasn’t the worst at this but it happened to some extent. My last company had such a huge disparity between work and employees that every single one of their IT projects (dunno about the rest) was in constant state of delays, hotfixing and putting out fires. Things were so bad people were moved between projects on daily basis and at one point management decided to throw everyone in the department (including folks who just joined the company and newbies with little to no programming experience) to triage one of them.
That’s not to mention poaching team members from projects they promised more bodies to (only informing the client about the latter decision) and many other issues. They absolutely needed more people but the way the company is run does little to help with that.
The worst party? They’re still growing as a company while their burnout rate stays unchanged. So yeah, it’s a thing.
The first sentence of the article shows the problem.
The problem wasn’t ever “bodies,” which people have always misunderstood. It’s qualified workers.
I worked in tech for a long time, at a bunch of different companies, and I never once worked anywhere that there was a glut of jobs and “not enough bodies” to fill them.
The people going into these careers includes a large number of people who want the money but aren’t qualified do what we’re looking for.
Its more than that; companies also continuously propagate the message of “shortage of workers” while continuing to raise the requirements for entry level positions more and more. It reaches a point where “entry level” is not attainable for most fresh grads to get experience, and keeps their starting wages (and continuing wages) very well depressed due to the high supply.
Its a very targeted campaign to make sure educated workers are oversupplied, tied down with student debt, and don’t get too many ideas of independence in their heads.
It’s a bit more nuanced than that. A lot of college grads I’ve interviewed come out expecting to be senior level when they don’t even have a basic foundation of IT. Don’t expect to get paid 6 figures right out of college when you have 0 experience and can’t even provide basic answers to questions that help desk people know. Colleges have lied to them that we(the IT industry) needs them and that they’re special. Show me you have the foundation before telling me how the industry works.
University is not a job training program though. A degree demonstrates that you have the skills to figure things out, not that you already have everything figured out. Even with decades of experience, it takes me a bit of time to spin up on a new library, framework, programming language, etc.
Companies are supposed to provide this training, not just to new hires, but to all employees. It does take a little extra time to teach new hires, but their salaries are also lower so it should balance out. And if they want to keep those employees around, then they should give them generous pay increases so they don’t just jump for a salary increase.
Sorry but a degree just demonstrates that you can pass exams and follow rules. Almost all new graduates I knew had a big ego, a lack of critical thinking, that combined in a massive Dunning Kruger effect. They are better middle management material than engineers. They can’t even RTFM, like c’mon. And AI is just making all this worse.
I don’t expect you to know everything, but while you’re in college you can still learn AD, spin up a server, make a domain. See the basics of a web server, see how HFWs work…the foundation of IT. Companies shouldn’t be paying you and paying to train you for learning things that, if you’re interested in this career path, you should have learned on your own.
I don’t mean “doesn’t know the flavour of Linux” I mean doesn’t conceptually know what a web server is so can’t restart the service running on the box.
Yeah, it’s going to be a couple years before you break into the high earner. The problem is that silly valley was hiring tech grads at $300k total comp when money was cheap. Money isn’t cheap anymore.
AI money is stupid cheap if you know who to bullshit. And, y’know, have no principles.
God this is true.
I’ve seen some real snake oil projects get massive finding and everyone on board getting promos.
The number of times I’ve had to just say “thank you for your time” and cut a interview shoot is way to much. Shit like this is way way to common.
Not to mention that many IT degrees are basically worthless as far as practical experience is concerned. You’d be better off spending $100k on certification training.
%100 agreed on that. The amount of on the job training I’ve got to put into fresh college grads is insane.
Fresh college grads should presumably be taking entry level / junior positions unless something about the candidate speaks for itself, it’s wild how hostile you’re acting to the notion of having to teach people who are new to the field how to work professionally in it.
Given that out of college they’d typically at best have internship experience of some kind. People got to start somewhere.
I knew of a company that listed an internal tool as a job requirement so they could claim a skill shortage and hire foreign workers. They coached them to put that tool on their resume.
It doesn’t help that conpanies lie on their requirements in job postings. Even entry level retail jobs are asking for 2-3 years of retail experience. That’s just insulting to those with retail experience and an impossible “entry level” requirement. Leads people to just ignore any requirements.
It’s not just that. HR departments (who, let’s be honest, were never exactly super-clear on what tech roles are or do because they’re busy with everything else) have been infected by AI to the point that no one can just see a job and apply for it unless they rearrange everything in the resume to match the job posting verbiage exactly.
Everyone who makes it past that hurdle are sorted lowest-to-highest salary requirements. Oh you have seventeen years experience? Fuck you. Everyone after that is sorted by age/race/ whatever. It’s the perfect system for fucking up tech hiring.
Unless you rebrand everything you do as AI. Then you’ll get 100 million dollars from Zuckabug. (It used to be “cloud” but that was a long time ago now). So the tech manager who knows what they’re looking for gets a bunch of applications from newbies who talk like AI is everything and they don’t want that.
It’s super fucked.
How much is it that these companies don’t want to train. I have a hard time believing your job is so advanced and technical you couldn’t find someone qualified at any point.
Training people up would be a great idea when you have the attitude that you’re going to keep working there for 30 years. Those old “company man” jobs are all but gone. If you stay at a job 5 years, people start to wonder if there’s something wrong with you. That’s just starting to be enough time for training to be worth the investment.
If tech was unionized, and the union had the attitude that they are basically a trade guild that will build up your skills, that would change things.
In my experience there is a huge gap between those that are smart and enthusiastic and those that are just average. I consider myself part of the former group and I can’t blame coworkers for just doing their job and go home. But it means the gap just widens.
This has been my experience as well, since I started in community college in the early 2000s.
There is an unfortunately large difference in tech between a person who has an innate interest and someone who is checking the boxes to get and keep a job.
Both would get the job done wouldn’t they?
But one you can underpay and abuse because they are excited. The other has a lot better idea of what they’ll accept and will leave when it’s not worth it anymore.
Not in the same way… which is the issue.
It’s a skilled profession, so ideally you want someone who is more skilled, and the person who has interest is more skilled.
It works similarly with other skilled professions like carpenters.
I’ve been in both industries. Hiring carpenters you’re hiring people who have qualifications and experience. The way it should be.
You’re not trying to make the carpenters calculate the roofing truss cuts through convoluted 3 days of interviews.
I believe Tech hiring is more about ego of the hiring managers and team more than it is about hiring qualified people.
I’ve never been on a team or seen a team where this was the case. We just wanted people who could do the job well, and they were hard to find.
I actually don’t understand where manager/team ego ever fits in, as someone who hired a lot of bootcamp grads.
Depends on what you see as “the job”. I would prefer many projects to be better than they currently are, both from the end user and the developer side.
When I think about the projects I have seen, you need very good people to clean up technical debt in a viable and sustainable way, as well as develop in a way that is sustainable and maintainable in a good way long term.
If you don’t have very good people, code quality devolves quickly, whereas the negative impact is felt a bit later, and at that point, it’ll be hard for most people to clean up and improve the project in a reasonable fashion, and it usually never happens.
The skill, experience, and being able to grasp what needs to be grasped gap is one thing, the time people are in a project or firm is another.
In the end, it depends on what the job is. Sure, most apps work. But there are so many applications that annoy and hinder me as a user. Even as a user, it’s a mess. I’m sure the dev team doesn’t have it much better on those projects.
With very good people acting as mentors and guidance, others can certainly get the job done, and contribute in productive ways. Most importantly, they learn and improve significantly.
I guess overall it’s not really about the big gap, but more of a continuum of skill. There’s certainly a weighted spread though.
I don’t think you answered the question.
You went on though to describe how difficult and technical the skill is needed to wade through this code. Which is kind of in my mind what I’ve seen with so many jobs. People in roles for long periods of time have this ability to make their job seem like the most difficult thing possible. Lately I’ve been watching these window tinting competitions. Listening to those guys describe putting on window tint always reminds me of tech guys. It’s like at some point just chill out. It isn’t that important.
Average in my mind is what the hiring process should be looking for. I would think the average one is someone who gets the job done. Like there is some diminishing returns trying to find above average or hire. The effort needed to get that best of the best turns into what we have now. Plus what I see, the best of the best are job hopping anyways.
I have. My first job wasn’t the worst at this but it happened to some extent. My last company had such a huge disparity between work and employees that every single one of their IT projects (dunno about the rest) was in constant state of delays, hotfixing and putting out fires. Things were so bad people were moved between projects on daily basis and at one point management decided to throw everyone in the department (including folks who just joined the company and newbies with little to no programming experience) to triage one of them.
That’s not to mention poaching team members from projects they promised more bodies to (only informing the client about the latter decision) and many other issues. They absolutely needed more people but the way the company is run does little to help with that.
The worst party? They’re still growing as a company while their burnout rate stays unchanged. So yeah, it’s a thing.