Nursing/Culinary majors agree to meet at the same rehab/psych ward.
We had a new member join the US Military with a Masters Degree in Music. Dude was like I am not washing dishes, I have a masters degree! Look E-3 it is your turn to wash these dishes you can sing while you wash.
join the US Military
Dude was like I am not washing dishes
Dude couldn’t swing officer with a master’s?
It’s not that straight forward. Since he clearly didn’t do ROTC he will have to compete for an OCS slot, to commission. To get that he would need a very high GPA.
Now the Army does have many musicians but promotion is so slow and competitive E6 and E7s in the military have masters in music.
Officer placement is weird (from my civilian perspective).
ROTS gets you in, then there’s OCS (I think?) for professionals? But you need to have something that’s relevant. Music won’t get you in, but engineering, some sciences, etc.
And some professions like M.D.s and J.D.s can get direct commissions.
It’s been a while since I had it explained so I probably messed a bunch of that up.
Is CS not a good option these days?
The CS jobs market fluctuates like any other market. Right this minute all the dumbass CEOs are trying to replace people with AI, just like they’ve repeatedly tried to have cheaper people in India do the jobs in the past.
Having people in India do it used to be called outsourcing, then off shoring, then a few other names, because every time it fails they have to call it something else to try again. The same will happen with AI.
I’m not the slightest worried about my own job, but it is currently a shitty market for fresh grads. Probably due to all the post-covid layoffs saturating the talent pool with more experienced people, and the aforementioned AI fad.
Which let me tell you, was real f-ing fun to have to watch unfold during my last year studying for my IT degree. The degree I went for thinking it would be the kind of thing least likely to be automated.
A lot of younger folks in IT, like myself, have been on the brink of exhaustion since 2022.
Sure, there was the “obsoletion” of PHP, Java, plain JS, etc. before, e.g. in favor of one of the JS frameworks that get released every other day.
But this one feels different. They are trying to sell you the idea of everything related to sw development and programming will get “outsourced” to a computer. The problem is, LLMs can’t do the necessary thinking to build resilient systems as they can’t “think”, neither effective nor efficient. They can be great tools when used the right way, but that’s about it.
This blog post summarizes this, admittedly subjective experience, way better than I could.
I started my degree in 2002, two years after the dotcom bust. I figured the market would rebound within five years. Right after I graduated (but thankfully after I got a job) the housing bubble burst. There’s always something happening, but software engineering is still needed and we still make bank. Being unlucky with the timing will set back your career, but probably won’t end it.
I’m not the slightest worried about my own job, but it is currently a shitty market for fresh grads. Probably due to all the post-covid layoffs saturating the talent pool with more experienced people, and the aforementioned AI fad.
Its a bit more than that I think. IT is killing its entry level job pipeline which grew people into seniors. In the infra space, we don’t really troubleshoot systems anymore in a “pets” method, we just redeploy new “cattle” meaning all the troubleshooting skills and underlying understanding of our systems you would have had doesn’t get learned anymore. For those of us that had to go through that, we’re fine because we developed the skills, but the new folks we bring in we just tell them to re-deploy to get it working.
I’m seeing this too in the software dev space. Small modules worth a few story points would have been given to junior developers to learn on and knock out getting some work done, but more importantly getting those juniors trained up with trial and error. Now an LLM can crank out mostly working code for that small module in a seconds and after a few minutes of human review that module is done. So the work is being done faster now, but the critical educational experience the juniors had before is missing.
In both infra and software dev spaces we’re cutting off our ankles, then legs, because when we retire very very few will have our skills that we had to learn, but didn’t give them the chance to learn.
You still have to debug things in a cattle approach, though. If anything there’s even more and more complex things to debug. Training will just have to shift from throwing the new hire into the deep end of the kiddie pool to something else. Granted, “something else” is probably going to be offloading it on educational institutions, which sucks for recent grads, so they’ll have to work it out somehow. Probably by creating a market for post-grad practical skills classes, is my guess.
There’s still coworkers who can’t debug worth a shit. I don’t understand. Like that was CS101
Just because they passed the class doesn’t mean they retained any of the knowledge.
You still have to debug things in a cattle approach, though. If anything there’s even more and more complex things to debug.
I would disagree on your complexity metric (for the purposes of learning troubleshooting) for cattle. What can be more complex than a completely unique system that only exist because of 10+ years of running on that same hardware with multiple in-place OS upgrade occurring along with sporadic (but not complete) patches to both the OS and the application? Throw in the extra complexity of 9 other unrelated applications running on that same server (or possibly bare metal) because the org was too cheap to spring for separate servers or OS licenses for a whole hypervisor.
If you have a memory leak in your application in a container running on k8s that will kill the pod after running for 72 consecutive hours, would you even notice it if you have multiple pods running it on a whole cluster as long as the namespace is still available?
I’ve maintained both and still do. While you may not be debugging memory leaks on k8s (although you should), you get all sorts of other fun things to debug. Things like:
- Why did our AWS bills suddenly triple?
- Why is that node accepting jobs but just hanging when they start?
- Why is that statefulset not coming back up? Is the storage still attached somewhere else perhaps?
- Why did all the data in our Kafka suddenly disappear?
- Why is everything still down after that outage? Maybe a circular dependency, thundering herd problem, or both?
- What’s wrong with my Helm chart this time?
The list goes on and on. With increased complexity you don’t get less problems, just different ones.
And nearly all of those problems are ones that other people have run into or at least have guidance on how to go about addressing. Old organically grown systems are many times unique one-offs which have little to no established path except to start diving into the fundamentals about the hardware and software.
I’m not here to get into a pissing match about who’s job is/was harder. If you think juniors have a better chance at learning on today’s systems than they did in the past, I still disagree with you. Problems exist on modern system, except juniors will rarely if ever get a chance to try to solve them and thereby learn from them.
I’m starting to think we’re talking past each other. Your last paragraph seems to imply that legacy systems were more approachable for a newbie to debug. If that’s your point I wholeheartedly agree. It’s not that hard as long as you get over the fear of fucking something up.
I do agree that juniors had an easier time learning on legacy systems, and that’s been true since the dawn of technology. Things get more complicated, and thus harder to get a deep understanding of, the more time passes. It’s a lot easier to understand older and simpler technology.
I’m a little confused why you seem to be arguing both that the issues I mentioned are easy to google, while at the same time saying newbies never get a chance to debug them. Surely, if it’s so easy, the newbie can take a stab at it?
Personally, I like to let the newbies have a stab at non-urgent issues first, and nudging them if they get stuck. They may not be able to solve the problem solo, but they know a lot more about how the system works afterwards anyway.
Now it’s “staff augmentation”
Right now my company calls it “Investing in IST”
I do feel this is a bit exaggerated. I’ve been in the industry for less than 5 years with a computer science degree. I think there is a lack of genuinely good engineers. You kinda also have to ignore tech twitter and LinkedIn telling you AI is going to replace software developers.
But long term, I think they will try and pay people less and less. I just also know a bunch of artists (mostly small musicians), and I can confidently sat we are fucking them over way more than software engineers. By my opinion is that we should band together with the artists and demand everyone be compensated more fairly.
Every ‘entry level’ job opening wants 5 years of experience in some piece of software that has only existed for 2 years.
I got my degree 8 years ago, it’s been gathering dust as I’ve been stuck in an unrelated dead-end job, and now I fear it’s a red flag for employers that I’m in my 30s with no relevant experience in the field.
Software development has been oversaturated for ages. There’s simply far too many applicants and too few open positions. Literally every job offer I’ve seen lately gets hundreds of applicants. Open applications are often not much more fruitful.
I’d be happy to go freelance/consulting/self-employed route, but our unemployment benefits folks recently did a brilliant move of restricting that even further (literally no one on any field liked that). Universal Basic Income would solve so many problems.
You say this like there aren’t intentionally fewer open jobs than jobs that are needed to be fulfilled
No sadly
It’s extremely oversaturated
Here’s my pet theory as to why CS did so well for so long and why that probably won’t remain true moving forward.
Programming / tech is a relatively new field that, as a proportion of how much time it takes as part of people’s waking hours (as a rough indicator of how much of the economy it can penetrate), has gone from essentially 0% to 99% in only a few decades. We went from only large corporations having one or two mainframes, to office computers, to home computers, to smartphones, etc. Add in social media, streaming, etc. and people have gone from spending virtually no time on programmable devices to all their time on programmable devices.
As tech continued to have this (apparently) exponential growth, there was a chronic shortage of programmers, leading to massive salaries. As salaries exploded, programming developed a reputation for being a relatively easy, well-paying job, provided you were somewhat intelligent. As a result, hordes of students studied CS to help keep up with the growing demand, although always lagging. For seniors the lag for new hires to reach their level is quite a bit longer, so seniors have remained in high demand.
Now as we catch up to the present though, it’s hard to see spaces where new jobs for programmers can be created without cannibalizing existing ones. VR? You’d take work away from game developers. Metaverse? From traditional social media sites. In short we’ve put computers on watches, sleep trackers, fridges, TVs, cars, light switches, etc. There’s no more room for the industry as a whole to grow. AI might be the exception for this - if it actually succeeds it could keep tech growing by eating into the jobs of other industries, but then I expect it would end up eating many tech jobs too, so for the purpose of my argument it’ll either hurt the programming job market or have minimal effect.
So - we reach the present. Lured in by the high salaries of previous years, and the high salaries seniors currently have, we have an overabundance of juniors on the job market. If tech had continued its previous rate of growth, things would have been fine - but it can’t. As a result, there just aren’t enough jobs for all the current juniors and there likely won’t ever be - the industry can’t grow to accommodate them. Many of them will need to switch to other careers and for less students to study CS for balance to arrive. There’s still a shortage of seniors at the moment, but as the current juniors who are employed gain experience and move up the job ladder, this will change. Current seniors can’t count on older tech workers retiring quite yet, due to how young-skewed tech is (because of the job growth pattern we previously had), so they should expect growing job competition as juniors develop and for salaries to stagnate (already seeing this at my employer).
This isn’t all bad news though - consumers will benefit. With a shortage of new industries to move into, the glut of workers who remain will best find work opportunities by selling products that outperform and/or are cheaper compared to the existing products. In other words, expect more alternatives to MS Office, social media, Photoshop, etc. People will be able to create work for themselves by undercutting the current incumbents - we should expect to see an explosion in competitors for existing products. In some ways we’re seeing this already - more and more great indie games that outperform the AAA giants, open source software that provide better experiences against the proprietary options (Lemmy vs Reddit, Mastadon vs Twitter, Forgejo/Gitea vs Github, etc.)
I fully expect to see deviations to this - new hype cycles that temporarily create demand, boom / bust cycles depending on the present economic circumstances, an eventual (short-term) shortage of workers once today’s tech workers do start to retire, but long-term I expect ‘programmer’ to become just another generic white-collar job with similar pay.
TL;DR - unless you’re already a senior in tech, you might want to look at professions that are actually in demand as the glory days for software developers won’t come back.
shit tons of tech money is now earmarked for ai, so new graduates that focus on ai are very highly sought after and can expect to make a truckload of money in the next 5-10 years of working 80 hour weeks and being abused by zuckerberg, musk, et al.
that means there’s far less spending on traditional development positions, especially junior devs who these companies are now trying to replace with ai. add on top that covid spending led to a massive increase in dev jobs which have been laid off since then.
tl;dr there are opportunities out there, but it’s a very unhealthy, lopsided market that’s not able to support the same population it could just a couple years ago. and all of this within a bubble that’s already showing signs of being stretched to capacity.
It only sucks if you dont get in somewhere. Which is more common now. Shit I am working IT and I work in my spare time in game development because I can’t stand my day job. The only job thats easy and lucrative today is like dsy trading or being an already established influencer. The future sucks.
Get into datacenter work and you’re good. Nobody wants to do anything physical anymore but I don’t mind the work
It’s cute you think there’s a homeless shelter with room for y’all.
I don’t think that’s what the word cute means
You sound like a Karen because I imagine you said this with a really bitchy mannerism
I read it more as ‘lol, good luck getting into a shelter without at least three jobs’.
I think it would have worked better if the comment wasn’t prefaced with “it’s cute you think” which is inherently insulting.
The meme is already self deprecating, like “haha things aren’t looking good for me” so they’re aware of the situation. It’s just heaping an insult and bad news on someone who already is experiencing a negative outlook.