Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is much like other advances in computing.

Being able to write code that compiled into assembly, instead of directly writing assembly, meant you could do more. Which soon meant you had to do more, because now everyone was expecting it.

The internet meant you could take advantage of open source to build more complex software. Now, you have to.

Cloud meant you could orchestrate complicated apps. Now you can't not know how it works.

LLMs will be the same. At the moment people are still mostly playing with it, but pretty soon it will be "hey why are you writing our REST API consumer by hand? LLM can do that for you!"

And they won't be wrong, if you can get the lower level components of a system done easily by LLM, you need to be looking at a higher level.



> LLMs will be the same. At the moment people are still mostly playing with it, but pretty soon it will be "hey why are you writing our REST API consumer by hand? LLM can do that for you!"

Not everyone wants to be a "prompt engineer", or let their skills rust and be replaced with a dependency on a proprietary service. Not to mention the potentially detrimental cognitive effects of relegating all your thinking to LLMs in the long term.


I recall hearing a lot of assembly engineers not wanting to let their skills rust either. They didn't want to be a "4th gen engineer" and have their skills replaced by proprietary compilers.

Same with folks who were used to ftp directly into prod and used folders instead of source control.

Look, I get it, it's frustrating to be really good at current tech and feel like the rug is getting pulled. I've been through a few cycles of all new shiny tools. It's always been better for me to embrace the new with a cheerful attitude. Being grumpy just makes people sour and leave the industry in a few years.


This is a different proposition, really. It’s one thing to move up the layers of abstraction in code. It’s quite another thing to delegate authoring code altogether to a fallible statistical model.

The former puts you in command of more machinery, but the tools are dependable. The latter requires you to stay sharp at your current level, else you won’t be able to spot the problems.

Although… I would argue that in the former case you should learn assembly at least once, so that your computer doesn’t seem like a magic box.


> It’s quite another thing to delegate authoring code altogether to a fallible statistical model.

Isnt this what a compiler is really doing? JIT optimizes code based on heuristics, it a code path is considered hot. Sure, we might be able to annotate it, but by and large you let the tools figure it out so that we can focus on other things.


But the compiler’s heuristic optimization doesn’t change the effects of the code, does it? Admittedly I’m no compiler expert, but I’ve always been able to have 100% trust that my compiled code will function as written.


I agree that not everyone wants to be. I think OPs point though is the market will make “not being a prompt engineer” a niche like being a COBOL programmer in 2025.

I’m not sure I entirely agree but I do think the paradigm is shifting enough that I feel bad for my coworkers who intentionally don’t use AI. I can see a new skill developing in myself that augments my ability to perform and they are still taking ages doing the same old thing. Frankly, now is the sweet spot because the expectation hasn’t raised enough to meet the output so you can either squeeze time to tackle that tech debt or find time to kick up your feet until the industry catches up.


Even the example in the post seemed closely related to other advances in consumer-level computing:

  I re-created this system using an RPi5 compute module and a $20 camera sensor plugged into it. Within two hours I wrote my first machine learning [application], using the AI to assist me and got the camera on a RPi board to read levels of wine in wine bottles on my test rig. The original project took me six weeks solid!
Undoubtedly this would have taken longer without AI. But I imagine the Raspberry Pi + camera was easier to set up out-of-the-box than whatever they used 14 years ago, and it's definitely easier to set up a paint-by-numbers ML system in Python.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: