Hacker Newsnew | past | comments | ask | show | jobs | submit | leokennis's commentslogin

Idea! Maybe these now redundant humans can be turned into a kind of battery, so they serve as a source of energy for the machines?

Perhaps it's then smart to make the humans have a brain/computer interface, to make then dream/think they are living in a normal society so they don't revolt.


Agents everywhere!!

Do you like what I've done with the place?


A tragedy is that supposedly the original idea was to use human minds for processing power, which would have been far superior to the thermodynamically laughable idea to use them as power sources.

> The obvious objection is that code produced at that speed becomes unmanageable, a liability in itself. That is a reasonable concern, but it largely applies when agents produce code that humans then maintain. Agentic platforms are being iterated upon quickly, and for established patterns and non-business-critical code, which is the majority of what most engineering organizations actually maintain, detailed human familiarity with the codebase matters less than it once did. A messy codebase is still cheaper to send ten agents through than to staff a team around. And even if the agents need ten days to reason through an unfamiliar system, that is still faster and cheaper than most development teams operating today. The liability argument holds in a human-to-human or agent-to-human world. In an agent-to-agent world, it largely dissolves.

Then I'd wager it's the same for the courses and workshop this guy is selling...an LLM can probably give me at least 75% of the financial insights for not even .1% of what this "agile coach" is asking for his workshops and courses.

Maybe the "agile coach LLM" can explain to the "coding LLM's" why they're too expensive, and then the "coding LLM's" can tell the "agile coach LLM" to take the next standby shift then, if he knows so much about code?

And then we actual humans can have a day off and relax at the pool.


Ceding the premise that the AGI is gonna eat my job, my job involves reading the spec to be able verify the code and output so the there’s a human to fire and sue. There are five layers of fluffy management and corporate BS before we get to that part, and the AGI is more competent at those fungible skills.

With the annoying process people out of the picture, even reviewing vibeslop full time sounds kinda nice… Feet up, warm coffee, just me and my agents so I can swear whenever I need to. No meetings, no problems.



Amazing bit of history, thank you!

There’s gonna be one guy in charge of you, and he’s going to expect you to be putting out 20x output while thanking him for the privilege of being employed, assuming all goes the way every management team seems to want

I dont think this will happen because AI has become a straight up cult and things that are going well don’t need so many people performatively telling each other how well things are going.


If a SWE could truly output 20x their effort, that person would probably be better at freelancing or teaming up with another SWE. If something can be automated away to AI is Project Management. Also, there has to be a point where delivering more and faster code doesn’t matter, because the choke points are somewhere else in the Project Life Cycle, say waiting for legal, other vendors, budgets, suppliers, etc, so productivity could max out at say 3X, after which, unless you have a strong pipeline of work, your engineers will be sitting around waiting for the next phase of the project to start.

> If a SWE could truly output 20x their effort, that person would probably be better at freelancing or teaming up with another SWE.

Yes but this requires the willingness to take on the additional stress and risk of managing your own sales, marketing, accounting, etc.


If AI can 20x an engineer, it can handle all this too.

Sadly it can’t.


Except now you also team up with people who are adept at sales, marketing, accounting, etc. now to form a cooperative instead of a corporation. Maybe workers can get back some of the rights and fruits of their labor.

To add to this, I remember somebody here on HN pointing out a few months ago that they’ve never seen so much investment in businesses that are going “we don’t actually know what the billion dollar application is so we’re going to sell y’all some rough tools and bank on the rest of you figuring it out for us.”

> There’s gonna be one guy in charge of you, and he’s going to expect you to be putting out 20x output while thanking him for the privilege of being employed, assuming all goes the way every management team seems to want

A perfect summation.


But that's really not the point of this particular article.

The point being made is, do you know what financial impact your work is having in terms of increasing revenues or decreasing costs?

If the company revenue is going down and costs increasing, developers will be laid off regardless of how many tickets they close.


I got 99 problems but an agent ain’t one.

I think you missed the key capitalist part:

There needs to be someone to benefit from all your labor. No, no, it can't be you. You have conflicts of interest!


What if your work isn't benefitting anyone?

If it sells, you don't own it! No one said a product needs to benefit people.

> my job involves reading the spec to be able verify the code and output so the there’s a human to fire and sue.

So, you're the programmer (verify code) and the QA (verify output) and the project manager (read the spec)?


That's the difference between programming and software engineering.

A software engineer should be able to talk directly to customers to capture requirements, turn that into spec sheet, create an estimate and a bunch of work items, write the whole system (or involve other developers/engineers/programmers to woek on their work items), and finally be able to verify and test the whole system.

That entire role is software engineering. Many in the industry suck at most of the parts and only like the programming part.

I think the hardest part is requirements gathering (e.g. creating organized and detailed notes) and offloading work planned work to other developers in a neat way, generally speaking, based on what I see. In other words, human friction areas.


> That entire role is software engineering. Many in the industry suck at most of the parts and only like the programming part.

I'm always amused when I read anecdotes from a role siloed / heavily staffed tech orgs with all these various roles.

I've never had a spec handed to me in my career. My job has always been been end to end. Talk to users -> write spec into a ticket -> do the ticket -> test the feature -> document the feature -> deploy the feature -> support the feature in production from on-call rotation.

Often I have a few juniors or consultants working for me that I oversee doing parts of the implementation, but thats about it.

The talking to users part is where a lot of people fall down. It is not simply stenography. Remember most users are not domain/technical experts in the same things as you, and it's all just a negotiation.

It's teasing out what people actually want (cars vs faster horses), thinking on your feet fast enough to express tradeoffs (lots of cargo space vs fuel efficiency vs seating capacity vs acceleration) and finding the right cost/benefit balance on requirements (you said the car needs to go 1000 miles per tank but your commute is 30 miles.. what if..).


> I've never had a spec handed to me in my career.

We call those places "feature factories".

I have been required to talk with many in my life, I have never seen one add value to anything. (There are obvious reasons for that.) But yet, the dominant schools in management and law insist they are the correct way to create software, so they are the most common kind of employment position worldwide.


Careful with that though. The guy whose entire job is to "take requirements from the customers and bring them to the engineers" really does get awful tetchy if the engineers start presuming to fill his role. Ask me how I know.

Are you talking about the "engineer talked to a customer and now both are mad at each other" trope?

While I have seen this happening it usually has nothing to do with engineers and more with that fact that talking to customers and identifying requirements is a task that requires respect and practice to become good at. Procentually I've seen more junior MBAs alienate customers than I have engineers seen do it.


Please tell more.

I have the same impression. But that is where it is going - roles merging and being able to do the full spectrum will be valuable.


Nothing much to tell. About 10 years ago on another job, I wanted more context in what I was building, so I suggested shadowing a user so I could see what they actually did, what value the software provided, and where the pain points were. A business analyst attached to my team became somewhat upset because she felt that impinged on her job. So there went that idea.

How do you know?

qa has long ago merged with programming in "unified engineering". Also with SRE ("devops") and now the trend is to merge with CSE and product management too ("product mindset", forward-deployed engineers). So yeah, pretty much, that's the trend. What would you trust more - an engineer doing project management too - or a project manager doing the engineering job?

The PMs and QAs I know would disagree with that assessment.

> What would you trust more - an engineer doing project management too - or a project manager doing the engineering job?

If one of the three, {PM, QA, coder}, was replaced by AI, as a customer I'd prefer to pick the team missing the coder. But for teams replacing two roles with AI, I'd rather keep the coder.

But a deeper problem now is, as a customer, perhaps I can skip the team entirely and do it all myself? That way, no game of telephone from me to the PM to the coder and QA and back to me saying "no" and having another expensive sprint.


If I'm managing a company of about 10 people to do something in the physical world, I'd probably skip the PM & QA and hire the engineer and have the engineer task the LLM with QA given a clear set of requirements and then manage the projects given a clear set of deadlines.A good SE can do a "good enough" job at QA and PM in a small company that you won't notice the PM & QA is missing. But the PM & QA can always be added or QA can be augmented with a specialist assuming you're LLM-driven.

Of course if none of your software projects are business-critical to the degree that downtime costs money pretty directly then you can skip it all and just manage it yourself.

The other thing you should probably understand is that the feedback cycle for an LLM is so fast that you don't need to think of it in terms of sprints or "development cycles" since in many cases if you're iterating on something your work to acceptance test what you're getting is actually the long pole, especially if you're multitasking.


> If one of the three, {PM, QA, coder}, was replaced by AI, as a customer I'd prefer to pick the team missing the coder.

I am curious: why? In all my years of career I've seen engineers take on extra responsibilities and doing anywhere from decent to fantastic job at it, while people who tend to start much more specialized (like QA / sysadmins / managers) I have historically observed struggling more -- obviously there are many and talented exceptions, they just never were the majority, is my anecdotal evidence.

In many situations I'd bet on the engineer becoming a T-shaped employee (wide area of surface-to-decent level of skills + a few where deep expertise exists).


> The PMs and QAs I know would disagree with that assessment.

It just depends on the org structure and what the org calls different skills. In lots of places now PM (as in project, not product) is in no way a leadership role.


QA is still alive and well in many companies, including manual QA. I'm sure there's a wide range these days based on industry and scale, but you simply don't ship certain products without humans manually testing it against specs, especially if its a highly regulated industry.

I also wouldn't be so sure that programming is the hardest of the three roles for someone to learn. Each role requires a different skill set, and plenty of people will naturally be better at or more drawn to only one of those.


From my experience with modern software and services, the actual practice of QA has plainly atrophied.

In my first gig (~30 years ago), QA could hold up a release even if our CTO and President were breathing down their necks, and every SDE bug-hunted hard throughout the programs.

Now QA (if they even exist) are forced to punt thousands of issues and live with inertial debt. Devs are hostile to QA and reject responsibility constantly.

Back to the OP, these things aren't calculable, but they'll kill businesses every time.


Continuous delivery really killed QA.

that's not the role of QA to be a gatekeeper, they give the CTO and President information on the bugs and testing but it's a business decision to ship or not

I’m not a native English speaker, but isn’t gatekeeping exactly that? Blocking suspicious entities unless they’re allowed through by someone higher in the hierarchy?

QA merged originally out of programming.

emerged?

I mean, yes?

Maybe it's different where you live but QA pretty much disappeared a few years ago and project managers never had anything to do with the actual software


Exactly. I think it's been a while since I've read an LLM hot take which couldnt have been written by an LLM and this one is no exception.

There's a 99% chance that the training materials on sale are equally replaceable with a prompt.


True. And yet, as an organization when you buy OP's training, you don't buy the material. You buy the feeling that you make your organization becomes more productive. You buy the signal to your boss that you are innovative and working to make your organization more productive. And you buy the time and headspace from your engineers that they are thinking if at least for 2 hours about making the organization more productive. The latter can be well worth the cost, and the former surely too.

They're buying a defensible (or laudable) justification when the training company's fee appears as a line item in the company budget.

This doesnt mean the training has to be good, useful or original in the slightest but the provider does need to have credentials which arent just "some dev with a hot take" that a fellow executive would recognize.


Yeah, that paragraph really betrayed the author's ignorance of software development. At the very least, it proves that they have no hands-on experience with LLM-assisted development.

Getting these tools to "understand", or be able to generate good results in a codebase, is not a function of the number of agents or the time you let them run. Much rather, if the tools fail to produce anything useful after a few minutes, you can bet your ass that they're not going to work better after hours, or days. If they come up with a mess, and your reaction is to just let them work on it for a few days, I can confidently predict what you'll end up with.

They come close to grasping what we've learned about where these new tools are useful and where they aren't, only to end up falling for the pretty words these generators use to lipstick their turds. As right as they may be about the financial considerations, there are going to be some very uncomfortable bills to pay for those who share this belief in the magical abilities of LLMs.


In general, there’s very little info that costs much to learn nowadays. The human standing in the front is a disciplinarian to force you to learn it.

Or, more likely, a snake oil seller dedicating more to marketing than to the product.

I read ZoneAlarm and it was like suddenly a part of my brain that went unvisited for 25 years lit up...


WinAmp skins. Alternate shells for Windows(!). Cygwin because I still played too many games to go full Linux.

Yeah...


btw all versions of WinAmp + skins still work great in 2026 even on Win11 :)


It still whips the llama's ass.


Goodness I miss litestep


While Microsoft in general is a mess, this article is like saying: what even is “save”? Microsoft has 1286 save products! Save in Word, Save in Paint, Save in Notepad…

Copilot means there’s a button/menu/command in the Microsoft app/site/tool that allows the user to pass whatever text/file/site/context/prompt is on the screen to the Copilot AI backend so it can summarize/transform/expand/explain it, and then have the user wait an inordinate amount of time for a mediocre response.


I don’t think the comparison is fair. Some of the products presented here are named copilot themselves, or at least for some, copilot + the domain of the base product. It’s not just a functionality like saving.

Which can get even messier in people’s head, since they will usually reference any product they use as to copilote, when they may be talking about different ones sometimes.

For instance, my friends who uses teams or the 365 suit refer to copilot as the integrated AI tool within these softwares. When, as a SWE, where I hear about copilot, it usually refers to the coding assistant/AI code completion/agent tools for me.


That's a bad analogy you made. Copilot is a Product Platform, Save is a basic software function that even my grandma could explain what it does. You don't have to believe me, test it yourself: Let your grandma explain what save does in Microsoft Word or Excel. Then let her explain what Copilot does in Outlook, VSCode, Bing, Github Copilot, Bing, Sharepoint, Microsoft 365 and so on..


> Microsoft 365

That's Microsoft 365 Copilot now actually.


My favorite coding font is the one that is pre-installed on my work laptop because I cannot install additional ones...

So Aptos Mono or Consolas it is.


Honda seems to love pulling out of things just when they are about to succeed. Both in F1 and apparently now also in EV's.


I admire and applaud Kagi for what they are trying to be.

But on the AI front, the Assistant is simply worse than using for example Gemini or ChatGPT directly. It is slower, it cannot generate images etc.


That's why I don't think anybody is going to pay for the Assistant alone.


On the other hand, it doesn't monetize any of the data you give. Six of one, half dozen of the other, yeah?

Re: image gen... it's a search engine. Why would I need my refrigerator to toast my bread?


The privacy buffer between the LLM providers and the user is a large part of the appeal. Having it hooked up to a search engine outside of the data broker space makes it uniquely attractive.

Given that window dressing, having toasted bread in a kitchen that isn't selling my data is something I want.


Because at least for my own usage of Google the LLM started out as an interactive search with substantially better context filtering that could tune the results to my desired technical level. However I promptly started just having it explain the subject matter to me rather than spending 30+ minutes consulting various docs and forum posts because it makes for an excellent secretary/tutor combo provided you vigilantly watch for misinformation.

So to answer your question, while charts might not be particularly useful for a search engine a tutor certainly benefits from them.


The research assistant should be able to generate images


I see the slowness and now the news that they might remove the assistant for the $10 plans as evidence of how costly it is to run LLMs and by extension how unsustainable it must be for OpenAi, anthropic, Microsoft etc to be offering such performance for free or very low prices. Surely something has to give soon.


It is not worse I would say. It uses neutral system prompt by default, whereas Gemini and ChatGPT will please you too much to mislead you badly. Also the base search is much batter. You can control the search while with Gemini, for example, you can't.


Keep in mind that kagi offers a wide range of models, not just one. I wouldn't want to have multiple subscriptions (for chatgpt, anthropic, gemini etc.)


This is also true if you use something like OpenRouter, and it will almost always cheaper or betterter (excluding Kagi Search).

I love Kagi-I can't imagine going back to any other search engine-but it isn’t competitive when it comes to LLMs. In its defense, that’s largely because others are bleeding money.


In my experience that is true, but I get the answers I need much faster than any kagi search with or without their ai integration from Gemini, ChatGPT or Claude. I was rooting for them, but they just seemed to far behind the leading llms for search. Same with perplexity. Could never figure out why it needs to exist. If I want citations, I’ll just ask one of the LLMs I mentioned to provide them.

Also, there news app is pretty underwhelming.


Every time some site or person tries to make me feel bad for using AdGuard DNS, ad blockers etc. I read an article like this and I feel fine.

I see three options:

1. Show me reasonable ads and I will disable ad blocking

2. Do the crap described in this article and don't complain when I arm myself against it

3. Do a hard paywall and no ads; force me to pay to see your content


It's my device. I decide what I download, execute and display on my device. A website is free to offer me to download an ad and I am free to decline that offer. Demanding me to download anything on my device or even worse execute someone else's programs [JS] and claiming that I have a moral obligation to do so is deeply creepy.


I was about to comment this too.

For someone that explicitly states:

> I don’t listen to pop songs. I prefer music of the 500-year tradition (...)

And who apparently wants to stream music, it is wild he's not subscribed to Apple Music Classical which exactly circumvents all complaints in this article...


I have a 2020 MacBook Air M1 with 16GB RAM - for development work, there is 0 reason to upgrade it. All day battery, silent, small, no lag...


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: