Understanding unfamiliar legacy code with a bunch of kludges and "domain knowledge" or "business logic" baked into it, when the original authors are uninterested/unavailable and management has zero respect or understanding for the fact that it often takes an order of magnitude longer to modify/maintain this stuff than it does to write it.
Oh, and there's no test suite.
That is my nightmare. I've been there a few times and it can be literally impossible to shine in that role.
"What do you mean, it will take you ten hours to add a button? They coded this whole module in one week and there are 100 buttons!"
aarrrrgh
Now, as an engineer, I understand and embrace the challenge of explaining this sort of thing to management. I get it. That's the type of skill that separates code monkeys from senior developers and architects. But, no matter how good you are, you can't force management to listen nor understand.
If you are on a legacy project, it's hard to know what to prioritise in terms of improving code. However, getting requests to take somewhere within an order of magnitude of how long people think it should take is usually my guide. I never tell people I'm going to refactor something. I just do it. Then after the fact I say, "Remember when this used to take a week? Now it takes an hour". Once you do that a couple of times, management gets the point (usually). The only problem I've come across is when you get on a team where they actively discourage code churn. On a legacy project, that's death, so it's best to find another job in that case.
I really enjoy working on legacy code. Usually everybody else leaves me alone because nobody wants to touch that code. I get a tonne of freedom. As long as I don't break things (requires a fair amount of experience) I can quietly improve things under the hood -- targeting areas where we get a lot of requests. Legacy code is also great because usually nobody wants new features just for the hell of it. You almost always get real requests from real users that have real pain points. The biggest downside (as you imply) is that it's hard to shine because the best you can do is make things acceptable. It's difficult to hype the legacy project so you don't get a lot of recognition for your work.
I liked reading this! I'm glad you found happiness doing this sort of work!
One thing I'd say is that it's awfully tough to do this sort of work as a contractor when there's a (direct) hourly cost for your time. It's tough when you're on salary working on an internal project... but doubly tough (not impossible) when you're a contractor! Theoretically, it would be in clients' best interests to improve efficiencies, but it's really hard to get a client to sign off on hundreds or many thousands of dollars of work for a future gain that's hard for them to grasp.
(Getting them to grasp it is part of our job, of course)
Then after the fact I say, "Remember when this used to take a week? Now it takes an hour". Once you do that a couple of times, management gets the point (usually)
This is great advice. It's all about those metrics and demonstrating the value of your work!
It's definitely a daunting task, and I've been in that position once and I think it was a pivotal role in my career. I was hired as the only android engineer on a project that had churned 3 devs already, each with their own code style, and they all built layers on top of one another. It's hard to be in the position where a full re-write would be simpler technically, but impossible politically. My strategy was to basically refactor layer by layer, bringing the codebase back to a mostly working state in between modules. Over the course of 6 months I basically re-wrote the whole app and reduced the LOC by 50% while taking our crash rate from ~10 percent to < 1 percent. It was painful, but I'm no longer overwhelmed or overly annoyed by legacy code filled with cruft. It really is possible to chip away at it piece by piece and end up in a great position, and knowing how to pull that off is what separates the good from the great. Theres no perfect codebase, and having the skills to continually refactor and improve while delivering features is key.
Oh yeah. I've spent my last (almost) three years (thankfully they're over now) working on that kind of codebase and it's been an absolute nightmare. The very few original(-ish) authors still on the team were generally very helpful, but the codebase was huge, some of it was bought from someplace else (so there was no one to explain anything, and no documentation because of course not), and the team had been downsized in the course of time. Patching it up with contractors, needless to say, didn't quite have the stellar effect that management had hoped it would have.
Three years later I was still struggling to figure out even the most basic of features. It would regularly take me days, sometimes even weeks to fix trivial bugs. The convoluted code definitely didn't help (message passing between several processes, 4000-line functions...). There were a lot of features that I could barely use, let alone debug. I had no mental model of the code at all. Figuring out where my mental model didn't match reality was pretty much out of the question under the circumstances -- and my only source for creating a mental model of the code was, well, the buggy code.
Thankfully, I also worked on other things during these three years, so my sanity is still mostly intact. But that thing was really terrible.
The convoluted code definitely didn't help (message passing between several processes, 4000-line functions...)
Oh man, yeah. This kind of thing's hard enough in a single threaded monolith, much less when you've got multiple processes!
One time I was actually able to kill/retire a whole bunch of external processes and bring stuff back into the monolith. Yeah it made the monolith 1% bigger or whatever, but I think it was 100x better than throwing more services into the absolute gangbang of dependencies in which we were already drowning.
I'm glad you were able to move on and had other stuff to work on. Looking back, I probably should have just found another job rather than push through those legacy maintenance slogs. Lost sleep and hair for years, and it was sort of a career black hole.
I have that problem even worse in that I have to work with a developer who will actively develop like that. Every time he has to add a feature, instead of adding it into the relevant module, he'll add it into 2 or 3 different module which have the knock-on effect of doing what he wants.
It took me months to take his spaghetti code, modularise it, and yet still every time he commits is a constant argument of things being in the wrong place in code.
And of course, management often loves that sort of dude. The burden of proof is usually on you to prove why your slower-in-the-short-term approach is better than that guy's approach, which will get working code into production two hours from now.
Well, if manager is blind to your explanations, then it means it is better to search for a new job. Time is precious, and nothing good can come out of this work.
Use a glass. Every time a manager asks you to implement a feature in an unrealistic timeframe, you drop one or more pebbles into the glass. When the glass is full, no more features can be added. When management gives you time to restructure your code, you remove some pebbles from the glass.
It can get worse, when the managers are the ones who wrote the legacy and still are actively acting as individual contributors themselves instead of empowering their own people
How about we add financial pressure to the mix that doesn't allow you to invest in anything so you're forced to add even more weirdness to the pile of crap instead.
True, specially when management is much more invested financially in the company than rank and file engineers. That happens a lot in European startup scene from my observations (as equity comp here isn’t that appealing for most of ICs)
> I suddenly realize it's in the compiler. It was the compiler. And every time you compile the original code and run it puts in the subliminal message code into the source code. I'd heard of this before.
However, I imagine that Ken Thompson hardly had in mind somem low-brow text manipulation hacks that go as far as to clobber the textual inputs to the compiler.
Coders worst nightmare is working at a company that could desperately use good technology but whos management completely devalues it. The existentialist feeling of expending hard mental labor on something that no one cares about
Great story, but I don't think it qualifies as a nightmare. It's just a really interesting and challenging puzzle. The fact that the author elected to work on it for free ("This is nerd war") clearly places it in that category. If you can exit a nightmare, you exit the nightmare. Speaking for myself the only coder's nightmare I have is the one I think every coder has. It's the universal programmer nightmare: you don't really _know_ your stuff works, the way you can know that turning a key in a lock will open a door. Maybe you missed something. Maybe you didn't test something. Maybe it's going to break over the coming holiday weekend.
For some reason, I don’t find that story believable at all. It sounds like somebody read “Reflections on Trusting Trust” and decided to exercise their creative writing muscle by passing it off as their own experience.
My take on why this isn't a nightmare: He got great support from everybody involved. No extra pressure, no artificial deadlines, no unpaid work expected from him and there was even little denying the problem (except that his statements were not taken as truth without presenting evidence, which I think is fair).
True, but he discovered that had he written a new version from scratch, the new version would have been erased and replaced. It was detecting the question set - he would have been required to put the questions into his version too.
So by declaring nerd war, he actually saved himself a day.
The problem itself was interesting, but a compromised compiler is absolutely a nightmare. It took that long to figure out even with blatant evidence of something fishy. How long does it persist when the vulnerability is subtle?
My worst nightmare is I publish a paper, post the code on github, and then someone contacts me pointing to a bug in my code that invalidates the main result of the paper.
Almost happened to me. My paper got accepted, and one reviewer suggested an improvement of how I did the benchmarks. Doing so I noted a bug in my code, but thank god the bug was hindering the performances of my solution, so in the end my results were even stronger.
If, in scientists' or journal editors' minds, this is an actual reason not to publish source (as opposed to any other reasons they might have), then shame on them. I can imagine the shrinking feeling when something like that comes out, but hey -- you're a bloody scientist. At least it's for the public good.
This is indeed the most commonly cited reason, but as others have pointed out it’s not a good one, and I’ll posit it’s not the real one either. The real reason is nobody in the lab has any idea if it works or ever worked most of the time.
Yes the feeling is not good. Really though the reason is that potential hit to the journal’s reputation in this case, not the feelings of the research team (who already paid the journal to submit the work after all).
You might then suggest that the journal should be prepared to also review the code alongside the claims it produced in the article — this seems like a good idea to me too but in reality would require hiring more reviewers (doubtful the tenured nonagenarians are up to the task), all while tightening the restrictions on the submitters. Just doesn’t make sense for a business.
The most compelling reason I've heard against releasing code is to encourage independent implementations. While others may find bugs in released code, they may also just use it as is extending it to test new ideas.
This could propagate any bugs in the code to multiple works. A similar problem occurs with blog code snippets that are written with shortcuts. Despite warnings that these snippets need to be expanded for production settings they are often copy pasted as is into hundreds of code bases, propagating incomplete or buggy code.
Still, you could solve for this in other ways. E.g. maybe reproduction focused journals can require reimplementation.
Yes and this already happens — check out the “cluster failure” paper documenting bugs that have been baked into statistical processing packages for fMRI analysis for years. Whole fields of science having no doubt already been built on top of these bugs, now might be a good time to start writing unit tests.
If the team is good it could pay off to continue working with them. Not that I would condone physical violence and I definitely suggest a less blunt approach, but here is how Charlie Watts solved the problem for him:
> On one occasion, according to Richards, Mick was in the mood to do some recording -- at five in the morning -- so he called Watts on the phone and said, "Where's my drummer?" Says Richards, these were the days when Mick's ego had gotten onto everyone's nerves, that it seemed all was about him.
> Watts got into his car and 20 minutes later arrived at the place. When Keith opened the door Watts walked right past him, went over to Mick, picked him up by the lapel and slugged him in the face. "Never call me your drummer again."
I understand that negative feeling. Coming from a non technical person or role, what wording would be better ? Developer ? Software architect (its different but let's assume non technical people don't understand the difference), code monkey ? What would be the best term we can find for our profession?
Engineer or developer is pretty accurate if you want to be nonspecific. Most of us have roles at any given time, as well as more long-term specializations, just like you don't have generic PhDs, scientists, etc, and it's usually better to identify those traits, than say something generic.
"Coding" irks me as well. Probably because it's used uniquely by outsiders, in my experience. I tried to ponder more about why the word might be inappropriate for what we do, but didn't get anywhere. There's that obvious difference with the word "to program", that "to program" identifies what we do, but "coding" identifies what we use. Another aspect, is that "code" is probably a leftover from the paper card programming days, where source code was very cryptic. Now we use programming languages, which are readable, so suddenly the meaning "something cryptic" of the word "code" isn't appropriate anymore. Of course, you could say that coding refers to the fact that all programming is conversion of intent into some form of messages, like processor instructions or ones and zeroes. In that sense a synonym for coder could be intent digitizer. A bit more sci-fi.
I am a coder/programmer/developer/engineer/whatever it is called, and find "coder" and "coding" is fine. I am actually surprised to hear that "coder" has some negative connotations, despite working in that industry for quite many years.
You have multiple options, this is how to choose one:
1. "software engineer" (might drop the "software" for specific roles like "front-end engineer" or "machine learning engineer" etc.) - use this for people who value their education/fancy-degree and/or who pride on code-quality, low bug/defects rate, reliability etc. and they'll love it!
2. "software developer" - use this for people priding on craftmanship and "ability to ship the right stuff on time" no-respect-for degrees etc. and they'll love it!
2b. feel free to mix up (1) and (2) and nothing bad will happen :) also using (1) or (2) instead of others will tend not offend, just show that the person using it is clueless a bit.
3. "programmer" - this is the most direct / no-bullshit term for people who hate fancy extra words and pride on practicality and getting stuff done (but it might offend ppl with fancy degrees or make them feel devalued - never use it for a PhD unless you know beforehand he/she would approve)
4. "software architect" (this is a dangerous one) - can be used to boost the self-confidence of smart but terribly insecure people (use it sparingly bc some might be pushed tot he other extreme and end up with over inflated egos bc of this... better add "senior" before "engineer", it's more honest, what we call "architecture" in software is closer to "structural engineering planning" or smth, very far off from what building architects do) ...also practical senior no-bullshit people might be offended if someone junior to them gets this title!
5. "code monkey" ...the '99 bubble is over man, like 20 years ago, forget about this, unless you're in SV or other hip place and some club/group uses it for nostalgic-value ...otherwise it's offensive, and God have mercy on your soul if you mistakenly use it in a place where ppl are unused to it and at the same time to refer to someone with non-white skin color!
6. "coder" - might be acceptable in some places, but it's also cunningly devaluing (see sister comment for details)... unless you have the malevolent intent of "placing a hint of their personal irrelevance in someone's mind while at the same time avoiding to offend them" please avoid it! (might be the worst of all actually, bc nobody can complain but at the same time you spray them with a depressing hint of their inferiority while mildly boosting your superiority and subtly lowering everyone's morale at the same time... and then as a manager you end up asking yourself in the end why it all turned to shit despite you being "such an awesome and nice guy")
When you realize your code has caused grave injury to someone, physically, cost their life, ruined them financially or ruined a relationship.
Everyone pays attention to plane or space craft crash. But bad software in cars, air bags, trains, medical equipment have cost lives. Bad software at financial institution have occasionally ruined people financially. Bad social media software have ruined relationships. Bad localhost software that has crashed and taken data could ruin people in ways many people can't imagine.
Someone out there is using something as simple as a text editor to write their favorite book or keep track of their finances and run their businesses. if the editor screws up their data, you can't imagine the hardship.
You never know how your users will use your software, don't take it for granted.
My most stressful times as a developer always were related to database and manipulating production data.
Subtle corruption that silently degrade your whole company's data, hearing customers come back with complaints days after you've released your code because stats don't seem to make sense anymore... And understanding after you found the issue that you'll have to run batches to manually fix the bad data and restore anything that was affected.
That's definitely an experience i'll never ever forget.
I too have nightmares about working at a place so poorly run that it would be possible for someone to delete the production database on their first day.
Surprised to see no one has mentioned 3rd party APIs yet. Especially half-baked, poorly documented SAAS APis where you're left wondering whether is was designed by malice - to protect some vague business interest - or simply by incompetence / lack of care / lack of time.
I've found the best mental defence in such situations is to assume the worst from the outset, by expecting to have to burn untold hours figuring out how this black box _really_ works and then be pleasantly surprised some parts actually work OK. Better that than assuming "this can't be that that hard" and ending up frustrated.
Exactly, also a frequent API changes for no reason. Tensorflow is a good example of this. Looks like every month they schedule a meeting on what they gonna break next. They seem to enjoy thousands of github issues about cryptic messages on renamed nodes in the neural network models.
Struggling through their newly released tensorflow-graphics library right now. They wrote it, published some simple, single google-colab tutorials and left it at that. No documentation anywhere else
What is the basis for the belief that there was a single person, who wrote the program for the psychologist, and perpetrated the compiler hack?
This could be a case that person B was innocently working on the curses-based questionnaire program for the psychologist. Person A was aware of this work and hacked the compiler to recognize and doctor that program (including squashing it down to one line), in addition to /sbin/login and possibly others.
I'm curious how the program managed to include various headers, like <curses.h>. That requires multiple lines. I'm thinking that the hacked compiler took the preprocessed original innocent program and then output it as one line not requiring any preprocessing directives. Not even necessarily to obfuscate it, but because the division into lines was gone at the point in the compiler where this was done; i.e. this was a manipulation of the token stream of the program, and the tokens were just converted back to text and output as one line.
Could even be that this was all done in the C preprocessor rather than the compiler proper, by recognition and replacement of token sequences. Though modern compilers integrate the preprocessing phases of the language, this sounds like it was in a classic Unix environment, in which cpp was still a separate program.
It is strange to think that someone with this level of skills was in a graduate program in psychology (as opposed to working in industry as an engineer).
> I'm curious how the program managed to include various headers,
Isn't it kind of obvious that this is a story, with some creative freedoms taken? The theme is an old one in hacker lore ("it was the compiler all along!"). While you are completely right this detail isn't important to the story.
Spending ages debugging javascript with alert boxes, publishing the final code, and then hearing that the homepage has an alert box that says "hi" on it.
Is this a regional thing? I haven't ever perceived it that way over the internet or here in Europe, but I've seen several comments who agree with you on HN.
I'm in NA and also don't understand why coder is negative. I remember it was very common in the old demoscene days. I've never even seen someone non-technical use the term. Usually non-technical people use programmer.
The one that irks me is engineer. I've met very few developers who have the sort of professionalism that the term engineer requires. Most developers are more like cowboys whose primary goals are making other developers and themselves happy. An engineer needs to instead feel responsible for serving the users of the application, which includes maintenance by others as a subgoal. No doubt my definition of engineering is not popular here.
I haven't seen it as a pejorative either, yet I would not refer to myself as a coder.
A good software engineer/developer has more than just code on their mind when getting something done. In fact, code is ideally the last thing you get around to.
Yeah, my first thought was of the guy who blogged the whole time an inoperable brain tumor was killing him. Over the course of multiple years I watched as it robbed him of his intellect. The cleverness faded. The subtlety faded. His posts started getting shorter, and then started getting sloppier. His grasp of grammar was failing. Shortly before he died his final post was "love you kids. love you wife."
I'm not sure there's objectively worse things than that.
But I think one could make a good argument his final post indicates he had people who probably cared for him during his deterioriation, in part because love for them had been a north star for him.
Maybe suffering such an end without that would be worse.
It happened to me about a year ago. I hated it at first but began to like some parts of it eventually. I'll never like the "paperwork" and the chores navigating inscrutable internal applications and processes, but having real influence over the direction of products is very satisfying.
Happening to me right now, although I'm going through with it and taking it upon myself to show the company/industry what a good tech manager can actually be.
I'm living that right now, except he has no MBA, is a consultant, has the dogged determination of an elderly employee on his last ever career project and used his leverage to bring in a bunch of unqualified architect yes-men.
This is easily the number one for me. The amount of damage someone like that can do (while swiftly being promoted up the ranks of course) is staggering.
Whenever I identify one, I start looking for jobs.
Bugs that you can reproduce perfectly in production but not at all in any of 20+ non-production environments is my usual one... and it usually winds up being some updated Oracle binary...
Better or worse than a bug that appears once every few months in Prod, but only when it's raining at the customer's location, and never on Sundays?
(This is an actual issue I've encountered. It had to do with code scraping weather data for a horse racetrack, specifically if the page was updated while the scraper ran. It was buried in a PHP monolith with no tests. The track didn't run races on Sunday.)
I would add 'unforgiving use of linters' to that, where 'unforgiving' generally means that straying even a hair's breadth from the default settings is forbidden.
Especially so when you can't adjust those defaults, thus making your post-listed code substantially more abstract and less readable than it would have been at first.
Case in point: forcing your simple 11-line function to become something more complicated that has to maintain state through each call.
I'll second that. Code reviews -quite often- is a senseless sh_tshow of arrogance, pedantry and psychological issues that have nothing to do with the code.
Being asked to "improve" a piece of internal software with not a single hint of what actually needed doing. After two meetings where nothing more concrete than "make it better" came up, I quietly continued working on actually achievable stuff until I left a few months later. When I checked up on my colleagues a few years later the damn thing was still in limbo.
Obtaining poor requirements, then being accountable for maintaing the feature and codebase having had little influence on decisions, especially if the people behind the requirements have now left the organisation.
Hired by a company and realize afterwards that you must work on a project that's designed and built on potatoes using potato languages and potato tools
Working in a company where I have only one monitor and a PC without SSDs. And I am not allowed to bring my own mouse/keyboard. And I don't have admin rights for the OS.
I totally agree with the general idea. Especially the mouse.
However, since I’ve bought my 27” 4k monitor I pretty much stopped using extra ones. Unless I have to for something very specific, like debugging an app which output to multiple monitors.
This story of course makes obligatory a link to Reflections on Trusting Trust, Ken Thompson's Turing award lecture, which first popularized this concept.
Going viral on the internet for an embarrassing situation .. if that ever happens I will finally realize my dream of becoming a hermit in the wilderness
Oh, and there's no test suite.
That is my nightmare. I've been there a few times and it can be literally impossible to shine in that role.
"What do you mean, it will take you ten hours to add a button? They coded this whole module in one week and there are 100 buttons!"
aarrrrgh
Now, as an engineer, I understand and embrace the challenge of explaining this sort of thing to management. I get it. That's the type of skill that separates code monkeys from senior developers and architects. But, no matter how good you are, you can't force management to listen nor understand.