this post by ben northrop Reflections of an "old" programmer struck me a bit, mostly because it covers two topics: trying to understand the role of a senior level engineer and a meditation on the cycle of churn in tech.

i can't speak to the first part, even though i've been around a while, i don't know that i can speak to his experience or situation, but i can speak to the feeling of "fatigue" and "boredom".

partly i feel somewhat confident in talking about my experience because

  • i first stared into the void when i discovered the seemingly endless rat race of MCSE certification. but also
  • i seem to have made peace with tech churn's place in my life (to some degree).

there are two things i can point to that changed my perspective on churn. the first was my math+cs undergrad degree, the second has been my work on the web, on and off, for the past 15 years or so.

the degree

there's this line in cs education that most people encounter at some point

computer science is neither about computers nor science.

The statement sounded pithy in the first few minutes of 6.001, but 2 years later, knee-deep in an automata class learning about computation, state machines, dumbed down proofs, oracles, and decidability the line resounded in my head.

The other thing which probably hurt me was my school's nearly allergic reaction to talking about and dealing in actual implementations1. I'm told this has changed, but at the time i think it was perhaps a bit too principled of a stance on doing the right thing. So I, at least, spent a lot of time thinking about good solutions & approaches and less time implementing naive ones that would have worked just fine probably.

This fledgling caution was partly a reaction to seeing the results of bad (sometimes disastrous) theoretical implementations in classes and then reading case studies about even worse practical implementations: you develop a simple of pattern matcher for your hubris and rabbit holes you've seen other projects fall into. You decide not to try your hand at implementing halt(f) anytime soon.

When i tell people why a CS education is valuable, i usually talk about this pattern matching: you develop a good spider sense for counterintuitively bad edge cases, you become highly sensitive to polynomially disastrous performance from ten thousand feet, and you have a good understanding of why architectures were designed the way they were simply by looking at their exposed APIs. You can absolutely learn to spot these things, to understand this behavior outside of school, but it can be easier when pattern matching is your full time job.

ideas

when i applied to design school, i cribbed a line i will paraphrase from ellen lupton:

design is a tool for communicating complex ideas

there was this thing going around twitter this summer where you'd list the first six programming languages you learned as a kind of guilt-free rorschach test. Here was my list

  • Basic 1989
  • C 1996
  • RPL 1997
  • HTML 1998
  • CSS 1999
  • PHP 1999
  • Java 2000
  • LPC 2000
  • SQL 2000
  • Scheme 2001
  • C 2001
  • Common Lisp 2003
  • Python 2003
  • ELisp 2005
  • JavaScript 2005
  • Ruby 2005
  • CSS 2005
  • XSLT 2006
  • Erlang 2006
  • Actionscript 2008
  • Objective C 2009
  • R 2010
  • JavaScript 2010
  • Python 2011
  • CSS 2011
  • Swift 2013
  • ES2015 + Flow2 2015

The thing i think most about when i look at this list is that each of those years, i was applying or re-learning a language but attempting to solve a new flavor of problem with it (or refining some approach in a problem-domain).

if you take a computability class like the one i mentioned earlier, you are taught that the kind of language you need to express all possible computable ideas is pretty straightforward, even if cumbersome. My takeaway was that any syntax beyond that basic turing-machine-language is an ideologically-aligned mechanism for expressing ideas about some problem domain.

People talk about Domain Specific Languages (DSL)s all the time (often dismissively!), but all general purpose languages are domain-specific. Nobody writes C because they are in love with memory management, you write it because it's easy to compile for most general purpose CPUs. Or you write it because you're tired of assembly. You don't write it because you feel like mallocing your way to health. And even if you do, you're probably looking at Rust sitting pretty in the corner of your hard drive over there. All languages, from Postscript (printing) to Lisp (list processing) to Java (platform independent vm) are built around solving some problem.

Sure, the purists say, you could get away with just lisp but there are tons of languages out there, why aren't they all lisps? The answer is simple: lisp is a very nice and expressive round hole, but some problems are quite literally square pegs that have their own DSLs built around them. Sure, recreate its api lisp (or transpile!), but why not use that dsl directly? It was made for a reason and that reason is (almost always) to be expressive and relevant to some problem domain (or, more interestingly, to address the limitations of bygone era's operating system, or hardware limitations, or cultural norms). I'm not saying you should program your next startup in COBOL, write cgi-bins in C, or use the Apache webserver, but have you ever wondered why whole generations of programmers did?

the churn

two parts of ben northrop's post jumped out at me

If I learned nothing else from this point forward, I bet that only about a half of my knowledge could I still use in 2026 (long live SQL!), and the other half would probably be of no use (React Native, perhaps?). Now of course I will be gaining new knowledge to replace the dead stuff, but will it be enough? Will I know more (useful) knowledge in 2026 than I do now?

...

Some of what we learned early in our career is now out-dated. All that time "we" (read: I) spent learning GWT? Lost! Essentially, both forces, knowledge decay and knowledge accumulation rate, begin to work against us.

I think most professional programmers have grappled with these two aspects so i want to address them

what will i retain?

i didn't put jquery on that list, but i think lots of people invested a significant amount of their brain real estate on being productive with jquery. But now "everyone" uses react or angular or vue or at least not jquery and what of all the mental energy you invested solving the web's problems with jquery? is that (like GWT/prototype/mootools/closure/et al) lost?

why is jquery great? because it papers over browser inconsistencies. at some point, though, we mistook it for a dom interface and its own tech stack. that's fine, but during those heady days we learned to harness the web, we finally had robust tools for making xhr calls, we began to harness animation, we learned what ui we actually needed to build to have productive web pages.

At some point, we realized that lots of those things had become standardized across many browsers we actually cared about and for some reason it was as if we needed to cut jquery out of our lives like trans fats or sugar or ecigs. that reaction is odd because it ignores the fact that jquery still has a purpose (albeit a very different one now that we've made a habit of casting aside older browsers)!

still, regardless your feelings about jquery now, you have to admit that during that time, we learned how to make web applications where before we were just struggling to make something jiggle consistently at 13fps during hover events. The lessons learned about toggling ui element styles in response to some user action may be obviated by a more efficient dsl or library (or ux pattern), but it doesn't change that our relationship to solving that problem is now a question of "how do i do this thing nowadays" vs "is it even possible to do this thing?"

i go back to my cs education being a big pattern matching thing: is the fundamental takeaway from a tech experience "wow, i got really good and efficient at using that api" or "i have developed techniques to solve a series of problems." it's fair to have both takeaways, but not at the expense of ignoring the more fundamental "hot damn, i know web kung fu."

all that time lost

i think people underestimate that lots of programming 'churn' is a reaction to the fact that lots of problems we have now, we didn't have ten years ago. Is the point of React Native to develope iphone apps or is it, like jquery, a tool for papering over the fact that it's a tremendous pain in the ass to develop for two mostly different deployment targets at the same time.

Hey, looking at all you folks in the 90s that used Swing/AWT/QT, how did that experience turn out? When i looked at react native, it was tempting to shrug it off for all the same reasons: boring ui, terrible ux, hilarious java-flavored api, etc.

but i also think about all the lessons i learned from exposure to those systems. maybe those lessons make me a cautions RN dev, or maybe they just give me a different perspective. Who knows. I don't think that's a negative, and certainly, i am able to use that perspective to evaluate if RN is the API i would choose to set sail with for my next cross-platform journey.

I find it easy to accidentally conflate "time i spent learning arcane details of this in javascript" with "something important i learned about solving a class of problems." Sometimes they're helpfully related, but other times one is just useless trivia or a detail that is optimized away by a transpiler or minifier (who both know my cargo culting is redundant) or caught by a linter (which is better at remembering thousands of stupid rules than i am) or hopefully code review or eventually in a year or so, the vm.

I try to be hawkish about differentiating trivia from conceptual knowledge because above all i remain haunted by the career path of somebody who needed to take those stupid mcse certifications every year. We're lucky that nobody is asking us to take yearly exams about the inner workings of VMs or apis to have a bullshit stamp we can show off to people. It's easy to conflate Keeping Current And Up To Date with having one of those stamps, but that's trivia, our job remains solving problems.

not having fun anymore

The counterargument to all this is every situation we've been forced or "encouraged" to learn some tech we morally disagree with. It happens all the time, even when it's not foisted on us from up high. Want to learn to use a traditional (oracle/postgres/mysql) database, it's time to make peace with SQL (if you dare). Turn up your nose all you want at javascript, but it's the only game in town for writing clientside webapps (unless you transpile from coffee, elm et al.). or even until recently, if you wanted to write a performant app for ios, you needed have The Talk about manual memory management.

I think it would be unfair of me to discount the fact that sometimes we're put in positions where we spend an outsized amount of time solving problems that aren't actually the ones we were hired to deal with:

  • You want to write a webapp, but instead you need to know a smattering of linux system administration
  • You just want an accordion menu, but you need to debug a memory leak in some library you imported
  • You are trying to do string formatting, but a dependency broke semver out from under you, now you manage a private npm repository and are a wizard at pinning and shrinkwrapping your app.
  • You invest 40+ hours integrating a complex and poorly crafted library in your codebase, learning its mental model only to realize it's not suitable for production use or that making it usable would require the same amount of time as having written it from the outset.

These are the sorts of "orthogonal concerns" that i hear when people talk about "wasted time" or "experience that doesn't help my career" except that a good percentage of the time that experience is your career.3 And all those experiences, I believe, are valuable data-points that can (and should) trigger your spider-sense in the future when you're stuck with a broken dependency or need to assess how much it will delay a feature.

It's fair to assert that those are the kinds of pedestrian concerns you don't want to deal with in your career. Like i said, this profession is all about solving problems and eventually you either delegate them away, solve them yourself, or delevop some coping strategy for them sharing space with your day to day life. It's fine to feel grumbly about good tech that got the raw end of some deal. Lots of people write their JS as if it were lisp and a ton write it as if it were C++. Everyone takes their experiences with them on new journeys.

who knows where the time goes?4

As i've gotten older, i think about how even something as "trivial" as a Javascript Promise solves problems i didn't have when i was learning my first web application with PHP. Wow! Such power! New language features have the capacity to expand our conceptual vocabulary, so in the future we can talk productively about and address problems we didn't have before.

It's fine to love a boring stack, especially if the problems you're solving aren't changing much or if those boring parts of your stack are solving a commodity need.

In general, though, I reject the idea that adopting a new patois is pointless when thinking and learning about new or novel problems. Or that the problem-space has remained static since the 60s (which is reductive). Solving problems is what we do all day long, so it's confounding to hear a drumbeat of folks dismissing out of hand that today's networks and norms around ui have completely changed the way that we approach problems like latency, optimistic updates, partition tolerance, reconciliation, and so forth. In the past, you could make a good case that the audience for these sorts of 'niche' problems was in the tens of people, now they are part and parcel of something as boring as a shared TodoMVC app.

Even if your work never directly interacts with that lower layer, your UI should be aware of its effects. Is that useless knowledge? People used to talk about paradigm shifts. Thinking about a widget's states from being [on, off] to [mounting, no-data, off, loading, on, unmounting] is a legit paradigm shift. Even if the widget doesn't reflect all those states, how do you go back to [on,off] when you know that a component's behavior is dependent on so many more things. We've gone from a world where we could expect computers or buttons to either work or not to a world where our digital data feels more like high-resolution analog input.

This is partly why I remain optimistic about even transitional blips in our knowledge, flashes in the pan whose utility is replaced by standards-track tech. At its best, it gives us new ways to approach old problems and new ways to talk about things we previously ignored. How is that not a net-positive?


  1. at least at my level of education, the only practical place to actually Make Things Go were either on your own time, at MITERS or in your dorm or a club or by taking on undergraduate research jobs. But even then, you would encounter the comic upturned nose to a hack and the accompanying semi-sarcastic rejoinder "what do you mean, no closed form solution!?"

  2. These days, i find something like Flow fascinating because it's such a change in mindset from the classic approach of a strictly typed languages: instead of rocking the type-hairshirt from the outset of your next multi-million dollar web-scale paas, you can decide to strategically invest in typing areas of a codebase that are type-sensitive while ignoring the ones that aren't. That's not a "new idea" by a mile, but the approach is refreshingly modern for something which has had historically high startup costs in traditional languages.

  3. i made a snide comment about this on twitter because i think most people are pretty blasé about this point: if you take on a dependency, you should be prepared to own it in some capacity (especially if this software is critical to your livelihood). the person i owe this mindset to is, unsurprisingly, craig silverstein, who got me into the habit of forking each dependency and assuming responsibility for merging (and dealing with conflicting) upstream changes.

    i mention this because it's vogue to complain about some package that a developer has stopped maintaining and pin the worries (as in mailing list post from the twitter thread) on concerns like "we aren't able to commit to the main repo" or "the package has stopped receiving active maintenance." Two points here: the first problem is a non-problem especially if the community is larger than just the missing developer. Anybody looking for the package will run across the new fork and see its active development history or discover a mailing list post (or stack overflow mention) of the newer package. The second is probably the bigger issue: if there are active patches dowstream just waiting for a maintainer, a hostile fork is already at an advantage over the original project. But if the project is beloved by the community but is receiving no active contributions, what exactly is the problem? Further, if the project has met its stated goals, what responsibility does it have to merge any patches? As anyone that has contributed to a project knows: unsolicited work that doesn't solve an existing pain point often goes unmerged. But also, if nobody's contributing work, who's there to complain about a dying project?

    Many people i think lump free/open source software into this bucket of "things i'm glad i didn't write and am glad to take advantage of" without acknowledging just how integral they are to the basic function of their product. It's fine to resent a lib or app for not being a fully fleshed out product but it's irresponsible to ignore the fact that without them, you'd be on the hook for writing that same code yourself. This the argument for open source as critical infrastructure. When the access road needs maintenance, do you pay to have it repaved in the interim, or close up shop?

  4. the title of this blog post was originally this heading, but it felt too big and too weighty. instead, i replaced it with unhalfbricking, which is the name of a Fairport Convention album (they also have a song called Who Knows Where The Time Goes). It's good! I would reccomend it heartily!