ChatGPT: Climbing the Engineering Ladder Requires More Than Just Code Generation


Like a large number of other folks in the tech industry, I've been going down the rabbit hole of understanding ChatGPT and other recent Generative AI improvements.

There's a tremendous amount of hype coming out, but underneath that hype there's also a lot of reality. These are super powerful technologies, and they are already changing the way that software engineering is done.

But I think it's important to highlight where the hype has gotten way out ahead of reality.

In the first episode of the Latent Space podcast, Logan Kilpatrick from OpenAI asserted that ChatGPT's capabilities, when it comes to programming, were going to climb from being a junior engineer to being a principal engineer by getting better at code generation.

This is a fundamental misunderstanding of what higher levels of engineers do. ChatGPT is indeed climbing a learning curve to generate better code for well-defined problems with a well-defined context. But beyond the senior level, most of engineering is about problem identification and definition, working with a large set of implicit often poorly defined context.

Understanding Staff-Level Engineering

To understand the complexity of engineering roles beyond the senior level, let's explore some examples of what staff-level and higher engineers do:

  1. Maintaining Contextual Knowledge: Staff-level engineers create and maintain a large set of context in their heads, blending explicit written content (current state of codebase, written design docs, etc.), explicit verbal content (conversations about what needs to be done and why), and implicit information (mental model of the current state of the business, business goals, relationships with stakeholders, relative importance of different timelines).

  2. Clarifying Outcomes and Constraints: Using the context they have, staff-level engineers ask key questions to clarify desired outcomes, constraints, and decompose problems into useful chunks. They also estimate the difficulty of these tasks in their company's unique context.

  3. Reframing Business Problems: Staff-level engineers find ways to reframe or constrain business problems to translate impossible or very difficult challenges into things that are understandable and achievable.

  4. Identifying and Solving Complex Patterns: Staff-level engineers identify patterns and recurring problem areas, which typically occur at the boundaries of systems and interfaces. These issues are frequently not well documented and may have combinatoric complexity. Staff-level engineers diagnose these issues (commonly using very incomplete data and logic to debug scenarios that have never or rarely been described or documented before), using experimentation and iterative problem solving, and design solutions.

The Realities of Advancing in Engineering

It's important to note that nowhere in these examples is "write better code" a key aspect. Being able to write code (even relatively complex, full systems) is what will advance you from junior to mid-level and mid-level to senior. However, it is merely table stakes when it comes to the upper levels of engineering.

ChatGPT is a solid code-generation tool. So are tools like Github Copilot, and the various open source alternatives that are growing. They can phenomenally improve our productivity as engineers, and once problems are well defined they can make generating code to solve them much faster.

But we shouldn't overestimate what they can do. Advancing in your career as an engineer requires you to move up the conceptual stack from straightforward implementation to problem identification and definition, coordination and translation between engineering & other domains, and interpersonal leadership.

These are not skills that ChatGPT is likely to replace soon, if ever.


Note: I actually used ChatGPT (with GPT 4) to help write this post. Once I knew what I wanted to say, I was able to prompt it to get a solid first draft. But that draft needed a fair amount of editing. And without the guidance of what I wanted to say, it wasn't able to generate something very useful. Generic prompts or questions around the premise of chatGPT replacing principal engineers generated very uninformative responses. This is a very similar situation to writing code. Once you know what you want, the AI is a phenomenal assistant. But the work of defining what you want is the core problem to start with.