Unbundling Work

Recently I had a conversation with an infrastructure team supporting an ML modeling group. The two orgs used to collaborate to ship experiments: the modeling team would come up with ideas, the infra team would augment their frameworks and build out tooling to make those ideas scalable. Together, they would ship an experiment every couple of weeks. Now the modeling team is largely making the framework changes and performance improvements themselves, thanks to coding agents, and are shipping a few experiments every single week. The infra team are still busy, but they are firefighting and debugging when the agents get stuck. The modeling team are much more productive, undeniably, and all the humans are busy, but the work for the infra team has ended up somewhat worse.

If you are a tech CEO who has recently returned to coding, you could look at the team doing the lower-scale firefighting and think “do I need these people?” If you keep taking that question to its conclusion you eventually ask… do I need anyone to do anything at all?

This question, helpfully, predates the term AI 1: Back in the ’30s, Coase wrote his theory of the firm on why companies do some things in-house, and buy others from the market. For a brief period in the early 00s it looked like software jobs would go to the market, thanks to outsourcing. This largely didn’t happen, because, as Coase predicted, specifying a project is tough. Creating software is an iterative process; you don’t know exactly what you’re building until you start, so you need people with technical taste to be making decisions in a consistent way.

There are a lot of Steve Jobs stories with this flavor. For one, Jobs wasn’t happy with the jiggling when holding down icons on the iPhone to remove them. The team built a UI with sliders so he could adjust the jiggle rate until satisfied. Once perfected, copying it was easy, but assembling a group that cares about those kinds of details is hard.

One way to find those people is to train them. Gary Becker wrote about human capital back in the 60s, and in his framing some training imparts skills which are marketable; some which are firm-specific. Companies will pay for firm-specific training but are less keen on paying for marketable skills because rivals can free-ride on it by poaching employees once they are trained.

From The AI Becker Problem:

“If Company A invests time and money to turn a raw college graduate into an expert, Company B can hire that person after five years of experience for a higher salary, collecting the benefits of skills Company A paid to build. In the past, firms tolerated this risk because juniors were producing valuable work along the way. Without that value, the economic foundation of apprenticeship collapses entirely.”

This shows up is in the L3-L5 progression in big tech companies. I’ve seen many hiring managers be hesitant to hire an “industry four” as they don’t yet have the rounded, marketable skills the manager wants. But within the companies they have (effectively) apprenticed at, L4s contribute a huge amount of value. Is AI blowing up that trade-off?

The author of the AI Becker note, Luis Garicano, recently put out another paper on AI disruption asking when AI actually displaces jobs. In Garicano’s framing jobs are bundles of tasks and responsibilities; AI’s impact depends on how tightly these tasks are tied together.

“In a strong bundle, breaking the job destroys enough value that the job survives as a whole: AI assists but the human still sells the full service and retains a large share of revenue. In a weak bundle, the cost of splitting is small: AI replaces some tasks, the human role narrows, and the labor share falls.”

Software engineering involves writing code, operating services, decomposing problems, and aligning with others (both project-wise and culturally). Current AI coding agents attack part of this bundle, but humans comparatively excel at social dynamics and maintaining the larger world view necessary to know which problems to focus on.

At the senior levels the ties seem strong: you can take the coding and task breakdown out of it, but that wasn’t the main thrust of your L7-9 engineers anyway. At less vaunted levels, companies will need many fewer software engineers to churn out code than they have doing it now. But as the ML infra example earlier showed, that doesn’t necessarily mean you don’t need some of the other things they can do.

This opens a risk for the business: if you need senior folks but don’t have enough valuable work to justifying training them yourself, you are stuck paying market-rate for increasingly rare talent. Right now if you happen to have, say, scaled LLM post-training experience you can command a very significant salary. Or just start your own company.

Hiring is hard even for deep-pocketed executives when key skills are firm-specific rather than marketable. Apple can’t go out and hire the kind of people with the taste it develops internally (generally). But how much are firms willing to roll the dice on developing the next Jeff Dean, and how much are they willing to risk someone else hiring them away?

For a similar dynamic, look at investing. Over the past decades, much of the junior analyst work that undergirded investment firms has been replaced by automation. The structure that emerged was the pod shop, or more formally a multi-strategy hedge fund. They operate more like a platform that hosts “pods”, each led by a portfolio manager who is supported by analysts, data scientists and traders. Each pod has its own domain of speciality, and its own profit and loss. The firm centrally manages risk and allocates capital to pods. Successful portfolio managers earn a healthy percentage of the profits they generate, while unsuccessful pods are taken out behind the woodshed and shot. This both gives a talent development pipeline and a rigorous performance standard, albeit not a very collaborative one.

This works, in part, because there is a very clear score card, measured in dollars. We might be able to copy the structure in engineering teams, but actually evaluating how well things are going is hard!

Firms that have the highest dependence on people you can’t easily hire are exactly the ones who are at risk of struggling in this transition: they have the most need to grow their own people, and the least economic reason to do so. Apple can’t buy another Apple, and neither can anyone else.

  1. Back when people still used the term Cybernetics. AI researcher drama is literally as old as AI. ↩︎

Discover more from Ian’s Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading