Five times the data. Same number of people. No new hires. The only way through? Rethink the architecture.
Same 8-person data team. The company went from 50 to 150 employees in 18 months. During this growth, I watched them spend 6 months trying to hire senior data engineers. Couldn’t find them-the market was dry, and their budget wasn’t competing with enterprise.
Instead of waiting, they stopped and redesigned the work. dbt and Airflow took over the boring pipeline chores. Then, domain teams began to own their own data. Suddenly, there was less that needed a specialist.
This change forced a decision point: Which pipelines actually need a senior engineer? Which ones can a domain team run if you give them guardrails? It turned out most of it could move.
For example, marketing had been requesting access to the pipeline for months. The data team kept saying no; it was too risky. But when they finally handed it over with guardrails, marketing broke it once-then fixed it themselves. That self-sufficiency never happened when everything ran through the data team.
As a result, analytics turnaround stayed at 3 days, even with three times the demand. The team didn’t work faster-they just stopped doing the stuff that didn’t matter.
Reflecting on all this, I used to think you could hire your way through scaling. I don’t anymore. When hiring gets hard, it’s usually a sign that the architecture needs to change first.
What role has been open for 90+ days? What if the team didn’t need it at all?
