I graduated with honors from my computer science degree. I could prove Turing completeness. I could explain computational complexity. I could implement sorting algorithms in my sleep.
On my first day at work, I realized I was completely unprepared.
Not unprepared for the technical work. Unprepared for everything else. The actual job.
What School Taught Me
My degree was thorough. Algorithms, data structures, discrete mathematics, operating systems, databases, networks. Everything a computer scientist needs to know.
Except almost none of it was what I actually needed to know.
Don’t get me wrong. That foundation was valuable. But it was maybe 10% of what I needed to learn. The other 90% wasn’t taught because it can’t be graded.
The 90% You Learn at Work
How to read code written by someone else. How to understand a codebase you didn’t write. How to ask good questions when you’re stuck. How to debug production issues at 2 AM when something is broken and people are depending on it.
How to work in a team. How to take feedback without getting defensive. How to give feedback that helps instead of hurts. How to communicate what you’re thinking to people who don’t know what you’re thinking.
How to estimate how long something will take (spoiler: you’ll be wrong, but less wrong over time). How to prioritize when you have more work than time. How to say no without tanking team morale.
How to learn something you’ve never seen before in a week because the job requires it. How to ship something imperfect because perfect is the enemy of shipped. How to iterate based on real feedback instead of imagined requirements.
How to handle technical debt. How to argue for something you believe in when senior people disagree. How to change your mind when you’re wrong without losing credibility.
How to deal with ambiguity. How to make decisions with incomplete information. How to live with being wrong sometimes and learning from it.
None of this was on an exam. None of this could be graded objectively. So none of this was taught.
Why Schools Can’t Teach This
Here’s the thing: these skills require real context. You can’t learn how to debug production code from a textbook. You need actual production code that’s actually broken.
You can’t learn how to work in a team from a classroom where everyone is trying to maximize their individual grade. You need a team that actually depends on each other and has real incentives to collaborate.
You can’t learn how to make decisions with incomplete information from homework where all the information is provided upfront. You need real ambiguity and real stakes.
Schools are built around being able to measure things. How do you measure “good at working with ambiguity”? You can’t put it on a rubric. You can’t give it a numerical score. So it doesn’t get taught or assessed.
The result: graduates with deep theoretical knowledge and zero practical ability to do the actual job.
The Illusion of Preparation
And here’s the frustrating part: schools tell students they’re preparing them for the real world. “You’re learning what you need to know for your career,” professors say.
It’s not true. Or at least, it’s only partially true.
You’re learning some of what you need to know. The 10% that can be standardized, tested, and graded. The rest you’ll learn by doing it, making mistakes, and figuring it out.
Schools could be honest about this. “Here’s the theoretical foundation. Here’s what you’ll need to learn on the job.” Instead, they imply that a degree is a complete preparation. It’s not.
What Actually Works
The people who are most prepared when they start their first job aren’t necessarily the ones with the best grades. They’re the ones who built things beforehand. Who worked in internships. Who contributed to open source. Who had real feedback loops on their work.
They learned the unmeasurable skills by actually doing the work. Not in a classroom. In the real world.
The brilliant insight is: you don’t need a classroom to learn these skills. You just need to build things and get feedback. You need a problem that matters. You need someone to review your work and tell you what’s wrong. You need to iterate.
You can do all that without a degree. In fact, you can do it better without a degree, because there’s real urgency. Real stakes. Real feedback.
The Skill Gap
So you end up with a weird situation: someone with a degree in CS is often less prepared for their first engineering job than someone who spent four years building projects and contributing to open source.
The degree holder has theory. The builder has practice. In the real world, practice wins.
This isn’t unique to tech. A business graduate doesn’t know how to actually run a business. A communications graduate doesn’t know how to actually navigate office politics. An education graduate doesn’t know how to actually manage a classroom.
The gap between what’s taught and what’s needed is massive. And it’s systematic. It’s not a flaw in a particular school or program. It’s baked into the model of standardized education itself.
The Real Curriculum
Here’s what I wish schools would say: “We’re going to teach you the foundations. The rest you’ll learn by working. Here’s how to learn effectively. Here’s how to get feedback. Here’s how to iterate. Here’s how to build things that matter.”
That would be honest. That would be useful.
Instead, they imply that the degree is the preparation. That you’ll walk out ready to work. You won’t. Nobody does.
The first person who gets the job is someone who already learned the unmeasurable skills. By building. By doing. By getting real feedback on real work.
Tomorrow we look at the cost of all this. The creative potential we’re crushing by teaching for tests instead of teaching for work.
