- Noah Jacobs Blog
- Posts
- On XP Farming
On XP Farming
Or why you should still do hard things.
2026.03.01
CXLI
[Doing Hard Things; Aggressive Purple Belts; 10X Seller; LLM Hegemony; Thinking Like Investors; Incentive Issue; Intellectual Serfdom]
Thesis: Doing hard things is how to get better, and getting better still matters, even when you’re being sold that it doesn’t.
[Do Hard Things]
Breaking news: you get better at something by doing that thing.
And usually, to get really good, you have to spend a lot of time in discomfort. The less comfortable something is, the more likely it is you're getting better at the skill.
This is true for jiu jitsu, selling, and programming.
Like it's true for any skill—debating, modeling, accounting, writing, etc.
LLMs don't change that.
[Aggressive Purple Belts]
When I was rolling this week, I had a particular purple belt be quite aggressive with me.
To me, this is very flattering. I am an upper ranked blue belt, which is lower than a purple belt.
I was actually explicitly told by a purple belt that I am good enough that most purple belts will have to be aggressive and high pressure with me if they want to win, because I've eliminated a lot of the easy paths.
Two things I'm seeing used against me now:
Camping: Putting their weight on me in such a way that I'm exerting force to keep the position stable, but they are not exerting any force
Timing Pressure: Aggressive top pressure that increases in time with my exhale to make it harder for me to inhale
Objectively, these both suck! It makes it harder for me to think clearly and play right. But, I'm actually really happy that it's happening, not just because it's flattering, but because it's how I'm going to get better.
It shows me where I'm weak and reveals what I need to do to get better.
If I was just ripping omoplatas on players who were less experienced than me, I'd only be getting better at part of the game. Learning to defend and escape and sweep when a better player is pressuring aggressively sucks the most but is definitely the highest leverage part to get better at.
[10X Seller]
I'm not great at sales.
I'm not bad, either, to be clear.
I'm maybe good.
I'm actually in the dangerous channel where I’m just good enough that I have confidence that I know what I'm doing, but still not high enough that I could call myself anywhere near great.
If I let sales take a mental backseat, then I'll get stuck at 'good enough.'
I don't want to stay at good enough! Jack & I still think we can make a combination of sales / product tweaks that raises our ACV by at least 2x, if not 3x or maybe even 4x! And reduce churn, too.
That would be really, really good, because it would alter our business model quite a bit positively, and positively impact our long term outcome.
Going from good enough to great is really uncomfortable, though. In sales calls, we need to ask harder, more direct questions faster, which is uncomfortable. Being uncomfortable isn't fun.
But it's really the same thing as jiu jitsu. You don't go from being a blue belt to a purple belt by only rolling with white belts.
That discomfort is where your performance increases and you get better.
[LLM Hegemony]
Nothing changes in this pattern of discomfort leading to gains when we look at programming or writing or knowledge work or anything an LLM can do, just because an LLM can do it.
Yes, LLMs give us a terrifically cheap way to do a lot of things decently well, and it really is like magic.
One of my friends used Vibe Coding to make his own investor dashboard and internal tooling for his business.
BirdDog's internal dashboard for tracking customers was built by Jack and I ping ponging a file between us and LLMs. It took us a couple of hours, most of which was us thinking about what data we actually wanted to see. The coding I did was copy, pasting, and editing a couple boiler plate functions and writing the data structure.
But the notion that we're entering some "post knowledge work society"* tomorrow is still pretty outlandish to me, and not something I would at all bet on.
*Yes, I am talking about that article, and yes, I know it was not a 'prediction,' and yes, I don't think it matters that it was not a 'prediction', and yes, it was still fear mongering to get you to buy their investment advice
[Thinking Like Investors]
Let's look at the different outcomes for LLM/AI progression and approach it like we're placing a bet. Yes, I'm focusing on coding, but you can replace agentic coding with automated therapists or LLM lawyers or LLM Consultants or LLM Accountants or LLM Bankers or whatever you want & it's going to be roughly the same.
Case | Outcome | Course of Action |
|---|---|---|
1. AGI is just around the corner | We're all fucked anyways | Probably go start a commune in some remote place in the mountains |
2. Agentic Coding will completely eliminate programmers | Barrier to entry to code goes to zero; no 0 value in coding | Focus on distribution and sales and learning how to solve problems real problems for real customers--the 'agent' will execute for you.* |
3. Agentic Coding gets really good and widely adopted but we still needs some programmers | XP Farm and become a better programmer | |
4. Agentic Coding is completely overhyped and produces masses of small, difficult to maintain projects that creates more work for everyone | We still need to lots of good programmer | XP Farm and become a better programmer |
For starters, if you seriously believe in option 1, you're not going to help your own situation by vibe coding a b2b SaaS faster than someone else.
You should probably build a self sufficient faraday cage of a bunker in a remote location where you can survive with a small group of humans to one day repopulate the Earth aeons after the AI Overlords cause a Terminator-esque Homo Sapien holocaust. Or, if you think AGI will lead to infinite surplus and global peace, then go be and artist or go outside or something.
I'm personally in between outcomes 3 and 4, closer to bucket 3 over a long enough time scale. I also don't think outcome 2 is at all impossible!
But, even if you're closer to believing in outcome 2, the recommended Course of Action in bucket 2 doesn't preclude you from playing the Course of Action in outcomes 3 & 4 in the meantime. The more people who believe in outcome 2 before it actually happens, the higher the return on XP Farming.
Meaning, if you think that Agentic Coding is ALREADY so good that we don't need programmers at all, you're not paying enough attention.** But, the more people who believe that, the higher the return for being one of the few who are actually getting better at the thing.
Again, I think this is just as true for any knowledge work that isn't coding, maybe even more so, as the CAPEX that's gone into replacing programmers with LLMs seems to far outstrip that going into replacing other people.
In short, unless you genuinely believe that complete human obsolescence of a valuable skill is happening immediately, becoming better at it is still a good bet to make.
*Depending on your business, this is likely something you should at least spend sometime on as a founder anyways
*Cursor’s ‘browser’ didn’t work, Salesforce regrets firing so many people to replace them with ai, Anthropic’s C Complier built from ‘scratch’ doesn’t work without calling the existing GCC C Compiler…, etc etc etc. The Cursor / Anthropic ones remind me of a post someone wrote about founders and lying.
[Incentive Issue]
All of this is without mentioning that the very companies driving the LLM obsolescing everything narrative profits from you believing that.

Where the crows goes, I shall not
The more people that say, "Yeah, I can stop learning things now" and just goes with the flow, represents additional recurring revenue for these companies.
And again, this is not me saying, "don't use AI." I use it very frequently and wouldn't have been able to build BirdDog without it! It truly is miraculous technology.
But that can be true at the same time that it's wise to have a healthy skepticism and reluctance to depend on these systems.
AI being super valuable can be a valid statement at the same time that vibe coding is not teaching you in the same way that coding does is a true statement. Or that writing a blog post with AI is not sharpening your critical thinking in the same way that writing a blog post by hand is.
If you don't want to become an exceptional programmer, none of this matters so much.
But what does matter, is that you hear the other side of the story--there still is value to getting good at things. And getting smashed by purple belts and being uncomfortable will help you more than hitting omoplatas on white belts.
Don't give up on XP Farming because marketing dollars tell you to.
Hey hey, thanks for reading my weekly rant. If you enjoyed and want to read more about skills, deep work, creating things, founding companies, AI, and all that, please give it a subscribe.
[Intellectual Serfdom]
Imagine a world where cars and scooters and other motorized transport were used so much that everyone became morbidly obese and could hardly walk (see: Wall-E).
Since you could hardly move on your own, you'd be stuck paying your 'tax' to whoever made such devices and powered them if you wanted to do anything significant. I don't think that'd be good.
Now, imagine a world where everyone went all in on LLMs for writing code, writing english, writing books, strategizing, and even critical thinking.
I don't see that world as any less desirable than the world of people who can't walk.
Just like the benefits of staying fit even though you don't "need" to are monumental, so to are the benefits of keep your brain organ fit.
And, those benefits are only inclined to go up as more and more people give into the temptation to let their organ atrophy.
Live Deeply,
