brianchitester.comPostsPolymath Profiles

When I first started my career in software development, mentorship was a clear and defined role. A senior engineer—or if you were lucky, a great manager—might serve as your technical guide, career coach, advocate, therapist, and accountability partner all in one. But that model is breaking apart.

Today, AI tools like ChatGPT and others are stepping into mentorship roles with astonishing effectiveness. They can explain unfamiliar codebases, generate architecture proposals, simulate code reviews, and even advise on career planning. In many ways, they’ve already surpassed what the average human mentor can provide—at least in terms of speed, availability, and breadth of knowledge.

But this shift is also revealing something deeper: mentorship isn’t one thing. It’s a bundle of roles, and not all of them are equally replicable by machines.


The New Mentorship Stack

We can now decouple mentorship into several core components:

  1. Technical Guidance — Reviewing code, explaining unfamiliar concepts, improving performance, recommending libraries, etc.
  2. Career Strategy — Mapping out growth paths, identifying roles, simulating what-if scenarios for job changes or skills acquisition.
  3. Emotional Support — Listening, validating, providing empathy in moments of burnout, anxiety, or imposter syndrome.
  4. Sponsorship / Advocacy — Promoting someone’s work, creating opportunities, publicly backing someone’s career.
  5. Cultural Transmission — Modeling team norms, values, communication styles, and conflict resolution approaches.
  6. Network Access — Making intros, offering visibility, pulling someone into their orbit.
  7. Ethical / Moral Framing — Helping someone think through difficult decisions that touch on integrity or long-term purpose.
  8. Accountability — Nudging progress, holding people to goals, reminding them of commitments.

AI is already excellent at the first two. It’s decent at a lightweight version of accountability (think reminders, reflection prompts), and improving rapidly as a brainstorming partner or synthetic coach. But for the rest? That’s where humans still shine.


Why AI Falls Short: The Context Problem

It’s not that AI couldn’t, in theory, perform emotional coaching or cultural mediation. It’s that in practice, it’s locked out of the context required to do those things well.

Even with memory and personalization features improving, AIs don’t carry continuity the way humans do. A mentor who’s worked with you for a year understands not just your skillset, but your hesitations, blind spots, and values. That kind of long-term relational context is hard to encode.


Mentorship Reimagined

What this adds up to is a new, modular mentorship model. Instead of expecting a single person—or AI—to do it all, we can think in terms of a “mentorship stack.”

This reframe is not a loss. It’s a new design.

In fact, unbundling mentorship may allow each part to thrive more fully:


Summary: Expanded Mentorship Roles

RoleCan AI Do It Well?Human Still Needed?
Technical guidance✅ Yes🟡 Optional
Career strategy✅ Yes✅ Yes
Emotional support🟡 Simulated only✅ Yes
Sponsorship / advocacy❌ No✅ Yes
Network access❌ No✅ Yes
Cultural transmission❌ No✅ Yes
Moral framing🟡 Simulated only✅ Yes
Accountability✅ Limited by context✅ Yes

Closing Thought

Mentorship in the age of AI isn’t disappearing. It’s just changing shape.

The tools we use are getting smarter. But in the end, the parts of mentorship that require trust, advocacy, culture, and care—those are still very much human. And that’s a good thing.

2025 © Brian Chitester.