3 min read

Series Intro - From Working Code to Walking Away: Why I Built (But Didn't Launch) an AI Investment Scraper

Series Intro - From Working Code to Walking Away: Why I Built (But Didn't Launch) an AI Investment Scraper

What happens when a finance guy uses AI tools to solve a real problem—and discovers the solution isn't worth shipping

I spent two weeks building a web scraper that works perfectly. Then I decided not to launch it.

I'm a finance professional, not a developer. But I was four hours into manually researching private equity portfolio companies when I realized: this is stupid. Every PE firm publishes their investments online—company names, industries, investment dates, ownership stakes. All public information. Yet here I was, clicking through hundreds of pages, copying data into Excel like it's 2005.

Two full workdays of data entry before I could even start my actual analysis.

So I did what any finance person would do: I ran a make-or-buy analysis. Could I automate this? What would it cost? What's my time worth? The numbers said build it—especially with new AI coding tools that let non-developers ship functional products.

Two weeks later, I had a working application. It takes any PE firm's portfolio URL, automatically scrapes their investments, and exports clean data to Excel. It works for 60-70% of firms without any custom configuration.

And then I killed it.

Why This Story Matters

Most technical content for finance professionals follows a template: person learns to code, builds something, launches it, makes money. You're supposed to be inspired.

But that's survivorship bias. Nobody writes about the projects that work but shouldn't ship. The ones where unit economics don't pencil out. Where your time is better spent elsewhere. Where the smartest decision is walking away from something you built.

This isn't a cautionary tale about failure. The scraper works. The architecture is solid. The extraction logic handles edge cases. But working and worth-launching are different questions, and finance professionals should know that better than anyone.

What You'll Actually Learn

Over this series, I'll show you exactly how I built this system—and why I shelved it:

  • The tools: Claude, ChatGPT, Lovable, and Firecrawl (with real cost breakdowns)
  • The architecture: Discovery, extraction, fallbacks, and quality control
  • The economics: Why per-user API costs killed my business model
  • The decision: When to cut losses on a working product

But unlike most tutorials, we're approaching this from a finance mindset:

  • Make-or-buy analysis at every decision point
  • Real unit economics, not fantasy Excel projections
  • Opportunity cost calculations that factor in your actual hourly rate
  • Sunk cost discipline when the numbers say quit

You'll learn web scraping, AI extraction, and application architecture. But through the lens of someone who had to decide whether shipping was worth it.

Who This Is For

You're a finance professional, consultant, or analyst who:

  • Wastes time on manual data collection that should be automated
  • Wonders if AI coding tools actually work for non-technical people
  • Thinks "I could build this" but needs a reality check on whether you should
  • Values honest economics over aspirational tech success stories

No programming experience required. You need curiosity, basic logic skills, and the discipline to make hard decisions about projects you've invested in. If you can build an Excel model, you can follow this.

What Makes This Different

Most "learn to code" content assumes you want to become a software engineer. I don't. I wanted to solve one specific problem: aggregating investment data at scale.

So this series skips the gatekeeping. No "proper architecture" lectures. No "best practices" that trade your time for theoretical elegance. Just practical examples, honest mistakes, and real costs.

Because sometimes the best decision is to build the prototype, learn the lessons, and move on. That's not failure—it's good capital allocation.

The Journey Ahead

  • Part 1: The Make-or-Buy Decision
  • Part 2: Building a Scraper That Doesn't Break
  • Part 3: How to Know If Your Data Is Garbage
  • Part 4: When Unit Economics Kill a Working Product
  • Part 5: The 80/20 Rule for Scraping Scalability
  • Part 6: The Decision Framework for Walking Away

Each post includes real code examples, actual costs, and honest mistakes. No survivorship bias. No humble-bragging. No assumption you should launch just because you can.

By the end, you won't be a software engineer. But you'll know how to evaluate whether automation is worth building, use AI tools to prototype fast, calculate realistic unit economics, and make the call between shipping, pivoting, or walking away.

Which might save you more time and money than any automation ever could.


Part 1 drops next week.

Working on something similar? Wondering if your automation idea is worth pursuing? The comments are open—lets talk about it.