5 min read

Part 4 - When Unit Economics Kill a Working Product: Building for Profitability

Part 4 - When Unit Economics Kill a Working Product: Building for Profitability

My scraper worked perfectly. But every power user would cost me $100/month in API fees while paying $49. I could either raise prices (and have no customers) or lose money on every user. Here's the math that killed my launch.

The Fatal Flaw: When Your Best Customers Are Your Worst Customers

The vision was elegant: charge $49/month for unlimited portfolio scraping. Users paste any PE firm's URL, get clean Excel exports, everyone's happy. The scraper worked. The extraction pipeline was sophisticated. The AI-powered fallbacks handled edge cases beautifully.

Then I calculated the actual costs.

Average scrape cost breakdown:

  • Firecrawl API: $0.10 per page
  • Typical portfolio site: 25 pages crawled
  • One complete scrape: $2.50 in API costs

That seemed manageable. Until I modeled actual user behavior.

Testing phase reality:

  • New users try 10-15 different firms finding the right data
  • Cost: 12 scrapes × $2.50 = $30 before they even decide to subscribe

Power user monthly pattern:

  • Monitoring 8-10 portfolio companies
  • Weekly re-scrapes for fresh data
  • 40 scrapes per month × $2.50 = $100 in API costs
  • Their subscription: $49/month
  • Net loss per power user: -$51/month

The more engaged the user, the more money I lost. This is the opposite of how SaaS economics should work.

The Four Hidden Cost Multipliers

The API costs were just the beginning. Here's what made the unit economics completely unworkable:

Cost Multiplier #1: The Customization Tax

Every new customer sounded the same: "This is great! Can you add support for [Firm XYZ]?"

The real translation: "Will you spend 3 hours building a custom scraper for my specific use case, then maintain it indefinitely, all included in my $49/month?"

The math:

  • 3 hours custom development at $75/hour = $225
  • Break-even point: 4.5 months of subscription
  • Average customer wanted: 5-10 firms customized
  • Actual work required: 15-30 hours per customer
  • Effective hourly rate: $3.27/hour

My "universal" scraper worked on maybe 40% of sites out-of-the-box. The other 60% needed manual configuration—URL patterns, CSS selectors, pagination logic. Each PE firm's website was a unique snowflake of poor design choices.

Cost Multiplier #2: Data Freshness Becomes Cost Inflation

Investment portfolios change constantly. A firm exits a company, closes a new deal, updates ownership stakes. Users don't want a scraping tool—they want current data.

Which means either:

  • Users scrape constantly (burning through credits rapidly)
  • Or you scrape proactively (burning through YOUR credits)

Weekly monitoring scenario:

  • 10 firms monitored
  • 4 scrapes per firm per month
  • 40 scrapes × $2.50 = $100/month in costs
  • This isn't a bug—it's the core use case

The more valuable the product, the more it costs to run. Every feature that made users happy made my P&L worse.

Cost Multiplier #3: The Maintenance Trap

In three months of testing, I watched the web scraping landscape actively work against me:

  • 12 PE firms redesigned their portfolio pages
  • 4 firms migrated from static HTML to React SPAs (requiring expensive JS rendering)
  • 8 firms added CAPTCHA or bot detection
  • 3 firms removed their public portfolio pages entirely

Every break meant support tickets, emergency debugging, and rushed fixes. The scraper wasn't a product—it was a part-time job maintaining brittle integrations.

Maintenance cost projection:

  • 5-10 hours per week fixing broken scrapers
  • At $75/hour: $375-750/week = $1,500-3,000/month
  • To cover with subscriptions: 30-60 paying customers minimum
  • To reach 30 customers: probably 6-12 months of marketing
  • Break-even: 12-18 months... if nothing else breaks

Cost Multiplier #4: The Market Reality Check

The TAM looked huge: 5,000+ PE firms globally, thousands of VCs and analysts, surely hundreds would pay $49/month?

The actual market had two segments:

Segment A: Enterprise buyers

  • Already pay $20K-40K/year for PitchBook or Preqin
  • Want comprehensive, verified, constantly-updated data
  • Need compliance, auditability, legal guarantees
  • Won't trust a solo developer's side project

Segment B: Budget-conscious users

  • Can't afford enterprise tools
  • Also can't afford $49/month for partial data
  • Want free or one-time payment
  • High churn, high support burden

I was trying to sell into a market with either enterprise budgets or no budget. There was no middle.

The Three Paths Forward (And Why None Made Sense)

Path A: Custom Scrapers Per Site

Build individual parsers for each major PE firm:

  • Time investment: 2-4 hours × 500 firms = 1,000-2,000 hours
  • Ongoing maintenance: 5-10 hours/week as sites break
  • Total lifetime commitment: 2,500+ hours over 2 years

At $75/hour opportunity cost: $187,500 in labor. To break even at $49/month subscriptions: 319 paying customers for 12 months straight.

Path B: Become a Data Company

Scrape everything yourself, sell access to the database:

  • Compete with PitchBook, Preqin, Crunchbase
  • Capital needed: $500K-2M to build, staff, market
  • Ongoing costs: $50K-100K/month in infrastructure and labor
  • Time to profitability: 24-36 months (if successful)

This isn't a side project—it's a venture-backed startup with massive risk.

Path C: Open Source and Walk Away

The path I chose. Ship the code to GitHub as a reference implementation. Learn the lessons. Move on before throwing good money after bad.


Unit Economics Reality Check Framework

Here's the calculator I wish I'd built before writing any code:

API-BASED PRODUCT UNIT ECONOMICS

STEP 1: CALCULATE TRUE COGS (Cost of Goods Sold)
□ API cost per unit × average usage = $_____/user/month
□ Support time per user × hourly rate = $_____/user/month
□ Ongoing maintenance (% of dev time) = $_____/user/month

TOTAL COGS PER USER: $_____/month

STEP 2: CALCULATE REQUIRED PRICING
□ COGS × 2.5 (minimum healthy margin) = $_____/month minimum price
□ Support + maintenance overhead × 2 = $_____/month realistic price

STEP 3: MARKET REALITY CHECK
□ Can your target market afford this price? YES / NO
□ Do competitors charge similar prices? YES / NO
□ Will power users cost 3x+ average users? YES / NO

RED FLAGS (any YES kills the model):
□ Monthly subscription < (COGS × 2.5)
□ Power users cost more than they pay
□ Customization requests = hidden dev costs
□ Data freshness requires constant re-processing
□ Core infrastructure breaks weekly

IF YOU HAVE 2+ RED FLAGS: DON'T LAUNCH

What I Should Have Done: Validate Economics First

Here's the embarrassing truth: I built the entire product before validating anyone would pay enough to make it profitable.

The right sequence:

  1. Build a landing page: "Pre-order: $299 per firm portfolio extraction"
  2. Target: Get 20 pre-orders
  3. Manually fulfill the first 10 orders (4 hours each = $75/hour)
  4. Use revenue to build automation
  5. Only automate what's profitable

What I actually did:

  1. Build sophisticated AI extraction pipeline
  2. Implement multi-tier fallback system
  3. Add caching and optimization
  4. Calculate costs
  5. Realize it's unprofitable
  6. Panic

The product worked. The business model didn't. I optimized the wrong problem.


Key Takeaway: Working ≠ Profitable

The scraper was technically impressive. The code was clean. The extraction accuracy was good enough. The caching layer was clever. The confidence scoring was sophisticated.

None of that mattered because the unit economics made it impossible to profitably serve customers who actually needed what I built.

Working and profitable are different questions. Sometimes the smartest move is building the prototype, learning the lessons, and walking away before you spend money on a business model that doesn't work.

The code now lives on GitHub as a well-documented reference implementation. Maybe someone smarter will find the business model I couldn't. Or maybe the real value was the spreadsheet I built to realize the economics didn't work—that sheet has saved me from three other bad ideas since.


Next time: The 80/20 rule for scraping scalability - we'll discuss why perfect is the enemy of great.

Question for you: Have you ever built something that worked perfectly but couldn't be profitable? How did you decide whether to pivot the business model or walk away?