Forecaster

We were always making bets. We just weren't tracking them

Szymon Korytnicki
welcome
So, what?

Before Forecaster, we thought we were building products.

In reality, we were placing bets.

Some were obvious: “This feature will increase activation by 15%.” “This onboarding change will reduce churn.”

Others were implicit: “This is worth building.” “This customer segment matters.” “Now is the right time.”

We debated them, argued over them, and committed weeks—sometimes months—of work behind them.

But we didn’t treat them as bets.

The uncomfortable realization

Looking back, a pattern emerges:

  • We had strong opinions, weak probabilities
  • We tracked outputs, not expectations
  • We celebrated outcomes, not decision quality

When something worked, we told ourselves a story about why we were right. When it failed, we moved on quickly—onto the next idea, the next sprint, the next “opportunity.”

What we didn’t do was ask:

  • How confident were we, really?
  • What did we expect to happen?
  • Were we wrong—or just unlucky?

Without those answers, learning was shallow. Sometimes illusory.


The cost of untracked bets

Not tracking decisions doesn’t feel like a problem in the moment. Work moves fast. Shipping feels like progress.

But over time, the cost compounds:

  • The same mistakes reappear in different forms
  • Confidence is driven by memory, not evidence
  • Strong personalities outweigh calibrated judgment
  • Teams optimize for momentum, not accuracy

And perhaps most importantly:

You lose the ability to distinguish skill from luck.


Our own failures

We’ve made all the classic mistakes:

  • Overestimating the impact of “obvious” improvements
  • Underestimating how long things would take
  • Shipping features with no clear success criteria
  • Doubling down on ideas that felt right but weren’t working

In hindsight, many of these weren’t bad ideas.

They were unmeasured bets.

We didn’t define what success looked like. We didn’t assign probabilities. We didn’t revisit our assumptions.

So when reality diverged, we had nothing to compare it against.


The shift

At some point, the question changed from:

“What should we build?”

to:

“What do we actually believe will happen?”

That shift is subtle but fundamental.

It forces clarity:

  • What outcome are we predicting?
  • On what timeline?
  • With what confidence?
  • Based on which assumptions?

It also introduces accountability—not to others, but to reality.


Why we built Forecaster

Forecaster wasn’t born from a grand vision.

It came from a simple frustration:

We were already making decisions as if they were bets—but without the discipline that makes betting rational.

So we started treating them explicitly:

  • Every initiative became a prediction
  • Every prediction had a probability
  • Every outcome was measured against expectations

Not perfectly. Not consistently. But enough to see the difference.

Patterns started to emerge:

  • Where we were overconfident
  • Where we systematically underestimated risk
  • Which types of bets actually paid off

That feedback loop changed how we think.


What this blog will be about

This is not a blog about productivity hacks or generic product advice.

It’s about decision-making under uncertainty.

We’ll write about:

  • How to think in probabilities without overcomplicating things
  • How to structure product bets so they can be evaluated
  • What calibration actually looks like in practice
  • Where teams tend to fool themselves—and how to avoid it

And we’ll be honest about our own mistakes along the way.


A simple starting point

You don’t need a system to begin.

Take one decision you’re making this week and write down:

  • What outcome you expect
  • When it should happen
  • How confident you are (in %)

That alone will put you ahead of most teams.

Not because you’ll be right.

But because, for the first time, you’ll be able to know whether you were.


We were always making bets.

Now we’re learning how to take them seriously.

Pilot Program

Ready to improve your team's decision making?

Join our Prediction Facilitator program. A 2-week fast-track to help your team reduce uncertainty and capture learning from real outcomes.

Learn More about the Pilot