logo

A new direction

We are discontinuing Replay Test Suites so that we can focus on exploring the intersection of replayability and AI.
profile photo
Jason Laster
Image without caption
We started Replay.io in 2020 because we believed that being able to record and replay web applications would change the way people saw and understood their software.
It’s been incredible to see how Replay DevTools has helped developers fix some of the hardest issues in some of the most important projects. Today, it’s difficult to build a web application without using a library that’s been improved with Replay.
Like most DevTools startups, we’ve had to balance two goals: creating a great free single-player tool and developing an excellent collaborative paid team tool.
We've always felt confident that Replay DevTools would surpass Chrome DevTools as the best way to inspect React applications. In this respect, Replay has been hugely successful and it's been great to see features like Live Console Logs and Jump to Code reimagine what's possible in browser devtools.
On the second goal, it’s been disappointing to see Replay DevTools struggle to find a home within teams. When we started, we acknowledged that most developers were not looking for more powerful devtools, but we felt that if we could remove the frustration associated with fixing hard to reproduce customer issues or flaky browser issues, that would be valuable enough to land at progressive companies.

The search for Product Market Fit

Finding Product Market Fit means quickly validating your hypotheses. This has been challenging for two reasons:
  1. We started with a solution and have been searching for a problem.
  1. Time-travel debugging is a difficult product to build, so it has been hard to differentiate quality and market signals.

Bug Reports

We launched Replay in the fall of 2021 with a focus on bug reports because it was easy to find developers who were frustrated with vague bug reports that they could not reproduce and were excited to be able to retroactively debug the issue as if they were there when it happened.
While that sentiment was correct, it ignored two important points. First, users were concerned that replays could contain sensitive data and that you had to download a separate browser to record a replay. Second, most issues are reproducible once you can see the video and network requests. And once you can reproduce the problem, you don’t need a time-travel debugger to understand the root cause.
Put another way, if you’re starting with the problem of making it easy to file great bug reports you’ll likely arrive at a simpler solution.

Test Suites

In the spring of 2022 we pivoted from focusing on bug reports to flaky browser tests because everyone we talked to said that they had flaky tests. In fact, it was kind of a joke to ask “do you have flaky tests” because of course they did, who doesn’t!
We were also excited that Test Suites mitigated the concerns around privacy and downloading a separate browser and once setup, every developer would get links to replays of their failing tests in their PRs. This felt like a product “no brainer” equivalent to the PR preview branch.
Unfortunately, with Test Suites we ran into the same challenges where we were starting with a solution instead of a problem. Yes, it was true that developers were frustrated by test failures in CI, but it was also true that they were happy with the Cypress Timeline in local development and would be satisfied if the Cypress Timeline were also available in CI.
Our initial belief was that developers would want to use Browser DevTools to understand the underlying issues in their applications and have tests that passed the first time, but in reality, most users were okay fixing the test so that it would pass eventually.
These truths can be difficult to hear when you’re excited about your specific solution, but are the reason why focusing on the problem and validating your assumptions is so important.

Where do we go from here?

First, we will be discontinuing Replay Test Suites August 31st. If you have any questions, please reach out to us at support@replay.io.
We are also conducting our first RIF as a company and letting some really talented folks go. This decision reflects the fact that we do not need to actively scale. And while we’re well funded, taking our foot off the gas will give us the space to focus on validating a new direction. If you are hiring, take a look at this sheet.
Second, Replay DevTools will not be going anywhere. We have some exciting next-gen React features in the pipeline that will make it much easier to inspect component lifecycles and fix common foot guns with Suspense and useEffect.
DevTools is the way that developers inspect their applications. As long as we’re recording and replaying runtimes, we’ll continue to invest in great time-travel devtools.

We’ll be exploring the intersection of replayability and AI.

When we raised our initial round in 2020, we were inspired by how replay environments were being used to train self driving cars and video game like Dota 2.
It’s been exciting to watch the advancements in AI over the past couple of years, but also difficult because we believe replayability could play an important role at both inference and training time.
At inference time, we believe replayability can offer software agents incredible tools. There’s research that shows how runtime traces can improve the Human Eval benchmark [1]. We’re excited to see how replayability can generalize these approaches and help AI agents develop hypotheses and avoid rabbit holing in more complex codebases.
At training time, we believe that great datasets and benchmarks are critical. Here, we’re excited to see if recording and replaying web applications could be the basis of a new benchmark or training environment that better reflects what it is like to be a frontend developer than benchmarks like Human Eval or SWE Bench.
If you’re exploring similar problems and would like to talk, please reach out (hi@replay.io) .

Onward and upward

Building a startup is never easy. If you’re going to embark on the journey, it’s important to find a mission you deeply believe in and a team and community to join you. At times like this we are incredibly grateful for your continued support. Onward and upward.
Replay employees
Related posts
post image
We are discontinuing Replay Test Suites so that we can focus on exploring the intersection of replayability and AI.
post image
We’re thrilled to share our progress on Performance Analysis: the first practical way to catch performance regressions before they ship to prod.
post image
In this post we develop the ideas underlying Replay’s upcoming Performance Analysis, using a real world regression drawn from the development of our devtools.
Powered by Notaku