Meticulous Onboarding Guide

Welcome to Meticulous! This guide will help you set up automated visual testing for your web application in 3 main steps.


Quick Start - Choose Your Path

Answer 3 questions to get a customized setup guide:

Question 1: What framework are you using?

Question 2: What are you testing?

  • Static site (HTML/JS/CSS, no SSR) → Use upload-assets action (simplest and most reliable)
  • Server-rendered app (Next.js, Nuxt, etc.) → Use upload-container action (Docker)
  • Not sure? → Continue reading below

Question 3: How do you deploy?


Overview - 3 Steps to Get Started

  1. [5 minutes] Install recorder and test locally
  2. [15 minutes] Set up CI/CD workflow
  3. [Ongoing] Record sessions and review diffs

Total setup time: ~20 minutes for basic setup


Step 1: Install Recorder and Test Locally (5 minutes)

Before setting up CI, let's verify Meticulous works on your local machine.

1.1 Add Recorder Script

  • Always install on localhost (for local development)
  • Recommended: Install on staging/preview environments
  • Production: Contact support for guidance

Add the Meticulous recorder script to your HTML <head>:

<script
  data-project-id="YOUR_PROJECT_ID"
  src="https://snippet.meticulous.ai/v1/meticulous.js"
></script>

Get your project ID: Meticulous Dashboard → Project Settings

Where to add it:

  • Next.js App Router: app/layout.tsx in <head>
  • Next.js Pages: pages/_document.tsx in <Head>
  • React/Vue/Angular: index.html or src/index.html in <head>

1.2 Verify Recorder Works

  1. Start your dev server: npm run dev
  2. Open browser DevTools Console
  3. Check for: "Meticulous recorder initialized" message
  4. Verify: window.Meticulous object exists

If recorder doesn't load, see Troubleshoot Recorder.

1.3 Test Local Simulation

Let's verify Meticulous can replay a session on your local machine:

# Get a session ID from the dashboard first
npx @alwaysmeticulous/cli simulate \
  --apiToken="<API_TOKEN>" \
  --sessionId="<SESSION_ID>" \
  --appUrl="http://localhost:3000"

Where to get session ID:

  1. Use your app normally with recorder installed
  2. Go to Meticulous Dashboard
  3. Find a recent session and copy its ID

Success criteria:

  • Command completes without errors
  • You can view the simulation in the dashboard
  • Screenshots look correct

Common issues:

Debug tools:

  • View simulation events: Dashboard → Simulation → Timeline & Logs tab
  • Step through replay: Add --debugger --devTools to simulate command

Debugging simulation issues on your local machine is much easier than debugging in CI. Don't proceed to Step 2 until local simulation works!


Step 2: Set Up CI/CD Workflow (15 minutes)

Now that local simulation works, let's run tests in CI on every pull request.

2.1 Choose Your CI Approach

Option A: Upload static assets (Recommended for static sites)

  • If your app can be served as static HTML/JS/CSS without server-side rendering
  • Simplest and most reliable approach
  • Not recommended for typical Next.js apps
  • Guide: GitHub Actions Setup

Option B: Upload a Docker container (Recommended for server-rendered apps)

  • For Next.js, Nuxt, and other SSR frameworks
  • You build a Docker image, Meticulous hosts it
  • Guide: GitHub Actions Setup

Option C: Preview URL integration

  • Works with Vercel, Netlify, and other preview URL providers
  • No CI workflow needed (Vercel has a direct integration)
  • Guide: Cloud Replay Setup

Option D: Tunnel to a running instance (Last resort)

  • Most brittle; requires starting the app in CI
  • Use only if none of the above work
  • Guide: GitHub Actions Setup

2.2 Add API Token to CI

  1. Get API token: Dashboard → Project Settings
  2. Add to GitHub: Repo Settings → Secrets → Actions
  3. Create secret: Name it METICULOUS_API_TOKEN

2.3 Create Workflow File

Follow the CI Setup Guide for step-by-step instructions and example workflow files for your chosen approach (upload static assets, upload a Docker container, or tunnel to a running instance).

If using preview URLs, follow the Cloud Replay Setup Guide instead.

2.4 Verify Workflow Runs

  1. Commit and push the workflow file
  2. Create a test PR with a small change
  3. Check Actions tab for workflow run
  4. Expected first PR: "No base test run found" (this is normal!)

2.5 Establish Base Run

For the first PR to work, Meticulous needs a base run on your main branch:

  1. Merge the PR that adds Meticulous workflow
  2. Wait for workflow to run on main branch
  3. Verify in Actions tab: main branch run succeeded
  4. Check dashboard: You should see sessions and test runs

2.6 Test with Second PR

  1. Make a small UI change
  2. Create another PR
  3. Expected: Meticulous comment with test results
  4. If you changed UI: Diffs are detected
  5. If no changes: "No diffs detected"

If you see a Meticulous comment on your PR, congratulations! You're now running automated visual tests.


Checklist: Is Everything Working?

Use this checklist to verify your setup:

Recorder Installation

  • Recorder script added to HTML <head>
  • Project ID is correct
  • Console shows "Meticulous recorder initialized"
  • window.Meticulous object exists
  • Sessions appear in dashboard after using app

Local Simulation

  • npx @alwaysmeticulous/cli simulate completes successfully
  • Can simulate authenticated pages
  • Can simulate different environments (if needed)
  • Simulation screenshots look correct

CI Setup

  • API token added as repository secret
  • Workflow file created and pushed
  • Workflow runs on PRs
  • Base run exists on main branch
  • Meticulous comments on PRs

Ready for Production

  • No false positive diffs
  • Auth works in tests
  • All critical user flows are recorded
  • Team knows how to review diffs

Step 3: Record Sessions and Review Diffs (Ongoing)

3.1 Record Real User Sessions

The more sessions you record, the better your test coverage:

Where to record:

  • Development: Engineers testing locally
  • Staging: QA and product testing
  • Preview URLs: Testing on PRs
  • Production: Contact support for guidance

What gets recorded:

  • User clicks, typing, scrolling
  • Network requests and responses
  • DOM changes and state

3.2 Curate Your Golden Set

Meticulous automatically selects sessions that maximize coverage:

  1. View selected sessions: Dashboard → Selected Sessions tab
  2. Increase count if needed: Configure → Number of Sessions
  3. Manually add important flows: Select specific sessions

3.3 Review Diffs on Pull Requests

When Meticulous detects visual differences:

  1. Review diff screenshots in PR comment
  2. Determine if change is:
    • Expected: Click "Approve" to mark as intentional
    • Bug: Fix the code
    • False positive: See Fix False Positives

3.4 Make CI Check Blocking (Optional)

Prevent merging PRs with unacknowledged diffs:

  1. Go to repository settings
  2. Add Meticulous check as required status
  3. Now PRs can't merge with pending diffs

See: Make CI Check Blocking Guide


Original Setup Details

Below are the detailed instructions from the original onboarding guide.

1. Installing the session recorder

Before installing the Meticulous session recorder, you should decide which environments are a good fit for recording user sessions:

  • The session recorder should always be installed on localhost to ensure that all sessions generated by engineers developing and testing your application locally are captured.
  • We also recommend installing it on other internal environments, like staging stacks or preview URLs.
  • If you are interested in recording production sessions, please reach out to Meticulous support for additional details.

There are two ways to add the Meticulous recorder to your web application:

  1. By inserting it as script tag (recommended)
  2. By installing an NPM package

If possible, we recommend that you use the script tag as it is the only way to fully guarantee that the recorder initializes before any other scripts execute, thereby ensuring Meticulous can capture all network responses (learn more).

However, if it's not possible to template your HTML so that the script tag is only included in the environments where you want to record sessions then you can use the loader package instead.

More information on troubleshooting recorder problems can be found here and here.

2. Setting up tests to run in CI

Validating that sessions can be simulated

Before setting up tests to run in CI, you should ensure that sessions can be simulated on your local machine. Debugging any issues on your local machine is much easier than debugging any issues that occur in CI. You can replay locally using the following CLI command:

npx @alwaysmeticulous/cli simulate
  --apiToken="<API_TOKEN>"
  --sessionId="<SESSION_ID>"
  --appUrl="<URL_WHERE_APP_IS_RUNNING_LOCALLY>"

Important session simulation scenarios to check:

  • Can you simulate sessions on authenticated pages? If not, see this doc on troubleshooting authentication & authorization issues.
  • Can you simulate sessions on a different environment from where they were recorded (e.g. can you simulate a session recorded on sandbox against your localhost environment)? If not, see this doc on troubleshooting cross-environment issues.

Two tools to debug simulation issues:

  • Once a simulation has completed and been uploaded to Meticulous, you can view all the events that happened during the simulation on the Timeline & Logs tab of the simulation’s page in the Meticulous UI
  • Simulating a session with the flags --debugger --devTools will let you step through the simulation one user event at a time and pinpoint exactly where the issue is occurring

Running tests in CI

For test runs to be executed in CI, Meticulous needs to be able to simulate sessions in 3 different contexts:

  1. PR test runs: whenever a new commit is pushed to a PR branch, Meticulous kicks off a test run where it simulates user sessions against the app running that commit. Ideally, this version of the app would be hosted on a commit-specific URL, but Meticulous can also run against the app if it's spun up in CI.
  2. New-commit-on-main test runs: whenever a new commit is pushed to the main branch, Meticulous kicks off a test run to take new baseline screenshots. Ideally, Meticulous would run against a commit-specific URL here as well, but running the app in CI also works.
  3. Session selection test runs: each night, Meticulous simulates every new session from the last 24 hours to determine if any of them add new coverage and should be included in the selected set. In this context, we want to test against the latest version of your app, so usually a publicly accessible staging URL works best.

How to get Meticulous to trigger test runs across these contexts depends on how you build and serve your app:

  • [Recommended] Upload static assets. If your app can be served as a folder of static assets (HTML/JS/CSS) without server-side rendering, this is the simplest and most reliable approach. Follow the guide here to set up the upload-assets action. This approach is not recommended for typical Next.js apps.
  • Upload a Docker container. For server-rendered apps (Next.js, Nuxt, etc.), you can build a Docker image and have Meticulous host it. Follow the guide here to set up the upload-container action.
  • Vercel preview URLs. If you use Vercel preview URLs, the complexity of dealing with three different environments is completely abstracted. You can add Meticulous’ Vercel integration here and then follow the guide here to complete installation.
  • Other preview URL providers (e.g. Netlify). If you deploy your app to a preview URL without Vercel, you can follow the doc here to set up Meticulous.
  • Tunnel to a running instance. If none of the above approaches work for your app, you can build and run your app in CI and have Meticulous connect via a tunnel. Follow the guide here to configure the Meticulous GitHub action. This is the most brittle approach and should be used as a last resort.

For a comprehensive list of all GitHub Action options and configuration parameters, see the report-diffs-action README.

Common Gotchas

Before diving into performance tuning, be aware of these common issues that can cause test failures:

Static Assets with Absolute URLs

Meticulous automatically swaps the base URL for page navigation and API requests, but static assets (CSS, JS, images) referenced with absolute URLs in your HTML are NOT rewritten.

If your HTML contains:

<script src="https://production.example.com/dist/app.js"></script>

Change it to a relative URL:

<script src="/dist/app.js"></script>

See Troubleshooting Cross-Environment Issues for more details.

Understanding Network Mocking

Meticulous automatically mocks XHR, Fetch, and WebSocket requests, but does NOT mock static assets loaded via HTML tags. For a complete explanation of what is and isn't mocked, see How Meticulous Handles Network Requests.

Cross-Environment Configuration

If you're recording sessions in one environment (e.g., production) and simulating against another (e.g., localhost or preview URLs), ensure your environments have consistent:

  • Authentication configuration
  • URL routing patterns
  • Environment variables

See our FAQ & Troubleshooting guide for more details.

3. Tuning Performance

Once Meticulous is set up to run in CI, there is usually a tuning period (~1-2 weeks) required to get Meticulous working at full capacity. A variety of features that can be used to tune Meticulous are described below.

Testing feature flags

If you make heavy use of feature flags we recommend configuring Meticulous to support using old sessions to test new features that are gated behind feature flags. Get started here.

Coverage

If you feel that coverage is insufficient, you can try to address it in a couple different ways:

  • Increase the number of selected sessions. When you click the Configure button in the Selected Session tab of your project dashboard in the Meticulous UI, you will see a Number of Sessions to Auto Select section. Below this, Meticulous will recommend increasing the number of selected sessions if it will add more coverage.
  • Enable the session recorder on production. Increasing the number of sessions being ingested by Meticulous should increase the level of coverage that Meticulous is able to provide. If you are interested in this option, please reach out to Meticulous support for more information.

Flakes & false diffs

You may run into a flake or false diff during the tuning period — below are some strategies to manage them:

  • If the flake is not too frequent or problematic, you can click on the screenshot of the diff and then click Flag as Unexpected. This sends a bug report to the Meticulous team so that the flake can be prioritized and fixed.
  • If the flake is problematic, you can use the techniques described in this doc to debug or ignore the problematic element.