Fix False Positive Diffs
Overview
Ideally every difference you see in the Meticulous UI for a given PR should be directly caused by a code change introduced by that PR. Differences that show up that are unrelated to your code change could be due to:
- Changes in content derived from database data, or the current date/time, when using server side rendering (when using client side rendering Meticulous handles this automatically)
- Or, differences in how you build the code for the two different environments you're comparing between (e.g. PR build vs main branch build)
- Or, executing asynchronous tasks that Meticulous doesn't natively handle
- Or, other causes
In all three cases, if you can't solve the underlying cause, then you can just mark the diff to be ignored:
Configuring certain diffs to be ignored
You can configure diffs inside certain elements to be ignored by adding a CSS selector to the 'Elements to ignore when comparing screenshots' list in the 'Screenshotting behavior' section in project settings, or by adding the meticulous-ignore
class to an element.
Please note that if you open a pull request to add a meticulous-ignore
class to an element then the ignore rule will only apply to Meticulous test runs for new PRs opened since the original PR adding the meticulous-ignore
class was merged.
Alternatively, you can detect if the app is being rendered as part of a Meticulous test, and disable part of the UI or code that is causing false positive diffs when being rendered in a test. For frontend components you can use the Meticulous object on the window, and for server side components or server side rendering you can use the meticulous-is-test
header.
Diffs due to changes in data or the current date when using server side rendering, or rendering NextJS server components
By default Meticulous stubs out responses for any fetch or XHR requests from the browser and stubs out the Date functions in the browser. This means that even if the data in your database changes or the time changes you won't see any false positive diffs.
However if you're using NextJS server components then Meticulous will re-render those server components on the backend every time it replays a session -- this means that if the data in your database changes or the date changes in the short window of time between when Meticulous replays the session on the base commit and when Meticulous replays the session on the head commit, and you render that data or the date to the page inside a server component, then you could see false positive diffs.
Meticulous sends two headers in every request it makes to your NextJS server to allow you to resolve this:
meticulous-is-test
: You can use this header to disable parts of your server components which cause flakes in Meticulous tests. It'll always be present (with value '1') if the request is being made as part of a Meticulous test.meticulous-simulated-date
: If you're ever rendering text based on the current time (for example "Posted 7 minutes ago"), then first check if themeticulous-simulated-date
header is present on the request, and if so then use that date instead of the current time. This avoids false positive diffs due to the time changing (e.g. "Posted 7 minutes ago" vs "Posted 8 minutes ago"): Meticulous will send the same timestamp every time for the same request. The timestamp is a UTC date in RFC 7231 format - you can parse it usingDate.parse
:
import { headers } from 'next/headers'
const getCurrentDate = () => {
// Meticulous sends a simulated date header when running tests: use that instead of the current date if present
const simulatedDate = headers().get('meticulous-simulated-date');
return simulatedDate ? new Date(Date.parse(simulatedDate)) : new Date();
}
You can also configure Meticulous to ignore the diffs using CSS selectors.
Diffs due to differences between environments
Using Vercel
If you use Vercel then Meticulous will try to automatically generate previews using the same enviromental configuration for both commits to the main branch, and to PR branches. So you shouldn't see any false positive diffs due to differences between environments. If you do, then reach out to the Meticulous support team.
Using Netlify, or other preview providers
If you use another preview URL provider, such as Netlify, then Meticulous will compare visual snapshots from the preview URL of the base commit on the main branch to snapshots from the preview URL of the head commit of the pull request branch.
In this case the environment variables and configuration you use to run & build your app needs to be the same for the deployments of the main branch (production deploys) and the deployments of pull request branches (preview deploys). If this isn't the case Meticulous could display false screenshot differences.
For example if you configure production deploys of your app (from the main branch) to have a blue banner, and preview deploys of your app (from pull request branches) to have a red banner, then Meticulous would display screenshot diffs of the banner changing from blue to red for every screen. You want to make sure that the only screenshot diffs Meticulous shows are due to changes in the code introduced by the pull request being tested, rather than enviornmental differences between the environments tested on.
To fix this check the environment variables and configuration you use to run & build your app are the same for the deployments of the main branch (production deploys) and the deployments of pull request branches (preview deploys).
If it's not possible to unify the configuration across the environments then you can configure Meticulous to ignore the diffs.
Using GitHub Actions
If, instead of preview URLs, you're using the report-diffs-action
GitHub action, then Meticulous will compare snapshots from running your app from the base commit of the main branch to snapshots from running your app from the head commit of the pull request branch. In this case it's simiarly important to make sure that you compile and run your app with the same configuration for both the main branch and the pull request branches.
Diffs due to asynchronous tasks not handled natively by Meticulous
Meticulous will automatically wait for most browser tasks to complete before continuing with javascript execution. This ensures the resultant screenshots are deterministic. However if your application waits for asynchronous events that are not handled natively by Meticulous you can use the Meticulous object on the window to pause the execution of the replay while the asynchronous task is in-progress.
For example, let's say you send a message to a custom Chrome extension and then wait for a response. In this case you can tell Meticulous to pause the replay until you have received the expected response:
function sendMessageToExtension() {
if (window.Meticulous?.isRunningAsTest) {
// Meticulous will pause test execution for up to 30 seconds. If we don't
// call pause() here Meticulous will sometimes take a screenshot before the
// Chrome extension has responded, and sometimes after, causing flaky tests.
window.Meticulous.replay.pause();
}
chrome.runtime.sendMessage(MY_EXTENSION_ID, "My message", (response) => {
if (window.Meticulous?.isRunningAsTest) {
// Important: we continue the replay even if the request fails
window.Meticulous.replay.resume();
}
if (response.success) {
doSomething(response.data);
}
});
}
False positive diffs due to other reasons
Meticulous ensures the session simulation executes identically every time, even if there are animations, timers, random number generators, changing data, or changing dates and times. So under normal operation false positive diffs or flakes should not happen.
However, if you are making extensive use of web workers, WebGL or WASM, it is possible that in some cases you could see false positive diffs. If you do notice a false positive diff please reach out out to the Meticulous support team and we'll take a look into it. You can also configure Meticulous to ignore the diffs.
Where can I reach out for support?
Reach out to eng@meticulous.ai and we'll be happy to help. You can also join our community discord.