An audit is not an opinion. It's a procedure that produces a number anyone can reproduce by running the same steps. This page is the procedure. It's public because methodology that isn't public can't function as a standard.
Is the warmup tool actually moving the only number that matters — independent inbox placement on real campaigns to real prospects, in 2026? Yes, no, or by how much.
Prerequisites
- One sending domain with at least four weeks of consistent campaign volume.
- Clean SPF, DKIM (1024-bit minimum, 2048-bit preferred), DMARC at
p=noneor stricter. - Access to your warmup vendor dashboard and to its score export.
- An independent seed network the vendor cannot influence — accounts outside every warmup pool, with organic histories, polled by IMAP.
- One real campaign template you actually send. Don't audit a synthetic test template; audit production traffic.
The eight-step protocol
- Day 0 — Baseline capture. Record vendor dashboard score, vendor pool composition (number of accounts, providers represented), independent placement on the real template. Capture all three on the same date.
- Day 1-3 — Continue normal sending with warmup ON. Send the real campaign at the normal time of day. Measure independent placement on the same template each day. Record vendor dashboard score each day.
- Day 4 — Pause warmup. Disable the warmup tool. Continue normal campaign sending unchanged. This is the kill test inside the audit.
- Day 5-6 — Warmup OFF, measure. Same template, same time, same volume. Measure independent placement each day.
- Day 7 — Resume warmup. Re-enable the tool. One day of resume traffic before the next measurement.
- Day 8 — Warmup ON, measure again. Final independent placement reading.
- Compute deltas. Vendor score during ON vs OFF. Independent placement during ON vs OFF. Per-provider breakdown.
- Write the report. Three artefacts, one paragraph each. See the template below.
Controls — what must stay constant
- Content. Same subject, same body, same links, same tracking-pixel state.
- Time of day. Sends within the same one-hour window each day.
- Volume. Same daily send count, same ramp.
- Authentication. No DNS changes during the audit week.
- List quality. Same prospect list segment, no fresh import mid-audit.
The only variable allowed to change is warmup ON / OFF. If you change anything else, the audit is invalid and you have to start over.
Report template
Three artefacts. One page. Defensible to anyone — including a vendor that disputes the result.
- Artefact 1 — Vendor score over time. Eight daily readings from the vendor dashboard, ON and OFF days marked.
- Artefact 2 — Independent placement over time. Eight daily readings from the seed network, ON and OFF days marked, broken out by Gmail / Outlook / Yahoo / one regional.
- Artefact 3 — Delta narrative. One short paragraph: the vendor score during OFF days, the independent score during OFF days, and whether the warmup ON/OFF state changed the independent score by more than the daily variance baseline.
Defending the result if the vendor disputes it
- You ran the same content, same auth, same time. The methodology is on this page.
- The seed network is outside every warmup pool — including the vendor's own pool. Provider verdicts are real-time.
- The audit measured independent placement, not pool-internal delivery. Those are different things; the vendor knows this.
- The eight-day window is short enough to control for content and ramp drift. If the vendor asks for a longer window, run a second audit. The number rarely moves much.
Inbox Check is independent of every warmup vendor. Free for the campaigns you need to audit. Per-provider breakdown by folder.
How often to repeat the audit
Quarterly is sufficient for most teams. Repeat after any major change: new domain, new ESP, large list refresh, content overhaul, or after any classifier update at a major provider. 2026 had two such updates already.