Performance work fails when it stays vague. "Make the site faster" is not a task anyone can act on. A fix list with evidence, priorities, and owners is.
This post gives you two practical tools you can use today: a copy-paste performance audit checklist and a report template formatted for handing to a developer or presenting to a client.
Both are designed to support the full audit process described in How to audit your website speed (Lighthouse + CrUX), and the broader performance playbook in Website Performance & Core Web Vitals: the Australian business guide.
Key takeaways
- A good audit records evidence — screenshots and specific URLs — not just scores.
- Five pages is enough to identify the patterns causing most of your problems.
- Prioritise fixes that improve LCP, INP, or CLS on your highest-value pages first.
- Track before/after so you know what worked and can demonstrate the improvement.
Why checklists matter for performance work
Most performance audits fail at the handover stage. A developer receives a Lighthouse screenshot and a request to "fix the performance." Without knowing which pages were tested, which metric is worst, what the LCP element is, or which third-party scripts are causing INP problems, the implementation work is guesswork.
A structured checklist and report template solve this. They force you to capture the specific evidence needed to turn a performance audit into scoped, actionable implementation work — and they make it easy to measure whether the work actually improved things.
Performance audit checklist
Use this checklist every time you run a performance audit. Work through it in order.
Scope and setup
- [ ] Identify the primary business goal for this audit (lead generation, ecommerce conversion, organic rankings, ad quality score)
- [ ] Choose five URLs to test — prioritise pages that drive the most revenue or traffic
- Homepage
- Top service or product page
- Lead form or quote page
- Top organic landing page (from Search Console)
- One "heavy" page (gallery, long-form, or video-heavy)
- [ ] Confirm you will test Mobile performance (not desktop — mobile is where most visitors arrive and where problems are most severe)
Field data — real users
- [ ] Run PageSpeed Insights for each of the five URLs
- [ ] Record whether field data is available for each URL (yes/no — smaller sites may not have enough Chrome user data)
- [ ] Record field Core Web Vitals for each URL: LCP status, INP status, CLS status (Good / Needs Improvement / Poor)
- [ ] Identify your single weakest metric across all pages — this is your top priority
- [ ] Note any specific pages where field data differs significantly from lab results
Lab data — Lighthouse debugging
- [ ] Run Lighthouse (Mobile) for each URL in Chrome DevTools
- [ ] Run each test two to three times and use the median score
- [ ] Screenshot the Performance score, Opportunities section, and Diagnostics section for each URL
- [ ] Record the specific LCP element for each page (Lighthouse identifies this — it's usually an image or headline)
- [ ] Note any pages where the LCP element is a slider, video, or CSS background image
Image audit
- [ ] Identify all images on the page using DevTools Network tab, filtering by image type
- [ ] Check that the hero or LCP image is in WebP or AVIF format (not JPEG or PNG)
- [ ] Verify images are served at appropriate dimensions (not a 2,400px image on a 390px phone)
- [ ] Check that all images have explicit
width and height attributes (prevents CLS) - [ ] Look for images loaded via CSS
background-image on the LCP element (these cannot be preloaded easily) - [ ] Check for unnecessary sliders or carousels in the first visible viewport
Third-party script audit
- [ ] Open DevTools → Network tab and record every third-party domain loading scripts or resources
- [ ] Categorise each script: analytics, advertising pixels, chat, heatmap/session recording, A/B testing, social embeds, booking system, other
- [ ] Mark each script as: Keep (essential, actively measured) / Defer (useful but not critical on first load) / Remove (unused, redundant, or unmeasured)
- [ ] Check for duplicate pixels (e.g., two versions of the Facebook Pixel, or both GA3 and GA4 running simultaneously)
- [ ] Estimate the combined JavaScript weight of third-party scripts
Layout stability (CLS)
- [ ] Check all images and iframes for explicit
width and height attributes - [ ] Look for cookie banners or promotional bars that appear above the hero content after load
- [ ] Check font loading — is
font-display: swap set? Does the fallback font match the web font size closely? - [ ] Look for any content that loads asynchronously and expands in the viewport
- [ ] Check ad slots — are their dimensions pre-allocated?
Server and hosting
- [ ] Record Time to First Byte (TTFB) for each URL using the Network tab in DevTools
- [ ] Check whether a CDN is in use and whether static assets are served from it
- [ ] For WordPress sites: check whether full-page caching is enabled
- [ ] Note the hosting provider and plan — shared hosting is a common TTFB bottleneck
Prioritisation
- [ ] Assign each issue a priority: P1 (Core Web Vitals on key pages) / P2 (template-level, multi-page impact) / P3 (secondary improvements)
- [ ] Mark which fixes are template-level (fix once, benefit many pages) — prioritise these
- [ ] Assign an owner to each fix: content team, marketing team, developer, agency
- [ ] Add rough effort estimates: hours, days, or weeks
Re-test and measure
- [ ] Record all baseline metrics before implementation begins
- [ ] Re-run Lighthouse after each sprint of fixes
- [ ] Record updated LCP, INP, CLS (field data where available, lab data otherwise)
- [ ] Record conversion metrics before and after if you have reliable tracking
Performance audit report template
Copy this template, fill it in during your audit, and send it to your developer or agency.
Project: [Site name]
Primary URL: [https://example.com]
Audit date: [Date]
Auditor: [Your name]
Business goal: [Lead generation / Ecommerce / Organic rankings / Paid search quality score]
Audience context: [e.g. Brisbane tradies on mobile, ecommerce buyers, professional services prospects]
URLs tested
- [URL 1 — describe page type]
- [URL 2]
- [URL 3]
- [URL 4]
- [URL 5]
Executive summary
[2–4 sentences: What's slow, what's causing it, and what we'll fix first.]
Example: "Mobile performance is poor across all five pages, driven primarily by an oversized hero image (LCP 4.8s) and three heavy third-party scripts adding significant JavaScript weight. The LCP element is a full-width JPEG on the homepage. Priority fixes are image conversion to WebP with responsive sizes, and removing two unused tracking pixels."
Core Web Vitals snapshot — field data
| Page | LCP | INP | CLS | Field data available? | |------|-----|-----|-----|-----------------------| | Homepage | | | | Yes / No | | [Page 2] | | | | Yes / No | | [Page 3] | | | | Yes / No | | [Page 4] | | | | Yes / No | | [Page 5] | | | | Yes / No |
LCP target: ≤ 2.5s | INP target: ≤ 200ms | CLS target: ≤ 0.1
Top issues (ranked by impact)
Issue 1:
- Description: [What is the problem]
- Evidence: [Screenshot / URL / Lighthouse finding]
- Root cause: [Why it's happening]
- Fix: [What needs to be done]
- Owner: [Developer / Content team / Marketing]
- Priority: P1 / P2 / P3
- Effort estimate: [Hours / Days]
Issue 2:
- Description:
- Evidence:
- Root cause:
- Fix:
- Owner:
- Priority:
- Effort estimate:
Issue 3:
- Description:
- Evidence:
- Root cause:
- Fix:
- Owner:
- Priority:
- Effort estimate:
Full backlog
P1 — Fix in Sprint 1 (this week):
P2 — Fix in Sprint 2 (next 1–2 weeks):
P3 — Later (backlog):
Before/after tracking
| Metric | Before | After Sprint 1 | After Sprint 2 | |--------|--------|----------------|----------------| | Lighthouse Mobile | | | | | LCP (field) | | | | | INP (field) | | | | | CLS (field) | | | | | Bounce rate | | | | | Conversion rate | | | |
Technical notes
- CMS / platform: [WordPress, Webflow, Next.js, Shopify, etc.]
- Hosting / CDN: [Provider and plan]
- Third-party scripts in use: [List]
- Must-keep scripts (non-negotiable): [List]
- Known constraints: [e.g. marketing team owns GTM, developer access limited]
How to use this after the audit
Once you've completed the checklist and filled in the report template, your next steps are:
- Share the report with your developer or agency — it's the scoped brief for the implementation work.
- Run Sprint 1 fixes (P1 items) and re-test with Lighthouse immediately after.
- Check field data two to four weeks post-implementation — Core Web Vitals field data reflects a 28-day rolling window, so improvements in lab data won't show in field data immediately.
- Run Sprint 2 fixes and repeat.
If your analytics tracking is not reliable enough to measure before/after conversion impact, address that first. Gut-feel performance wins are easy to claim but hard to defend without data. Analytics & Tracking service.
FAQs
Should I include every Lighthouse recommendation?
No. Focus only on findings that connect to LCP, INP, or CLS on your high-value pages. Lighthouse can generate dozens of recommendations — most of them secondary. Work the P1 list first.
What if I can't get field data for my site?
Use lab data for debugging. Many smaller Australian business sites don't have enough Chrome user traffic to generate CrUX field data. In that case, Lighthouse lab data is your primary tool, and you measure success through conversion metrics rather than field Core Web Vitals.
Do I need a developer to run this audit?
No — the audit itself can be done by anyone with a Chrome browser. Implementation of the fixes typically requires developer support, particularly for caching, JavaScript changes, and template-level work. If you want a professional audit with a developer-ready backlog built in, start with our Website Performance service.
Next step
Use the checklist on your top five pages, fill in the report template, and hand it to your developer. For the full context on what each finding means and how to fix it: Website Performance & Core Web Vitals: the Australian business guide.
And if you want this done professionally — audit, prioritised backlog, and implementation — get in touch.