Every local SEO discussion eventually includes someone claiming page speed is a top ranking factor. Someone else responds that content and links matter more. Both are partially right, and the data tells a more nuanced story than either camp admits.
Core Web Vitals are a confirmed Google ranking factor. But the effect size is small, the mechanism is indirect, and the impact varies by competitive intensity. Here is what the evidence actually supports.
The Claim vs the Evidence: Does Page Speed Move Local Rankings?
Google’s Official Statements on CWV and Local Search
Core Web Vitals are part of Google’s Page Experience signals, which are a confirmed ranking factor. Google has never disputed this. But Google has also been explicit about the weight.
Google’s John Mueller has stated that simply having better CWV scores than competitors does not guarantee higher rankings. Content relevance and quality remain far more important.
The practical interpretation: CWV acts as a tiebreaker. When two pages thoroughly address the same query with similar content quality, similar authority, and similar relevance signals, the page with better CWV scores is more likely to rank higher. It is the deciding factor when everything else is close.
For local search specifically, proximity, Google Business Profile signals, and reviews carry much more weight than page speed. A slow site with great reviews and a verified GBP will outrank a fast site with no reviews in almost every local query.
Correlation Studies: What Third-Party Data Shows (and Its Limits)
Analysis from multiple sources shows that domains failing CWV score about 3.7 percentage points worse in search visibility on average compared to domains passing CWV. That is real but modest.
More than 50% of websites still do not pass CWV as of the most recent data. This means simply meeting the threshold puts you ahead of half the web on this signal. Whether that matters depends on how competitive your local market is.
The limitation of correlation studies: sites that invest in CWV tend to also invest in content quality, mobile experience, and overall technical health. Isolating CWV’s independent contribution is difficult because it correlates with general site quality investment.
Case studies show stronger effects. Rakuten saw conversion rates jump 33% and revenue per visitor increase 53% after CWV optimization. CoinStats fixed LCP issues (Base64 images bloating HTML) and saw “Good” CWV URLs increase by 300% alongside a 300% increase in search impressions. But these are conversion and impression improvements, not pure ranking improvements. CWV’s biggest impact may be on what users do after they arrive, not on where you rank.
LCP, CLS, and INP: What Each Metric Measures
Largest Contentful Paint and Why Local Sites Fail It
LCP measures how long it takes for the largest visible element on the page to fully render. Google’s threshold: under 2.5 seconds is “Good,” 2.5 to 4 seconds is “Needs Improvement,” over 4 seconds is “Poor.”
Local business sites commonly fail LCP because of uncompressed hero images. The homepage slider showing your office, team photo, or project gallery loads a 3MB image file that takes 4 seconds to render on mobile.
Other LCP killers on local sites: server response time on cheap shared hosting, render-blocking CSS and JavaScript that prevents the page from painting until every resource loads, and web fonts that cause invisible text until the font file downloads.
Cumulative Layout Shift: The Hidden Killer on Service Pages
CLS measures visual stability. When elements on the page move around as it loads, shifting content, buttons, and text, that is layout shift. Google’s threshold: under 0.1 is “Good.”
Local service pages fail CLS because of: images without width and height attributes (the browser does not know how much space to reserve until the image loads), ad units or chat widgets that inject themselves into the page after initial render, and dynamically loaded review widgets that push content down.
CLS is the metric most local businesses have never heard of but that most directly damages user experience. A user tries to tap the “Call Now” button and the page shifts, causing them to tap something else. That user leaves.
Interaction to Next Paint: The Newest Metric Nobody Optimizes
INP replaced First Input Delay (FID) as the interactivity metric. While FID only measured the first interaction, INP measures overall page responsiveness across all interactions during a visit. Google’s threshold: under 200 milliseconds is “Good.”
INP is the metric most local sites ignore entirely. It fails when pages have heavy JavaScript execution: complex booking widgets, interactive maps, real-time availability checkers, and embedded third-party tools that lock up the main thread.
For most simple local business sites (static pages with a contact form), INP is not a problem. For sites with booking systems, interactive service calculators, or multiple embedded tools, INP is likely failing.
Where CWV Matters Most in Local SEO
Mobile Performance as a Tiebreaker in Competitive Local Markets
Google uses mobile-first indexing, which means mobile CWV scores are what count for rankings. Desktop performance is secondary.
This matters disproportionately in competitive local markets. A plumber in a small town with three competitors does not need to worry about CWV as a ranking factor because there is not enough competition for tiebreakers to matter.
A personal injury lawyer in Houston competing against 200 other firms for the same Map Pack positions is in a different situation. When content quality, reviews, and authority are clustered tightly, CWV becomes one of the signals that separates position 3 from position 7. At that level, every marginal signal counts.
The practical rule: if your local niche has low competition, fix CWV for user experience but do not expect ranking changes. If your niche is highly competitive, CWV optimization is part of the overall competitive strategy.
Bounce Rate and Dwell Time: The Indirect Ranking Signal
Google has been evasive about whether bounce rate and dwell time are ranking factors. But the behavioral chain is real: a slow, glitchy page causes users to leave quickly. A page that loads fast and stays visually stable keeps users engaged longer. Users who stay longer are more likely to convert.
This is CWV’s indirect ranking pathway. Improved CWV leads to improved user behavior metrics, which leads to improved engagement signals, which Google’s algorithms can observe.
The effect is not attributable to CWV alone, which makes it hard to measure in isolation. But the chain is mechanistically sound. Every fraction of a second of load time improvement reduces the probability that a user hits the back button before the page finishes rendering.
Fixing CWV Issues on Typical Local Business Sites
Image Optimization for Location and Team Photos
Images are the number one CWV problem on local business sites. Location photos, team photos, project galleries, and hero images are often uploaded directly from cameras or phones at full resolution.
Fix: compress all images to WebP format, set explicit width and height attributes in HTML, implement lazy loading for below-the-fold images, and serve appropriately sized images for each device using srcset.
A 4000×3000 pixel team photo does not need to be served at full resolution to a mobile screen. Serve a 800-pixel-wide version to mobile and reserve the full resolution for desktop. This alone can cut LCP by half on image-heavy pages.
Third-Party Script Bloat: Chat Widgets, Booking Tools, Review Embeds
Every third-party tool on your site adds JavaScript that competes for the browser’s main thread. Chat widgets, booking tools, review embed scripts, analytics pixels, and social media widgets all fire on page load.
Common offenders on local sites: Tawk.to or Drift chat widgets loading immediately instead of after user interaction, Google Maps embeds loading the full Maps API on every page instead of just the contact page, review widgets from Birdeye or Podium loading full libraries before the page renders.
Fix: defer non-critical scripts, load chat widgets on scroll or click instead of page load, self-host critical fonts and scripts when possible, and audit every third-party script for actual business value. If you have a booking widget that 2% of visitors use, do not let it degrade the experience for the other 98%.
Hosting and Server Response Time for Single-Location Sites
Server response time (Time to First Byte) directly impacts LCP. If your server takes 1.5 seconds just to begin sending the HTML, you have already used more than half of the 2.5-second LCP budget before a single element renders.
Cheap shared hosting plans are the most common server speed problem for local businesses. When your site shares a server with 500 other sites, response times spike during traffic surges.
For a single-location business site with modest traffic, a quality shared hosting plan from a reputable provider (not the cheapest option) is sufficient. For multi-location sites or sites with dynamic content, managed WordPress hosting or a basic VPS provides more consistent response times.
A CDN (Content Delivery Network) helps for geographically distributed visitors but is less critical for local businesses whose visitors are concentrated in one area. The closer your server is to your visitors, the faster the initial response. Choose a hosting provider with servers near your service area.
How to Monitor CWV Without a Dev Team
PageSpeed Insights vs CrUX Data vs Search Console Report
Three tools, three different data sources:
PageSpeed Insights runs a lab test on your page and gives you a score. It is useful for diagnosing specific issues but it reflects a simulated environment, not real user experience. Lighthouse scores from PageSpeed Insights do not directly impact rankings.
Chrome User Experience Report (CrUX) collects real user data from Chrome browsers. This is the data Google actually uses for ranking purposes. CrUX data appears at the top of PageSpeed Insights when available (it requires a threshold of real user traffic). If CrUX shows your site passing CWV, that is what matters for rankings regardless of what lab tests say.
Google Search Console’s Core Web Vitals report shows which pages are Good, Need Improvement, or Poor based on CrUX data. It groups pages with similar characteristics, so fixing one page’s issue may resolve the status for a group of similar pages.
The monitoring workflow: check Search Console monthly for CWV status changes. When issues appear, use PageSpeed Insights to diagnose the specific problem. After fixing, wait for CrUX data to update (typically 28-day rolling average) before assessing impact.
Setting Thresholds That Actually Matter for Your Traffic Level
CrUX data requires a minimum number of real user visits before it reports. Small local business sites with under a few hundred monthly visitors may not have enough data for CrUX to populate. In that case, Google falls back to page-level assessments, which are less granular.
If your site is too small for CrUX data, do not obsess over CWV for ranking purposes. Focus on content, reviews, and GBP optimization, which will have far greater impact at your traffic level. Fix obvious speed problems (massive images, broken scripts) for user experience, but do not treat CWV optimization as a ranking strategy.
Google looks at the 75th percentile of real user experiences, meaning the slowest 25% of your visitors’ experiences determine your CWV status. A site that loads fast for 74% of visitors but poorly for 26% still fails. This is by design: Google wants even your worst user experiences to be acceptable.
CWV thresholds and metric definitions in this guide reflect Google’s specifications as of February 2026. INP replaced FID as of March 2024. For current thresholds and implementation guidance, see web.dev/vitals.