Skip to content
Vivotiv

How Our Website Scan Works

Last updated April 2026

Six analysis tracks, one report

Our scan combines six complementary analysis tracks to give you a complete picture of your website's health. Each track is purpose-built for specific types of checks, and together they cover performance, SEO, accessibility, trust and security, website quality, and AI readiness.

The entire scan runs in the background and typically completes in 15 to 20 seconds.

Lighthouse: performance, SEO, and best practices

Lighthouse is an open-source tool developed by Google. It is the same engine that powers the audits in Chrome DevTools and Google's PageSpeed Insights.

We use Lighthouse to measure:

  • Core Web Vitals including Largest Contentful Paint (LCP), Total Blocking Time (TBT), and Cumulative Layout Shift (CLS)
  • Loading performance including First Contentful Paint, Speed Index, and Time to First Byte
  • SEO fundamentals including meta tags, canonical URLs, viewport configuration, and crawlability
  • Best practices including HTTPS usage, safe JavaScript patterns, and modern API usage

Lighthouse is widely used as an industry standard for measuring web quality. Some Lighthouse metrics, particularly Core Web Vitals, overlap with Google's page experience signals used in search ranking.

axe-core: accessibility testing

axe-core is developed by Deque Systems and is one of the most widely used accessibility testing engines. It is integrated into tools like Chrome DevTools and is used by many accessibility auditing platforms.

We run axe-core using Playwright, a browser automation framework, which lets us test your site in a real browser environment rather than against static HTML. This matters because many accessibility issues only appear after JavaScript has rendered the page.

We test against both desktop and mobile viewport sizes, and check all WCAG 2.1 A and AA rules. The scan reports:

  • Violations categorized by severity (critical, serious, moderate, minor)
  • Incomplete findings that need manual verification
  • The number of affected elements per violation

axe-core is used by organizations including Microsoft and Google, and is designed to minimize false positives. When it reports a violation, it is highly likely to be a real issue.

HTTP and TLS header analysis

This track inspects your server's response headers and SSL/TLS configuration. It does not require loading the page in a browser, so it runs in parallel with the other tracks.

We check:

  • Content Security Policy (CSP) quality and whether overly permissive policies (like unsafe-inline or unsafe-eval) are present
  • HSTS (HTTP Strict Transport Security) configuration including max-age value and includeSubDomains presence
  • Security headers including X-Frame-Options, X-Content-Type-Options, Referrer-Policy, and Permissions-Policy
  • Server exposure headers that may reveal technology versions to potential attackers
  • SSL/TLS certificate validity and trust chain status

These headers are your first line of defense against common web attacks. Missing or misconfigured headers leave your site and your visitors exposed.

External security checks

We cross-reference your site against external security databases to detect threats that internal analysis cannot catch:

  • Google Web Risk checks whether your URL has been flagged for malware, social engineering, or unwanted software
  • MDN Observatory grades your site's security configuration against Mozilla's recommended standards

These checks run via external APIs and complement the header analysis with an outside-in perspective on your site's security posture.

Broken link detection

We crawl the links on your page and verify that they resolve correctly. Broken links hurt both user experience and SEO, as search engines treat dead links as a signal of poor site maintenance. The scan reports the number of broken links found and which URLs are affected.

AI readiness analysis

AI-powered search engines (ChatGPT, Perplexity, Gemini, Google AI Overviews) are changing how people find businesses online. Traditional search volume is projected to drop 25% by 2026. This track checks whether your site is prepared for this shift.

We analyze:

  • AI crawler access by parsing your robots.txt to determine which AI bots can reach your content. Blocking search crawlers like OAI-SearchBot or PerplexityBot means your business is invisible to AI search. Blocking training crawlers (GPTBot, ClaudeBot) is a valid choice and does not affect the score.
  • llms.txt presence and validity. This emerging standard (adopted by 844,000+ sites) provides a structured summary of your site for AI consumption.
  • Structured data completeness beyond the basic SEO check. We evaluate whether your schema is detailed enough for an AI to identify your business name, location, and offering.
  • Content renderability by comparing your raw HTML (what AI crawlers see) against the browser-rendered page. If your content requires JavaScript to appear, most AI crawlers cannot see it.
  • Semantic HTML usage. Tags like <main>, <article>, and <nav> help AI distinguish your actual content from navigation and chrome.
  • Entity clarity by cross-referencing your business name across title, meta description, Open Graph tags, and structured data for consistency.
  • Citation readiness by checking for content patterns that correlate with higher AI citation rates: data tables, lists, statistics, and self-contained answer sections.

These checks run across two tracks: DOM analysis via Playwright (structured data, semantic HTML, entity clarity, citation readiness) and lightweight HTTP fetches (robots.txt, llms.txt, raw HTML for the renderability comparison). Both run in parallel with all other scan tracks.

How we score results

Each of the six categories receives a score from 0 to 100.

Checklist-based categories (performance, SEO, trust and security, website quality, AI readiness) use a fixed set of checks. Each check returns pass, warn, or fail, and the category score is a weighted average.

Accessibility uses a deduction model. The score starts at 100 and points are deducted for each violation based on severity:

  • Critical: -15 points
  • Serious: -10 points
  • Moderate: -5 points
  • Minor: -2 points

The overall score is a weighted average across all six categories:

  • Performance: 20%
  • SEO: 20%
  • Accessibility: 20%
  • Trust and security: 20%
  • Website quality: 10%
  • AI readiness: 10%

What the scores mean

  • 90 to 100 (green): Good. The category meets or exceeds current standards.
  • 50 to 89 (orange): Needs attention. There are issues that should be addressed.
  • 0 to 49 (red): Significant problems. Immediate action is recommended.

The scan is designed to surface real, actionable issues. We do not inflate scores to make results look better, and we do not suppress findings to avoid alarming you. The report shows what the tools found, scored according to how much each finding impacts your site.

Sources

Run a free scan on your website

How Our Website Scan Works | Vivotiv