A Step-by-Step SEO Audit You Can Do Yourself

Stream
By Stream
32 Min Read

Phase 1: Pre-Audit Checklist and Toolkit Assembly

Define Your Primary SEO Goals

Before a single URL is crawled, you must define what success looks like. An audit without goals is just data collection. Your objectives will dictate where you focus your energy. Are you trying to:

  • Increase Organic Traffic? This is a broad goal. Refine it: Are you targeting more traffic to specific product pages, or are you aiming to grow your blog’s readership to build top-of-funnel awareness?
  • Generate More Leads or Sales? This is a conversion-focused goal. The audit will need to pay close attention to transactional pages, user journey funnels, and call-to-action (CTA) effectiveness.
  • Improve Brand Visibility and Authority? This goal focuses on ranking for high-volume, informational keywords, securing featured snippets, and building a strong E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) profile.
  • Recover from a Traffic Drop? This requires a forensic approach, often involving a deep dive into Google algorithm update history, technical errors, and backlink profile changes.

Select one primary and one or two secondary goals. Write them down. Every finding in your audit should be weighed against these objectives. A “critical” issue for a lead generation goal might be a “low priority” issue for a brand visibility goal. This framework prevents you from getting lost in a sea of minor fixes.

Gather Your Essential Auditing Tools

While you can perform a basic audit with free tools, a comprehensive DIY audit is made significantly more efficient and accurate with a combination of free and paid software.

  • Google Search Console (GSC): This is non-negotiable and free. It is your direct line of communication with Google. It tells you how Google sees your site, including indexing status, crawl errors, mobile usability, Core Web Vitals, security issues, and the keywords your site is actually ranking for. If you haven’t set this up, stop and do it now. All other tools are secondary to the data you get from GSC.
  • Google Analytics 4 (GA4): Another free and essential tool. GA4 provides invaluable data on user behavior: how users arrive on your site, which pages they visit, how long they stay, and what actions they take. This data helps you understand content performance, identify user experience problems, and track goal completions from organic traffic.
  • A Website Crawler: This software acts like a search engine bot, systematically crawling your website to find technical issues. The industry standard for DIY auditors is Screaming Frog SEO Spider. It has a free version that can crawl up to 500 URLs, which is sufficient for many small websites. For larger sites, the paid version is a worthwhile investment. Alternatives include Sitebulb (paid, very user-friendly) or various web-based crawlers. Your crawler is your primary tool for finding broken links, redirect chains, issues with title tags and meta descriptions, duplicate content, and much more.
  • An All-in-One SEO Suite (Paid): Tools like Ahrefs, Semrush, or Moz Pro are the powerhouses of SEO. While they have significant costs, their backlink analysis, keyword research, and competitive intelligence capabilities are unparalleled. They allow you to see your backlink profile, analyze your competitors’ strategies, and find keyword gaps. Most offer limited free trials or free versions with restricted data, which can be sufficient for a one-time audit.
  • Google’s Public Tools:
    • PageSpeed Insights: Analyzes the performance of a page on both mobile and desktop devices and provides suggestions on how that page may be improved. It’s the primary tool for diagnosing Core Web Vitals issues.
    • Rich Results Test: Checks if your page’s structured data is valid and eligible for Google’s rich results (like FAQs, reviews, and product snippets).
    • Mobile-Friendly Test: A quick and easy way to check if a specific URL meets Google’s criteria for mobile usability.

Phase 2: The Technical SEO Deep Dive

Technical SEO is the foundation of your website. If search engines cannot efficiently crawl, render, and index your content, all your on-page and off-page efforts will be wasted.

Accessibility and Indexability: The Bedrock

This section ensures that search engine bots can find and understand your content without roadblocks.

  • Robots.txt Review: Your robots.txt file, located at yourdomain.com/robots.txt, gives instructions to web crawlers.
    • Action: Open the file in your browser. Look for Disallow: directives. Are there any rules unintentionally blocking important sections of your site? For example, Disallow: /blog/ would prevent Google from crawling your entire blog. Conversely, you should be disallowing private areas like Disallow: /wp-admin/ or internal search result pages.
    • Check: Ensure there’s a line pointing to your XML sitemap, like Sitemap: https://yourdomain.com/sitemap.xml.
  • XML Sitemap Analysis: Your XML sitemap is a roadmap for search engines.
    • Action: Use your crawler (like Screaming Frog) to crawl your XML sitemap. The crawler will list every URL found.
    • Check: Verify that this list contains only the “good” pages you want indexed. It should not include non-canonical URLs, redirected URLs, 404 error pages, or pages blocked by robots.txt. Every URL in the sitemap should return a 200 OK status code. In Google Search Console, check the Sitemaps report for any errors Google has found.
  • Meta Robots Tag and X-Robots-Tag Check: These tags provide page-specific indexing instructions.
    • Action: Use your website crawler to crawl your entire site. Add columns for “Meta Robots 1” and “Indexability.”
    • Check: Sort by the “Indexability” column. Look for pages that are mistakenly marked as “Non-Indexable.” This is often caused by a noindex tag left over from a staging environment. The most common meta robots tag is , which is the default and doesn’t need to be added. You are looking for or on pages that should be ranking. The X-Robots-Tag is an HTTP header equivalent used for non-HTML files like PDFs and should also be checked by your crawler.
  • Crawl Budget Optimization: Crawl budget is the number of pages Googlebot will crawl on your site in a given period. For large sites, this is critical.
    • Action: Go to the Crawl Stats report in GSC (under Settings). This shows Googlebot’s activity over the last 90 days.
    • Check: Look for spikes in “Crawl requests” that don’t correlate with new content additions. This can indicate a problem. Look at the “By purpose” and “By file type” reports. Is Google spending too much time crawling low-value URLs like those generated by faceted navigation (e.g., ?color=blue&size=large), infinite scroll, or old redirect chains? Fixing these issues by using nofollow attributes, canonical tags, or improving your robots.txt file can preserve your crawl budget for your most important pages.

Site Architecture and URL Structure

A logical site structure helps both users and search engines navigate your site.

  • URL Structure Analysis:
    • Action: Export a list of all crawlable URLs from your crawler.
    • Check: Are your URLs clean, logical, and descriptive? A good URL (yourdomain.com/services/seo-audit) is better than a bad one (yourdomain.com/page-id=123?cat=4). Use hyphens to separate words, not underscores or spaces. Keep them as short as is reasonable.
  • Click Depth: This is the number of clicks it takes to get from the homepage to any other page.
    • Action: Your crawler (Screaming Frog’s “Crawl Depth” column) will show this for every URL.
    • Check: Your most important pages (key service pages, top-selling products) should have a low click depth, ideally 3 or less. If a critical page is buried 7 clicks deep, it signals to Google that it’s not important, and it will receive less link equity and be crawled less frequently. Improve click depth by linking to important pages from your main navigation, footer, or relevant high-level pages.
  • Breadcrumbs Implementation: Breadcrumbs are secondary navigation aids that show a user’s location in the site’s hierarchy.
    • Action: Browse your key pages (blog posts, product pages).
    • Check: Are breadcrumbs present? They improve user experience and help Google understand your site structure. For maximum SEO benefit, they should be implemented with BreadcrumbList schema markup. You can test this using the Rich Results Test tool.

Index Status and Health in Google Search Console

The Index Coverage report in GSC is your health check for how Google is indexing your site.

  • Action: Navigate to the “Pages” report under the “Indexing” section in GSC.
  • Check: Pay close attention to the chart showing “Not indexed” and “Indexed” pages.
    • “Not indexed” Pages: This is where the most critical issues are found. Click into this report to see the reasons.
      • Server error (5xx): A serious issue with your server. This needs immediate attention from your developer or hosting provider.
      • Redirect error: Indicates a problem with your redirects, like a redirect chain that is too long or a redirect loop.
      • Submitted URL blocked by robots.txt: You’ve told Google to index a page in your sitemap but are blocking it with robots.txt. This is a contradiction you need to fix.
      • Not found (404): The page doesn’t exist. If these are important pages that have been deleted, you should implement a 301 redirect to the next most relevant page.
      • Crawled - currently not indexed: Google has crawled the page but decided not to index it, often due to perceived low quality or thin content. This is a content quality issue.
      • Discovered - currently not indexed: Google knows the page exists but hasn’t crawled it yet, often due to low site authority or poor internal linking making the page seem unimportant.
    • “Indexed” Pages: Review the “Indexed” report to ensure no strange or low-value URLs (like search result pages or tag pages) have been indexed.

Website Speed and Core Web Vitals

Site speed is a confirmed ranking factor. Google’s Core Web Vitals (CWV) are specific metrics used to measure user experience.

  • Understanding Core Web Vitals (CWV):
    • Largest Contentful Paint (LCP): Measures loading performance. Aim for under 2.5 seconds.
    • Interaction to Next Paint (INP): Measures interactivity and responsiveness. This is replacing First Input Delay (FID). A good INP is below 200 milliseconds.
    • Cumulative Layout Shift (CLS): Measures visual stability. Aim for a score of less than 0.1.
  • How to Test:
    • Action: Use Google PageSpeed Insights. Test your homepage, a key service or product page, and a typical blog post to get a representative sample. Look at both the “Field Data” (real-world user data from the Chrome User Experience Report) and “Lab Data” (a controlled test). Field data is what Google uses for ranking.
  • Common Fixes to Look For:
    • Image Optimization: Are images properly sized and compressed? Are you using next-gen formats like WebP? This is often the biggest and easiest win.
    • Leverage Browser Caching: Configure your server to tell browsers how long they should store resources like images, CSS, and JavaScript.
    • Minify CSS, JavaScript, and HTML: Remove unnecessary characters, comments, and whitespace from code to reduce file sizes.
    • Use a Content Delivery Network (CDN): A CDN stores copies of your site’s assets on servers around the world, so they are delivered faster to users regardless of their location.
    • Reduce Server Response Time: This is often related to your hosting plan. If your Time to First Byte (TTFB) is consistently high, you may need to upgrade your hosting.

Mobile-Friendliness and Security

With mobile-first indexing, Google predominantly uses the mobile version of your content for indexing and ranking.

  • Mobile Usability:
    • Action: Check the “Mobile Usability” report in GSC.
    • Check: This report will flag any site-wide errors. Common issues include “Viewport not set,” “Content wider than screen,” “Text too small to read,” and “Clickable elements too close together.” Use Google’s Mobile-Friendly Test tool to diagnose individual URLs.
  • Security (HTTPS):
    • Action: Use your crawler to get a list of all URLs.
    • Check: The entire site must use HTTPS. A valid SSL certificate is a must-have trust signal. Look for “mixed content” issues, where an HTTPS page loads insecure (HTTP) resources like images or scripts. Browsers will flag this as a security risk, and it can harm user trust and rankings.

Structured Data (Schema Markup)

Schema markup is code that helps search engines understand the context of your content, which can lead to rich results in the SERPs.

  • Action: Identify pages where schema would be appropriate: articles, products, FAQs, local business information, recipes, events, etc.
  • Check: Use the Rich Results Test tool to paste in a URL or code snippet. The tool will tell you if the schema is valid and if the page is eligible for rich results. Check for errors and warnings. Missing recommended fields can prevent you from getting a rich snippet even if the basic schema is valid.

Phase 3: The Comprehensive On-Page SEO Audit

On-page SEO involves optimizing the individual elements of your web pages to improve rankings and user experience. This phase uses your crawler’s export and manual checks.

Keyword Targeting and Content Alignment

Every important page should have a clear purpose and target a specific keyword or topic.

  • Page-to-Keyword Mapping:
    • Action: Create a spreadsheet. List your most important URLs in one column. In the next column, define the primary keyword each page should target.
    • Check: Is there a clear 1:1 relationship for your core pages? Does the content on the page thoroughly address the user intent behind the target keyword? (e.g., A page targeting “how to tie a tie” should be an informational blog post or video, not a product page selling ties).
  • Identifying Keyword Cannibalization: This occurs when multiple pages on your site compete for the same keyword, confusing Google and diluting your authority.
    • Action: Use a site search operator in Google: site:yourdomain.com "target keyword".
    • Check: If multiple pages from your site appear in the top results, you may have a cannibalization issue. For example, if you have three different blog posts all about “best running shoes,” Google doesn’t know which one is the definitive resource.
    • Fixes: The best solution is often to consolidate the competing pages into one comprehensive “ultimate guide” and 301 redirect the older posts to the new one. Alternatively, you can de-optimize the less important pages for the target term or use a canonical tag to point to the preferred version.

HTML Element Optimization

These are the classic, fundamental elements of on-page SEO.

  • Title Tags (): This is a heavily weighted ranking factor.
    • Action: Use your crawler to export a list of all URLs with their corresponding title tags.
    • Check:
      • Presence: Is any title tag missing or duplicated across multiple pages?
      • Length: Are they under the recommended ~60 character limit to avoid being truncated in search results?
      • Keyword Placement: Does the primary keyword appear, preferably near the beginning?
      • Compellingness: Is it written for humans to entice a click, not just stuffed with keywords?
  • Meta Descriptions: While not a direct ranking factor, they heavily influence click-through rate (CTR).
    • Action: Export a list of all URLs and their meta descriptions from your crawler.
    • Check:
      • Presence: Are any missing or duplicated? Every indexable page should have a unique meta description.
      • Length: Are they under the ~160 character limit?
      • Persuasiveness: Do they accurately summarize the page and include a call-to-action or compelling reason to click?
  • Header Tags (H1, H2, H3): Headers structure your content for readers and search engines.
    • Action: Use a browser extension like “SEO META in 1 CLICK” to quickly check the header structure of key pages, or configure your crawler to extract them.
    • Check:
      • H1: There should be one, and only one, H1 tag per page. It should be similar to the title tag and contain the primary keyword.
      • H2s/H3s: Are these used logically to break up content into subsections? Do they contain secondary keywords and related terms that add context? A well-structured document with a clear hierarchy is easy for both users and bots to understand.
  • Body Content Quality and Depth:
    • Action: Manually review your most important pages.
    • Check: Is the content unique, or is it scraped or duplicated from elsewhere? Does it fully answer the user’s query? Is it well-researched and up-to-date? Look for “thin content”—pages with very little text that offer no real value. These pages should either be improved with more substantial content or removed and redirected.
  • Image SEO:
    • Action: Use your crawler to find images that are missing alt text. Manually check the file names of recently uploaded images.
    • Check:
      • Alt Text: Does every meaningful image have descriptive alt text? This is crucial for accessibility and helps Google understand the image’s content. Alt text for a product image should be alt="blue nike running shoe" not alt="image123".
      • File Names: Are file names descriptive (e.g., diy-seo-audit-checklist.jpg)?
      • File Size: Are images compressed to ensure fast loading times?

Internal Linking and Site Connectivity

Internal links guide users and search engines through your site and distribute link equity (PageRank).

  • Auditing Internal Links:
    • Action: Use your crawler. It will show you the number of “inlinks” to every page.
    • Check:
      • Orphaned Pages: Sort by “Inlinks” to find pages with 0 internal links. These pages are invisible to search engines unless they are in your sitemap and have backlinks. They need to be linked to from relevant pages on your site.
      • Link Equity Flow: Are your most important pages receiving the most internal links from other high-authority pages on your site?
      • Anchor Text: Export a list of all internal links and their anchor text. Is the anchor text descriptive and varied? Over-optimized anchor text (e.g., every link is “best SEO company”) can look spammy. Use a mix of exact match, partial match, and branded anchor text.
  • Fixing Broken Links:
    • Action: Your crawler will provide a report of all broken internal links (404 errors) and the pages they are on.
    • Check: This is a simple but important fix. Go to the source pages and either update the link to the correct URL or remove it. Broken links create a poor user experience and waste crawl budget.

Phase 4: The Off-Page SEO and E-E-A-T Audit

Off-page SEO refers to actions taken outside of your own website to impact your rankings. This is primarily about your backlink profile and brand reputation.

Backlink Profile Analysis

You will need a paid tool like Ahrefs, Semrush, or Moz for a thorough backlink audit.

  • Gathering Your Backlink Data:
    • Action: Enter your domain into your chosen tool and export your full backlink profile. You can supplement this with the Links report from GSC, but third-party tools provide more data and metrics.
  • Assessing Backlink Quality vs. Quantity: A single high-quality link is worth more than a thousand low-quality ones.
    • Action: Analyze the list of referring domains.
    • Check:
      • Domain Authority: Look at the authority metric of the linking sites (e.g., Ahrefs’ Domain Rating or Moz’s Domain Authority). Are you getting links from authoritative, well-known websites?
      • Relevance: Are the linking sites topically relevant to your industry? A link from a respected industry blog is far more valuable than a link from a random, unrelated site.
      • Link Placement: Is the link contextually placed within the body of the content, or is it a sitewide footer or sidebar link? Contextual links carry more weight.
  • Analyzing Anchor Text Distribution:
    • Action: Use your tool’s anchor text report.
    • Check: Is the distribution natural or does it look manipulative? A healthy profile has a high percentage of branded anchors (“Your Company Name”) and naked URL anchors (“yourdomain.com”), with a smaller, more diverse mix of keyword-rich anchors. A profile with 70% of its links using the anchor text “best cheap widgets” is a major red flag for Google’s Penguin algorithm.
  • Identifying and Dealing with Toxic Links:
    • Action: Manually review your backlinks, sorting by the lowest domain authority first. Look for patterns.
    • Check: Red flags for toxic links include links from spammy directories, private blog networks (PBNs), sites in foreign languages unrelated to your business, and sites with obviously over-optimized anchor text.
    • The Disavow Tool: If you identify a clear pattern of toxic, spammy links that you believe are harming your site and you cannot get them removed manually, you can use Google’s Disavow Tool. This is an advanced tool and should be used with extreme caution. For most sites, ignoring spammy links is better than disavowing them incorrectly.

Competitor Backlink Gap Analysis

This is where you find your best new link-building opportunities.

  • Action: Use the “Link Gap” (Ahrefs) or “Backlink Gap” (Semrush) tool. Enter your domain and 2-3 of your top organic competitors.
  • Check: The tool will generate a list of websites that link to one or more of your competitors but not to you. This list is a goldmine. These sites are already linking to content in your niche, meaning they might be willing to link to your content as well, especially if you can create a superior resource.

Auditing E-E-A-T Signals

E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) is a concept from Google’s Quality Rater Guidelines. It’s especially crucial for “Your Money or Your Life” (YMYL) topics like finance, health, and law.

  • On-Site E-E-A-T Signals:
    • Action: Review your key site pages.
    • Check: Do you have a detailed “About Us” page? Are there clear author biographies for your blog posts, complete with credentials, experience, and links to social profiles? Is your contact information (address, phone number) easy to find? Are you citing sources and linking out to other authoritative websites to back up your claims?
  • Off-Site E-E-A-T Signals:
    • Action: Perform brand searches for your company name, your key employees, and your products.
    • Check: What do people say about you? Look for mentions on reputable news sites, reviews on third-party platforms (like GMB, Trustpilot, Capterra), and forum discussions. A positive reputation across the web is a powerful trust signal.

Phase 5: The Local SEO Audit (For Bricks-and-Mortar Businesses)

If your business serves a specific geographic area, local SEO is paramount.

Google Business Profile (GBP) Optimization

Your GBP listing is your most important local ranking factor.

  • Action: Log in to your Google Business Profile manager.
  • Check:
    • Verification: Is the profile claimed and verified?
    • NAP Consistency: Is the Name, Address, and Phone number (NAP) 100% accurate and consistent with the information on your website?
    • Completeness: Is every single section filled out? This includes selecting primary and secondary categories, adding all relevant services or products, listing attributes (e.g., “wheelchair accessible,” “free Wi-Fi”), and uploading high-quality, recent photos of your business.
    • GBP Features: Are you actively using Google Posts to announce offers and updates? Are you seeding and answering questions in the Q&A section?

On-Page Local Signals

Your website needs to reinforce your location to search engines.

  • Action: Review your website’s footer and contact page.
  • Check: Is your full NAP clearly visible on your site? Is it marked up with LocalBusiness schema to explicitly tell Google what it is? If you have multiple locations, does each one have its own dedicated, unique page with its specific NAP, hours, and local information?

Local Citation and Review Audit

Citations are mentions of your business’s NAP on other websites.

  • Action: Use a tool like BrightLocal or Whitespark to run a citation audit. Manually search for your business on major review platforms.
  • Check:
    • Citation Consistency: The audit tool will find inconsistencies in your NAP across major directories (like Yelp, Yellow Pages, etc.). It’s critical to clean these up so that Google sees a consistent signal.
    • Review Management: What is your review quantity, velocity (how frequently you get them), and average rating? Are you actively responding to all reviews, both positive and negative? Responding shows that you are engaged and value customer feedback.

Phase 6: Content and Keyword Opportunity Analysis

The final phase of the audit shifts from fixing past problems to identifying future opportunities.

Auditing Existing Content Performance

  • Action: Go to the Performance report in Google Search Console.
  • Check:
    • Content Decay: Look for pages that used to get significant traffic but have declined over the past 6-12 months. These are candidates for a content refresh—update the information, add new details, and improve the on-page SEO.
    • “Striking Distance” Keywords: Filter your queries by position and enter a value greater than 10 and less than 21. This shows you all the keywords for which you rank on page 2 of Google. These are your lowest-hanging fruit. Improving the on-page SEO, adding internal links, or building a few high-quality backlinks to the corresponding pages can often push them onto page 1 for a significant traffic boost.

Conducting a Competitor Keyword Gap Analysis

This process, also known as a content gap analysis, finds keywords your competitors rank for, but you don’t.

  • Action: Use the Keyword Gap tool in Ahrefs or Semrush. Enter your domain and 2-3 of your top competitors.
  • Check: The tool will output a list of keywords. Filter this list to find keywords that are relevant to your business, have reasonable search volume, and a manageable keyword difficulty score. This list forms the basis of your future content strategy, ensuring you create content that your audience is searching for and that your competitors have already proven to be valuable.

Topic Cluster and Content Hub Audit

Modern SEO is about topical authority, not just individual keywords.

  • Action: Visually map out your site’s content.
  • Check: Is your content organized into topic clusters? A topic cluster consists of a main “pillar page” (a long, comprehensive guide on a broad topic) and several “cluster pages” (shorter posts that cover specific subtopics in more detail). All cluster pages link back to the pillar page. This structure signals to Google that you have deep expertise on a subject. If your content is just a chronological list of disconnected blog posts, consider reorganizing it into a topic cluster model to build authority and improve rankings for entire groups of related keywords.
Share This Article
Follow:
We help you get better at SEO and marketing: detailed tutorials, case studies and opinion pieces from marketing practitioners and industry experts alike.