45 Actionable Checks

Help Confused Search Spiders

Improve Access for Humans

Unearth Transient Bugs

Fix Bad HTTP Headers

Prevent Canonical Issues



Checklist

Saved Audits

Login or join to save audits.


OK | NOT Check Findings

Is the server free of the most basic issues?

These first few checks verify how well the most basic server features are working.

These issues prevent search spiders or would-be link partners from reaching your site, even if it loads for you.

Is DNS completely healthy? Is DNS completely healthy?

Enter the domain name in the mxtoolbox test.

Errors containing SMTP, SPF, and MX are email issues. They may impact deliverability, but that's another audit, so log no issues there. Also, ignore warnings for servers in the same subnet, SOA serials, and SOA Expire Value.

Raise all other errors with a web host (small organization) or server administrator (large organization).

Is the site free of packet loss? Is the site free of packet loss?

Now we'll test the network.

macOS and Linux

  1. Open Terminal (macOS) or shell (Linux).
  2. Type "ping -c 1000 domain.com".

Windows

  1. Start > Run > "cmd".
  2. Type "ping -n 1000 domain.com".
  3. OR: Type "pathping domain.com" for deeper data.

This will take a while; let it run and move on. Circle back at the end and log packet loss. A great web host will be 0.0%. Above 0.1% is an issue.

If it's high, do this again on a popular site (like google.com) to be sure it's not just you. If there's an issue, ask a network admin (in a large organization) or web host (in a small organization) to address it.

Is the sites IP neighborhood acceptable? Is the site's IP neighborhood acceptable?

A variety of patents and research papers suggest that Google is looking at the sites that share your vhost and subnet to identify webspam patterns.

To check this, run a free scan.

Anything damaging would have to be pretty deliberate spam: look for very low-quality blogs with no theme or content that appears to be automated, pornographic, or littered with strange links.

Is there uptime monitoring? Is there uptime monitoring?

Google told us that "short" downtime is OK, but "longer" impacts rankings.

Anything important needs 24/7 monitoring. Aside from simply being offline, this will also help us diagnose when a website is poorly responsive or slow at times when we're not looking.

The free version of UptimeRobot does checks on a 5-minute interval. SiteUptime or Pingdom are also fine to start. There are different types of downtime. For example, your website technically can be online but broken with a blank page.

We recommend configuring a monitor to check for content that should always appear (such as the business name).

Is the site secure?

This isn't a security audit. But Google has been vocal about signals related to SSL, malware, and phishing.

There are also indirect effects from browser security warnings - theoretical topics like bounce rates, task completion times - and a higher conversion rate on earning backlinks. In case security itself isn't motivation enough.

Can you reach the correct content with both Can you reach the correct content with both "http" and "https"?

Manually type in a URL using "http". Then using "https". You may get redirected but they should look the same and free of visual inconsistencies.

Keep an eye out for test pages, missing images, and other signs of a negatively altered experience.

Leave an "https" page open for the next check.

Do Do "http" versions of the site 301-redirect to "https"?

Using the header viewer, type the "http://" version of a page in.

Repeat for a few noteworthy subfolders (especially "/blog/").

Do the resulting headers contain "301" and "redirect"?

Duplicate content issues exist if "http" and "https" versions both resolve with no redirect. Other methods of redirect (302, JavaScript, meta, etc.) are all possible issues. To fix, have a developer put this at the top of .htaccess:

# Rewrite to SSL (via northcutt.com)
RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

If SSL is not an option right now, force visitors the other way instead ("https" to "http"):

# Rewrite to no-SSL (via northcutt.com)
RewriteCond %{HTTPS} !=off
RewriteRule ^ http://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Is SSL completely healthy? Is SSL completely healthy?

If SSL works (above), run the SSL Labs test to make sure it's secure.

Once complete, click to expand each IP address (if there are multiple). Log any errors as issues.

Leave the results open.

Are all browsers free of SSL issues? Are all browsers free of SSL issues?

After your SSL Labs test, check Handshake Simulation under Configuration.

If some browsers can't validate your SSL (listed under this section), there may be issues with your SSL settings, vendor, or a missing chain (a.k.a. intermediate) certificate. This will also present issues if the certificate is self-signed (no Certificate Authority).

Some very old browsers have become impossible to secure and can be ignored. Those browser names will appear in red text in this list. As of writing this, that's just a few very old releases of Internet Explorer.

Look for notes in red for issues. Green notes are okay.

Leave the results open for one more step.

Are ciphers hardened? Are ciphers hardened?

After your SSL Labs test, look at Cipher Suites under Configuration.

Note any cipher-related vulnerabilities. This is a great resource that's updated frequently on cipher hardening.

You can now close out of this test.

Is the site free of mixed-security content? Is the site free of mixed-security content?

Use the JitBit test to scan for linked content (images, videos, etc.) that still aren't encrypted.

This causes a giant security alert with Chrome's default settings.

Can you afford to lose 10% of potential inbound links because of that scaring people off?

The answer is no. No you can't.

Is the site free of malware? Is the site free of malware?

Google is so direct about preventing malware that they have their own malware scanner. Use that to test.

Is the site free of phishing schemes? Is the site free of phishing schemes?

See if you appear in this database of active phishing schemes using this Google search:

 site:phishtank.com "domain.com"

This is a common side-effect of a hacked site and devastates rankings.

Have you run a self-crawl?

There are many great SEO crawlers and most will work for this phase. We use Screaming Frog because it's cheap, lightweight, and has every feature you'll probably ever need.

Before beginning this phase, navigate to Configuration > Spider in Screaming Frog and set likeso:

Follow internal "nofollow": YES
Follow external "nofollow": YES
Ignore robots.txt: NO
Show Internal URLs blocked by robots.txt: YES
All other settings: default

Enter the homepage URL and click START.

Leave the results open for this entire phase.

Did you discover all of pages that you expected? Did you discover all of pages that you expected?

Filter your crawl results by "HTML" (top left). Record the # of crawled pages to the right.

More pages probably mean there are duplicate or thin content problems. That's a different audit.

Fewer pages probably mean there are crawling or indexing issues. If important sections of the site weren't discovered by our crawl en masse (go by major sections... products, categories, collections, blog posts, etc.), they'll probably not get discovered by major search engines either.

Is the site free of self-inflicted 3XX redirects? Is the site free of self-inflicted 3XX redirects?

Sort results on the Response Codes tab.

Keep these results up for the next several checks.

Get the total internal 301s + 302s from the Overview folder on the right. If there are issues, navigate the top menu to Bulk Export > Response Codes > Redirection (3xx) to get a CSV that includes the originating URLs. Upload to Google Drive and link to the right + note the total.

Don't worry about external 301s/302s.

More on this topic.

Is the site free of 403 errors? Is the site free of 403 errors?

These are usually password-protected areas or situations where no index.html (or .php, etc.) exists.

Resolve by blocking Google from crawling this location in robots.txt or create a useful index file. Or just don't publically link to sections that the public can't reach (it's a terrible user experience).

Is the site free of 404  errors? Is the site free of 404 errors?

404 Errors aren't a problem by themselves, but discovering them through crawling your own site is.

If pages have moved, resolve with a 301 redirect and update old links to match. Broken external links need fixing too as a measure of quality. Get the total internal + external 404s from the Overview folder on the right.

If there are issues, navigate the top menu to Bulk Export > Response Codes > Client Error (4xx) to get a CSV that includes source URLs. Upload to Google Drive and link to the right + note the total.

Is the site free of 5XX errors? Is the site free of 5XX errors?

These errors happen when server-side code, or the server itself, has errors.

Sometimes these are intermittent. That's no less a problem. A developer should begin by reviewing access logs. Export these just as you did above if they exist in any significant # + record the total.

There are 25 more steps in this audit.


Or

... to continue.





Ready to get started?