45 Actionable Checks

Help Confused Search Spiders

Improve Access for Humans

Unearth Transient Bugs

Fix Bad HTTP Headers

Prevent Canonical Issues


Saved Audits

Login or join to save audits.

OK | NOT Check Findings

Are there basic server issues?

These first few checks verify how well the most basic server features are working.

These issues prevent search spiders or would-be link partners from reaching your site, even if it loads for you.

Is DNS completely healthy? Is DNS completely healthy?

Enter the domain name in the mxtoolbox test.

Errors containing SMTP, SPF, and MX are email issues. They may impact deliverability, but that's another audit, so log no issues there. Also, ignore warnings for servers in the same subnet, SOA serials, and SOA Expire Value.

Fix all other errors with a web host or server admin.

Is there packet loss? Is there packet loss?

Now we'll test the network. A network admin or web host can fix.


  1. Open Terminal (OS X) or shell (Linux).
  2. Type "ping -c 1000 domain.com".


  1. Start > Run > "cmd".
  2. Type "ping -n 1000 domain.com".
  3. OR: Type "pathping domain.com" for deeper data.

This will take a while; let it run and move on. Circle back at the end and log packet loss. A great web host will be 0.0%. Above 0.1% is an issue.

If it's high, do this again on a popular site (like google.com) to be sure it's not just you.

Are you in a bad IP neighborhood? Are you in a bad IP neighborhood?

A variety of patents and research papers suggest that Google is looking at the sites that share your vhost and subnet to identify webspam patterns.

To check this, run a free scan.

Anything damaging would have to be pretty deliberate spam. Just make sure there's not a collection of spammy-looking sites that share your web server's IP range.

Is there uptime monitoring? Is there uptime monitoring?

Google told us that "short" downtime is OK, but "longer" impacts rankings.

Anything important needs 24/7 monitoring. The free version of UptimeRobot does checks on a 5-minute interval. SiteUptime or Pingdom are also fine to start.

There are different types of downtime. For example, your web hosting can be working, but your software (like WordPress) breaks. The best monitoring checks for content that should always appear (like, your business name).

Is the site secure?

This isn't a security audit.

But Google has been vocal about signals related to SSL, malware, and phishing.

There are also indirect effects from browser security warnings - theoretical topics like bounce rates, task completion times - and a higher conversion rate on earning backlinks. In case security itself isn't motivation enough.

Does the intended content load at https? Does the intended content load at 'https'?

When you visit the site using "https", do you end up with something useful?

Visitors and inbound linkers use "https" even if it's not our desired version of the site.

Make sure it's not just a test page or something else useless and leave the page open for the next step.

Do Do "http" versions of the site 301-redirect to "https"?

Using the header viewer, type the "http://" version of a page in.

Repeat for a few noteworthy subfolders (especially "/blog/").

Do the resulting headers contain "301" and "redirect"?

Duplicate content issues exist if "http" and "https" versions both resolve with no redirect. Other methods of redirect (302, JavaScript, meta, etc.) are all possible issues. To fix, have a developer put this at the top of .htaccess:

# Rewrite to SSL (via northcutt.com)
RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

If SSL is not an option right now, force visitors the other way instead ("https" to "http"):

# Rewrite to no-SSL (via northcutt.com)
RewriteCond %{HTTPS} !=off
RewriteRule ^ http://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Is SSL completely healthy? Is SSL completely healthy?

If SSL works (above), run the SSL Labs test to make sure it's secure.

Once complete, click to expand each IP address (if there are multiple). Log any errors as issues.

Leave the results open.

Do specific browsers have issues? Do specific browsers have issues?

After your SSL Labs test, check Handshake Simulation under Configuration.

If some browsers can't validate your SSL, there may be issues with your SSL settings, vendor, or a missing chain (a.k.a. intermediate) certificate. This will also present issues if the certificate is self-signed (no Certificate Authority).

Some very old browsers have become impossible to support safely and can be ignored. Those browser names will appear in red text in this list (as of writing this, that's just a few very old releases of Internet Explorer).

Look for notes in red for issues. Green notes are okay.

Leave the results open for one more step.

Are ciphers hardened? Are ciphers hardened?

After your SSL Labs test, look at Cipher Suites under Configuration.

Note any cipher-related vulnerabilities. This is a great resource that's updated frequently on cipher hardening.

You can now close out of this test.

Is there mixed-security content? Is there mixed-security content?

Use the JitBit test to scan for linked content (images, videos, etc.) that still aren't encrypted.

This causes a giant security alert with Chrome's default settings.

Can you afford to lose 10% of potential inbound links because of that scaring people off?

The answer is no. No you can't.

Does the site have malware? Does the site have malware?

Google is so direct about preventing malware that they have their own malware scanner. Use that to test.

Do you host a phishing scheme? Do you host a phishing scheme?

See if you appear in this database of active phishing schemes using this Google search:

 <a href="https://www.google.com/search?q=site%3Aphishtank.com+">site:phishtank.com "domain.com"</a>

This is a common side-effect of a hacked site and devastates rankings.

Have you run a self-crawl?

There are many great SEO crawlers and most will work for this phase. We use Screaming Frog because it's cheap, lightweight, and has every feature you'd ever need for this task. Not a paid endorsement.

Before beginning this phase, navigate to Configuration > Spider in Screaming Frog and set likeso:

Follow internal "nofollow": YES
Follow external "nofollow": YES
Ignore robots.txt: NO
Show Internal URLs blocked by robots.txt: YES
All other settings: default

Enter the homepage URL and click START.

Leave the results open for this entire phase.

Did you discover less pages than expected? Did you discover less pages than expected?

Filter your crawl results by "HTML" (top left). Record the # of crawled pages to the right.

More pages probably mean there are duplicate or thin content problems. That's a different audit.

Less pages probably mean there are crawling or indexing issues. If important sections of the site weren't discovered by our crawl, they'll probably never get discovered by Google either.

Are there internal 301/302 redirects? Are there internal 301/302 redirects?

Sort results on the Response Codes tab.

Keep these results up for the next several checks.

Get the total internal 301s + 302s from the Overview folder on the right. If there are issues, navigate the top menu to Bulk Export > Response Codes > Redirection (3xx) to get a CSV that includes the originating URLs. Upload to Google Drive and link to the right + note the total.

Don't worry about external 301s/302s.

More on this topic.

Are there 403 pages? Are there 403 pages?

These are usually password-protected areas or situations where no index.html (or .php, etc.) exists.

Resolve by blocking Google from crawling this location in robots.txt or create a useful index file.

Are there 404 links? Are there 404 links?

404 Errors aren't a problem by themselves, but discovering them through crawling your own site is.

If pages have moved, resolve with a 301 redirect and update old links to match. Broken external links need fixing too as a measure of quality. Get the total internal + external 404s from the Overview folder on the right.

If there are issues, navigate the top menu to Bulk Export > Response Codes > Client Error (4xx) to get a CSV that includes source URLs. Upload to Google Drive and link to the right + note the total.

Are there 500-level errors? Are there 500-level errors?

These errors happen when server-side code, or the server itself, has errors.

Sometimes these are intermittent. That's no less a problem. A developer should begin by reviewing access logs. Export these just as you did above if they exist in any significant # + record the total.

There are 25 more steps in this audit.


... to continue.

Ready to get started?