The 2017 Professional SEO's Website Audit Checklist

— by John Heard                                                            website checklist

When starting a new SEO project or taking a fresh look at an existing project, it can be hard to know where to start. That's why we've decided to share our in-house checklist to save you the trouble of making your own and to help you avoid missing the important aspects along the way.

The following steps can be used for a comprehensive technical review and to help generate a plan of action to fix problems on most websites. Of course each website is different, but we intentionally made this checklist generic, so it should be applicable to most sites you come across.


1. Review # of indexed pages

  • The most accurate method to determine the number of indexed URLs (HTML not images) in Google is to view the Google Search Console Index Status Report. This report excludes URLs that are considered duplicate, non-canonical or have a meta noindex tage. You can also see the number of URLs that Google is aware of, but can not reach because they are blocked by Robots.txt.

GSC Indexed

  • Download the Chart Data for future reference.
  • Review the number of pages indexed, ideally the number will be climbing as new content is added. URLs indexed will fluctuate, especially on larger sites as Google changes their algorithm and/or other factors change. Note: The site in the screen shot above has over 30,000 URLs, but Google only wants to index approximately 6,000.
  • The second most accurate method is to do a search at Google. The value between what Google Search Console reports, and the number of results reported back from searches will vary, sometimes greatly.
  • "IF" you have an XML sitemap for pages on the site, and it's been submited to Google, the Crawl > Sitemaps report will display how many of the URLs in the Sitemap are in the index. You may wish to record this value in a spreadsheet locally as GSC will not show you changes over time for the XML sitemap indexed values.
  • Bing's Webmaster Tools > Reports & Data Report will display the Historical Number of pages indexed, this value will likely not match a search at Bing in many cases. We found typically the reporting at Bing Webmaster Tools to be higher than what is reported with the search reported # of results, this may be due to duplicate content or other filtered search results. 

bing TPI

  • Consider why the number of indexed URLs may be higher or lower between Bing and Google.

2. Review # of indexed images

  • Do a search in Google Image Search and add &sout=1 to the end of the URL, this will show the "Old" type of image search which displays the approximate # of indexed images. For example -

  • Bing will not display the number of indexed images using a search in Bing Image Search, however you can do a quick visual review to see how things are going. Remember you can specify specific directories, ie to see more specific results.
  • Review how well the images are indexed using page title, img alt text, caption or nearby text.

3. Review Data from Google and Bing

You should do a full review of all the reports in both Google Search Console (GSC) and Bing Webmaster Tools (BWT), this is valuable data direct from the engines that can be very helpful, make use of it! At the very least, review these most important sections in GSC.

  • Review any site message reports (GSC & BWT).
  • GSC Search Appearance Section, review the Structured Data Report for errors, and the HTML improvements report for miscellaneous problems.
  • GSC Search Traffic Section, check for any Manual Actions and record the number of Links to your site in the Links to Your Site subsection.
  • GSC Search Traffic Section, closely review the Mobile Usability Report for errors. Any errors in this area can affect mobile ranking.
  • GSC International Targeting Section, review settings and verify they are appropriate.
  • GSC Google Index Section, review Index Status Trend and look for changes, it's a good idea to download the chart data and save for record keeping. Also check the Remove URLs section.
  • GSC Google Index Section, check Blocked Resources Section. Verify you have nothing blocked so that Google can render your pages.
  • GSC Crawl Section, Review All Crawl Errors (Web / Smartphone). Large numbers of 404 or other errors indicate poor site quality, something Google mentions in their Quality Raters Guidelines.
  • GSC Crawl Section, test a sample of page layouts (home, subpage, category, etc.) using Google's Fetch and Render Tool as BOTH desktop and mobile. You're looking for pages that don't fully render under those tests.
  • GSC Crawl Section, check for Robots.txt errors and warnings.
  • GSC Crawl Section, review Sitemap Section - is a sitemap in use? When was it last processed? Are there any issues reported?
  • Record Pages Crawled per Day under > Crawl Stats for future reference. Does the crawl chart look consistent, trending up or down?
  • Also check GSC Security Issues Section for Malware Warnings.
  • BWT - Review Site Messages.
  • BWT in the Reports & Data Section, Export the Inbound Links and Search Keywords reports for future records. Also review the Crawl Information and Malware reports.
  • Malware - If you do run into Malware Warnings, see "Was your Site Hacked? How to Recover from a Google Malware Attack.

4. Check Cached Content for important pages

  • How recent are the cached dates? Important and frequently updated pages should be crawled frequently.
  • Are there additional links showing in the cache that aren't showing when you look at the same URL in a browser? That may be a sign of Malware or that the server has been compromised.

5. Is the Site Ranking for Company Name, Brand and Unique Terms?

  • If you request a nonsense page from the site such as does the site return a 404 Error Response Page?
  • Error pages should also generate a 404 Server Header which says HTTP/1.1 404 Not Found. You can use the SEN HTTP Header Analyzer Tool to check the 404 error response from the server. If you get any other response code for a missing URL request, this problem must be corrected.

7. Review robots.txt and .htaccess files

  • What URLs are blocked in robots.txt? URLs blocked in robots.txt can't transfer page rank.
  • Verify no URLs necessary to render the page are included in robots.txt such as images, CSS and JavaScript. You can use the Google Search Console's Fetch as Googlebot function to verify an individual URL's access to everything it needs to render the page.
  • What URLs should be blocked, but aren't in robots.txt?
  • Is there content that is blocked in robots.txt that shouldn't be?
  • Are there settings in robots.txt that can be removed? For example perhaps you were blocking some content from spidering that has long since disappeared from the search engines and from the site.
  • Are there 301 redirects, or other commands in the .htaccess file that can be removed? For example, 301 redirects placed there 10 years ago probably are no longer needed.
  • Remember that Google no longer supports the linking to xml sitemaps in the robots.txt file. So if you're doing that, this is a good time to remove those links, you'll need to submit the XML sitemaps directly via GSC if you are not already.

8. Review XML Sitemaps

  • If XML sitemaps are used, are they correct and kept up to date? When were they last updated?
  • Are proper and reasonably accurate update frequencies used?
  • Are Video, Image or Geo Sitemaps in use?
  • If it's a WordPress site, check to make sure you're not generating sitemaps for content you don't want indexed and/or have set to meta noindex. For example image attachment sitemaps, slider sitemaps, category or tag sitemaps.

9. Does the site have an indexable navigation system?

  • Is JavaScript or Flash based technology used for the menu or other navigation links?
  • If you're not sure, turn Javascript and CSS off in your browser, can you still use the links? If so, you're good to go.

10. Check the site with a Smartphone such as a Android or iPhone

11. Conduct a Test Crawl to watch for errors

  • Use SEN's Super Spider, Xenu or Screaming Frog's Desktop Spider to test crawl the site and observe results. Are these spiders reaching all levels of the site?
  • Observe the number of 404 and other errors, these should be fixed.
  • Note how many URLs are redirected (301 or 302), these redirects should NOT be necessary other than short term fixes. If you're redirecting an on-site link, you really need to update the old link to point to the new URL, not redirect it.
  • If the spiders can't crawl the site, then search engines will likely have problems as well.

Duplicate Content

1. Check for off-site duplicate content problems

  • Search for a unique snippet of text within quotes from recent, but not brand new content – does it show up anywhere else?
  • If there are copies of content, attempt to determine the source. For example, are they scrapping the site? RSS feed? Content Syndication?
  • How about images, are the site's images copied to other websites? Use Google Image search to find duplicates. Use the Similar and Visually Similar Search Options to discover variations and alterations.
  • Conduct searches for the company's phone numbers (toll free and standard), and postal address to help discover additional websites.
  • Has the company moved in recent years? Conduct Searches for previous phone and address references if so.

2. Are both the www and non-www versions of the site indexed?

  • Check using the search command.

3. Is the site indexed by IP address or by host name?

  • Find the site's IP address using a tool like -'s Hostname IP Finder.
  • Do a search at using the site search operator, example site:
  • If the site is indexed by IP, that means you have a duplicate content issue on your server. You'll need to 301 redirect your sites IP address to your domain name to prevent the IP address from getting spidered.
  • Look to see if the site is indexed by the server host name, some servers are setup to be reachable by You can look for this error by searching for unique text on the page, like the phone number or other unique text on the page, and a filter to remove your site (and subdomains) from the results This should return all sites that have that phone number and unique text string indexed EXCEPT for your domain name. Example search. Note: You may find some other problems, like site mirrors as well with this test.

4. Is there a duplicate issue with mobile content?

  • If a mobile site or mobile customization occurs for mobile, does that create a duplicate content situation on site, or on a mobile site?

5. Are there 301 redirects in place to the preferred (www or non-www) canonical version?

6. Do a search

  • At the end of the results listed are you seeing the “in order to show you the most relevant results, we have omitted some entries very similar to the XXX already displayed” message? This is a strong indication of duplicate content on the site, but you may not see this message show up in the first page of results, click further within the search results to see if it shows up further into the result set.

7. Is the site a dynamic database driven site, or does it use tracking URLs?

  • URL variables and tracking URL's can often cause content to be indexed multiple times. Take a snippet of body text from an indexed page, and then search for it using the “text snippet” method to see if you can find alternate URL structures.
  • Scan the site with a tool like SEN's Super Spider, Xenu or Screaming Frog's Desktop Spider and sort the results by title and by URL to look for duplication.
  • If the site uses URL parameters, such as ?category=shoe&brand=nike&color=red&size=5, you might want to make use of URL Parameters Tool in GSC to avoid problems with duplicate content.

8. Check for additional Domains, subdomains, subdirectories and secure servers for duplicate content.

  • Does the same content exist on different domains, subdomains, subdirectories, or secure servers (https)?
  • Review Google Webmaster Tools -> Search Appearance-> HTML Improvements -> Duplicate Title Tags report. Duplicate titles often indicate duplicate content pages.
  • Do other domain names point to the same content on the server? This can be a severe problem, and many companies do this without realizing the possible problems it can cause. Ask if the company owns other domains, get a list of any “extra domains” and check to see what content they are pointed at and do a site: search to see if they are indexed.
  • Are there other domains representing the company, such as Yellowpages or other promotional domains representing the company at the same address? These can be a source of problems for Google My Business Page resulting in lower ranking and/or incorrect address citations.

9. Check Image Duplicate Content

  • Load sample image URLs into Google's "search by image" option at, see if the images show up on other sites. Product photos are very likely culprits.

Load Time

1. Google Page Speed Insights

  • Test a sampling of the site's pages (Home, Product, Gallery, etc.) with Google's Page Speed Insights. Record Mobile and Desktop Ratings.
  • At the minimum your site should score better than your competition's site, ideally your site should score in the 80-100 range for both desktop and mobile tests if possible.
  • Note that Google Page Speed Tests are to show you what improvements your page and site need, it's not an actual speed measurement.
  • If the site is using Google Analytics, review the Behavior > Site Speed reports. The Page Timings section can be sorted by Avg. Page Load Time showing you the slowest pages on the site. Additionally you can show in the 2nd column the page Bounce Rate. Specifically make note of slow pages and high bounce rate for needed improvements.
  • Next in the Google Analytics Site Speed section, the Speed Suggestions report will display pages that are ranked by page views. Make note of pages with low scores and high average page load time for improvements.

2. Test Website Response Time

We recommend using for performance benchmarks, it's a fast and well accepted service to analyze Load Time. According to Google, sites that take 20 seconds or more to load are likely to cause ranking issues for the URL. These tests will help identify areas of the site that need further improvement. Make sure to test the proper protocol used on the site (http/https), failure to do so will result in inaccurate reporting.

  • Test Cold Cache vs. Hot Cache - reports page load time on first visit and subsequent visits. You should test your home page, and other important sections of your site. We recommend testing with both Cable (5/1 Mbps 28ms RTT) and Mobile 3g (1.6Mbps/768 Kbps 300ms RTT). Make note of the Load Time, First Byte and Number of Requests for both First View (Cold Cache) and Repeat View (Hot Cache). These are important metrics, for comparison you should see 2-3 second First View and sub one second Repeat View on Cable. For Mobile look for better than 5 seconds first View and 1 second repeat view, those values are both reasonably fast (and obtainable).
  • Note the Grade (A through F) reported in each section, First Byte Time, Keep-alive, Compressed Transfer, Compressed Images, Progressive Images and Caching. Each section should be scoring an "A" on an optimized web page. Again, test more than just the home page. It's important to test major sections of the site. Ideally each time your site template makes a change (product, home, category, gallery, FAQ, etc.) you should test for irregularities.
  • Specifically note pages with high Time to First Byte (TTFB), these are pages that may have backend problems, hosting issues, DNS issues, etc. that need resolving. Slow TTFB pages are VERY aggravating for users, and have been suspected of having an impact in search ranking as a site quality factor. Ideally we would like to see a TTFB below .200 ms if possible. Google Page Speed Insights will ding you points if it's much slower than that. Also note TTFB often is load dependent, if the site is getting heavy traffic often it will slow down, so this time will likely vary from test to test.
  • The "Start Render" value is the amount of time before the user sees the page start to load, which is a big factor in high bounce rates. Improvements such as progressive JPG images, removal of render blocking CSS and JavaScript will all improve this value.

3. DNS Server Tests

  • DNS errors can cause severe performance problems, you can test yours for free at
  • DNS response time increases latency and impacts your site load time, including Time To First Byte (TTFB) performance. Check your DNS performance at Under 10 ms average is a good response time, over 100 ms response time you should work on improvements.

See How the Pros Engineer a Lightning Fast Website - Part 1 and Part 2 for tips on improving these numbers.


1. Does the home page of the site have indexable text of at least one paragraph?

  • Lack of indexable text can be a severe ranking problem.
  • Rough rule of thumb, if you can hit CNTL-A then CNTL-C to copy the text on the site and paste it into a text editor, that text should be what is indexable.
  • Ideally 1500-2500 words on the page would be ideal, but of course in certain situations that's difficult to do. At least try to have 150 words minimum for your pages - more is better. Pages that have a high number of links, but a small amount of text are typically going to be considered low quality.
  • Pages that you need, but that are very light on text you should consider setting them to meta noindex to avoid any Panda penalties, for example gallery pages that have very little text content.

2. Spelling, Grammar and Content Quality

  • Bing has specifically stated that their engine pays attention to spelling and grammar errors. Google's Matt Cutts on the other hand previously said don't worry about spelling errors in user comments, but he really didn't address errors in main content. We believe a lack of spelling and grammar errors can be a small quality signal.
  • Does the content read like a magazine or newspaper article, or does it read like someone paid a worker in India $1/hr to produce?
  • Would you trust the content presented on the pages? Does it seem to be written by an expert or someone well versed on the topic?
  • Does the site have multiple articles on the same topic, with very similar content but with only certain keywords switched out? That's a very good sign of low quality content that users won't like.
  • If this is an ecommerce site, is there a lack of security signals that would make you not trust them with your credit card?

3. What is the Readability Level of Important Content ?

  • Copy the plain text from the home page (Not HTML Code) and paste it into an online tool such as the Dale Chall Online Tester at Record the Grade Level, # of Words NOT found in the Word List (Unique Words) and Final Score. Also test other important content pages on your site such as FAQ, Product Pages, White Papers, etc. Studies have shown that low readability scores (low grade level) have some correlation to lower ranking.
  • Has readability been compromised for the sake of optimization?

4. Does the site have over usage of important keywords?

  • Is there excessive keyword repetition within the indexable body text, image alt text, and/or links? Does the content look "spammy"?
  • Is the site a "Your Money or Your Life" website? If so, would you trust the site with your credit card information, email address, or other personal health details?
  • Are outbound links relevant to the topic on the page?

5. Meta Tags

  • Review Meta keyword tags. This tag is no longer observed by search engines. Excessive content in the Meta keyword tag typically is wasted code. With the exception that the tag is sometimes used for on-site search.
  • Review Meta description tags. Does each page have a unique, well written description? Google Webmaster Tools has a report that shows duplicate Meta descriptions.
  • Look for use of the Meta robots tag. The noindex variable should only be used on pages that indeed should not be indexed. Make sure this tag is not in use on pages that should be indexed.
  • Use of the Meta robots index, follow tag is wasteful and only adds to code bloat on the site. Those settings are assumed and do not need tags to indicate index or follow.

6. Title Tags

  • Does each page have a unique title tag? Google Webmaster Tools has a report that lists pages with duplicate title tags. You should also use a tool like SEN's Super Spider, Xenu or Screaming Frog's Desktop Spider to scan the site to find duplicate title tags.
  • Is the primary keyword phrase at the beginning or near the beginning of the title tag? This is typically a ranking boost.
  • Are there any title tags over approximately 70 characters long? If so, they will not display in search results and may not be useful. Long title tags can also cause Google to swap out the title for a more appropriate text.
  • Do the title tags appear to be spammy, or keyword stuffed? This can reduce click through rates in search results and result in lower ranking.
  • Every page should have a well optimized title tag, it is still one of the most important on-page ranking factors.

7. Are images SEO optimized?

  • Use of keywords in file name?
  • Use of keywords in img alt text?
  • Do pages that are dedicated to images, like a Gallery, have appropriate title tags to help describe the image?

For detailed image optimization recommendations, see Six Steps to Quickly and Expertly Optimize Your Site's Images.

8. Text Formatting

  • Are H tags in use?
  • Does the site use more than one H1 tag per page? It shouldn't, unless you're working with a Blog where individual post sections are marked with h1 tags for each post title in Categories, Blog Home, etc.
  • Is content formatted well and easy to read?

9. Advertising

  • How many ad blocks are in use throughout the site?
  • What approximate percentage is the ad to content ratio in regards to screen real estate?
  • Is over 50% of the above the fold content composed of ads? If so, that may be viewed as poor quality by Google, and impact ranking negatively. On a desktop, a 1280x1000 pixel area is typically regarded as content "above the fold" - obviously mobile devices are MUCH smaller.

10. Desktop Compatibility

  • Review site using current and older versions of Edge/Internet Explorer, Chrome and Firefox. Check for variations, problems rendering pages, layout variations and using features such as navigation.
  • Is the site usable if you turn off JavaScript and CSS in the browser?

11. Mobile / Tablet Compatibility

  • Check the site using a Smartphone such as a Droid or iPhone, does it have at least minimal functionality? Test with 10" and 7" tablet as well, note problem areas if found. If you do not have access to these devices, at least test with a Smartphone emulator such as and/or or the Chrome DevTools mobile emulation (look for smartphone icon on top left).
  • Test with Opera Mobile especially if you're using web fonts.
  • When using a Smartphone, can you find the address, email contact and phone number?
  • Can you use the site's main navigation links?
  • If there are videos present, can you view them on the Smartphone? Google lists unusable video as a ranking factor for mobile and does not want you to use Flash Videos. In fact there is a penalty if you server Flash to non-supported devices.

12. Is there Structured Data Markup compatible with in use?

  • Is the markup properly formatted? Test using Google's Structured Data Testing Tool at Excessive errors can result in Google penalties.
  • Check for errors such as incorrect hours, business name, address or phone (NAP).
  • If using other markup formats such as hcard, consider updating to those endorsed by
  • Does the site have structured data hidden using CSS display:none or display:hidden? That practice can easily result in Google penalties.
  • Is the structured data misleading, inaccurate, or false? If so, it would violate Google's Guidelines and can result in a penalty for Spammy Structured Markup. An example is a site marking up hidden text in order to generate a review rich snippet, but the actual review is not visible on the page when viewed.
  • Are there aggregate review values marked up, but the reviews are not visible? If so, that's a potential penalty problem. Are reviews posted that were sourced from someplace other than the site such as Google My Business, Trustpilot, Yelp, etc.? That's also a problem.

13. Are Facebook/G+ Open Graph Tags in use?

  • Are Facebook and Google+ Social Buttons in use? Home page only? Sitewide?
  • Test Page's Open Graph tags using Facebook's Lint Tester and look for errors. Facebook's Lint Test is the most complete method you can use for testing these.
  • Is the content within the tags unique on each page and formatted correctly?
  • When a URL is “liked” on the site, does the presentation on the person's wall look correct?
  • Are the same Open Graph tags used site wide, or do they correctly reference each page uniquely?
  • Are Facebook and Google+ Social Buttons in use? Home page only? Sitewide?
  • Is the Like button properly referencing the URL where the Like Button is at? Or do they have all the Like Buttons referencing the home page, or their Facebook page?

14. Canonical Issues

  • Is the rel=canonical tag in use on the site? If so, is it being used correctly?
  • Does the internal linking structure agree with your canonical version of the site that is preferred? Otherwise, if you prefer to use instead of, do the links within the site use the www for example
  • Remember other content formats, such as .PDF, .DOC, Video, etc. – All content should be using the same canonical version of the domain name.

15. Are iframes or frames in use?

  • Both iframes and regular frames are bad for ranking. Check to make sure the use of these methods are limited and not used to show any major portions of content. Iframe tags are quite common, such as the Facebook Like button, but should be avoided for content that needs to be indexed.

16. Is JavaScript or Ajax used to generate content, or used excessively?

  • Search engines typically have a problem reading JavaScript and in general, you should not expect JavaScript generated content to be indexed, or usable by a search engine.
  • JavaScript navigation can make a site hard or impossible to be spidered by search engines.
  • Ajax generated content can be problematic, not impossible, but certainly can be difficult to get indexed. It should be avoided for SEO best practices unless you fully research and test.

17. Does the site hide text using accordion, tab or other user toggle features?

  • Content that uses CSS display:none or hidden may be ignored to a certain extent by Google. For example jQuery tabs on a product detail page. Ideally no important content is listed in those sections if possible. Google has went back and forth on how they view the use of these features.
  • Is any other content or links being hidden accidentally? For example anchor text links in the footer that are the same color as the background color. Issues like that can trigger a Google Penalty.
  • Use of CSS display:hidden or display:none is certainly something Google looks to penalize for when used improperly. However even innocent usage may cause a problem, avoiding using these features unless the user can toggle/turn the text or feature on or off. If the user has no way of displaying what is hidden, then you should probably consider not using that method to hide the content.

18. Content Date

  • Is the creation date listed in the content? Last Updated Date? These are quality signals according to Google. Even on older, evergreen content Google likes to see the content date published on the page.
  • Is the Copyright Year in the footer of the site up to date?

19. Your Money or Your Life Requirements

YMYL (Your money or your life) type sites such as those that deal with financial, shopping, medical, legal and safety information have very high quality standards requirements by Google. They should typically have these content pages as described by Google's Quality Raters Guidelines.

  • Privacy Policy.
  • Terms of Service.
  • Robust and Complete Customer Service and/or Contact pages.
  • Functioning Order Forms.
  • About Us/Company Info - these are typical pages on a legitimate business website.
  • HTTPS is highly recommended for YMYL content.

Site Architecture

1. Click Depth

  • How many clicks is the majority of the important content from the home page?
  • What's the maximum depth of content in clicks from the home page?

2. Content Structure

  • How many categories are there?
  • How many sub-categories are there?
  • How many product or detail pages are there? Compare to number of indexed pages earlier and review what percentage of products are indexed.

3. Navigation

  • How many links are there in the main navigation?
  • Are there over 100 links on upper level pages? Over 100 might be getting excessive and may be spreading available Link Juice too thin.

4. Link Anchor Text Structure

  • Does the site utilize reasonable keywords in internal link anchor text? For example links to product and category pages linked to using appropriate keywords?
  • Excessive repetition of primary keyword in navigation structure can trigger a penalty or reduced ranking.

5. Are Mobile Redirects working correctly?

  • If the site customizes content for mobile devices, or redirects users to a site, verify the redirects are working correctly. Desktop site subpages should NOT redirect to the site's home page. Desktop sites should not redirect to 404 error pages on sites either.
  • If a desktop user lands on the desktop site, can they navigate (or be redirected) successfully to the mobile equivalent?
  • If a desktop user lands on the mobile site, can they navigate successfully to the desktop page equivalent?
  • Does the site redirect Tablet users to a mobile site? That might not be the best solution if the desktop site works well for tablets.

6. Search Engine Friendly URLs

  • Does the site utilize excessive URL parameters and/or session IDs? These can cause crawling and duplicate content problems.
  • Shorter URLs are better for usability and are easier to link to.
  • Descriptive URLs can aid in ranking and increase click through rates.

7. General Usability

One sign of a high quality website is great usability. While this isn't a full usability test, it's important to make note of glaring issues such as -

  • Note any particularly annoying or hard to use features of the site.
  • Are there pop-ups that can't be easily closed? Do Pop-ups cover up content on mobile devices? That can cause an interstitial penalty in some cases. Ideally they should not obscure content except under specific conditions such as age verification.
  • Can forms be filled out easily? Is autofill enabled?
  • Are there advertising links closely located to other navigation features making them difficult to avoid clicking?

Local Search

1. Does the business have an Owner Verified Google My Business Page?

  • Is there more than one My Business Page Listing for the business?
  • Who is in control of the account via the listed email address?

2. External Citations and Reviews

  • Do external citations include keyword stuffed references describing the business such as "best plumber in Houston"? Would an editor from Google use the words in a description?
  • Are reviews duplicated in multiple locations? Does the website publish their Yelp, Google My Business Page, or other duplicate reviews on the site? This can cause a loss of reviews in Places when Google finds duplicates.
  • Are on-site reviews or ratings marked up with tags?
  • Note the last review or ratings date, are new ones being posted?

3. Do Address & Phone Number on the Google My Business Page and website match exactly?

  • This is a factor in Local Search for Google.

4. Does the Website Title Tag contain the City and State?

  • Use of the City and State can benefit Geo Targeted Search ranking.

5. Are City and State, and/or other location related Keywords in use within H tags and Body Text?

  • Use of the City and State as keywords can improve Local Search Ranking.

6. Are the Business Name, Address, Phone and Contact info listed on every page?

  • Business Contact info (NAP) on every page is highly recommended both from Google Quality Raters Guideline and from a local search aspect.
  • Ideally the NAP (Name, Address, Phone) info is marked up with markup.

Secure Server Errors / HTTPS

If the site uses a security certificate, make sure to check the following.

1. Inspect the padlock icon in the browser

  • Does the padlock icon show an X, slash or red label indicating the certificate is invalid?
  • When you click the padlock icon are there any mixed content errors or other errors being reported?

2. Test the Site using Qualys SSL Labs Server Test

  • Does the summary report show at least a "C" Overall Rating? If not, there are issue that need resolved.
  • Does the Configuration support TLS 1.2, TLS 1.1 and TLS 1.0? These are the suggested protocols to be using at this time.
  • In the Authentication Section, when does the certificate expire?
  • In the Authentication Section, does the cert use SHA2 or SHA-256withRSA? That's what Google recommends, SHA-1 will soon be out of date.
  • Does the Authentication Key show RSA 2048 bit? If not the cert should be upgraded.
  • In the Protocol Details Section, what Vulnerabilities are being reported? Make site owner aware of any issues.
  • Alternate Tests available at - SSL Store Checker & Geocerts SSL Checker

3. Is there unintentional insecure URLs indexed?

  • If you do a search at Google, are you finding any HTTP URLs indexed?
  • Are both the HTTP and HTTPS versions getting indexed?
  • Test the server redirect from http to https using SEN's HTTP Header Analyzer, the redirect should generate a 301 code.

4. Are internal links correct?

  • Do the pages rel=canonical tags point to the correct HTTPS version of the URL?
  • If the pages are meant to be viewed with either HTTP or HTTPS, do they use protocol relative URLs? (example <a href="//path/page.html">)

5. Any chained redirects present?

  • If you attempt to load, do you get redirected to then to You can test this using SEN's HTTP Header Analyzer. The response code should be "200" when tested, if you see a 301 response code, then Code 200 in the server headers, the URL is being redirected.

6. Scan the site for non-secure content.

  • Use which can scan up to 200 URLs, or a tool such as Screaming Frog's Desktop Spider to identify non-secure URLs. Any resource pages call to load the page must be HTTPS to be fully secure. This includes remote images loaded in a page, local and remote CSS and Javascript files, etc.

Videos and Images

1. If there is video hosted on the site, is an XML Video Sitemap used?

  • Video Sitemaps can enhance and improve chances on getting the video to show up in Organic Search.

2. Are XML Image Sitemaps used?

  • Image Sitemaps can enhance and improve ranking for Image Search, and to show in Organic Search.


1. Form Testing

  • Test all forms to make sure they are functioning correctly. Verify form reply messages are correct and verify form data is sent to the appropriate people.
  • Also test email contact addresses that are listed on site.
  • Does the form request a password or credit card data? If so, the page needs to be HTTPS and the form submission URL or Google's Browser may throw up a non-secure warning.

2. Review WHOIS contact data and Domain Expiration Date

  • Is whois domain registration information public? It should be, also verify all contact information is correct including address, email and phone. This is a Google Quality Rater Guideline.
  • Note renewal date for domain. Who is the contact that will be renewing the domain? Are they aware of what to do when it's time for renewal? It's not a bad idea to set a data in your calendar as a reminder to make certain the domain is renewed.

3. HTTPS Certificate Expiration

  • When does the HTTPS certificate expire? Do you or your client have a plan in place to renew or replace the security certificate? There are several tools available to check the expiration date, such as the SSL Checker.

4. What is the Site's uptime?

  • Numerous sites offer this service to monitor and alert you when the site goes down. Such as and many others. How would you know if there is a problem if you're not monitoring it?
  • Some of these services will email you when the content on the site changes. How long would it take you to find out that your site has been hacked and has something embarrassing on the home page?

5. Google Analytics

Many websites make use of Google Analytics, but also many have things configured incorrectly. Do a quick review of the following;

  • Log into the Google Analytics account and verify the account is properly tracking hits to the site. You should be able to go to the real time traffic display, load a page and see the traffic as a hit in real time.
  • Verify only one instance of the same Google Analytics Tracking Script is being used on each page. Multiple instances of the tracking code can inflate site traffic.
  • Is Google Search Console associated with the Google Analytics account?
  • Is the Google Analytics setup making use of demographics, events, goals and conversion tracking?
  • Is the account setup for the proper protocol that the website is using (HTTP/HTTPS) ?

6. Dangerous Remote Resources

  • Review the source code of sample pages and look for calls to off site resources. Specifically, look for remote website references for javascript, fonts, and other resources. Do you recognize the domain names for things such as jquery.js? Are there multiple calls for the same resource? Look for off site scripting such as

    <script type="text/javascript" src="">
    If you don't recognize the domain name, this could be a potential source of malware or viruses. It may or may not be active when you test it! If you get zero information back from one of those, it could very likely be that the site has been hacked in the past, but the dangerous code hasn't been activated... yet. A call to a blank resource is a good indication of a trojan horse that can be activated at any time!
  • Hosted libraries such as are a safe resource. If in doubt, do some research to make sure that off site resource is safe.
  • These issues can be hard to locate, close examination of your HTML code is required to find these on a compromised website. Be especially vigilant if you know the site has been hacked or compromised in some way in the past.

7. CMS / WordPress Software Out of Date

  • Log into the sites CMS and review the current CMS release date. Is there a notification there is a newer version available? How many versions is the software out of date? Google is now sending warnings to verified sites when it detects WordPress sites need to be updated. This may be quality signal for Google, as out of date software is often a security risk. Note: Web Hosts may shut down a site that is allowed to get too far out of date due to security risks.
  • Review the sites plugins to see if any need updated.
  • Is the theme the site uses out of date or need updated? Themes can have security problems as well and should be kept up to date.
  • Don't forget to check Forum software and other software powered resources. While Google isn't reporting on these yet, it's likely to happen in the near future as they expand on what they monitor.
  • Don't underestimate the potential for errors or compatiblity problems when updated a sites CMS, Plugins or Themes. Make certain you have a FULL working backup of the sites files and database/s before an update, and make sure you have a contingency plan should and update cause major issues or make it totally unusable.

Additional Checklists

Now you have the checklist we start with when doing research for our SEN Website Consultations. Be sure to bookmark this article or download the PDF to be ready for your next website audit. 

You're only as good as the checklist you follow!