While creating strategic, optimized content is key for SEO success, other factors need to be considered to maintain your site health and user experience. There are crucial technical audits and spot-maintenance checks you need to schedule to stay on track and keep your site healthy. Below I will walk you through each of these elements and show you where to look to find everything that needs your attention. Some things you can fix on your own if you are technical enough like resolving server errors, while some you may need to hire a professional to help.
As we talked about in Part 1 of this series, Search Console (previously Google Webmaster Tools/GWT), has evolved into almost all you need to manage and keep your on-site SEO and search presence healthy. It is your best friend for identifying issues and managing your site maintenance quickly, easily and for $0.
Below I will walk you through each of these elements and show you where to look to find everything that needs your attention. Some things you can fix on your own if you are technical enough like resolving server errors, some you may need to hire a professional to help.
Before we begin, make sure to download the swipe file that includes: the ebook version of these blog posts Part 1 & 2, 2 Bonus On-page SEO Techniques (+ how to use your IA template), plus a spreadsheet template that shows you how to do an SEO Content Audit & Information Architecture Health Check. Whoa! Grab it up here by clicking below-----
If you are adding a lot of content on a regular basis you should let Google know about it as quickly as possible. It can take a while for search engines to get around to crawling your site and indexing its content. Although there are no guarantees that if you submit your sitemap to Google you will rank faster, it doesn’t hurt. Why not increase your chances of refreshing your presence sooner and possibly expanding your indexation rate?
Check out this video of Google engineer Matt Cutts “hamming it up” that explains indexation well.
Every website ought to have a sitemap which search engines use to understand the organization of your site content. Every month, you should review your website sitemap, and if you have made changes to your site generate an updated version and re-submit to search engines. You can resubmit in Search Console.
A 100% indexed site is ideal but is very rare. Realistically, 90+% of the sitemap pages should be indexed. This amount will be lower for new sites, but you should be able to see the rate increase over time. You should be looking for two indicators:
- Is the percentage of indexed pages increasing over time, or holding steady?
- Does the number of pages submitted match the total number of pages that are actually on the site?
If the answer to either one of these questions is “no,” you should monitor this and search for possible reasons why. Common issues include things like 404 errors and blocked resources that I go into detail about below.
Review 404 errors
A 404 error means a web page which you are trying to reach cannot be found on your site. This can occur for a number of reasons, most commonly a broken link, a bad redirect, etc. Although 404 errors are not directly harmful to your site’s SEO, they may indirectly cause harm by affecting usability and trust. No one likes to land on a 404 page, not even a search engine. Run a monthly audit and fix these.
In Search Console, click Crawl, and select Crawl Errors. See what you’re dealing with.
This particular site above only has 3-404 errors. Probably not something to panic about, unless it's very important content. Use your best judgement and prioritize.
Review Structured Data
Structured data, also known as rich snippets to the user, marks up your content so that Google can display “rich results” such as:
If you or your developer take the time to markup the HTML with structured data you will get these visually appealing search results. These types of results are known to improve click through rates as much as 20-30% according to Moz. From a user experience perspective, they provide benefits such as:
- Drawing a user's attention to your relevant result.
- Providing instant information as related to their search query.
These aren’t available for all types of sites, so see the full list of the type of content that Google supports rich snippets for here: Google documentation.
If you have structured data on your site you should review it for errors as well as opportunities to add new snippets. To review structured data, head over to Search Console > Search Appearance > Structured Data in the drop-down menu.
The report above shows you the total number of structured data items found, trends in structured data over time, types of structured data found on the site, and the errors.
Check which items have errors. Click on each URL in the report to see the markup. The most common markup error is a missing fields i.e. as in an Event item that has the location and performer marked up but not the date. Again, not all sites use structured data, so if you go into GWT and see that no structured data is found on your site, then you can move on, or consider adding some in the future. Again, prioritize. For assistance understanding your errors please see Search Console Help.
Check Title Tags
As I previously detailed in Part 1, a title tag is one of the most important on-page factors to consider. You should monitor these on a regular basis to make sure there are no errors such as:
- Missing title tags
- Duplicate title tags
- Too long or too short
- Non-informative title tags (i.e. relevant to the page content)
Also, make sure that they are optimized for one main keyword. Once again head to Search Console > Search Appearance > HTML Improvements to check your title tags.
Check “Links To Your Site” Report for "Bad" Links
Page rank is an important signal, but it is only one of 200 things Google looks at to determine the rank of your site. Even so, many site owners build or hire someone to build backlinks for them that in turn get them penalized by Google. Paid links or other link schemes violate Google Webmaster Guidelines and therefore, you want to avoid doing these things in order to rank well.
If you review your “Links To Your Site” report you will get a list of backlinks. If you see any that any look low-quality or spammy, you should do what you can to remove that link from your profile. If you are unable to remove a link, you can take further action by disavowing them in the Search Console. This is a two-step process.
First, download all the links to your site. Next, you’ll create a file containing only the links you want to disavow and upload this to Google. I will show you how.
Go to Search Console > Search Traffic > Links To Your Site. Look at Who links the most. If any of those looks suspicious, click on it to get a list of all your pages that are linked from it. If the site is really spammy, like a porno or gambling site, you should probably disavow it. If you aren't sure, you should check with your SEO to see if that link could be harming your domain. If it isn't, you can skip the disavow process.
Once you have a list of URL's you want to disavow, create a text file (the file type must be .txt and it must be encoded in UTF-8 or 7-bit ASCII) —one link per line. If you want Google to ignore all links from an entire domain (like example.com), add the line "domain:example.com".
Here's a sample of a valid file:
Upload a list of links to disavow:
- Go to the disavow links tool page.
- Select your website.
- Click Disavow links.
- Click Choose file.
Give Google some time to process this information. Hopefully, if you were diligent, you will only have to do this one time.
Look at historical site performance
Historical site performance is when you look at the current quarter, the last 12 months, and all-time data. You are looking to compare how current performance i.e. quarter over quarter, year over year, compares to the past performance of the same.
Here is how you look at quarter over quarter in Google Analytics:
Look at the last 3 full months of data and check Compare to: Previous period. Analytics will give you the previous quarter and you’ll have yourself a comparison!
If you want to see year over year select a date range within the current year and check Compare to: Previous year.
What you want to look out for are any anomalies, either good or bad in your traffic and conversions. To do this dig into your individual traffic sources/channels and look for large spikes either up or down.
Using a quarterly audit as an example, I see a big decline in referral traffic, over 80%. So I would want to investigate the individual channels (click on Referral to see individual sites) that declined to find out if this is a good or bad thing. In the case below, we get lucky and have no reason to be alarmed! The referrers were mostly spammy sites which have all since gone away. Hence the drop in my referral traffic.
So in a case like this, no action would be needed, besides maybe setting up a spam filter in GA. However, if the declines came from a site where you have a guest post or an article submission for example, then you’d definitely want to reach out to to the site owner and investigate - perhaps your content was removed or a link is broken? These are all things you can determine by auditing traffic channels regularly.
If you are running an e-commerce site, you definitely want to keep an eye on Revenue and Transactions. How are those doing quarter over quarter, year over year? This could be a post all on its own, but you get my point. Any steep declines in rev and you may have on-page conversion problems to solve.
This text file is in the root of your website's folder communicates a certain number of guidelines to search engine crawlers. Know what it is that you want to communicate and what you want to “disallow.” For example, an e-commerce robot.txt would want to disallow crawlers from accessing their shopping cart so the file would contain lines like:
It’s also ok to add your sitemap in your robot.txt since you will be updating that regularly. That line looks like:
What you don’t want to ever see is a line in your sitemap that looks like:
Because that is basically telling search engines to not crawl your entire site.
Testing to make sure there are no errors in your robot.txt file is also easy to do using Search Console.
Go to Crawl> robot.txt Tester and input the path of your robot.txt file and Google will tell you if it can access and also give you a list of any errors or warnings.
You can also edit and submit your robot.txt file here but be sure and read the Google guidelines on this before you get started.
Last but certainly not least is mobile friendliness. For 2016 and beyond, this may be the most important update you can make to your landing pages. Luckily Google has a Mobile-Friendly Test you can use to input your URL and report if the page has a mobile friendly design.
Start by testing your home page:
Hopefully, you will get this message:
If you haven’t updated your site in awhile, this is the message you will most likely see. With problems specific to your landing page listed below the message:
Follow the directions on the report page that Google provides.
A recent mobile marketing statistics compilation from SmartInsights shows just where we are in terms of mobile usage. It’s no longer about if visitors will want to access your site from a mobile device, it’s to the point where you should know your visitors preferences i.e. tablet, mobile phone, or other devices.
Mobile media time in the US is now significantly higher at 51% compared to desktop at 42%. If you cannot fix errors yourself, it is time to hire a designer to get your site up to snuff for 2016 and beyond.
Lets’ recap what we’ve learned so far:
Search Console has evolved into the go-to tool for managing and keeping your on-site SEO and search presence healthy. It is your best friend for identifying issues and managing your site maintenance quickly, easily, and for $0. The best way to stay on top of site maintenance is to schedule it. Put the dates in your calendar and follow up with everything on the above list.
Key Takeaways for Perfect Technical SEO:
- Keep your Sitemap updated and resubmit it to search console when you add new content.
- Try and resolve as many 404 errors as you can on your own and seek help if things are beyond your expertise.
- If you have any of these types of sites: mobile apps, products, recipes, or reviews; adding structured data can really enhance your search presence and create attractive calls to action in search results.
- Google tells you what to do to improve your HTML and meta data. Do them.
- Depending on what linkbuilding activities you’ve engaged in in the past, you could potentially have some backlinks that are hurting your rankings. Go through the report and check on any links that look suspicious and contact the webmasters to remove them.
- Approximately every 3 months you should look at historical site performance (this quarter, the last 12 months, and all-time) to see how current performance compares to past performance. If you have numbers that are down for a certain KPI, it’s time to dig a little deeper there.
There you have it, there is a lot to maintain in the typical website environment. If you’re looking to outsource this process to ensure you have all your boxes checked for a healthy and well-maintained site at the end of every month, let us know. We carefully choose a few new sites each month to help with site maintenance.
If you are ready to take this on for yourself, this is your last chance to enter your email below and receive the swipe file that includes: the ebook version of these blog posts Part 1 & 2, 2- Bonus On-page SEO techniques (+ how to use your IA template), plus a spreadsheet template that shows you how to do an SEO Content Audit & Information Architecture Health Check. Whoa! Grab it up before it's gone!