This is part two of our 2020 guide to SEO. Part one was about on-page SEO, this part two will guide you through how to implement technical SEO.
There are two primary components of SEO: on-page and technical. On-page SEO is about understanding the buying intent of people searching for the problem your product or service solves, then including relevant search terms in your quality content. It focuses on optimizing the words on your webpages.
Technical SEO differs by focusing on user experience and accessibility. For example, when determining where your website ranks in search results, Google’s algorithms consider how long visitors stay on your website and how quickly they bounce. Therefore, factors like site page speed (aka load time), the security of your site, if it’s mobile-friendly, and how quickly visitors can find what they’re looking for are taken into account.
Below, we walk through the most important elements of technical SEO today and how you can get your site in optimal shape. Here are some free technical SEO tools we’ll mention throughout if you want to bookmark them first:
Additional (paid) technical SEO tools to consider:
Before we begin, make sure to download this swipe file that includes: Bonus on-page SEO techniques (plus, how to use your included Information Architecture (IA) template), and a spreadsheet template that shows you how to do an SEO content audit & IA health check. Download it here:
Note: You may need assistance from your developer for some of this work.
Structure Your Content and Create a Site Map
Your website’s navigation matters and is a great first step in getting your site into optimal shape. That is, the order in which your information is presented to users and search engines. This is often referred to as information architecture and reflected in your website’s menu.
Your visitors want to find what they’re looking for fast. Otherwise, they’ll bounce on over to your competitor’s website. Create a hierarchy that flows, makes logical sense, and gets visitors to where they want to go—ideally, to where they’ll make a purchase.
A website’s navigation can also help search engines understand what content you think is important. According to Google: “Although Google’s search results are provided at a page level, Google likes to have a sense of what role a page plays in the bigger picture of the site.”
Unless your site has only a handful of pages, think about how visitors will go from your root page (your homepage or a landing page) to a page containing more specific content. Here’s an example of a well-organized website hierarchy from Google:
The image above reflects the menu (or navigation) of a small website selling baseball cards. Your menu should be simple to navigate and present options a potential buyer would expect.
How to Create and Submit a Sitemap
Your site’s hierarchy informs your sitemap. Google says, “A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. Search engines like Google read this file to more intelligently crawl your site.”
Additionally, a sitemap provides valuable information about these files, such as when the page was last updated, how often the page is changed, and any alternate language versions of a page.
A sitemap is created in an XML file format. Most content management platforms and many plugins will provide an XML sitemap for you. Yoast, a popular SEO plugin for WordPress sites, for example, provides one automatically.
Once you have your sitemap downloaded, you’ll need to submit it to search engines. To do this in Google, follow these steps:
- Sign in to Google Search Console.(GSC)
- In the sidebar, select your website.
3. Click on ‘Sitemaps.’ The ‘Sitemaps’ menu is under the ‘Index’ section. If you do not see ‘Sitemaps’, click on ‘Index’ to expand the section.
4. Remove outdated or invalid sitemaps (if any) like sitemap.xml
5. Enter ‘sitemap_index.xml’ in the ‘Add a new sitemap’ field to complete the sitemap URL.
6. Click Submit.
The same should be done in Bing Webmaster Tools.
Review your website sitemap every month. If you’ve made changes to your site, generate an updated version, and re-submit to Google Search Console and Bing Webmaster Tools.
Add Your Sitemap to Your Robots.txt File
You’ll also need to add sitemap locations to your robots.txt file. A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. Choosing which pages crawlers have access to is also called indexing, covered more extensively below.
Once your sitemap and robots.txt are captured in Google Search Console (GSC), you can manage most aspects of your technical SEO from the GSC dashboard. In some cases, it will make sense to invest in software such as SEMRush to create a backlink audit report and help you manage the workflow removing toxic backlinks and identifying helpful ones within that additional dashboard shown in the “Fix Broken Links” section below.
Once you’ve submitted your sitemap, expect that Google and other search engines will begin indexing the pages of your site to show up in search results.
When a search engine like Google indexes your website, it stores and organizes the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a search result to relevant user queries. Example search result for my brand name query:
Ideally, 90% or more of your website should be indexed, but it takes some time for search engines to catch up to newly published pages. That’s why keeping your sitemap and robot.txt files updated is so important.
However, there are also elements of your site you likely don’t want to be indexed, especially anything that could be considered “duplicate content” from what’s already on your site. This might be elements like category pages or duplicate pages.
Choosing what is and is not indexed on your site will look different for each CMS (WordPress, Squarespace, etc.). Check out their help center for more information.
Use Secure Protocol (HTTPS)
The security of your website impacts your search engine ranking. An SSL (Secure Sockets Layer) certificate tells search engines that your website is safe.
According to Norton, “an SSL certificate is a type of digital certificate that provides authentication for a website and enables an encrypted connection. These certificates communicate to the client that the web service host demonstrated ownership of the domain to the certificate authority at the time of certificate issuance.”
They’re commonly used pages that require users to submit personal or credit card information, like e-commerce sites. By ensuring that all data passed between the two parties remains private and secure, SSL encryption can help prevent hackers from stealing private information like credit card numbers, bank information, names, and addresses.”
Having an SSL certificate means your site domain appears as a “https://www….” rather than a “http://www….” It demonstrates that your site is secure to viewers. Here’s more information on how to secure your website. This is what it looks like when you have an SSL:
Add Structured Data to Your Websites Technical SEO
In SEO, structured data refers to markup implemented on a webpage to provide additional detail around the page’s content. This markup improves the search engines’ understanding of that content, which can help with relevancy signals and also enables a site to benefit from enhanced results in search engine results or SERPs [source: moz].
These enhancements include your content appearing in rich snippets, rich cards, carousels, and knowledge boxes when someone searches for your brand or a relevant term. Because this type of markup needs to be parsed and understood consistently by search engines as well as by people, there are standardized formats and vocabularies for it.
Schema.org is the popular vocabulary used for the markup resulting in rich snippets, cards, and carousels. I suggest working with your developer to implement this in your site’s HTML.
If you or your developer take the time to markup the HTML with structured data, you’ll get these visually appealing search results. These types of results are known to improve click-through rates as much as 20-30% according to Moz.
Here’s an example of a rich snippet:
A newer enhanced result is rich cards, which look like this:
Here’s what rich snippets and rich cards look like on mobile:
Image source: Google Webmasters.
From a user experience perspective, they provide benefits such as:
- Drawing a user’s attention to your relevant result.
- Providing instant information as related to their search query.
These aren’t available for all types of sites, so see the full list of the type of content that Google supports rich snippets for here: Google documentation
Get Rid of Duplicate Content
Duplicate content annoys your visitors and trips up search engine algorithms, causing them to penalize your website. When the spiders find duplicate content, they don’t know which to rank or index, or which to give authority to or if it should be split between them.
Duplicate content can be created a number of ways, both intentionally and unintentionally. Either way is a problem for good technical SEO. Examples include URL variations, different site versions (www.site.com” and “site.com”), or scraped or copied content. Sometimes scammy sites will use bots to scrape content and add it to their own, which only hurts your site.
Find duplicate content with SEMRush’s Site Audit Tool.
Instead of deleting them, use 301 redirects to redirect to the page you want visitors and search engines to find.
Big Commerce provides another example and suggests, “preventing your CMS from publishing multiple versions of a page or post by disabling Session IDs where they’re not vital to the functionality of your website and getting rid of printer-friendly versions of your content.”
Canonical tags also help eliminate duplicate content but it’s important to use them sparingly and carefully. Using canonical links lets search engines know where the ‘main’ version of your content resides. Read more on adding canonical tags from Moz.
Optimize Your Site Speed
You want a fast website. What’s called “time-to-first-byte” or “TTFB” in the SEO world correlates highly with rankings. It’s the amount of time needed for a browser to load the first byte of your web page’s data. You want your TTFB to be low.
Use Google’s Page Speed Tool to see how fast your site loads and runs.
One contributing factor to site speed is fast hosting. Do your research before committing to a host. Here are some ranked hosts for online stores.
Other tactics for optimizing your site’s speed include:
- Use a fast DNS (domain name system) provider like Cloudflare or WordPress.com
- Minimize HTTP requests by keeping the use of scripts and plugins to a minimum.
- Use one CSS stylesheet (the code used to tell a website browser how to display your website) instead of multiple CSS stylesheets or inline CSS.
- Compress your images before uploading them into your CMS.
- Try Google’s Lazy Image Loader.
- Condense any files you upload to your CMS. We like Smush for WordPress.
If you’re not a developer, you’ll likely need some help fixing some of the issues from the diagnostics report.
Take a Mobile-Friendly and Mobile-First Approach
Considering that more than 50% of web traffic today comes from mobile, it’s fair to say that nearly every website should take a mobile-first approach, and every website should be mobile-friendly. That means responsive design, a navigation that makes sense in mobile browsers, fast load times, no weird pop-ups on mobile, etc. User experience is important but also, Google’s algorithms began indexing mobile sites first in 2018.
You can test the mobile-friendliness of your website with Google’s Mobile-Friendly Test.
That said, we recently worked on AMP for a client, it’s expensive to implement and there are no guarantees the AMP versions of your pages will convert better. I’d only recommend implementing AMP on a blog that’s already generating a high level of traffic and seeing a solid conversion rate from that content. Read more about AMP
Fix Broken Links & 404 Errors
A 404 error means a visitor is trying to reach a webpage that cannot be found on your site. This most commonly occurs from a broken link, a deleted page without a redirect, etc.
Search engine crawlers flag 404 errors and they impact usability and trust. No one likes to land on a 404 page, not even a search engine. Run a monthly audit and fix these.
To do so, use Google Search Console. Crawl errors are shown at the site level in the new Index Coverage report (previously the Crawl Errors Report) and at the individual URL level in the new URL Inspection tool. These reports help prioritize the severity of the issue, and group pages by similar issues to help you find common underlying causes.
Here’s what it looks like in Search Console.
Typically, setting up a 301 redirect will fix a 404 error. In doing this, you’re diverting the traffic from that link to a better one. A 301 redirect is an appropriate solution if you no longer need a page, deleted one, or if an external site linked to the wrong page. It maps viewers to the page you intended them to visit.
Check Links to Your Site
Depending on what link-building activities you’ve engaged in in the past, you could potentially have some backlinks that are hurting your rankings. Use Google Search Console or SEMRush to run a backlink report. Go through the report and check on any links that look suspicious and try and get them removed. Here’s where to find backlink reports in Google Search Console:
Again, here’s where some additional software could come in handy. If you find a bunch of shady links pointing to your site, you should consider SEMRush to help you manage the process of getting them removed.
Create Awesome Content Consistently
With every guide or piece of advice, we’re always going to come back to creating well-executed, valuable content. If you want your visitors to buy from you or sign up for your newsletter, you have to demonstrate what they can expect from you.
People are generally smart. No matter how many SEO tactics you implement, they’ll see right through bad content. Prioritize quality first, then worry about everything else.
About every three months, look at historical site performance (this quarter, the last 12 months, and all-time) to see how current performance of your content compares to past performance. For the sake of this article, I’m defining performance by visits from organic search and search engine page rankings. If you have numbers that are down for a certain goal or objective, it’s time to dig a little deeper there.
Content auditing on a schedule is important because catching issues early can mean avoiding major dips and penalties in search engine rankings. I include a schedule in the free swipe file below.
If you’re ready to take this on for yourself, enter your email below and receive the swipe file that includes: Bonus on-page SEO techniques, a how to on using the information architecture template, plus a spreadsheet template demonstrating how to do an SEO Content Audit & Information Architecture Health Check.
I know I just threw a lot of information at you. If you have any questions about technical SEO or are looking for help getting your site into shape, contact us any time.