Doukas Media

View Original

The Complete Technical SEO Checklist

The Complete Technical SEO Checklist

When you first think of the concept of Search Engine Optimization—or SEO, for short—you might envision yourself building streams of keywords, drafting a slew of optimized image descriptions, and creating a series of local landing pages. 

While, of course, these content optimizations are all essential components of increasing your website’s visibility, there’s another portion of this process that you should never ignore: technical SEO. Read on to learn more about what technical SEO is, why it matters, and how you can make sure your technical SEO checklist is complete.

The Basics of Technical SEO

Among digital marketing experts, there are four core categories of SEO that are critical for improving and maintaining your site ranking: on-site SEO, off-site, SEO, local SEO, and technical SEO.

  • On-site SEO refers to the optimization of content within the pages of your website, and is likely the type of SEO that first comes to mind when you think of the term. This includes the use of keywords in your pages, optimizing your ALT images, etc.

  • Off-site SEO includes different efforts that can be used to bring visibility of your site from external sources, such as guest blogs and press releases that link back to your pages.

  • Local SEO has to do with geographic information, and ensuring your site contains enough localized data to rise to the top of search results made by those in your local or regional area, for example: “best marketing agencies near me.”

  • Technical SEO relates to the mechanics on the back end of your website and how successfully modern search engines can parse through your site data. It’s an umbrella category for all optimizations of the way that your site is structured, tagged, and secured.

Think of technical SEO as the vehicle by which search engines can “crawl” through and recognize all of the incredible content optimizations you’re making on your website so that it can put your business at the top of results for inquiries that relate most to your offerings. It’s a way to communicate between system mechanics and creative, and we truly can’t overstate how critical it is for businesses to create this connection through an optimization checklist.

Completing Your Checklist

For those who don’t have a background in information systems or coding, the concept of technical SEO can seem scary—but this category is essential to improving your organic search rankings, and in most cases, these optimizations can be made without a technical education. 

Read on to explore the basics of your technical SEO checklist, and why each of them matters.

Ensure Your Site’s Secure ✔️

First things first: In a world where trust is earned, not given, your site must be secure. A Secure Sockets Layer (SSL) certificate can help ensure your site visitors, your customers, and your business, are all protected. Basically, an SSL is an encrypted intermediary between a user’s web browser and your website that can help prevent dangerous data breaches. 

You’ll notice that sites protected by an SSL will be listed as https:// rather than http:// in a browser. This signals that an agreement has been made on the back end between your search engine and the server that hosts your website (it’s sometimes called an “SSL handshake”) validating its safety.

In addition to, of course, communicating to potential customers that your website is safe to visit and make transactions with, having a secure site with an https:// url is now an important ranking factor in search engine results.

Build Your XML Sitemap ✔️

If you were an author or a researcher, you wouldn’t publish your book or paper without a table of contents at the front. Think of your eXtensible Markup Language (XML) sitemap as the table of contents for your website that tells search engines where and how to find the information within your site. A well-built XML sitemap is readable by both people and machines, will allow search engines to comb your website more quickly, and can therefore improve your chances at landing a better ranking. Just as you want people to be able to understand where to find the information they’re looking for with a clean navigation, you want search engine crawlers to be able to easily understand your website structure. In general, it’s recommended that your sitemap contain only the urls you would want to be ranked in search engine results.

If you use a content management system or web template, there are typically options for you to edit and export your sitemap. If you do not have many different urls on your website, you can also manually draft one following Google’s XML template. Keep in mind that your sitemap will be fluid as you add new pages to your website. Consider using language that will automate your updates.

Upload Your Robots.txt File ✔️

Your robots.txt file might seem similar to your XML sitemap, but the key difference is that your robots.txt file specifically instructs search bots which pages of your website they should crawl through and, importantly, helps ensure that pages you do not want to be crawled through (such as private pages that require a login) are excluded.

When a search bot arrives at your website, it seeks out your robots.txt file first to help it understand where to look. Unlike your XML sitemap, which is more of a resource than instructions, your robots.txt file will use basic commands such as “allow” and “disallow” so that you can control which content on your website you’d like to appear in search rankings, and which not.

Your robots.txt file should always be up-to-date, and it’s best practice to include your XML sitemap within this file.

Clean Up Canonical Tags ✔️

When a bot crawls through your website, it will, by default, look at the urls that exist rather than the web pages that exist. Consider the XML sitemap that we talked about earlier: it’s a reference listing of the URLs. Oftentime, we have duplicate, near-duplicate, or very similar pages that load by unique URLs without even realizing it. 

For example, every time you build a marketing campaign that ends on a landing page, you might notice that tracking information populates at the end of that landing page’s url (“utm=”, etc.). That means, every appearance of this landing page, whether it’s tracked through a Facebook ad, an email, or partner blog, can show up in organic search results from an inquiry. That is, unless you specifically define which page you want search engines to prioritize.

By adding a simple canonical tag (rel=“canonical”) to the <head> section of your page’s HTML code, you indicate to search engines that this is, in fact, the priority version of the page that should be indexed over any other similar ones. In general, you want to prioritize only one version of similar pages with a canonical tag to decrease the amount of time crawlers need to parse through the URLs.

Use hreflang Tags ✔️

Hreflang tags are used to distinguish content that accounts for regionally or language variation. For example, while British English and American English may be considered interchangeable, content may be written differently based on regional dialect.

Hreflang tags can tell crawlers whether the page it found also exists in another language, for example, if you have a United States version of a product page with American slang used instead of the UK version that was found first. Hreflang codes are generally created using universal two-letter tags and also help avoid bots processing content as duplicate pages. Think of these tags as a way to further canonicalize your URLs.

SEO-Friendly Pagination ✔️

Pagination refers to the way that site visitors can navigate through content in a clear and linear way without having to load everything at once. This is particularly important when you have a lot of content in a series, such as in a long-standing blog or landing page for a particular type of product (like women’s shoes on an apparel site).

In general, there are three ways you can load content from a series:

  • using pages where the site visitor can navigate “next, “ “previous,” or numbered pages by clicking links that direct them to each,

  • by using buttons that prompt the visitor to load more content on the page, or

  • using “infinite scroll” where hitting the “end” of the page prompts more to load beyond it.

No matter how you design your content to load, there should always be HTML links within your code on the back-end that allow bots to navigate through these pages. We mentioned earlier that canonical tags should be used to ensure duplicate content is not crawled, and this situation is almost the exact opposite: You want all of this content to be able to be parsed through since each component of the series is unique.

To make this happen, you must utilize self-referencing tags (<a href>) that prompt crawlers to follow the path. This will make sure older content, or pages that are at the end of a long series, still appears in search results.

Eliminate Crawl Errors ✔️

Just as you don’t want crawlers wasting time going through multiple versions of what is essentially the same web page, you don’t want to lead them to dead ends! Crawl errors occur when a bot follows a path to reach a certain page on your website, but fails to deliver the content you’re looking for.

If you have broken pages on your website, they will produce the dreaded 404 ERROR message, which means that the file is not found within the server. Bots generally won’t want to deliver a broken page to the search results because it doesn’t have the relevant content the user is looking for. Temporary redirects on your website (called the 302 REDIRECT) can also be confusing for bots and increase the amount of time it takes for them to parse your content, and will be unlikely to deliver the page as a result. Another huge error that will stop a bot in its tracks is the 500 SERVER ERROR, where no pages on your site work due to a complete hosting error.

Ultimately, you should comb through your site's navigation to ensure your crawl errors are minimized, and as mentioned earlier, your server should be reliable so that you can avoid server errors as much as possible. You can even setup a free Google Search Console account for your website which will help you identify all of the different types of crawl errors that Google has encountered.

Get those Pages Loading Quickly ✔️

Ask any customer visiting your site: If a page loads too slow, away they’ll go. Reports have claimed that customers would rather take the time to search for a new, faster site than wait for a slow-loading page to fully render.

While you may have heard of the term site speed and how the overall quickness of your site affects rankings, the truth is that individual page speed is now just as important a ranking factor on mobile. Here are a few quick steps you can take to help the content on each of your web pages load faster:

  • For overall site speed, ensure you’re using a fast and reliable web host or dedicated server so that your connection isn’t limiting your visitors’ experience.

  • Resize and/or compress the images on your website. The larger the file that needs to load, the longer it will take. If you’re using a website template, you should be able to find the ideal sizes that you should use for your images. 

  • Be mindful of the file types that you’re uploading onto your site; use image files that are web-friendly (like WebP or JPEG) otherwise they might not load at all.

  • Remove website scripts (or, pieces of code that are embedded into your website to communicate information on the back-end and are not visible on the front-end of your site) that are no longer in use, and move non-critical scripts further down in your site’s code so that the visible content on your site (essentially, the components your visitors care about most) loads first.

Keep Text-Based Content Crawlable ✔️

There are many new tools that are available to help business owners and marketing professionals build beautiful websites without having to build site code entirely from scratch. While web apps that help with site creation are incredibly helpful and save time, the code that lives on the back-end sometimes hides or delays the reveal of important content from crawlers, and in turn, affects your page ranking. 

While you might set your larger creative files lower in your code so that your page loads more quickly, you should still always have text-based content describing the images, video, or plugins used on your page. Bots rely on text to understand the content on your site, and cannot parse through video content or media that lives within iframes unless you indicate what it is through the proper language.

Code in Structured Data ✔️

Structured data is essentially using code to organize the information on each of your web pages so that crawlers can present it in the results the way you’d like it to be prioritized. Depending on what type of business your website is promoting, you might choose different pieces of information to highlight.

  • A restaurant may want their star ratings and location to be at the top of their search result so that people in the local area can easily see what others think as well as the restaurant’s proximity.

  • Users looking to purchase an expensive camera may ask several questions while researching their options; therefore, if you sell cameras, you may want a featured response from your FAQ page to populate in the search result.

Google has markup resources that can help you identify which data you should work to highlight based on the type of business you manage, as well as where to place the code on the back-end of your web page.

Completing the Checklist

Now that you have the foundation of what should be included in a technical SEO checklist, feel free to build yours out with more details that relate to your unique business—from a running log of internal page rankings, to records of your sitemap, and more—and don’t forget to refer back to it over time. Factors that influence search rankings can change as search engines update their own technological processes, so it’s important to regularly keep tabs on your site’s performance.

If you’d like help making sure you’ve completed your technical SEO checklist correctly, or if you feel you could use support with monitoring the rankings of your website, the team at Doukas Media is here to help.