Skip to main content

How often did you recommend technical SEO improvements just to see the same mistakes made all over again when new pages or websites were created? At CLICKTRUST, we understand that this can be very frustrating for both SEOs and developers. Moreover, it’s a fact that businesses everywhere are losing precious time and money because of misalignment, caused by a lack of knowledge and understanding of each others’ priorities (Search Engine Journal).

For these reasons and inspired by the recurring issues on the websites of our clients, we decided to create our own checklist focusing on the technical SEO requirements for a new website or webpage. We hope that it can serve as a starting point in communicating priorities to your developer and webmaster teams. However, it’s up to you as a digital marketeer or SEO specialist to give the necessary context and guide your developer team through the process as things are rarely black and white in SEO.

This technical SEO requirements checklist will cover all of the following:

  • Content
  • Media
  • Crawlability
  • Performance
  • Security
  • International SEO
  • Mobile SEO
  • JavaScript SEO
  • Structured data
  • Tracking & tools

 

Throughout the checklist, we will try to refer to interesting resources as much as possible, to help you save time.  You will also see that some checks are very self-evident, while others are more easily overlooked. Even though this checklist can be a useful tool, we want to point out that it won’t help you set up a well-structured website. Nor will it help you with keyword research or identifying content opportunities.

Finally, we also realize that no website can ever be perfect from the start. For that reason, some checks should get a higher priority than others. To help you, we will point out what is absolutely necessary and what is ‘nice to have’ throughout the list.

Content

We’re often so busy with the latest technical developments that we sometimes forget the very basics of SEO: high-quality content and well-optimized metadata. Without it, it’s nearly impossible to rank in a top 10 position in Google.

Even though the content is usually the responsibility of the SEO specialist or content marketeer, there are some things a developer should keep in mind as well. Therefore, we start this checklist with some absolute essentials:

Title tags

  • There should be exactly one <title> element on every page
  • The <title> element should not be empty
  • The <title> element should be below 70 characters long
  • The <title> element should be unique for every page (alternate languages excluded)

 

Meta descriptions

  • There should be exactly one meta description on every page
  • The meta description should not be empty
  • The meta description should be below 130 characters
  • The meta description should be unique for every page (alternate languages excluded)

 

Headings

  • There should be exactly one <H1> on every page
  • The <H1> should not be empty
  • Headings should be used to bring structure in a text
  • The headings should follow a logical, hierarchical structure
  • Headings should not be used for banners or formatting alone

 

Images

  • All important images have a descriptive filename
  • All important images have a descriptive alt=”…” attribute
  • All important images have a descriptive title=”…” attribute
  • The images on your website should not be broken

 

Media & layout

What is a website without a nice layout, images and videos? Nothing. At the same time, these are the things that often slow down your website, ultimately having a negative impact on user experience and conversion rate. To avoid this, we recommend you to follow these technical guidelines:

Images & videos

 

CSS

  • Avoid CSS @import
  • Put CSS in the document head

 

Crawlability

You can have the best copy and the nicest visuals on your website, but this means nothing if Google can’t crawl your website decently. Or if the website/URL structure is a chaos and canonicals or pagination is implemented incorrectly. Therefore, we recommend you to stick to the following technical SEO best practices. Most of them are absolutely crucial, though a sitemap won’t have that much impact for very small and simple websites.

Robots.txt

  • There should be a robots.txt file for each subdomain
  • There should be a reference to the sitemap(s) in the robots.txt file
  • The robots.txt cannot contain “User-agent: * Disallow: / ” on production
  • The robots.txt should not disallow pages that need to be indexed
  • The robots.txt should not disallow pages that contain crucial links, even if you don’t want them in the index
  • The robots.txt should not disallow resources needed for the correct rendering of a pages

 

Sitemap

  • There should be at least one sitemap
  • The number of URLs in the sitemap should not be more than 50.000
  • The size of the sitemap should not exceed 50MB
  • The sitemap should contain all URLs that you want indexed, including new ones
  • The sitemap should not contain any of the following:
    • redirected urls
    • canonicalized urls
    • 4xx pages
    • noindex URLs
    • disallowed pages
    • orphaned pages
    • HTTP pages

 

Links

  • Internal links should not contain a nofollow (except in very specific cases)
  • All pages should have at least one incoming link, but preferably more than one for important pages
  • Links should never point to:
    • Redirected pages
    • 404 pages
    • HTTP pages
  • Internal links should be consistent (f.e. always link to www. version, rather than a mix of www. and non-www.,   point to the same URL in the navigation on all pages)
  • Anchor text should be descriptive (so not only “click here”, “more info here”, …)
  • Use proper <a> tags with resolvable URLs
  • Make sure links don’t require user interaction to load.

 

Breadcrumbs

  • Add breadcrumbs where possible and relevant
  • Breadcrumbs should be clickable

 

Crawl depth

  • Important pages should not be more than 3 clicks away from the homepage

 

URLs

  • URLs should follow a logic and consistent structure
  • Every separate page should get a separate URL
  • URLs should be descriptive
  • URLs should not be longer than 100 characters
  • URLs should not contain underscores, NON-ASCII characters, unnecessary characters or capital letters
  • There should not be any duplicate URLs caused by www. and non-www variations or HTTP and HTTPS variations
  • All URLs with a 200 response code should be either www. or non-www, not a combination of both (the non-chosen version should have a 301 redirect)
  • The use of parameters in URLs should be limited to an absolute minimum

 

Redirects

  • Avoid 302 redirects for redirects that are supposed to be permanent
  • There should not be any page with a meta refresh tag
  • Avoid (built-in) redirect chains/links to redirected pages

 

Robot directives

Meta robots

  • Only put a ‘nofollow’ on pages whose links shouldn’t be followed
  • Only put a ‘noindex‘ on pages you don’t want in the index*
  • As long as a website is in staging, put a ‘noindex’ on each page

 

*You might be confused when it comes to the difference between ‘noindex’ and disallowing a page in the robots.txt file. It works as follows: If you don’t want a page to be crawled, you can ‘disallow’ it. Do keep in mind that disallowing a page won’t necessarily prevent it from being shown in the SERPs. If you want to prevent indexation, whether you want the page to be crawled or not, you should use a ‘noindex’.

Canonicals

  • There should be exactly one canonical present on every page
  • The canonical should be put in the head of the HTML source code
  • The canonical should be self-referential, unless otherwise specified (f.e. to avoid duplicate content issues)
  • Canonicals should not point to any of the following:
    • Canonicalized urls
    • Redirected urls
    • 4xx pages
    • Noindex urls
    • Disallowed pages
    • Orphaned pages

 

Pagination

  • When only part of the data on a page is loaded at the initial page load and you can scroll down the page, you will need pagination.
  • The pagination should be put in the head of the code
  • Paginated pages should either:
    • Have a canonical pointing to itself and rel=”next” and a rel=”prev” between the pages (except for the first and last page, they should only have one)
    • Have canonicals pointing to the ‘View all’ page

 

Performance

We already discussed how to optimize images and videos, but there is more you can do to optimize your loading speed:

Compression and minification

  • CSS needs to be compressed
  • JavaScript needs to be compressed
  • CSS needs to be minified
  • JavaScript needs to be minified

 

Eliminate render-blocking resources

  • Defer the loading of non-critical resources
  • Inline critical resources
  • Remove unused resources

 

Browser caching

  • Put the browser caching expiration date to one year for static content
  • Put the browser caching expiration date to at least one week for other content

 

Want to know how fast your website is loading? Some of our favourite tools for this are Google PageSpeed Insights, Lighthouse and GTmetrix. You definitely want to check out their documentation as well.

Security

Cybersecurity is a major ranking factor as Google wants to protect its users from malware and unscrupulous websites. Not having an SSL certificate might even make users feel uncomfortable and make them want to leave your page as quickly as possible. Google Chrome already shows warnings to users when trying to visit a non-secure HTTP website. Do you really want to miss out on valuable traffic because of this?

If you genuinely care about the security of your website and your customers’ personal data, you need to have the following in place:

 

Not sure which security headers were implemented on your website? You can easily find out on securityheaders.com.

International SEO

Hreflang tags are key if you are active in multiple countries and/or languages. Even though the concept of hreflang tags is relatively simple, we often see it being overlooked and a lot of mistakes can happen here.  As a result, we regularly see French domains rank above Belgian domains, for example, even for French queries searched for by a Belgian user. If you want to avoid this type of issue, these are the technical requirements you need to follow:

  • There should be hreflang tags on every canonical page (= every page that is not canonicalized)
  • There should be a self-referential hreflang tag on every canonical page
  • The hreflang tags should not all point to the homepage. They should point to the most relevant alternative instead.
  • The hreflang tag should use either a language code (f.e. nl) or a language & country code (f.e. nl-BE), but never a country code alone (f.e. BE)
  • Use a hyphen instead of an underscore to separate the language and country codes in the hreflang tags
  • Use hreflang tags on all pages of all domains

 

The same requirements apply if you decide to use an hreflang sitemap instead of hreflang tags.

We recommend you to go through our guide on how to combine hreflang tags and canonicals as well.

Mobile SEO

Mobile-friendliness

More than half of the searches worldwide are coming from mobile devices (Think With Google). Google understands this and demotes websites that are not mobile-friendly. If you’re not thinking about mobile SEO yet, it’s time to finally do so and follow these technical guidelines:

  • All pages should be responsive
  • There should be a viewport tag on every page
  • Fonts should be readable on all devices
  • There should not be any huge, disturbing ads on your page

 

Not sure if your page is mobile-friendly? You can easily test this using Google’s Mobile-Friendly Test.

Don’t forget about mobile-first indexing

Since Google is using mobile-first indexing, it’s no longer enough to just be mobile-friendly though. The search engine is now using the mobile version of a page for indexing and ranking, which means that your website also needs to meet the following technical requirements:

  • The actual content should be the same for mobile users as for desktop users
    • Avoid “read more” buttons that need to be clicked before all content is shown on mobile
    • Make sure headings are the same for mobile and desktop.
    • Make sure metadata is the same for mobile and desktop.
    • Make sure the same structure data is used for mobile and desktop.
  • Use the same meta robots tags on the desktop and mobile website
  • Make sure content does not need user interaction (f.e. swiping) to load.

 

JavaScript SEO

More and more web developers are making use of JavaScript frameworks, such as React, Angular, Vue, etc… Even though this might have several advantages from a development perspective, it’s one of the biggest frustrations of SEO specialists nowadays. This is because the use of JavaScript frameworks can make it very hard or even impossible for search engines to crawl your website. However, it doesn’t have to end in a nightmare as long as you stick to the following JavaScript SEO guidelines:

  • SEO elements (metadata, canonicals, hreflang tags ánd links) should be loaded regardless of JavaScript support
  • If possible, use server-side rendering when using JavaScript frameworks
  • When using client-side rendering, make sure to serve a pre-rendered version to search engine crawlers (f.e. through dynamic rendering)
  • If you’re relying on image search traffic, avoid swapping in and out different src attributes
  • Avoid single block element swapping content in and out if the content is important. Use CSS instead.
  • When using events to change content, make sure that all separate pages/content live on a separate page.
  • When using events to change content, make sure that there is an actual <a> link between all areas

 

We recommend you to check out this guide on the basics of JavaScript framework SEO as well.

Structured data

The type of schema.org will depend on your industry and the type of website you have. It is definitely not a priority, but there are some types that almost all websites can benefit from, regardless of the industry.

Use ‘Organization‘ schema on the homepage or contact page
Use ‘BreadcrumbList‘ schema on all pages
Use ‘FAQ’ schema on the FAQ page and all pages with FAQ questions
Use ‘Review’ schema on pages with customer reviews

 

Whatever the schema.org is that you decide to implement in the end, always test your code using a tool such as Google’s Structured Data Testing Tool before implementation.

Tracking & Tools

As performance marketers and SEOs, we like to measure the impact of our actions and campaigns. To make this possible, the following tracking tools should be implemented correctly:

 

Not sure how SEO can help your company or organization? Need assistance with identifying and solving (technical) SEO issues & opportunities on your website? We are more than happy to help you at CLICKTRUST! Please contact us at contact@clicktrust.be for more information.

Recevez notre newsletter &
nos insights

Nous approfondissons les sujets brûlants du marketing numérique et aimons partager.

    Lore Dessent

    Marketing Executive