Category
 | 4 articles

Technical SEO

Technical SEO is one of the most critical and complex pillars of search engine optimization.

An SEO specialist must be capable of intervening on the website and optimizing aspects such as:

  • The page loading speed;
  • Presence of corrupted links and status errors;
  • HTTP/HTTPS protocols;
  • Security and vulnerabilities.

For this reason, as an SEO professional, you have to interface with developers. You must provide them with guidelines on how to optimize the website from a technical point of view.

There’s no shortcut: if you want to become an SEO expert, you have to know technical SEO.

Don’t worry. With this guide, you can learn the basics of the craft.

Technical SEO: definition

Technical SEO is the process of optimizing the technical aspects of a website, facilitating the work of search engine crawlers.

How? By making a website more accessible and improving its speed, internal structure and security. 

Sitemap

Let’s start with one of the most critical files for a website: the sitemap.

The sitemap is a file containing the hierarchical structure of the pages and files of the website.

The SEO sitemap is important because search engine crawlers use this file as a compass when scanning a website.

It turns handy when the website has many pages, or they are not well connected (bad internal linking).

Robots

The robots.txt file is a simple text file in the root directory of your website. It gives instructions to crawlers on how to access to specific resources and directories.

Search engines can decide whether or not to follow these instructions, which are no more than guidelines that the crawler could decide not to follow. 

Unlike web pages, we can use robots.txt to prevent access to search engines to:

  • Media files (images, videos, audio).
  • Resource files.

You can learn more about this by studying Google’s guidelines on robots.txt.

Website performance

Monitoring and improving the performance of your website is a job that has a significant impact on your ranking.

Why?

A fast-loading website has positive effects on:

  • Dwell time;
  • Bounce rate.

A faster site can provide a better user experience, which is critical for a good ranking.

How to improve the performance of a website?

Before you start optimizing your site, you should know what issues you need to solve.

You can use the SEO Tester Online tools, such as the SEO Checker and the SEO Spider.

How fast is my site?

Meta robot

As we said, the robots.txt file does not prevent the crawler from accessing specific pages.

If we want to exclude pages from indexing, we need to use the meta robots tag.

What is it?

Meta robots is a tag that allows you to control the indexability of a page.

The meta robots looks like this and should be placed inside thetag of the page:


The meta robots tag consists of two attributes: name and content.

With the name attribute, we define the crawler which we want to target with our statement (in the example above, the “noindex” statement applies to all search crawlers).

In the content attribute, we put the action we want the crawler to perform.

URL

The URL is critical in technical SEO. As you may know, it should be SEO-friendly. But what does that mean?

An SEO-friendly URL should:

  • Describe the content;
  • Include the main keyword for which we intend to position ourselves;
  • Be short and easy to read.

An example of an SEO-friendly URL?

https://www.domainname.com/blog/main-keyword

As you can see, the URL is clear and easy to read. You can easily guess the structure of the website from it.

You always prefer simplicity and clarity when you can. Users and crawlers appreciate it.

Status codes

As we have already said, technical SEO is more complicated than other aspects of search engine optimization for those who do not have an adequate technical background.

However, it is essential to understand concepts such as HTTP requests and status codes.

When we try to visit a website, we send a request to the server that hosts it. After receiving the request, the server processes it and “responds” through specific messages: status codes.

Any example?

Even those who are not insiders encountered messages such as error 404 (Page not found) or error 502 (Bad Gateway).  

An SEO specialist must know all of them. You have to know how to interpret them and how to act when problems arise.

HTTP/HTTPS

An aspect that can affect the way Search Engines “see” a web site is given by the use of certificates authentication that guarantee its safety.

An SEO Specialist must comprehend the difference between HTTP (HyperText Transfer Protocol) and HTTPS (HyperText Transfer Protocol over Secure Socket Layer) and know how to configure certificates authentication via SSL or TLS.

Rel Canonical

It’s easy, surfing the Internet, to come across same pages that are reachable from different URLs.

In order to help Search Engines, the SEO Specialist should always select a “canonical” URL that the search engine should use to index that page.

The “rel=canonical” tag helps us to communicate to the search engine which URL we want to give the priority to, compared to the others.

The latest