Technical SEO

Technical SEO is the process of verifying a number of technical issues of any site, which are examined to ensure that the site follows the best technical practices related to its appearance on the search engine, these factors are made sure of the archiving process (that the search engine knows this site), page storage in the index of the search engine, and that all page information is readable by search engine crawlers, and other technical matters.

In case the definition is not clear enough, no problem, now when you know how to do it, you will have a clearer and more comprehensive picture of it.

?How does a technical SEO audit work

There are many things that you should pay attention to when doing a site audit from the technical side in this area, and although it is not unified as a successive checklist, there are stages you can start with, and each stage has several points that explain what you have to do before moving on to the other points.

For example, before I care that the site links are readable, I have to make sure that search engine crawlers can access these links in the first place!

Therefore, we will divide the technical SEO plan into several stages, to cover a comprehensive picture of each stage, including it.

 Site Structure

Is your site hundreds or thousands of scattered pages? Or are there clear, structural page rankings that allow search engine crawlers to understand the site in layers?

The question is another way, are your pages like this?

Youtube SEO basics

So that you can access from the home page of any page easily, or this way:

The structure of the website pages is not arranged

Search crawlers are not very different from the person who visits your site, if access to the internal pages is complicated and does not have an easy path from the start page, it is possible to delay the process of archiving and indexing these pages, especially if you do not use a map of your site.

 Crawlabiltiy & Archiving

In the previous step, we implied that if search crawlers visit our site, they will find their way to all pages, now let's ensure that they reach the site and its pages in the first place.

It is onething for crawlers to access our site, and another thing for pages to be indexed by the way, why? Because with a simple technical error, we can prevent crawlers from archiving any page!

When crawlers access any page or site, they make sure that the page is allowed (the crawlers) to archive it by not having the noindex tag on the page, and also go directly to the bots file located at the address:

yourwebsite.com/robots.txt

We explained in the first part of the series that any simple error in this file can be catastrophic. Instead of preventing a specific page from archiving, you block the entire site, so you must make sure that the file is located on this link, and in the correct form that allows archiving all pages that You want it to appear on the search engine.

I suppose you have started using Google Search Console, it is very important to see the Coverage Report in the tool, which gives you information about the pages accessed, are they archived? Are they excluded from archiving? Or does it have technical problems?

Coverage report in Search Console

 Use a sitemap

This is not the map that some sites put in the main menu that contains all the pages (although it serves almost the same purpose), the sitemap is based on XML , and contains all the pages of your site that you want the search engine to archive and index.

The sitemap is usually at:

 yourwebsite.com/sitemap.xml

And if you upload it to another address, it is better to put the address in the bots file, thus ensuring the Crawlability of all pages, and indexing these pages in the search engine index.

One of the most important things to check on any sitemap is that it is constantly up-to-date, not just created once. The sitemap is the second most important way (after links) to tell search engines about your site pages.

You can also speed up your sitemap's search engine access by uploading it using Search Console (which I'd be surprised if you got here in the series and haven't used it yet).

 Domain Redirects

Still, when many people want to tell you about a specific site link, they start with www, although the reality has changed, and many sites no longer use this part in their site, but because people are still accustomed to writing it before the domain name, this can cause a problem.

In the event that your site is available in two ways to write it, such as:

yorwebite.com

www.yourwebsite.com

That is, it does not redirect one of these two addresses to the other, it is possible (mostly will) that the search engine crawlers will act as two different links, and the two links will be archived, meaning that each page on your site will have two copies on the search engine, and this is what you do not want to Gets.

Another similar problem is whether or not you use HTTPs, you can find the site at two addresses:

http://yourwebsite.com

Yes, you might say that this is the same site, and what is the relationship of encryption and that it completely accepted the domain name for archiving! The problem is that search crawlers see links, and nothing but links, any difference between two links means that these are two different links, and they will be archived separately.

The solution? Very simple, you have to choose your preferred domain, whether with www or not, and of course you have to choose the encryption protocol, as it is also one of the most important things that you should pay attention to in the technical SEO audit, and then redirect all possible addresses.

For example on this site, the preferred address is seocorner.net (with https of course), I made a redirect (from the .htaccess file) to these addresses:

 Correct use of Headings

We mentioned it in the fourth part of our series on keywords when we said that you should take care that the keywords are logically mentioned in the titles of your pages, because it is simply possible to discover that your site and your pages use plain text, but only in large size!

You can check this easily, choose the address you want to verify, right-click and then Inspect from Chrome or Firefox browsers, the address type will appear as in the image:

 Site Speed ​​and Image Sizes PageSpeed ​​& Images Sizes

In most cases, you will not find a site without these problems, either slow loading the page, or the delay in loading the images on the page because their size is “imaginary”!

I worked on websites that contained more than 5 megabytes of images without any compression. Do you know what this means to the user? Can you imagine how many seconds a visitor would need to download the entire page if he had a medium-speed internet (I'm talking about 5-10 megabytes, and there is less of course)!

Assuming the internet speed is 5 megabits, and there is only one image of this size, this means that the image (alone) will need 8 seconds to load, then think why it is possible for Google to like a page like this.

Here you have to pay attention to the dimensions of the images to ensure that their size is logical. The images taken with the phone are very high dimensions of several thousand pixels, and after choosing suitable dimensions, tools can be used to compress images without any loss of quality, such as TinyPNG.com (which is suitable for images jpg also).

We will not return here to the topic of site speed and testing tools, we mentioned it in detail in the third part of our series ( ranking and lead factors ) and you can return to it if you wish, but there is another important point here.

You will find that many websites use PNG quality images, which are much larger in size than JPEG images, with their ignorance of the difference between them. Pay attention to this, it is possible to save more than 50% of the image size by changing the extension (type) of the images on the site!

There are some great WordPress plugins that convert all your site images from PNG to JPEG easily, such as PNG to JPG .

Mobile-Responsive Compatibility

You have to check that the site appears appropriately for different screens, and it is not enough to display the page of the site (as it appears on computers) in miniature on the screens of phones, and not even to allocate a special link for visits that come from phones.

To make sure your site is compatible with different screens, you have to use Google's Mobile-Friendly Test tool , just enter your site link to check it:

Check a site using a mobile-friendly test

 Duplicate Content

If you have more than one page with the same content, beware that you archive them all for more than one reason, the first is that Google does not like duplicate content, but can be punished for it, and the second is that if there is weight (from external links, for example) to one of the two pages, You will lose it from the second page because these two pages are different for the search engine.

It is always better not to have duplicate content, but if this must be done for different reasons, there are two ways to solve the problem:

1) Redirecting one page to the other: This prevents access to the first page, but it transfers the weight of the first page to the second, and the second appears only on the search engine and on your site as well.

2) Using Canonical URL : This keeps the two pages (or more) available to visitors, but the weight of the first is transmitted to the second (from the point of view of SEO), and the visitor will not know anything about this, it is done through a code that is placed on the first page only, Indicates the second page as the main canonical.

Use of Site Audit Tools

One of the best known and best free tools in the field is the Screaming Frog (I wouldn't translate it to Screaming Frog!).

The point is, this one is simple to use, after you download it, scan your site, and it gives you a report on the pages (up to 500 pages in the free version), their headlines, sub-headings, descriptions, are they redirected, and more. Here is a picture of one of the tool's reports:

Report-Screaming-Frog

With these nine points, we have covered the most important thing that you should pay attention to and pay attention to once you start your work on any site to improve it for search engines, and despite its ease as we have seen, these errors are frequently found on the sites, and unfortunately their impact is great if they are not corrected, are You encountered one while checking out a site? What was the outcome when the problem was resolved?

Share your answer in the comments, then go to the sixth part of the seo learning series for beginners: internal and external links, and these are links to the rest of the series: 

Comments



Font Size
+
16
-
lines height
+
2
-