According to Google: This is the optimal form of links for search engines

It is very important to improve the internal links in order to help the pages of the site appear better, but you must learn how these links work properly.

 If you are about to create a new site or already have a site and want to improve its visibility in search engine results, you should know the form of links that Google prefers so that you do not fall into the trap of complex links that search engines cannot archive.

Make your website links simple and clear!


Your site links should be as clear and simple as possible, and do not contain “identifiers” or parameters because they may hinder the work of Google’s search spiders.

Whenever you have the opportunity, use keywords in your site links instead of numbers and symbols, as this will help your site appear in better results.

http://www.example.com/index.php?id_sezione=360&sid=3a5ebc944f41daa6f849f730f1

As for this link, neither visitors nor Google search spiders prefer this link

Google always prefers to use punctuation marks in links, as shown in the following example:

Google considers this example that uses punctuation marks like a tag - better than a link that doesn't use them like this link http://www.example.com/greendress.html where Google prefers to separate different keywords with punctuation like Dash (- ) instead of Underscore ( _ )

Google also states that links that contain multiple identifiers or parameters make the task of search spiders more difficult to archive your pages and may have to take more time and bandwidth to access all of your site pages, and sometimes they may not be able to access them from the original.

The most difficult forms of links that Google faces in archiving:

Links that filter a group of items: On some sites such as classified ads sites or hotels, you may find several links to the same section or to the same ads, and when they are all called up through “filter” or “filter” Google may find it difficult to archive these pages because The number of links he has to go through to find this page.

An example of a site that filters the hotels offered according to quality:

SEO Success Factors (Search Engine Optimization)

An example of a site that filters the hotels offered in it according to quality and the presence of a beach in the hotel:

An example of a site that filters the hotels offered in it according to quality and the presence of a beach and gym in the hotel:

The sites that filter their results according to the search process performed by the visitor must specify static links to these results that the visitor can access directly from the main or sub-menu or even the Sitemap.html page, and this is to ensure that they are archived from Google.

The correct form of links for filtering results:-

An example of a site that filters the hotels offered according to quality:

http://www.example.com/hotel-search-results/value/

An example of a site that filters the hotels offered in it according to quality and the presence of a beach in the hotel:

http://www.example.com/hotel-search-results/value/beaches/

An example of a site that filters the hotels offered in it according to quality and the presence of a beach and gym in the hotel:

http://www.example.com/hotel-search-results/value/beaches/gyms/

Sorting Parameters : These identifiers are used by some large online shopping sites to arrange products and display them in an orderly manner to visitors. These identifiers may lead to the creation of links full of unnecessary parameters, which leads to poor archiving of the site.

An example of a section that displays videos arranged according to their relevance to the search process from section 25:

http://www.example.com/results?search_type=search_videos&search_query=tpb&search_sort=relevance&search_category=25

Correct form of the link:-

Quick tips at the end:

Always consider making your site links static and not dynamic if the opportunity arises.

Use the Robots.txt file to block the archiving of dynamic links that are automatically generated through searches and filters made by users to avoid the decline in your site results due to the presence of a large number of pages that do not have valuable content or as it is known as Thin Content.

You can use the regular terminology or Regex in the Robot.txt file to block the archiving of a large number of links at once, as shown in the following example:

User-agent: *

Disallow: /*example$

In the end, if you have any question regarding your site links, leave it in the comments and I will reply to you as soon as possible.

On-Page SEO A complete explanation of internal SEO

Comments



Font Size
+
16
-
lines height
+
2
-