Website and SEO structure: how to design and optimize trees

Website and SEO structure : there is a close correlation between the structure of a website and the positioning on search engines, but to understand the connection it is necessary to take a step back and talk about the spiders of the search engines and in particular to that of Google, the Googlebot.
Google spiders: how Googlebots work (also called Crawlers or Spiders or Robots

The key objective of an SEO strategy is to be able to "communicate" the site with the search engines: the more clearly they read our site, the more clear we make the information that our site intends to convey, and the more we will have the possibility to climb the search engines.

Specifically, SEO must pave the way for search engine spiders, in order to make the scan faster and therefore more profitable.

Spiders are automatic scanning software arranged by search engines that have the task of periodically analyzing the network and the pages of the sites. The spider analyzes and scans and then sends any data collected and any information to the search engines, which will decide, each based on its algorithm, whether or not a site deserves to be included in the various SERPs and in what positions with respect to specific queries. research (i.e. the questions people ask the search engines).

What is the Crawl Budget

Spiders don't have unlimited time and resources, so they allocate a "budget" of time to each site, called the crawl budget . If they find everything well organized, then they will scan multiple pages:
↑ problematic website = 10 web pages; optimized website = 100 web pages in the same time frame.

Spiders will keep coming back if they know they will always find something new, which is why we must commit to updating our site often, so as not to discourage spiders.

As we have already seen, through Search Console we can analyze the work that the Googlebot does on our site: how many contents are downloaded and scanned every day by the search engine? With what time?

We can increase the possibility that spiders analyze multiple pages through different improvements (fast site, presence of a sitemap, performing server, site update frequency, etc.) but certainly the fundamental structure is the structure of the website : the more this it will be neat and consistent with the main topic, and the more the Googlebot will be able to move easily between the pages of the site.

Website and SEO structure: how to structure a site in an optimal way

To understand the importance of an effective structure , let's start with an example: imagine a book that strikes us between the shelves for the title and the cover (it could be the title of one of our articles and the snippet that brings Google back to someone who makes a Research).

 We begin to browse it, but we realize that inside it does not have an index (the site menu), it is not divided into chapters and paragraphs (pages and categories of the site) and deals with topics not relevant to the main theme.

We would certainly not buy that book, as we would not browse that site. And the Googlebot that finds itself crawling a problematic site will waste a lot of time because the structure will not guide it to explore it, it will not have clear topics and will feel confused, with the result that only a few pages will be periodically scanned. So our goal must be to make our site communicate effectively , both to the reader and to the Google spiders.

By always imagining our sites as books, where the chapters are the categories, the sub-chapters the sub-categories and the pages the individual articles, it is already clearer to us how we should structure a website. From our main topic, a series of related topics (the categories) should branch out , from which some sub-topics could still arise (the sub-categories or daughter categories). We will then fill our categories with relevant content .

Useful tools to plan a SEO friendly structure

A structure that is SEO friendly, that is to say that search engines like it, is an orderly structure, a structure that guides navigation both the user and the spider. In this context, the order in which we distribute the categories and tags and the names we assign them are of significant importance.

How to organize the categories of a website

The names that we will assign to each category have an importance not to be underestimated in the positioning of a site . They will also depend on the URLs, which heavily influence the positioning of a site. So it is necessary to study a strategy , carrying out a search for vertical keywords and keywords related to our main topic, taking into account the competitiveness they have and the relevance with the content of the category.
These keywords will be useful in naming site pages, categories, subcategories and tags.

To create a winning site structure we can use these tools:

Coggle : an online tool offered by Google that allows you to create mental maps , often used also to graphically draw the structure of the sites. It will be very useful to have a graphic representation of our ideas, so as to see at a glance if our project has coherence and balance.

Tag: how to use them to give greater order to the structure of the site

TAGs are almost never used correctly, and instead they can give our site an extra gear, as they allow us to categorize content also according to vertical topics.  

With tags we can order our items in an alternative way, for example according to seaside resorts, historic locations, mountain resorts, etc.

The tags therefore allow us to make our site even more orderly and schematic.
Improve SEO by optimizing the user experience

Google privileges SERP sites that guarantee the best user experience to its users, therefore ensuring good navigability for readers, valid and updated content and a performing site must be one of our priorities. Here are some tips not to be underestimated:

It should be avoided to create “matryoshka” categories to avoid the labyrinth effect : creating sub categories on sub categories will make life difficult for both crawlers and users. The more the structure is simple and concise, the more intuitive and effective the site will be.

It is necessary to make rational use of TAGs and not get carried away: the ideal condition would be to create a new tag only if you have at least 5 contents to label with that tag, otherwise it is better to avoid. For each article, a maximum of three tags must be arranged (again so as not to confuse the spider on the main topic of the post) , and in general, make sure that the site does not have more than 50 tags in all.

The menu must be accessible on each page: we do not know which will be the first page the user will visit, and if we want them not to leave the site immediately by raising the bounce rate * we must provide them with all the tools to be able to browse other contents. 

 The bounce rate is the percentage of exit after browsing content without taking any action. It is an enemy of SEO, as Google's algorithm takes this into account: if users leave immediately, our site probably has something it can review.

The home must be reachable from any page of the site in one click always according to the same menu principle.

Better to manually enter the related content and not let the site do it: sometimes the plugins offer irrelevant correlates because maybe they have not found better, but this confuses the context and the spiders.
Read Also : Web Design Services
Also Read: SEO NYC Company

Comments

Popular posts from this blog

Weaknesses in business security: alarm systems

Digital marketing tactics that work very well in 2020

The 4 big differences between mobile site and desktop site