Following good internal linking practices ensures the enhancement of 3 important aspects of a website:
- Internal links are a way to facilitate navigation for users.
- They also help consolidate the hierarchy of information on a website.
- In addition, they serve to distribute authority among the pages that we want to rank to improve their search engine positioning.
An internal linking strategy will help us improve the SEO of our website. We cannot simply follow common sense, we must thoroughly analyze our website first.
But what are the steps we must take to get to that strategy? We recommend the following actions as the best internal linking practices:
- Full crawling of the website that is going to be optimized
- Structure analysis
- Content and internal linking analysis
- Content management
- Link insertion
- Measurement and iteration
Full Crawling of the Website that Is Going to Be Optimized
Crawling the site is the first good practice we need to undertake to improve internal linking. In order to improve the internal links on our site, we must first understand them.
It is important to crawl the entire web to audit its contents, its structure and its internal links. If the website is huge and has hundreds of thousands of pages, you can select one or more sections of the web that are related—such as a group of specific pages targeting a specific geography.
This crawling will reveal quick fixes that we must implement: broken internal links, links to redirects, pages that do not have links, among others. But, in addition, it will lay the foundations for everything that comes afterwards (the structure and content analysis.)
The most used tool to crawl a website and to be able to easily locate those quick fixes is Screaming Frog. While some specific Screaming Frog settings may require research, time, or experience with the program, some improvement actions are really straightforward and come pretty much out of the box. Such is the case of redirect chains.
A redirect chain occurs when there is more than one redirect between the initial URL and the destination URL.
Therefore, redirect chains occur when (after changing content on successive occasions) we have page A that redirects to page B and page B that redirects to page C.
Ideally, both page A and page B should redirect to page C—avoiding redirect chains.
Screaming Frog makes this good practice a breeze. Once the site is crawled, we simply need to export the report called ‘Redirect Chains’. Then we just have to modify those redirects, either in our CMS or in the .htcaccess file.
After the crawl, we already have all the necessary material to carry out the following good practice: analyzing page groupings according to their typology or the topic they deal with. We need to understand their correlation and find weaknesses.
Are the different categories of the site equally important? From what sublevel within the hierarchy of the site do links begin to rank badly? How many links does a subset of URLs need to get search engine traffic?
It is not always easy to have a consistent and easily crawlable structure. For example, crawling gets more complicated when we talk about sites with millions of pages, which use filters, faceted navigation and pagination.
In fact, pagination can be a huge problem for good search engine crawling. E-commerce often uses this form of navigation due to product listings and can be affected by pagination errors that impair the indexing of hundreds or thousands of product listing pages.
According to a test carried out by Audisto's crawling tool, the proper way to paginate should vary depending on whether all the elements of the list and all the pages of the same concept need to have the same relevance or if the business chooses to give more importance to the initial pages and items.
For example, if (as a business) we choose to prioritize initial content, we can boost those pages and purposely downplay the last pages of the pagination. We should choose a pagination that always links to the first five pages of the listings.
In this image, we can see this type of pagination. The user is on page 25, and has links to the previous page and the next one. Furthermore, the initial five pages are also linked.
Both Screaming Frog and Audisto have options to analyze site structure and pagination depth.
We deeply recommend you to review this Audisto's article. You will be able to find more information about the best internal linking strategies in paginations.
Content and Internal Linking Analysis
Once we have identified improvements in the structure, we must focus on the content itself, and on the internal links. Which pages do we want to promote and with which links?
At this point the structure of the site is clear. We must understand which pages are the strongest in relation to organic traffic and which pages we want to promote. This will help us decide which pages we have to boost.
A good practice that we can apply is to identify the pages that have the most traffic and those that position on the second page of Google results. We need to look for relationships between them and find out if they have a connection due to the content, the number of links or the place from which they are linked.
Thus, we discover possibilities for improvement by focusing directly on the pages that can get us a greater increase in search engine traffic.
DWX InLinks, in combination with DWX Crawler, is the ideal tool to automate this task and achieve the best results.
The ultimate goals are to create links between pages—from the strongest ones to those that need to be boosted—, and also to add links from thematically related pages.
It is important that our linking strategy avoids the use of "nofollow" links with the intention of better distributing the link juice.
This attribute was used years ago for page rank sculpting and today it should no longer be used for this purpose. It is no longer practical because the link juice gets leaked. ("Nofollow" links do not receive link juice and it is not distributed among the rest of the links on the page.)
This next best practice for improving internal linking is related to content.
We connect contents that are related to each other with internal links. Therefore, we must understand what content we have duplicated or what content is so similar that it serves the same user intent.
It is necessary to adapt that content (using redirects or canonical URLs) so that it does not damage us. Thus, it will be able to help us to obtain a better positioning.
In addition, thanks to the crawling and the structure analysis we can discover that it is convenient for us to create certain content that we do not have. We can also find out if we need to make modifications to be able to link different pages.
For example, this case study includes links that connect the pages that have more external links with the pages on our site that we are trying to rank.
It shows how the weight that these pages have benefits the pages that we link to our site and how it makes them improve in rankings in relation to the keywords used in the anchor text.
The location of the links within the web also matters. It is not the same that the link comes from a menu than from the central part of the site.
It is a good internal linking practice to think about menus from another perspective. We need to understand how relevant their links are to the pages they lead to.
The more linked pages there are, the more the link juice is divided. If we prioritize the categories, subcategories and products that most interest us (as a business), links will benefit and will be more relevant to search engines.
Screaming Frog has configuration options that allow the links to be classified and analyzed based on their location within the page.
The fifth step includes link insertion.
Once the crawling has been carried out, the structure has been understood and the content has been analyzed, we must proceed to insert the links on the appropriate pages. Thus, we'll be able to boost those pages that will help us improve our organic positioning.
We must also take into account the good practices that we know about links:
- A good link gives context and is helpful to the user.
- The location of the link within the page matters.
- The anchor text of the link is very relevant.
- We must not abuse and include many links on a single page.
DWX makes it possible to perform the link insertion that we need to improve our internal linking. It helps us to obtain better rankings thanks to the ranking analysis that it carries out to the pages with the most potential to grow in organic traffic.
Measurement and Iteration
Last but not least!
The last best practice for internal linking improvement includes measuring the results.
At the end of the process, we must always take a measurement. Have we got more search engine traffic to the pages that we wanted to rank and with the keywords that we were interested in?
DWX solution measures the results of its actions and also takes them into account for successive iterations. Thus, it can improve its performance.
This process can and should be repeated periodically, because these good internal linking practices guarantee a permanent adaptation and a good structure of the website, which will be easily crawlable by search engines and will adapt to different user intents (for which our value proposition is a good solution).