Big SEO update

Benoit Lamouche
7 min readDec 23, 2023

Version 1.48 of BagFinder.cc has just been deployed. This version has a strong focus on SEO (Search Engine Optimization).

I remind you that if you wish to closely follow the development of BagFinder, I invite you to join the backstages, it’s free and I share quality content there.

SEO with Content Creation

Travel guide on the homepage

Example : Top 10 tips for traveling with kids — Travel guide and tips (bagfinder.cc)

The content of a site is an important source of SEO. For this reason, a new section “Travel Guide” has been added. This section includes editorial articles, tips and tricks, as well as travel guides and other general information for travelers from around the world. The goal is to generate traffic based on content and keywords.

Open Graph Meta

Open Graph Meta is essentially a set of instructions that you can add to a webpage to control how the page is described and displayed when it’s shared on social media platforms like Facebook, Twitter, LinkedIn, etc.

Open Graph Meta is a way of making sure that your webpage looks good and is accurately represented when it’s shared on social media. It’s like giving social media platforms a little cheat sheet about what your page contains.

301 Redirects from Old URLs

Before switching to multiple languages, the site’s URLs were already indexed. However, when transitioning to multilingual, the URL structure changed.

When you change the URL of a page on your website (for instance, updating or renaming a page), it’s important for SEO (Search Engine Optimization) purposes to create a 301 redirect from the old URL to the new one. Here’s why:

1. Maintaining Search Rankings: Search engines, like Google, rank web pages based on various factors, including the URL’s history and its backlinks (other sites linking to it). If you change a URL without a redirect, the search engine treats the new URL as a completely different page. This means all the rankings and trust the old URL had built up is lost. A 301 redirect tells search engines that the new URL is a direct replacement, transferring most of the search ranking power to the new URL.
2. Preserving User Experience: Users might have bookmarked your old URL or found it through old links on other websites. Without a redirect, they would land on a 404 error page, which is frustrating and might cause them to leave your site. A 301 redirect seamlessly takes them to the new page, preserving the user experience.
3. Consolidating Link Equity: Over time, various external links may point to your website’s pages. These links contribute to your site’s overall SEO value. A 301 redirect helps in transferring this “link equity” from the old URL to the new one, ensuring that the value of those inbound links is not lost.
4. Avoiding Content Duplication: If both the old and new URLs are live and show the same content, it can lead to duplicate content issues. Search engines might not know which version to index, and this can negatively impact your SEO. A 301 redirect clearly indicates which is the correct URL to be indexed.
5. Updating Indexing Quickly: Search engines are generally quick to understand and process 301 redirects. This means they will update their index to reflect the new URL faster, helping maintain your page’s visibility in search results.

In summary, 301 redirects are essential for maintaining SEO performance when URLs change. They help in transferring the trust and link equity from the old URL to the new one, preserve user experience, prevent duplicate content issues, and ensure that search engines quickly update their index with the correct page information.

Language Alternate Meta

The “language alternate” meta tag plays a crucial role in websites that have content available in multiple languages. This tag is part of the HTML code of a webpage and serves several important purposes:

  1. Improving User Experience: When a website uses language alternate tags correctly, it helps ensure that visitors are presented with the version of the site that’s in their preferred or local language. This improves the overall user experience, as users are more likely to engage with content in their own language.
  2. Enhancing SEO: Search engines like Google use these tags to understand the language and regional targeting of a webpage. When you tell a search engine that certain pages are alternate versions of the same page in different languages, it can serve the most appropriate version in its search results based on the user’s language preference or geographical location. This can increase the visibility of your site in international markets.
  3. Avoiding Duplicate Content Issues: Without language alternate tags, a search engine might treat the same content in different languages as duplicate content, which can negatively impact a site’s SEO. These tags help search engines understand that the content is not duplicate, but rather localized versions of the same content.
  4. Facilitating Content Discovery: By properly implementing these tags, you make it easier for search engines to discover all language versions of your content. This is especially important for new or less linked-to content, as it might not be discovered by search engines as quickly without these tags.
  5. Assisting in Geotargeting: For multinational and multilingual websites, language alternate tags help in geotargeting, allowing search engines to show themost relevant version of the site to users based on their location. This is crucial for businesses that cater to diverse geographical markets.

In summary, language alternate meta tags are essential for multilingual websites as they enhance user experience, improve SEO for international audiences, prevent duplicate content issues, help in content discovery, and assist in accurate geotargeting.

Robots TXT File

The `robots.txt` file is a simple text file that’s part of a website’s structure, used to communicate with web crawlers and other web robots. It essentially acts as a set of instructions, telling these robots which parts of the website they should and shouldn’t access or index. Here’s a breakdown of its importance and functions:

  1. Controlling Crawler Access: The primary function of `robots.txt` is to tell search engine crawlers which pages or sections of the site they can crawl and index. For instance, a website might want to prevent search engines from indexing certain private or non-essential pages.
  2. Preventing Overload: By restricting crawler access to certain parts of a website, `robots.txt` can help prevent the site’s servers from being overloaded by requests. This is particularly important for large websites or those with bandwidth limitations.
  3. Securing Sensitive Data: While it’s not a foolproof security measure (and should not be relied upon for securing sensitive data), `robots.txt` can be used to discourage crawlers from accessing pages that contain sensitive information, such as user data or unpublished content.
  4. Managing Search Engine Indexing: It’s useful for managing how a site is indexed. For example, if there are duplicate pages or pages that don’t add value to search engine results (like print versions of pages), these can be excluded to help ensure that only the most relevant and valuable content is indexed.
  5. Efficient Use of Crawler Resources: By directing crawlers away from irrelevant or insignificant pages, `robots.txt` ensures that the crawlers’ limited resources are used more effectively to index important content.

However, it’s important to remember a few key points about `robots.txt`:

  • Not all robots follow the rules: While well-behaved crawlers like those of major search engines will respect `robots.txt`, not all crawlers do. This means that it should not be used to hide information.
  • It’s publicly visible: The contents of `robots.txt` are publicly accessible. Anyone can read it to see which parts of your site you don’t want crawlers to index.
  • Not for user security: It should not be used as a method to secure sensitive user information or to hide confidential parts of websites.

In essence, `robots.txt` serves as a guide for crawlers, helping to manage and optimize the indexing process of websites, but it’s not a tool for privacy or security.

New Airlines Added

A few more airlines have been added to the database:

- Antrak Air
- Ariana Afghan Airlines
- Arik Air
- Armavia
- Aruba Airlines
- Aserca Airlines
- Astra Airlines
- Allegiant Air
- Frontier Airlines
- JetBlue
- Spirit
- Sun Country Airlines

And a few more SEO improvements…

I remind you that if you wish to closely follow the development of BagFinder, I invite you to join the backstages, it’s free and I share quality content there.

--

--