2 min read

Google removes URL parameter tool – here’s what you need to know

Here’s what you need to know about Google removing its URL parameters tool… 👇🏻

Later this month, Google will be removing its URL parameters tool. 

Originally it was used to give site owners granular control over how Google crawled their site. But with Google’s technology getting better at guessing which parameters are useful and which are not, the tool is now of very little value. 

So, what do you need to do?

Absolutely nothing (kind of). 

Google has said, ‘Going forward you don’t need to do anything to specify the function of URL parameters on your site, Google’s crawlers will learn how to deal with URL parameters automatically.

However, if you are looking to have some control over your site, here is what you should do: 

Use the Robot.txt file to communicate with Google which pages you don’t want indexing. 

This may require you to use a developer, however, it’s the best way to set specific parameters on your website. 


Why is this important? 

Introducing Index bloat and crawl budget. 


What is Index Bloat? 

Index bloat is where search engines index pages on your website that aren’t very useful and have no benefits to appearing in their search results.

For example, if a page has duplicate content or low-quality content, you don’t need it to show up. Save the rankings for good quality content.

This is important to be aware of as in some cases this can cause your better quality pages to actually rank lower due to the duplicate content. In a nutshell, the poor quality pages could end up stealing the organic traffic due to rankings.

If it is the case that you have pages that fit the ‘poor-quality’ category, you can either delete them or make sure Google doesn’t index them. Previously, this is where you would’ve used the URL parameters tool.


What is a crawl budget?

A crawl budget is the total number of pages a search engine is willing to crawl. Some websites have minimal pages whilst others have thousands. In order to give every website equal opportunities to be crawled, they have a set time allocation for each one.

If you have lots of low-quality pages, Google may spend your crawl budget on these leaving your high-quality pages within being indexed, which may result in the lower-quality ones ranking and the higher-quality pages not getting the rankings they deserve.


In conclusion: 

  1. Optimise your high-quality pages to ensure they receive the best rankings possible
  2. Delete unwanted pages or use the index.txt file to inform Google not to index
  3. Speak to an SEO expert for more advice if you get stuck

Here’s Google’s introduction to Robots.txt tool to help you get started.

We know technical SEO can be tricky so as always, you can pick the easy route: Ask a professional for help! 

Explore Our Services

DIGITAL PR 

Earn authoritative links and drive brand awareness with Digital PR

PAID SEARCH

Deliver instant traffic and revenue through Paid Search and Shopping

SOCIAL ADS

Reach new audiences and retarget existing ones on social channels 

CONTENT

Attract and engage website visitors with a well executed content strategy

Latest Blog Posts

Technical SEO Tips to Boost Your BFCM Sales

2 min read

Technical SEO Tips to Boost Your BFCM Sales

Increase your Black Friday and Cyber Monday sales with these crucial technical SEO tips.

Read More
Content Marketing Tips for Ecommerce Success This BFCM

2 min read

Content Marketing Tips for Ecommerce Success This BFCM

Black Friday and Cyber Monday (BFCM) are just around the corner. Boost your content marketing strategy to drive traffic, engage customers, and...

Read More
The Role of Artificial Intelligence (AI) in Content Creation

5 min read

The Role of Artificial Intelligence (AI) in Content Creation

Discover how artificial intelligence is changing the way we create, curate, and optimise content across various platforms.

Read More