Any tools that we use for our website content writing and publishing, linking, or any other part of our SEO campaign for our website or application, can only be as good as our understanding of them. If we do not understand the totality of the tools, then we are just using them at the most basic level. One example of a tool that needs thorough understanding is Google Search Console and its Google Index feature.
Now every browser supports https instead of http for urls. So it necessary for us to know all web tools that can help us with the adjustment of website. One of the best tools available to SEOs. This is the third part of the series, which means the first part about Search Appearance and the second article about Search Traffic has already been published. Let’s get started.
The Google Index tab of Google Search Console can help to SEO in various methods. It can help you in your search for indexing instances, finding out if CSS is blocked, or the simple removing of URLs, the Google Index section is definitely a gold mine.
The Index Status section enables you, the user, a report on your website’s URLs that Google has already indexed over a specific amount of time, usually the past year.
Index Status feature is to find out whether your website gets an instance of indexing. You can find out by comparing the data that you receive from the Index Status of Google Search Console to the data from Google Analytics.
What you’ll do when comparing them is:
Look at the number of pages in the Index Status data → Match it with the number of landing pages that receive organic traffic highlighted in the data from Google Analytics
If ever they do not match, then it probably means that only a small number of your indexed pages are receiving organic traffic.
To identify index bloat, here’s what you should do:
Go to Index Status → Google Index → Head to Google and perform a site:[website URL] search → Inspect each page that are shown in the search results to detect a pattern in the page’s parameters → Find out if there are indexed pages that should not be indexed → Add the noindex tag to the pages → Disallow them in the robots.txt
Unblocking URLs is simple, and here’s how to do it:
Remove the URLs from your robots.txt’s disallow section → Use Google Search Console’s robots.txt tester tool to see if your updated robots.txt file works → Inspect your pages to make sure that there are no instances of noindex or nofollow tags → Input the URLs into Fetch As Google tool to see if they are being properly rendered
Using the Remove URLs feature of Google Search Console can be complicated, but can be exceptionally useful for SEOs. Additionally, it is common for SEOs to experience a website they are handling possess thin or duplicate content.
I’ve written an article about the permanent solutions to duplicate or similar content, but you can use the Remove URL feature to temporarily hide them from Google. Just add the URL to the tool in Google Search Console. By doing this, the URL will temporarily remove it for 90 days, and the processing will take a day or two.
The main feature of this that I really like is that I can use it to organize my content before the Panda update comes in and to clean URLs that have case sensitivity issues.
So, before you input pages into Google Search Console, you should:
- Add noindex meta tags to each and every page
- Insert the rel=canonical tag to each page
- Disallow them in the robots.txt file
- Submit them to the Remove URLs feature
I have already said it, and I will say it again. Knowing and understanding Google Search Console GSC is a must for all SEO professionals. It can help you and everyone in each and every aspect of a website. From content to the technical aspect, Google Search Console has it all.