5 REASONS WHY ARE YOUR INDEXED PAGE GOING DOWN- stay tuned

5 REASONS WHY ARE YOUR INDEXED PAGE GOING DOWN

When you are creating any website, You also want to rank it high on Google SERP. Getting rank is only possible when you submit your website to Google and it is indexed by Google.

How you can see how many pages are indexed by Google.

It’s really simple.

Use the site: Operator
Check the status of XML sitemap Submissions in GSC (Google Search Console)
Check your overall indexation status

Above mentioned all the methods give a different number. But this is another story.

Let’s discuss the decrement in the number of your indexed pages by Google.

If your pages are not being indexed by Google, It can be a sign that may be Google does not like your pages or may not be able to easily crawl. There are some other reasons “why are your indexed pages going down” are listed below.

Your website is penalized.
Your pages are irrelevant according to Google.
Google is not able to crawl your pages.

In this article, I am going to discuss how to diagnose and fix the issues of decrement in the number of indexed pages.

1. PAGE SPEED

Are the pages loading properly? Make sure all the pages of your website have the proper 200 HTTP Header Status. Did you get frequent or long downtime server response? Did the domain recently expired and was a renewed date.

SOLUTION:

You can use a free Header status checking tool to determine whether your all pages have the proper status. For massive websites, you can use Xenu, DeepCrawl, Screaming Frog, or Botify to perform the test. The proper Header status is 200. Sometime 3xx(except the 301), 4xx or 5xx errors appear. These Header statuses are not good news for your website or may cause the de-indexing problem.

2. CHANGES ON YOUR URLS

Did your URL change recently? If you have changed the URLs, you need to resubmit it. A search engine may remember the old URLs. but the pages have redirection issues, a lot of pages can become de-indexed.

SOLUTION:

Hopefully, a copy of the old site is still visited. Take note of all old URLs and you can set a 301 redirect to all corresponding URLs.

3. DUPLICATE CONTENT ISSUE

Did you fix the duplicate content issues? Fixing duplicate content involves implementing canonical tags, 301 redirects, non-index meta tags or disallows in robots.txt. These actions can result in a decrease in indexed URLs.
This is one of the examples when you enjoy the decrement of indexed URLs by Google.

SOLUTION:

Since this is good news for your website but you have to cross-check whether this is the reason for the decrement of your indexed URLs.

4. ARE YOUR PAGES TIMING OUT?

Some servers have bandwidth restrictions due to associated costs. If you want higher bandwidth you need to upgrade these servers.
In the case of hardware-related problems, you need to upgrade your hardware processing or memory limitation.

SOLUTION:

If you are facing problem in server bandwidth limitation, then this is the appropriate time to upgrade the services.

5. DO SEARCH ENGINE BOTS SEE YOUR WEBSITE DIFFERENTLY?

Many times it has happened that search engine spider sees your website different than what you see.
Some developers make the website in some preferred way without knowing the implication of SEO.
Developers build a website in the out-of-box search engine without knowing whether it is SEO friendly or not.
In other cases, the website has been hacked by some hackers. They create a hidden page to be shown to Google to promote their hidden links or clock the redirection to their own website.
One of the worse situations when web pages are affected by malware then google automatically deindexes the web page.

SOLUTION:

Google search console’s fetch and render feature is the best way to confirm if Google bot is seeing the page the same as you are.