What are zombie pages SEO?

Zombie pages are pages that are so bad that Google can’t rank them in its search results.

A zombie page is one that is so bad that even Google can’t rank it as a top result in the search results.

Google has a system in place to help users find the best results for their query.

But for websites that are not as good as the top results, there can be so many reasons why a page does not pass the robots.txt test.

In these cases, the only results that Google can rank tend to be the ones with the most links and those links are often coming from bad pages.

So, if a page is not so good, and it’s not ranking high, Google may decide to give it little to no ranking value.

For example, if you search for [how to make a zombie page] you’ll find a lot of pages on the internet that talk about how to make a zombie page.

But, if you try and crawl the pages, you’ll find a high bounce rate and low organic traffic.

It is not because these pages are bad, rather, Google considers them as pages that have no link value.

And that’s why, in some cases, Google may remove these pages from the search results.

So, if you are trying to rank a page, and you notice a lot of pages on your website that are ranking well, but you are not able to rank them, then you may want to check that your robots.txt file isn’t blocking them.

It is important to check if these pages are blocking pages that you want to rank, and not your own pages.

Here is a screenshot of the robots.txt file for the website “Wrecking Balls”.

When I was looking at the robots.txt file, I noticed that there was a redirect in there that pointed to a 404 error page.

But, there is also a redirect in the robots.txt file that points to a page that has a good number of links, but I am not sure if this page is good or not.

And so, I wanted to do a quick check to see if this 404 page was blocked.

I used the free tools from Screaming Frog.

I noticed that there was a redirect in the robots.txt that I wanted to block.

But, when I looked at the robots.txt file, I noticed that this page had a 301 redirect.

So, I created a new robots.txt file, and added this redirect:

  • User-agent: Googlebot

Disallow: /404.html

User-agent: 404

Now, if I open the robots.txt file I can see that this is not blocking anything, I can easily find the 404 page in the search results.

I still don’t know if the 404 page is good, and so I wanted to find out if this page is bad.

So, I used the free tools from Screaming Frog and I was able to check the link value of the page.

I was able to find that this page has a lot of links, but it is linking out to a page that has a high bounce rate and low organic traffic.

And so, it is possible that these pages are not good and they may not be ranking well.

But, with this test, I was able to find some good pages to consider for your zombie page.

If you are trying to rank a good page, and you find a lot of pages on your website that are ranking well, but you are not able to rank them, then you may want to check that your robots.txt file isn’t blocking them.

And this is why it is important to check whether or not your robots.txt file is blocking the pages you want to rank, and not blocking your own pages.

In some cases, it can be a good idea to make a robots.txt file with all the redirects pointing to bad pages, so that these pages don’t get any ranking value.

But, it is important to check these redirects before you make any changes to your robots.txt file.

If you are not sure if a redirect is blocking a page, then you can check it with the free tools from Screaming Frog.

And this is going to help you find some pages that are not ranking as you would like.

2. Remove Redirects

When you have a robots.txt file with redirects pointing to bad pages, you may want to remove these redirects.

And, Google can tell when a page has been redirected to a bad page because it will show the redirect in the search results.

So, if you find a lot of pages on your website that are ranking well but aren’t ranking well because they are redirected to a bad page, then it is time to remove the redirect.

Google may not rank these pages as well because they are redirected to a bad page, so you may want to remove the redirect in order to get better ranking for these pages.

So, let’s see how to remove a redirect.

If you are running a website that has a lot of redirects pointing to bad pages, then you may want to check if there is a redirect in your robots.txt that points to a good page.

If you are not sure what redirects are pointing to bad pages, then you can check with the free tools from Screaming Frog.

And you can use the free tools to check all the redirects pointing to 404 error pages, so that you can find some good pages to remove.

But, if you are not sure if there are any redirects pointing to good pages, then you can use the free tools to check all the redirects that are pointing to your pages.

Closing thoughts

Robots.txt files are important when it comes to keeping your website free from unwanted, spam and bad links.

It is important to make sure that you have a robots.txt file that is pointing to good pages, so that you aren’t blocking the pages you want to rank.

And, when you have a robots.txt file, you can easily find which pages are bad and which pages are good.

And, if you find lots of pages that are good but are blocked, then you will want to check if the redirects are blocking a good page and you can remove them.

Generated by AI

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x