September 5, 2019

How To Exclude WordPress Content From Google Search?

There are times when you may need to exclude WordPress content from showing up in your Google results. Is there a way that you can do this?

There is — indexing. This is a word that predominately was used at the back of books, but now with the emergence of Google, it’s got a completely different meaning.

What is Google indexing?

For every search engine, indexing has a different meaning. For Google, indexing refers to the process of adding new web pages. This can be documents, images, or videos.

In short, for your web page to appear on Google, your pages need to be stored in Google index.

How is Google able to do this? Through spiders or crawlers – unappealing names for something rather wonderful. These crawl through websites, and are responsible for ignoring certain pieces of information.

Websites need to be indexed in order for us to find a particular piece of content. In the digital age, it would be impossible to navigate through billions of websites ourselves.

Indexing has become an essential part of how search engines work. Google, in particular, does a fantastic job at it. It helps to identify specific phrases and expressions, helping to find your relevant piece of information.

For website owners, indexing also plays a crucial part in the marketing aspect. Ranking well on Google, ultimately means that you’ll get more visitors and, in turn, sales.

With the use of keywords, sites can be seen and discovered by more people. Although SEO and Google’s algorithm often updates, indexing will always play an essential part in the process. This is why it is essential to integrate your keywords correctly in the content you publish. There are now many services that can assist you with that, like Trust My Paper, BestEssay.Education, and Grab My Essay, also Grammarly is a great tool to enhance your readability, which is also essential for SEO.

How does this relate to excluding WordPress pages?

You might be wondering how all of this is relevant. Indexing also plays a crucial part in blocking individual pages.

One of the main reasons why people seek to stop indexing content is a result of security flaws.

In the early 2000s, hackers found their way into credit card information, from websites and simple search queries.

There have also been rumors that simple exploits to Google search, could expose the confidential and private information of individuals and businesses.

Obviously, hacks do happen online. We’re constantly reminded to be safe, and keep all information incredible secure.

In short, if you want to exclude WordPress from your site, you need to dive into indexing.

Below are the best ways that you can hide a WordPress page from Google, without affecting the SEO of your site.

Use Robots.txt

Essentially, Robots.txt is a file that’s located at the root of your site. It provides Google (and any other search engine) with instructions on what to crawl, and what they shouldn’t crawl.

Robots.txt is predominately used to control crawling traffic. However, it can also prevent various images from appearing in Google search results.

A robots.txt file, on a standard WordPress site, would look like this:

“User-agent: *

Disallow: /wp-admin/

Disallow: /wp-includes/”

With this, you can also keep bots away from any specific digital files that you may have. For instance, MP4s, JPEGs, and PDFs – this ensures your privacy and safety is secure.

This would look something a bit like this, for PDF files:

“User-agent: *

Disallow: /pdfs/ # Block the /pdfs/directory.

Disallow: *.pdf$  # Block pdf files from all bots.”

The above examples will exclude your content from simply being indexed. They will still be accessible for somebody who knows where and how to look.

If you want to make it so that your files are entirely private, and nobody can access them, you will need to find another method. This can be achieved with content restriction plugins.

Is Robots.txt a good option?

For the most part, Robots.txt is a great, easy option which anybody can achieve. It adds a level of security to your site and will help exclude some pages.

However, this is not a reliable way to block confident or sensitive files. They can only instruct well-behaved crawlers – some bots could ignore its instructions.

Not to mention, of the content and pages that you block, is linked from other sources, search engines can still find it.

Use no-index Meta Tag

Arguably, this is a much more reliable option for blocking search indexing. In comparison to robots.txt, this method is placed in thesection of your website – arguably the most crucial part.It’s a simple HTML tag that mirrors the following:

<meta name=”robots” content=”noindex”>

Any page with this instruction placed in the header, will not appear on Google search results. You can also build from this, and instruct web crawlers not to crawl links or offer translations.
You can also instruct multiple crawlers, which requires a different HTML tag, based on your preferences.
To add this code to your website, you need to edit the HTML of your WordPress theme. In the functions.php, you can insert a no-index meta tag or anything else that you want.
If this isn’t a reliable option for you, there’s also the SEO plugin route.
“You can control your page’s visibility with Yoast SEO, which will ultimately change the meta tags for you. This is many people’s favorite option. It’s easy to do and is much more useful than robots.txt.” Jeremy Fell, an SEO specialist at Studicus.

Use page authentication

The above options don’t solve all your problems, though. While they may prevent your private documents and sensitive data from appearing on Google search results, any users with a link can still access it.
Many people make the mistake of thinking that they’ve made the appropriate steps, and fail to do the las step – authentication.
For security purposes, it’s essential that you set up page authentication with a username and password.
You should also consider a role access permission.

For pages that may have personal profiles, staff pages, or sensitive data – you don’t want anonymous users to access this. Regardless of who they may be. It’s not worth the risk.
By taking steps to instruct crawlers, you’re preventing your WordPress site from being indexed. However, if someone manages to find your pages – they will be asked for credentials before they can look at any content.
The good news is, this is really easy to do with WordPress. You simply have to set the visibility of the post or pages to “password-protected.”
From here, you can create a password, and your site becomes much more secure. You can also consider adding WordPress plugins for extra security.
That being said, for comprehensive protection of documents, you should consider hiring a service, or professional security.


The goal for the majority of websites is to rank high. However, many business owners and bloggers alike are not considering what search engines can see.

Leave A Comment

one + 20 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.