How to hide the blog pages from search engine crawlers to avoid duplicate content issue?

Some of your posts may be included in multiple categories and have several tags. That's why different category and tag pages can posses the same content that search engines mark as duplicated. Hide these pages from indexation to avoid this problem.

To hide certain pages from search engines (for the default Blog Pro configuration), please modify the robots.txt file on your server by adding the following lines:

Disallow: /blog/tag/

Disallow: /blog/category/

Generally, place URL you want to hide from search engine crawlers between the slashes and utilize the part of the URL that is located after your domain:

example.com/category/

This way you can hide any particular page.

See more details on Blog Pro page
Rate the answer?