Two tips to avoid Duplicate Content:
Robots.txt or Meta Robots WordPress Plugin
Robots.txt or Meta Robots WordPress Plugin
Do you use tags? Did you know they can bash your Google Page rank? But you can fix that?
Reading Graywolf’s blog, I was reminded to watch out for duplicate content issues and Wordpress. It turns out that the wordpress default doesn’t nofollow “tags”.
Because bloggers who tag posts tend to create zillions of tags, they often end up with exactly one post in a many individual “/tag/” directories. This nearly always create duplicate content, which is not a good thing.
You’ll want to fix this; it’s fairly easy. I fixed the issue by modifying my robots.txt file.
What’s a Robot.txt file?
The robots.txt file is a plain text file you place in your root directory. It tells robots not to crawl specific files thereby eliminating the duplicate content issue.
The robot.txt file for BigBucksBlogger now reads like this:
Click here to read more.
Related Posts:
- Blog Security: htaccess block
- Lucia's Linky Love for WP 2.3: Option to follow trackback immediately.
- Improve Your Better Feed: Wordpress Plugin
- Andy Beard Wants Dramatic Titles: Just Like Muhamed Saleem's.