![]() |
Does sitemap changefreq never instruction avoids duplication penalty for robots
Hello,
I want search engine to start seeing contents in a new subdirectory that was hidden from their view. Is the following approach OK? First, remove the line of instruction from the robots.txt file that blocks access to /en/? Second, add this new path to the sitemap.xlm file. Third, tag the change frequency to NEVER for duplicated files in the sitemap? Also, I do not intend to block the older files with duplicate content using the ROBOT.TXT file because I want them to stay in google indexes and because I want to maintain the full integrity of the older version of my site. Is there anything missing from the above ? |
Very good post thanks for sharing.
|
thanks for the info
|
All times are GMT -7. The time now is 10:43 PM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.