This is SEO question. If i have a domain, xyz.com and a sub domain - 123.xyz.com.
I am listing a lot of affiliates on sub domain. Creating a book library. However, to avoid duplication error, i went to robots.txt of sub domain 123.xyz.com and made it disallow so that the sub domain is not indexed at all by google. Hence no duplication errors / low content errors..
Should i disallow 123.xyz.com from the robots.txt of the main domain also? (for example robots.txt of xyz.com also?) so i would be making it disallow 123.xyz.com/ in the robots of xyz.com.