Robot feature tells search engine crawlers/spiders, which URL can be accessed on your site and which cannot. This is used to avoid overloading your site with requests. It is not a mechanism for keeping a web page out of Google.

You can find this under Site > Site Settings > Robots.

You can enter your robots file content into this area and click update.

Did this answer your question?