Robots.txt is a text file that instructs search engine robots what pages not to crawl and also tells the crawler how to index and navigate a website. The proper use of robot.txt ensures your website “works” with search engines just as good as it looks to your users.
To begin, click on Settings from the main navigation bar.
From this screen you can adjust a variety of settings for your sitemap such as which search engines get blog update notifications and which types of page and post types you would like indexed. To manage your robots.txt files, simply go to the “ROBOTS.TXT section in your settings panel.
Under these settings, you can choose to discourage search engines from indexing your website all together by simply unchecking the box at the bottom of the page. When you’re done adjusting your robots and Sitemap settings, hit “Save Settings.”