Using Robots.Txt to Hide Azure Blob Storage Data from Search Engines

Hello,

What if you want to make an Azure Blob storage container public so that you can quickly share data with someone – but don’t really want the rest of the world to discover that it’s in place? A simple approach would be to drop a robots.txt file in the directory with the following contents inside:

User-agent: *
Disallow: /

This will ensure that the directory isn’t crawled by search engines – which will reduce the risk of exposing the data to unwanted parties.

Azure blob storage robots.txt search engine

Azure blob storage robots.txt search engine

Posted in: