Choose whether to allow all bots or block them entirely. Custom rules are controlled by the Allow/Disallow lists below.
Use Disallow to tell crawlers which paths not to access. Leave empty to allow all paths.
Use Allow to explicitly permit paths that might otherwise be blocked by broader Disallow rules.
Set dedicated policies for common search engine crawlers: Default = follow global rules, Disallow = block completely, Allow = fully allow.
Crawler User-agent Policy
Google Web Googlebot
Google Image Googlebot-Image
Google News Googlebot-News
Bing Web Bingbot
Bing Preview BingPreview
Baidu Baiduspider
Yandex YandexBot
DuckDuckGo DuckDuckBot
Sogou Sogou web spider
360 Search 360Spider
Not all search engines support this directive, but it can help reduce server load.
Add a fully qualified sitemap URL to help crawlers discover your content.