# robots.txt for https://stickandfly.com/ # Prevents crawling of sensitive pages, allows everything else User-agent: * # Allow general pages Allow: / # Block directories/pages that should not be indexed Disallow: /admin/ Disallow: /app/ Disallow: /config/ Disallow: /logs/ Disallow: /scss/ Disallow: /vendor/ # Optionally block legal pages if not relevant for SEO Disallow: /datenschutz Disallow: /impressum #Disallow: /sitemap # Link to sitemap Sitemap: https://stickandfly.com/sitemap.xml