# As a condition of accessing this website, you agree to abide by the following # content signals: # (a) If a Content-Signal = yes, you may collect content for the corresponding # use. # (b) If a Content-Signal = no, you may not collect content for the # corresponding use. # (c) If the website operator does not include a Content-Signal for a # corresponding use, the website operator neither grants nor restricts # permission via Content-Signal with respect to the corresponding use. # The content signals and their meanings are: # search: building a search index and providing search results (e.g., returning # hyperlinks and short excerpts from your website's contents). Search does not # include providing AI-generated search summaries. # ai-input: inputting content into one or more AI models (e.g., retrieval # augmented generation, grounding, or other real-time taking of content for # generative AI search answers). # ai-train: training or fine-tuning AI models. # ANY RESTRICTIONS EXPRESSED VIA CONTENT SIGNALS ARE EXPRESS RESERVATIONS OF # RIGHTS UNDER ARTICLE 4 OF THE EUROPEAN UNION DIRECTIVE 2019/790 ON COPYRIGHT # AND RELATED RIGHTS IN THE DIGITAL SINGLE MARKET. # BEGIN Cloudflare Managed content User-agent: * Content-Signal: search=yes,ai-train=no Allow: / User-agent: Amazonbot Disallow: / User-agent: Applebot-Extended Disallow: / User-agent: Bytespider Disallow: / User-agent: CCBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: Google-Extended Disallow: / User-agent: GPTBot Disallow: / User-agent: meta-externalagent Disallow: / # END Cloudflare Managed Content # # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html # # For syntax checking, see: # http://www.frobee.com/robots-txt-check User-agent: Googlebot User-agent: AdsBot-Google User-agent: Googlebot-Image User-agent: Googlebot-Mobile User-agent: Mediapartners-Google User-agent: AdsBot-Google User-agent: bingbot User-agent: msnbot User-agent: msnbot-media User-agent: adidxbot User-agent: slurp User-agent: yandex User-agent: baiduspider User-agent: baiduspider-image User-agent: baiduspider-mobile User-agent: YandexBot User-agent: Twitterbot User-agent: facebookexternalhit/1.1 User-agent: facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php) Allow: / Sitemap: https://www.kixify.com/sitemap.xml User-agent: * Disallow: / Crawl-delay: 10 # Directories Disallow: /includes/ Disallow: /misc/ Disallow: /modules/ Disallow: /profiles/ Disallow: /scripts/ Disallow: /themes/ # Files Disallow: /CHANGELOG.txt Disallow: /cron.php Disallow: /INSTALL.mysql.txt Disallow: /INSTALL.pgsql.txt Disallow: /install.php Disallow: /INSTALL.txt Disallow: /LICENSE.txt Disallow: /MAINTAINERS.txt Disallow: /update.php Disallow: /UPGRADE.txt Disallow: /xmlrpc.php # Paths (clean URLs) Disallow: /admin/ Disallow: /comment/reply/ Disallow: /filter/tips/ Disallow: /logout/ Disallow: /node/add/ Disallow: /node/*/edit Disallow: /user/register/ Disallow: /user/password/ Disallow: /user/login/ Disallow: /follow/* Disallow: /flag/* Disallow: /api/* Disallow: /app/* Disallow: /feedbacks/* Disallow: /paypalapi/* Disallow: /mobile/* Disallow: /paypal_rest/* Disallow: /add/* # Paths (no clean URLs) Disallow: /?q=admin/ Disallow: /?q=comment/reply/ Disallow: /?q=filter/tips/ Disallow: /?q=logout/ Disallow: /?q=node/add/ Disallow: /?q=node/*/edit Disallow: /?q=user/password/ Disallow: /?q=user/register/ Disallow: /?q=user/login/ Disallow: /?q=follow/* Disallow: /?q=feedbacks/* Disallow: /?q=paypal_rest/* Disallow: /?q=paypalapi/* Disallow: /?q=mobile/* # User-agent: * # Disallow: / # User-agent: YandexBot # User-agent: SemrushBot # User-agent: AhrefsBot # User-agent: SemrushBot # User-agent: Mail.RU_Bot # User-agent: Mail.Ru # User-agent: dotbot # User-agent: rogerbot # Disallow: /