# robots.txt to prevent search engine crawlers: User-agent: * Disallow: /cis/ Disallow: /core/ Disallow: /dox/ Disallow: /e3/ Disallow: /ecube/ Disallow: /ecube/OLDecube/ Disallow: /roots/ Disallow: /signin/ Disallow: /todos/ Disallow: googlehostedservice.html Disallow: /research/scecnumber/ Disallow: /workshops/travel/