Skip to content
  • Ben Bodenmiller's avatar
    disallow irrelevant pages by default in robots · 73a144b7adab
    Ben Bodenmiller authored
    Update default robots.txt rules to disallow irrelevant pages that search
    engines should not care about. This will still allow important pages
    like the files, commit details, merge requests, issues, comments, etc.
    to be crawled.
    73a144b7adab
To find the state of this project's repository at the time of any of these versions, check out the tags.