PlugRobots
An Elixir plug serving robots.txt.
Installation
Add plug_robots to your list of dependencies in mix.exs:
def deps do
[{:plug_robots, "~> 0.1.0"}]
endUsage
- Add it to your endpoint before the router
endpoint.ex
plug PlugRobots, rules: :allow_all
plug MyApp.RouterExamples
- Deny all crawler
plug PlugRobots, rules: :deny_all
plug MyApp.Router- Returns different rules by environment
case Mix.env do
:prod -> plug PlugRobots, rules: :allow_all
_ -> plug PlugRobots, rules: :deny_all
end
plug MyApp.Router- Returns custom rules
@googlebot_rule """
User-agent: GoogleBot
Disallow: /images/dogs.jpg
"""
plug PlugRobots, rules: @googlebot_rule
plug MyApp.RouterOptions
rules:allow_all: Allow all crawlers to all path:deny_all: Disallow all crawlers to all path- string: any other custom rules