Robots
Plugin to create the robots.txt file automatically
Options See on deno.land
- filename string
The robots.txt file name
Default:"/robots.txt"
- allow string[] string
- Default:
"*"
- disallow string[] string
- rules object[]
Description
This plugin allows to create the robots.txt file automatically, used to configure which search engines (and other bots like AI data scrappers) have access to the website.
Installation
Import this plugin in your _config.ts
file to use it:
import lume from "lume/mod.ts";
import robots from "lume/plugins/robots.ts";
const site = lume();
site.use(robots(/* Options */));
export default site;
Usage
The plugin accepts an array with a list of bots allowed and disallowed. For example:
// Explicit allow access to Google and Bing
site.use(robots({
allow: ["Googlebot", "Bingbot"],
}));
Note that this configuration only give explicit permission to those bots, but doesn't prevent other bots to scan the site. If you only want to give permissions to these bots, add the *
value to disallow
:
// Give access only to Google and Bing
site.use(robots({
allow: ["Googlebot", "Bingbot"],
disallow: "*",
}));
Advanced options
The rule
option contains an array of rules for more specific configuration. For example:
// Deny access to the /admin folder to all user agents
site.use(robots({
rules: [
{
userAgent: "*",
disallow: "/admin",
},
],
}));
More info
You can see a complete list of bots at Dark Visitors.