Plugin to create the robots.txt file automatically

Options See on

filename string

The robots.txt file name

allow string[] string
disallow string[] string
rules object[]


This plugin allows to create the robots.txt file automatically, used to configure which search engines (and other bots like AI data scrappers) have access to the website.


Import this plugin in your _config.ts file to use it:

import lume from "lume/mod.ts";
import robots from "lume/plugins/robots.ts";

const site = lume();

site.use(robots(/* Options */));

export default site;


The plugin accepts an array with a list of bots allowed and disallowed. For example:

// Explicit allow access to Google and Bing
  allow: ["Googlebot", "Bingbot"],

Note that this configuration only give explicit permission to those bots, but doesn't prevent other bots to scan the site. If you only want to give permissions to these bots, add the * value to disallow:

// Give access only to Google and Bing
  allow: ["Googlebot", "Bingbot"],
  disallow: "*",

Advanced options

The rule option contains an array of rules for more specific configuration. For example:

// Deny access to the /admin folder to all user agents
  rules: [
      userAgent: "*",
      disallow: "/admin",

More info

You can see a complete list of bots at Dark Visitors.