This is a lightweight package that enables /robots.txt root of the umbraco website. This package is configured using appSettings.
Command Line
donet add package Our.Umbraco.Blend.RobotsTxt
Or Nuget
Install-Package Our.Umbraco.Blend.RobotsTxt
In the Startup.cs
there is a configuration you need to add for /robots.txt
path to render.
In the app.UseUmbraco()
Under .WithEndpoints(u =>
add:
u.EndpointRouteBuilder.MapControllers();
This will use the route /robots.txt
declared in the controller.
If there are not any configurations in the appSettings.json file and no environment is found the default robots.txt will be:
User-agent: *
Allow: /
Disallow: /umbraco
If an environment is found and the name is Production
the above will be rendered. For all other environments will be:
User-agent: *
Disallow: /
The /umbraco
is global path that is set in appSettings. If this is set to a different path this path will update.
In the root of your appSettings.json
you can configure custom settings. You can also use appSettings.[Environment].json
to have specific settings for every environment.
"Robots": [
{
"UserAgent": "*",
"Allow": [ "/" ],
"Disallow": [ "/umbraco" ],
"Sitemap": "/sitemap.xml"
}
]
Robots
is an array of objects to be configured as needed to your use case.
UserAgent
is an optional string. If left blank will use *
.
Allow
is an optional string array. Array of paths to allow.
Disallow
is an optional string array. Array of paths not to allow.
Sitemap
is an optional string. If left blank will not include.
If Allow
and Disallow
are both empty, will set to Allow: /