Create a Sitemap.xml with Eleventy

Sitemaps are an underrated tool for helping search engines crawl into every corner of your website. Fortunately building an XML sitemap in Eleventy (11ty) like this one is super quick and easy.

Demo on GitHub

Start by creating sitemap.njk in your source directory.

---
permalink: /sitemap.xml
eleventyExcludeFromCollections: true
---
<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
{% for page in collections.all %}
<url>
<loc>{{ site.url }}{{ page.url | url }}</loc>
<lastmod>{{ page.date.toISOString() }}</lastmod>
<changefreq>{{page.data.changeFreq}}</changefreq>
</url>
{% endfor %}
</urlset>

Let's take a closer look at a few lines...

Global Data

<loc>{{ site.url }}{{ page.url | url }}</loc>

While we could always hardcode our domain, it's a nice idea to lean on 11ty's global data and pull it from there. My global JSON file is at src/_data/site.json. Now "https://www.belter.io" is prepended to each page's path.

{
"name": "Duncan McDougall",
"url": "https://www.belter.io",
...
}

changefreq

<changefreq>{{page.data.changeFreq}}</changefreq>

There are some pages that change more frequently than others and it's worth hinting to search engines that you'd like them to recrawl these pages more often. Achieve this by adding a changefreq value to these page's front matter data.

Valid values are:

As I've been evolving this site quite a bit recently, I've set my homepage's changefreq to daily.

Letting the bots know

Once you have a working sitemap, your next steps should be let search engines know where to find it. The simplest way is to add a reference in your robots.txt file.

Sitemap: https://www.belter.io/sitemap.xml

User-agent: *
Disallow:

You can do point Google there with Search Console.