Add Search to an Eleventy website with Elasticlunr

I recently found out about Elasticlunr, a lightweight, full-text search engine built in JavaScript. It got me thinking how this could be integrated into an eleventy site for client-side search. It's pretty easy as it happens.

I have a collection of 100 markdown documents, each about a different movie. Their frontmatter data has a title, excerpt, and a list of genres. This is what we'll add to the index.

---
title: "Parasite"
excerpt: "All unemployed, Ki-taek's family takes peculiar interest in the wealthy and glamorous Parks for their livelihood until they get entangled in an unexpect..."
genres:
- Comedy
- Drama
- Thriller
---

Building an index

We're going to create an 11ty filter. We'll use this to convert our movies collection into an elasticlunr index which will be output as JSON with the url /search-index.json

src/filters/searchFilter.js

const elasticlunr = require("elasticlunr");

module.exports = function(collection) {
// what fields we'd like our index to consist of
var index = elasticlunr(function() {
this.addField("title");
this.addField("excerpt");
this.addField("genres");
this.setRef("id");
});

// loop through each page and add it to the index
collection.forEach(page => {
index.addDoc({
id: page.url,
title: page.template.frontMatter.data.title,
excerpt: page.template.frontMatter.data.excerpt,
genres: page.template.frontMatter.data.genres,
});
});

return index.toJSON();
};

.eleventy.js

const searchFilter = require("./src/filters/searchFilter");

module.exports = function(config) {
...
config.addFilter("search", searchFilter);
config.addCollection("movies", collection => {
return [...collection.getFilteredByGlob("./src/movies/**/*.md")];
});
...
};

search-index.json.njk

---
permalink: /search-index.json
---

{{ collections.movies | search | dump | safe }}

Search Page

Now we have an index, let's use it on a page with just HTML and a little JavaScript. With the code below, we're instructing the browser to fetch search-index.json, load it into elasticlunr, search it, and finally add the results to an <ol> element.

<div class="field">
<label for="searchField">Search</label>
<input type="search" placeholder="Search..." id="searchField" />
</div>
<ol id="searchResults" />
<!--Only 5.7kb GZipped. You may want to bundle this with your application code. -->
<script src="https://cdnjs.cloudflare.com/ajax/libs/elasticlunr/0.9.6/elasticlunr.min.js"></script>
(function(window, document) {
"use strict";

const search = e => {
const results = window.searchIndex.search(e.target.value);

const resEl = document.getElementById("searchResults");
resEl.innerHTML = "";
if (results && results.length > 0) {
results.map(r => {
const { id, title, excerpt } = r.doc;

// this is where you'll go wild with your own HTML injection.
const el = document.createElement("li");
resEl.appendChild(el);

const h3 = document.createElement("h3");
el.appendChild(h3);

const a = document.createElement("a");
a.setAttribute("href", id);
a.textContent = title;
h3.appendChild(a);

const p = document.createElement("p");
p.textContent = excerpt;
el.appendChild(p);
});
}
};

fetch("/search-index.json").then(response =>
response.json().then(rawIndex => {
window.searchIndex = elasticlunr.Index.load(rawIndex);
document.getElementById("searchField").addEventListener("input", search);
})
);
})(window, document);

Final notes

Keep in mind the number of pages in your collection and how much content your indexing. Since this is client-side search, the entire index will be downloaded by the browser. My demo site of 100 pages has an average of ~150 words indexed per page. The index is 26.5kb GZipped.

Once you've got search up and running it's worth checking out the ElasticLunr docs and have a play with things like query-time boosting, boolean logic and token expansion.

View the full source code on GitHub