Setting up Algolia Service

Algolia has a free plan that allows for 10 search units (10,000 requests) every month, which gives me access to their widget library, analytics and most of the configuration options for the index.

For this little old website that's more than enough, and the tools provided to get it all working are plenty and powerful enough to all my search needs. So after signing up and landing on the dashboard I realised the first thing I had to do was to tell Algolia about my data.

What's an index?

An index is a collection of data in the form of Objects in JSON format. These objects don't have any required fields, so it's quite flexible to match your data structure and searchable properties.

You can obtain this data in multiple ways, depending on your platform: database exports, API calls or if you have the right plan, you can let Algolia Crawler scan your site and collet the content directly. Since my site is static, but generated using Eleventy, I created a file that generates the index for me every time the content changes, and I can upload the generated data directly into Algolia's dashboard.

{
    "title": "I build the web",
    "url": "https://ramono.me/",
    "description": "UX and Frontend Developer, Tech Lead, Web Designer and Accessibility learner based in Colorado Springs",
    "date": "2021-05-24"
}

Once the file is uploaded to Algolia, you can try the search within their interface right away, and then start fine tuning the searchable properties from it by adding and sorting attributes.

Install on the website

I'm going here with the most basic installation set up. There are only two JS files to include, the lite version of the main search and instantsearch which provides the widgets to create the search form and results.

I'm only using the form and results widget for now. There are many other widgets for pagination, filters, etc. but I don't have much use until my content massively grows.

<script src="/assets/js/min/algoliasearch-lite.umd.js"></script>
<script src="/assets/js/min/instantsearch.production.min.js"></script>
<script src="/assets/js/min/search.js"></script>

Note that in my case I downloaded the files and call them locally to avoid having to change CSP rules (Also, avoid DNS lookup for these two small files).

In my case I created a new page for the search instead of an overlay, I may change my mind in the future, but since the site is so fast I thought it would do nicely, and this way I get to load the JavaScript files only on that page.

// Search 
const searchClient = algoliasearch('<API_APP_ID>', '<API_SEARCH_KEY>');

const search = instantsearch({
    indexName: 'ramono',
    searchClient,
});

search.addWidgets([
    instantsearch.widgets.searchBox({
        container: '#search_widget',
        autofocus: true,
    }),

    instantsearch.widgets.hits({
        container: '#search_results',
        templates: {
            item: `
                <h3>{{#helpers.highlight}}{ "attribute": "title" }{{/helpers.highlight}}</h3>
                <p>{{#helpers.highlight}}{ "attribute": "description" }{{/helpers.highlight}}</p>
                <p><a href="{{ url }}">Read more</a></p>
            `,
        },
        escapeHTML: true,
    })
]);

search.start();

The search script is quite simple as well, just configuring access, the widgets and triggering the search. They provide a CSS file with basic styling, but I rather use my own to match the site. Their only requirement for free plan is to include their logo with "Powered by" which I've added.

The widgets take care of all the interactions, and I feel they did a great job with how people use the search. Focus styles, keyboard navigation, the speed of the results is great, and the relevance out of the box is pretty good, but you can influence it as you want from the admin interface.

Conclusion

I found the whole process very easy and straightforward, and the result is fantastic (go try it). I love the speed, and how easy is for me to add records and influence the results, which for a small static site is more than enough.

There are ways to automate many of these tasks, including a crawler on higher tiers, and many other options, for for my need this is great.

Definitely recommended.