Talking Tech: Max on Elasticsearch
We’re very lucky to have a hugely talented Development team here at Pickr, so you’ll be hearing more from them directly in our Talking Tech series.

We’ll document our challenges, learnings and experiences as the platform evolves, all from the perspective of the people who build it! First up is Max Andreassen, who reduced Pickr’s load speed from 40s to 400ms, using Elasticsearch.



“Elasticsearch is an open source search and analytics engine for all types of data. That’s quite a mouthful but what it means is that it’s great for searching on pretty much anything in record time. It’s something that if used correctly can perform efficiently even with billions of indexed rows. If Pickr’s internal roadmap is anything to by, it’s a feature we definitely need!

One of the reasons Elasticsearch can handle such scale is because it’s built on the free, open-source search engine library Apache Lucene. With Lucence, Elasticsearch delivers full text indexing and searching capabilities straight out of the box. This indexing is what enables Elasticsearch to perform such fast, efficient queries. For example, if you had a document that contained the string “Hello to the world”, it could index those words in many different ways – as the full string, as individual words ([“hello”, “to”, “the”, “world”]) or as only significant words ([“hello”, “world”]).

These words become keys in an index and enable text search that’s orders of magnitude more efficient than a SQL database. This increased speed does come at a bit of a cost in terms of precision, however. In terms of discrete record retrieval, Elastic search is less capable than an SQL database and it also can’t join document types like SQL can with tables.

Elasticsearch is a brilliant tool in your arsenal for certain situations but it should rarely be used as your primary data store. Here at Pickr, we use a MySQL database as our source of truth and sync the relevant data over to Elasticsearch when we need to perform high speed, complex searching on it.

Our first real use case for Elasticsearch was for our potential candidate review boards. These review boards allow our internal operations team to find the best candidates for a job out of 10,000s of potential contractors. Our first attempt to deliver the required functionality was directly with SQL in one big stored procedure. This worked for a while but as our contractor count grew, the requirements for filtering became more and more complicated and quickly spiralled out of control. Near to the end of its life, our potential candidate stored procedure was taking in excess of 40s to load each review board! It meant an awful lot of time was being wasted waiting for pages to load. We needed a better solution and we found it in Elasticsearch.

Elasticsearch took our review board page loading time down to less than 400ms and actually led to better and more contextual filtering. If you consider the business benefit of this, and do a little adding up, you can see that it is a life saver. If every customer loads 200 review boards per day then Elasticsearch saves every customer over 800 hours of wasted time per year!

Now that we’ve been using Elasticsearch for a while, we’ve started to develop conventions and best practices around the way we develop with it internally. As a .net based back-end team, we used NEST as the basis of our integration with Elasticsearch. In building these integrations have ensured that we followed the SOLID principles. As any good developer knows, adhering to these principles allows us to build clean integrations that will stand the test of time.

In following these principles, we have ended up with a ‘Pickr Elasticsearch’ foundation project. This allows us to build out queries, indexes and searches faster than ever before. This foundation serves us wonderfully as it now means we can create new indexes, filled with data to search on in under a day. I would highly recommend this approach to anyone repeatedly using Elasticsearch as you will benefit from the initial effort.

I’m more excited than ever to keep pushing the boundaries with Elasticsearch and to see where it takes us. If our journey here at Pickr sounds like something you want to be a part please reach out. If you’d like to hear me ramble on about Elasticsearch some more then feel free to drop me an email at I’m always excited to hear from anyone as enthusiastic about Elasticsearch as I am!”

Liked this article? You can also checkout Max chatting about Elasticsearch in the Pickr lounge or learn more about Pickr and the work we do here.


Pickr and Openreach join forces

Pickr and Openreach join forces

As the UK’s race to full-fibre continues to intensify Pickr announced a bold, multi-million-pound partnership with Openreach, to accelerate the UK’s ultrafast full-fibre transformation in a sector that’s generally dominated by traditional contractors and multi-level sub-contracting.

Badge of honour: Forces Badge added to Work Experience Card

Our Fibre Worker Survey 2019 found that 40% of all the workers who enter telecoms through a career change have a military background. In our blog, ‘Why do ex-services personnel go into telecoms,’, we spoke to veterans to discover why this group enter the profession in...

Why is fibre such a politicised issue?

Why is fibre such a politicised issue?

As the December 11th election approaches, all the parties are out pushing their agenda on the campaign trail. Among the well-trodden battle grounds of Brexit, the NHS and immigration, the UK’s fibre network (or lack of it) has become a key issue.  Why has fibre become...

Share This