Best practices for dealing with large amounts of data?


I have a search bar that is filtering a large database that we have.

The search bar is to narrow down the data and as of right now about 15,000 different options.

My programmer used Select2 and there’s a ton of input lag with using the search bar. When we click on the search bar, it takes about 2-3 seconds for the dropdown box to appear. When we type into the search box it takes 2-3 seconds for letters to appear. If we try to delete a letter, again, multiple seconds before a letter is deleted. When we select an option, it takes a couple seconds for it to actually get selected.

As of right now it’s populating these options from our database.

I’m trying to speed this up. My idea to test out was to bring all the options to the front end (since they won’t be changing too often). My thoughts are that since it’s trying to get the options from the database, the communication back and forth is what’s slowing it down.

But is it the fact that we have 15,000 options that’s slowing it down, or will moving it to the front end make the user experience less laggy/clunky.


It’s the number of items you’re trying to load into the Select2, not the communication with the database, that is slowing you down. Select2 slows down significantly after about 1000 items, and I recommend keeping it to under 500 items.

The “official” solution is to use the AJAX feature to query just the data that matches the user’s search term from the database. The filtering and sorting is done on the server side. You will probably have to write a server-side API endpoint that takes the query from Select2, formats it into a database query, retrieves the result set from the database, and returns it in the format required by Select2. See the AJAX section of the Select2 documentation for more details.