Is there any resolution to the slowness in execution in case of 1000+ options in combobox

I am using select2 to create searchable combobox. It works fine with less options in select but the performance degrades as the options become more than 1000. Is there any resolution to this issue

I would say you should try to redesign your application so it doesn’t require you to have 1000+ items in your combobox at one time. However, I’ve seen a slight performance degradation with data sets of around 1000 items, but it’s certainly not unusable, even when doing special things like highlighting the search term in matched items. For example, see my CodePen, which has 2000 data items, and seems pretty speedy to me (at least on my laptop; it would probably be slower on a mobile device).

I suppose if your data elements themselves are large or complicated, that could affect performance. Again, I would suggest redesigning your application so you don’t need to hold so many data items in memory at one time.

It seems like you’re getting your results from a database. I recommend that you use AJAX while passing in the delay parameter. Check out the example below which serves up data from an Oracle database.

HTML:

<select class="searchSelect2" name="item_id" id="item_id" required></select>

JavaScript:

$(document).ready(function () {

 $('.searchSelect2).select2({

    placeholder: '---Select an Item---',
    ajax: {

      url: 'company_data.php',          
      type: 'GET',
      dataType: 'json',
      delay: 250,
      data: function (params) {
        return {
            q: params.term
        };
      },
      cache: true
    }
 });    
});

Script file:

$search_param = '%'.strtoupper($_GET['q']).'%';
$strQuery = "SELECT item_id, item_name FROM tbl_items where 
item_name like :search_bv ORDER BY item_name";

$result = oci_parse($connection, $strQuery);
oci_bind_by_name($result, ':search_bv', $search_param);
oci_execute($result);
$json = [];

while($row = oci_fetch_array($result, OCI_BOTH)){

    $json[] = ['id'=>intval($row['ITEM_ID']), 'text'=>$row['ITEM_NAME']];
}
$data = ['results' => $json];
echo json_encode($data);

This allowed me to easily manage a dataset of about 20,000 records. When the user types in some text, the request kicks in 250 milliseconds after they’re done typing and filters the results based on their input. Even if they amend their selection, it waits again and once they’re done typing, serves up the new dataset.