Activity Forums Salesforce® Discussions Non-selective query against large object type (more than 100000 rows)

  • Non-selective query against large object type (more than 100000 rows)

    Posted by Naman on March 23, 2016 at 11:29 am

    Non-selective query against large object type (more than 100000 rows). Consider an indexed filter or contact salesforce.com about custom indexing.
    Even if a field is indexed a filter might still not be selective when:
    1. The filter value includes null (for instance binding with a list that contains null)
    2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times)

    The above error occurs in production where we have approx 138000 records. We have couple of methods to get this rectified like optimizing the query, null check, making the field external id or unique. Somehow the query Threshold value goes out of the limit that is why the query is not selective. Any help would be appreciated.

    Naman replied 8 years, 9 months ago 1 Member · 2 Replies
  • 2 Replies
  • Naman

    Member
    March 28, 2016 at 7:26 am

    I have been touch with Salesforce support and also created case regarding the same. When trigger gives an error like above and you have tried almost everything to resolve it, that means your trigger exceeding the CPU time limit so the alternative solution could be a batch class processing the records.

  • Naman

    Member
    April 4, 2016 at 10:21 am

    Above was the one solution that Salesforce support gave initially. After using batch class too, trigger was not able to perform the operations so again contacted with Salesforce support and finally they enabled the custom indexing for some fields(in my case on account fields) which i have to use in my WHERE clause. This made my query run successfully and no more of "Non-selective query" error message come.

Log In to reply.

Popular Salesforce Blogs

Popular Salesforce Videos