sunspot_index_queue
sunspot_index_queue copied to clipboard
Document not found error with Mongoid
Query processing is launched in loop with timeout in 30 seconds. It works great until some records from the mongoid model will not be deleted. After that all the queries fails and data will not be deleted from the index.
Document not found for class MongoDoc with id(s) 4f47a487d6194c6ac1000001, 4f47a488d6194c6ac1000008, 4f47a488d6194c6ac100000b, 4f47a488d6194c6ac1000010, 4f47a488d6194c6ac1000013, 4f47a489d6194c6ac1000018, 4f47a48ad6194c6ac100003c, 4f47a489d6194c6ac100001b, 4f47a489d6194c6ac1000024, 4f47a489d6194c6ac1000029, 4f47a489d6194c6ac100002e, 4f47a489d6194c6ac1000036, 4f47a48ad6194c6ac100003b, 4f47a48ad6194c6a
How are the models not able to be deleted?
The problem is in the Sunspot::Mongoid::DataAccessor#load_all(ids) method. Finally it tries to load the records in this way
def load_all(ids)
criteria(ids)
end
private
def criteria(id)
@clazz.criteria.find(id)
end
But find raises error if records not found. This is reproducible in this way:
- Stop queue processing.
- Create few mongo db docs (so they will be queued for indexing).
- Delete these docs.
- Try to process the queue.
BTW, active record is more adequate in this case and doesn't raise the error. So, problem is completely in mongoid.
I have solved this issue in this way:
module Sunspot
module Mongoid
class DataAccessor
def load_all(ids)
@clazz.where(:_id.in => ids).to_a
end
end
end
end