feat: index numerical and date fields in Solr with appropriate types + more targeted search result highlighting
What this PR does / why we need it:
Currently, all fields regardless of type are indexed in Solr as English text (text_en). With this PR, numerical and date fields are indexed in Solr with appropriate types:
| Field type defined in TSV | Field type indexed in Solr |
|---|---|
int |
plong |
float |
pdouble |
date |
date_range (solr.DateRangeField) |
I chose to index dates as DateRangeField because they can be used to represent dates to any precision, e.g. a day YYYY-MM-DD, a month YYYY-MM or a year YYYY. See: Date Formatting and Date Math :: Apache Solr Reference Guide
This matches the allowed formats in a date field as defined by Dataverse.
This means that range queries are now possible on numerical and date fields, e.g. exampleIntegerField:[25 TO 50] or exampleDateField:[2000-11-01 TO 2014-12-01].
Which issue(s) this PR closes:
This PR implements ranged queries as discussed in #370 (issue was already closed)
This issue is related to #8813 and https://github.com/IQSS/dataverse-frontend/issues/278 (the range queries that are now possible lay the groundwork for a nicer search facet UI)
Special notes for your reviewer:
For testing, I've created a sample TSV containing all relevant fields here.
Suggestions on how to test this:
- Load sample TSV and update + reload Solr schema as described in docs
- In the UI:
- Activate metadata block
- Activate facets for all three fields
- Create dataset with values in all three fields
- Run test range queries via the search bar, e.g.
exampleIntegerField:[25 TO 50]orexampleDateField:[2000-11-01 TO 2014-12-01] - Check that facets are working correctly
Does this PR introduce a user interface change? If mockups are available, please link/include them here:
Facets still look the same as before. There is only a small change in the highlighting of search results, see my comment below
Is there a release notes update needed for this change?:
Yes, there should be an info text describing the new feature + instructions for how to activate the feature:
- the Solr schema.xml needs to be updated
- all datasets need to be reindexed
Additional documentation:
/
coverage: 22.568% (-0.003%) from 22.571% when pulling 541946ef8aadbffbb8c6f37a4db5faede293cc82 on vera:feat/solr-field-types into e3b5795094bfecc4f8fc1477086f9625f5597d32 on IQSS:develop.
Additionally, I've set hl.requireFieldMatch to true:
If false, all query terms will be highlighted for each field to be highlighted (hl.fl) no matter what fields the parsed query refer to. If set to true, only query terms aligning with the field being highlighted will in turn be highlighted.
https://solr.apache.org/guide/solr/latest/query-guide/highlighting.html
Two reasons:
- Querying solr with a date range query with activated highlighting using the default (unified) highlighter without
requireFieldMatchtriggers a 500 error in Solr (see my post on the Solr mailing list. My guess is that Solr is attempting to highlight the matched date range within fields in a nonsensical way which triggers the error) - I think this improves the highlighting of search results, previously a match of my search term is highlighted anywhere even if I limited my query to a specific field, e.g. here "replication" is also highlighted in the title even though I limited my search specifically to the description:
With this change, the highlighting is limited to specific fields if the query is:
@qqmyers Would it make sense to include this feature in the next release? It's a rahter small adaption that improve the serach experience.
@johannes-darms it's a very cool feature that adds a lot of value, something I've wanted for years.
Let's see what @cmbz and @scolapasta think.
One question - is it possible to enter or have legacy values that don't fit the new types that would break indexing?
2024/10/15: Added to sprint ready after conversation with @pdurbin
@qqmyers good question. When I tried to enter to invalid data (non-integer in an integer field, non-float in a float field, or non-date in a date field), I got the following errors via the UI:
...and via the API:
{"status":"ERROR","message":"Validation Failed: Example integer field is not a valid integer. (Invalid value:edu.harvard.iq.dataverse.DatasetFieldValueValue[ id=null ]), Example floating point field is not a valid number. (Invalid value:edu.harvard.iq.dataverse.DatasetFieldValueValue[ id=null ]), Example date field is not a valid date. \"yyyy\" is a supported format. (Invalid value:edu.harvard.iq.dataverse.DatasetFieldValueValue[ id=null ]).java.util.stream.ReferencePipeline$3@6e9605fb"}
The validator code seems to be quite old, so I don't know if there could be any installations with legacy invalid values entered before it was added.
However, looking at the validator code, I found that date fields allow some formats which are not documented: YYYY followed by AD or BC, a "Bracket format" (not familiar with this), and datetime formats (yyyy-MM-dd'T'HH:mm:ss, yyyy-MM-dd'T'HH:mm:ss.SSS and yyyy-MM-dd HH:mm:ss).
When I add a dataset using one of those formats and try to index it, the indexing fails. The Dataverse log shows an error like dev_solr> org.apache.solr.common.SolrException: ERROR: [doc=dataset_2_draft] Error adding field 'exampleDateField'='2024-09-01 12:34:56' msg=Couldn't parse date because: Improperly formatted datetime: 2024-09-01 12:34:56 and the dataset is missing from the UI.
So, yes, there may be some installations with date field values using the above formats which would cause invisible datasets due to indexing errors. I am not sure how we should deal with this. Are those date formats intended to be officially fully supported/widely used?
If no, it might be OK to offer a workaround in the upgrade instructions like "Please ensure that all date fields containing legacy dates in formats other than YYYY, YYYY-MM, YYYY-MM-DD, or YYYY-MM-DDThh:mm:ssZ are not updated to the new date_range type in the Solr schema. Otherwise, datasets with these legacy dates will fail to index and disappear from your Dataverse page.".
If yes, this feature becomes a bit more complicated, because Solr does not support those date formats and we would need to work around that somehow.
@pdurbin I've just added an API test for the range queries as you suggested.
I don't know what is required but I wouldn't be surprised if BC dates are something people want and want to have indexed. Perhaps @jggautier would know more about what legacy values exist and what's required. W.r.t. validation - I'd also make sure that the API calls don't allow bad values - I know there was newer validation code just added there. W.r.t. the code, I'd definitely suggest doing checks for int, float, date values when indexing or somehow assure that a bad legacy value can't break the overall submission and we just drop that field rather than have the dataset not index. (I didn't see such a check but I might have missed it).
I wouldn't be surprised if BC dates are something people want and want to have indexed.
It wouldn't be a problem in general. Solr does support BC dates, however in a different format than YYYYBC:
-0009– The year 10 BC. A 0 in the year position is 0 AD, and is also considered 1 BC.
https://solr.apache.org/guide/solr/latest/indexing-guide/date-formatting-math.html
W.r.t. validation - I'd also make sure that the API calls don't allow bad values - I know there was newer validation code just added there.
Is the code doing checks for API-submitted datasets different from the code doing the UI checks? I assumed it was both the same code I linked above, since the error messages are the same.
I'd definitely suggest doing checks for int, float, date values when indexing or somehow assure that a bad legacy value can't break the overall submission and we just drop that field rather than have the dataset not index.
Yes, that would be nice.
Hi all. I haven't been following this issue closely enough to contribute and won't have the time to catch up. But I agree with Jim and encourage folks to look into how others have and are using these fields. My dataset at https://doi.org/10.7910/DVN/2SA6SN might be helpful for seeing who's using the fields in different ways. And the list of contacts in our spreadsheet of Dataverse installations might help for contacting particular installations to learn more.
Here's the related issue about BC dates:
- #10843
I've just pushed a commit that implements the suggestion above (if encountering a bad legacy value in an int/float/date field, just drop that field, but index the rest of the dataset).
So, if a dataset contains a bad legacy value in an int/float/date field, this means that queries on that field will not yield that dataset, since the field hasn't been indexed. But for any other query, the dataset will still be found. (I've also added a test showing this)
I think this is a small limitation. And we could add BC support relatively easily in a future PR.
@vera I poked your PR today. I'm very excited about this feature. I started looking into how existing fields will be affected. Please take a look at this PR of suggestions:
- https://github.com/vera/dataverse/pull/1
Also, it's expected that range queries do not produce highlights, right? I'm getting hits but no highlights:
Thank you! Yes, I think that's expected. I also haven't seen any highlighting for range queries, and didn't find any indication that Solr supports it in the documentation. I've just sent a mail to the Solr user mailing list to make sure. I'll take a look at your suggestions now. :)
@pdurbin Suggestions looked good, I have merged them!
I came across an issue while testing this in my local. After I load sample TSV and update + reload Solr schema as described in docs.
- Create a dataverse using the new Solr Metadata fields
- Create a dataset within the same collection, ensuring solr metadata fields are populated as follows: Integer: 22 Float: 19 Date: 2024
- Save and publish the dataset
- Go back to search and try to find the dataset
Issue: Dataset does not display on the UI after publishishing
https://github.com/user-attachments/assets/beaa6d1d-7f09-4d6f-8981-9e0265e5b13e
@ofahimIQSS can you please provide more of server.log?
Also, let's add @vera to the PR to let her know you're having some trouble. Maybe she can help
serverlog.txt attaching extended server.log file
@vera One more note it may be a problem getting solr configured/ not a code problem. --- I am still retesting on my end
@ofahimIQSS ah you're still re-testing. Yes, the error in your log...
dev_solr> org.apache.solr.common.SolrException: ERROR: [doc=dataset_65] unknown field 'exampleDateField'\
... means that you need to update your schema.xml file.
I'm not sure if this helps, but I added this one-liner...
curl http://localhost:8080/api/admin/index/solr/schema | docker run -i --rm -v ./docker-dev-volumes/solr/data:/var/solr gdcc/configbaker:unstable update-fields.sh /var/solr/data/collection1/conf/schema.xml
... to a PR I'm working on at #11024.
Hi Vera - I’ve tried to validate this ticket but could not see the datasets after I publish them on my local. Steps to reproduce:
- Build PR in local environment
- Load the solr tsv file
- Update Solr schema curl http://localhost:8080/api/admin/index/solr/schema | docker run -i --rm -v ./docker-dev-volumes/solr/data:/var/solr gdcc/configbaker:unstable update-fields.sh /var/solr/data/collection1/conf/schema.xml
- Restart Solr --- in local, go to solr container and run bin/solr restart
- Clear all data from Solr and start Async Reindex based on https://guides.dataverse.org/en/6.4/admin/solr-search-index.html
- Create a Collection with Solr Field Types Test Metadata
- Create a dataset within the collection and publish Issue: After publishing dataset, dataset is not appearing on the UI. Server.log file can be found below. server.log.txt
You can ignore this comment --- I have since resolved the issue from my end.
Overall, PR looks good. One observation I had was with the "Example Date Field". I can enter in a 2 digit value and save it as a date but the values specified for that field are (YYYY-MM-DD, YYYY-MM, or YYYY).
@ofahimIQSS thanks for testing! Yes, I am seeing the same behaviour. The document indexed in Solr for that metadata looks like this:
While exampleFloatField and exampleIntegerField are indexed, exampleDateField is not because "11" is an invalid date value (according to Solr).
I think this is a pre-existing bug/inconsistency in the date field validation code not caused by this PR. We might want to open an issue for that.
Hmm, this makes me wonder about two and three digit years from ancient history.
For example, the philosopher Epictetus was born in 50 and died in 135. It sounds like Dataverse doesn't want to index these as real dates. Do we have to pad them as 0050 and 0135? (I haven't tried this.)
This issue about BCE dates is related:
- #10843
To quote @vera from that issue, "Since the Solr search index underlying Dataverse supports BC dates (using the ISO 8601 format: 1 BC = +0000, 2 BC = -0001, and so on)."
Yes, padding with zeroes seems to be the way intended by ISO8601/Solr. I just tried inputting "0011" instead of "11" and it indexes fine.
Wanted to drop another observation (edge case) I found during testing related to display. When entering very long values in the meta data fields, the numbers go past the border instead of wrapping around.
To test, populate the Solr Field Types Test Metadata as follows:
Integer: add as many 0's followed by a 2. ie. 0000...00002
Floating Point: 2.01000290329039 - copy and paste everything after the decimal to make it a long
Date: 2024-11-11 (or any valid date)
@vera I'm afraid we had to bump the milestone to 6.6 but I think this feature will be very popular! Thanks again! ❤️
Testing Passed - Merging PR
https://github.com/user-attachments/assets/37f50c13-3ab4-4cfc-b5e2-6efda9d66d73
@vera and @ofahimIQSS can you please take a look at this follow-up PR I just made? Thanks.
- #11140