robot
robot copied to clipboard
duplicate_definition should take into account axiom annotations
When we combine ontologies, we copy definitions and add provenance. For example, we re-use the definition of HP:0032154 in MONDO:0005318:
[Term]
id: HP:0032154
name: Aphthous ulcer
def: "Oral aphthous ulcers typically present as painful, sharply circumscribed fibrin-covered mucosal defects with a hyperemic border." [PMID:25346356]
synonym: "Canker sore" EXACT layperson []
[Term]
id: MONDO:0005318
name: canker sore
def: "Oral aphthous ulcers typically present as painful, sharply circumscribed fibrin-covered mucosal defects with a hyperemic border." [HP:0032154, PMID:25346356]
I believe the duplicate_definition check should be relaxed to include axiom annotations as well. What do you think @cmungall @jamesaoverton @balhoff?
I think this makes sense. Other annotation checks should probably ignore the axiom annotations as well.
I'd like to see a PR with the updated SPARQL.
@matentzn I'm assigning this to you, to show us how the SPARQL would be different.
Are all robot report checks pure python? This is going to be impossible with pure SPARQL. The alternative would be to exclude definitions from the check with a particular axiom annotation like rdfs:isDefinedBy (or something new) assuming what follows is a reference saying: “this definition comes from elsewhere”.
ROBOT does not include any Python. All the ROBOT reports are in SPARQL: https://github.com/ontodev/robot/tree/master/robot-core/src/main/resources/report_queries
Sorry @jamesaoverton I meant to say: owl api.. Brain scrambled. Pure SPARQL, no OWL API; from your answer I assume the answer stays: pure SPARQL. If so.. tough one.
Implementing this is seriously too complicated. I think we should do something else: implement ROBOT normalise (https://github.com/ontodev/robot/issues/901), this way axioms like the ones causing this issue here would be merged first. As soon as the idea of ROBOT normalise is approved, we can close this issue here.