You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What content needs to be added where in dumps to preserve current VFB functionality
Q. What axioms need to be present for automated classification of individuals?
A. (I think) KB content + ontologies is currently sufficient. We don't classify by neuron:neuron connectivity (@Clare72 is this true) and classification by neuron - region connectivity is currently opaque to data driven recording of connectivity (although that could change.
Q. What axioms need to be present for SPARQL generated neo labels?
A: All connectivity - but this step does not require reasoning.
Q. What axioms need to be present for reasoning generated neo labels?
A: KB + ontologies (I think)
Q. What A-box axioms need to be loaded into ELK to drive reasoning?
A. Currently only neuron-region connectivity is needed in addition to KB content + ontologies (although check API)
TODO Document how long each step takes.
The text was updated successfully, but these errors were encountered:
Nico's suggestion - use named graphs to exclude connectomics data from reasoning step
this will still require connectomics to be loaded and dumped to/from triplestore which is slow(ish). A faster solution will be to load connectomics OWL files later in the pipeline. However, this approach would require quite a bit of re-engineering of the Makefile. This is because the SPARQL-based neo: label addition runs against the triplestore and is used in a patsub to structure the Makefile and direct content to be loaded into the various endoints.
Neo4j needs all 3 graphs + label graphs as in current build
OWLERY needs graph1 + graph3. (pre)-reasoning is not needed. Needs to retain the filter step that removes annotation axioms.
SOLR needs graph + label graphs as in current build.
Super quick and dirty fix to get pipeline running again:
Remove loading of connectomics data to triplestore
Merge in all connectomics OWL files at dumps steps - owlery and neo get all connectomics.
Add additional script to add connectomics flag neo4j:labels directly in PDB & side load these to SOLR.
We will do this. @Robbie1977 will edit Makefile -> PR for us to review.
Experiment worth doing
Change reasoning step and owlery to use WHELK reasoner.
To be documented:
Q. What axioms need to be present for automated classification of individuals?
A. (I think) KB content + ontologies is currently sufficient. We don't classify by neuron:neuron connectivity (@Clare72 is this true) and classification by neuron - region connectivity is currently opaque to data driven recording of connectivity (although that could change.
Q. What axioms need to be present for SPARQL generated neo labels?
A: All connectivity - but this step does not require reasoning.
Q. What axioms need to be present for reasoning generated neo labels?
A: KB + ontologies (I think)
Q. What A-box axioms need to be loaded into ELK to drive reasoning?
A. Currently only neuron-region connectivity is needed in addition to KB content + ontologies (although check API)
The text was updated successfully, but these errors were encountered: