Personal tools
You are here: Home / Eagle / Ontology Content Type for Plone

Ontology Content Type for Plone

Plone is a natural choice for managing ontologies and to act as an ontology repository.

Using Plone to store local and remote ontologies means that we can:

  • Use workflow to establish the state of an ontology and what needs to be done to e.g. publish an ontology.
  • Add extra useful information to the ontologies, either as separate "metadata", or included in the ontologies themselves.  Some notes on this:
    • We need to carefully model how ontologies are used in practice, e.g. sometimes an ontology might have multiple quite different versions all with the same URI; sometimes a new version will be at a new URI and there should a) be a relation with the previous version and b) preferably a "migration" ontology interpreting the old ontology in terms of the new, where feasible.
    • We need to attribute our information to the ontology separately to the "pristine" ontology, but it would still be good to incorporate that new information into the ontology.  One way to do this, especially if the ontology is stored within Plone with a new URL, is to consider that new URL as the name of the local graph, and make statements about that, rather than about the original ontology URI.
    • Most of this modelling is going to be "annotational" and not have much bearing on reasoning.
  • Storing a remote ontology locally, we need to consider some issues:
    • The remote ontology may not be fetch-able from the ontology URI, so we'd just be uploading some document we'd got hold of.
    • If the remote ontology is fetch-able, we're basically doing caching.  Coupled with the issue that the remote ontology may change even without changing its URI (or even changing its URI, but keeping at the same URL!), we'd need to deal with the consequences of keeping the cache fresh.
  • We can programmatically apply checks for "best practice":
    • The ontology is fetch-able from its ontology URI (the knock on effect of this for local ontologies is that we should allow ontologies to be put in some decent path within our registry, e.g. /ontologies/YYYY/MM/ont).
    • Subclasses of ontology for "interpretation" and "migration" ontology, along with constraints on their contents (e.g. should import other ontologies).
    • Check that the ontology declares terms with URIs strictly underneath the ontology URI.
    • Run Pellint and show a report.
    • More esoteric stuff, e.g.: deprecate global rdfs:domain and rdfs:range constraints; check for multiple rdfs:range intersection rather than union issue; check that top level classes are disjoint; etc.
  • We can offer up a "trusted repository", where we sign off the ontologies in the repository.
  • We can offer some simple canned searches:
    • All ontologies containing some URI or URI prefix, useful for when an application receives some RDF with a URI it doesn't recognise.
    • Any potential interpretation ontologies for a given set of ontologies.

We could also do some of the following things, but given that there are already "Ontology Management" tools out there which will probably do a better job, we can consider these as lower priority:

  • Visualise the contents of the ontologies, e.g. a human readable form for class hierarchies, etc.; dependency graphs...
  • We could add/remove axioms in the ontologies to hammer them into shape.
  • Dump all the ontologies into Jena/Pellet so we can offer up a SPARQL-DL endpoint.  This could be handy for reports and introspection, esp. coupled with some named-graph use cases (signing, "metadata", ontology patching -- fixing broken ontologies without hurting the original).

Document Actions