Mercurial > repos > mikel-egana-aranguren > oppl
comparison inference.xml @ 19:cc270db37d33 draft
Directories re-arranged
author | Mikel Egana Aranguren <mikel-egana-aranguren@toolshed.g2.bx.psu.edu> |
---|---|
date | Sat, 06 Oct 2012 21:50:39 +0200 |
parents | OPPL/inference.xml@d3616fac4ca5 |
children |
comparison
equal
deleted
inserted
replaced
18:d3616fac4ca5 | 19:cc270db37d33 |
---|---|
1 <tool id="inference" name="Perform inference in an OWL ontology" version="1.0.1"> | |
2 <description>It performs inference in an OWL ontology and it generates a new ontology with the inferred axioms as asserted axioms</description> | |
3 | |
4 <!-- For big ontologies I use -Xmx3000M -Xms250M -DentityExpansionLimit=1000000000 If that's too much for your machine simply delete or modify at will, but since Galaxy is usually used in a server setting it makes sense to use a big chunk of memory --> | |
5 | |
6 <command> | |
7 java -Xmx3000M -Xms250M -DentityExpansionLimit=1000000000 -jar ${__tool_data_path__}/shared/jars/inference.jar $input $reasoner $axioms > $output | |
8 </command> | |
9 | |
10 <inputs> | |
11 <param name="input" type="data" label="Input ontology file"/> | |
12 <param name="reasoner" type="select" label="Choose reasoner"> | |
13 <option value="Pellet" selected="true">Pellet</option> | |
14 <option value="HermiT">HermiT</option> | |
15 <option value="FaCTPlusPlus">FaCT++</option> | |
16 <option value="Elk">Elk (Not all axioms supported)</option> | |
17 </param> | |
18 <param name="axioms" type="select" display="checkboxes" multiple="true" label="Select what axioms to add as asserted"> | |
19 <option value="CLASS_ASSERTIONS">CLASS_ASSERTIONS</option> | |
20 <option value="CLASS_HIERARCHY">CLASS_HIERARCHY</option> | |
21 <option value="DATA_PROPERTY_HIERARCHY">DATA_PROPERTY_HIERARCHY</option> | |
22 <option value="DISJOINT_CLASSES">DISJOINT_CLASSES</option> | |
23 <option value="OBJECT_PROPERTY_HIERARCHY">OBJECT_PROPERTY_HIERARCHY</option> | |
24 </param> | |
25 </inputs> | |
26 <outputs> | |
27 <data format="text" name="output" /> | |
28 </outputs> | |
29 <!--<tests> | |
30 <test> | |
31 <param name="input" value="test.owl"/> | |
32 <param name="reasoner" value="Pellet"/> | |
33 <param name="axioms" value="CLASS_ASSERTIONS,CLASS_HIERARCHY,OBJECT_PROPERTY_HIERARCHY"/> | |
34 <output name="out_file" file="test_new.owl"/> | |
35 </test> | |
36 </tests>--> | |
37 <help> | |
38 | |
39 **About Inference-Galaxy** | |
40 | |
41 Inference-Galaxy offers the possibility of performing automated reasoning in an ontology and then injecting the inferred axioms as asserted axioms, generating a new OWL ontology. | |
42 | |
43 **Usage** | |
44 | |
45 An ontology is needed as input: load it with Get Data >> Upload File from your computer or redirect the output of another galaxy tool. Inference-Galaxy uses the OWL API, and therefore it can load any ontology format that such API is able to load: OBO flat file, OWL (RDF/XML, OWL/XML, Functional, Manchester), turtle, and KRSS. In case the loaded ontology includes OWL imports, Inference-Galaxy will try to resolve them. | |
46 | |
47 The reasoner can be Pellet, HermiT, FaCT++ or Elk. | |
48 | |
49 The inferred axioms to add as asserted axioms can be chosen. | |
50 | |
51 **Contact** | |
52 | |
53 Please send any request or comment to mikel.egana.aranguren@gmail.com. | |
54 | |
55 </help> | |
56 | |
57 </tool> |