Page tree
Skip to end of metadata
Go to start of metadata

There is module-test in UnifiedViews/Core, which contains classes which may be used for testing the DPU outside of UnifiedViews environment. 

The DPU tests use JUnit as an environment. The base class for testing is TestEnvironment - this class provides environment in which DPUs can be executed. The preparation of the test environment can be split into several steps:

  • Create new testing environment
  • Prepare DPU's configuration
  • Prepare DPU instance and configure it
  • Add input and output data units
  • Run the execution
  • Check the results

The code sample below shows how to test a DPU, which has an input file data unit and output rdf data unit. It runs the DPU and then there is a possibility to get outputted triples and check that such triples correspond to what was expected. 

public class MyDpuTest {

    @Test
    public void execute() throws Exception {
        // Prepare config.
        MyDpuConfig_V1 config = new MyDpuConfig_V1();

        // Prepare DPU.
        MyDpu dpuInstance = new MyDpu();
        dpuInstance.configure((new ConfigurationBuilder()).setDpuConfiguration(config).toString());

        // Prepare test environment.
        TestEnvironment environment = new TestEnvironment();

        // Prepare data unit.
        FilesDataUnit filesInput = environment.createFilesInputFromResource("input", "cities.csv");
        WritableRDFDataUnit rdfOutput = environment.createRdfOutput("rdfOutput",false);

        try {
            // Run.
            environment.run(dpuInstance);

            //check the output, if it satisfies your requirements. You standard approaches for examining data units.   
            RDFHelper.getGraphs(rdfOutput);
            RepositoryConnection connection = null;
            try {
                connection = rdfOutput.getConnection();
                RepositoryResult<Statement> statements = connection.getStatements(null, null, null, false);
                while(statements.hasNext()) {
                    //TODO Assert
                }
            } finally {
                if (connection != null) {
                    connection.close();
                }
            }
        } finally {
            // Release resources.
            environment.release();
        }
    }

}

 

To configure certain file content as input to the DPU (so in this case cities.csv will be the only entry in the input file data unit; such file cities.csv has to be available at: src/test/resources/):

FilesDataUnit filesInput = environment.createFilesInputFromResource("input", "cities.csv");

To configure that certain RDF data input entry contains triples loaded from a file "input.ttl" (note: such file has to be available at: src/test/resources/):

WritableRDFDataUnit input = env.createRdfInput("rdfConfig", false);
 
InputStream inputStream = Thread.currentThread().getContextClassLoader().getResourceAsStream("input.ttl");
URI graph = input.addNewDataGraph("test"); 
 
RepositoryConnection connection = input.getConnection();
connection.add(inputStream, "", RDFFormat.TURTLE, graph);

 

 

 

For further examples of such testing, see: 

 

 Notes: 

  • Files to be loaded as input files should appear in "src/test/resources". Then they can be loaded via FilesDataUnit filesInput = environment.createFilesInputFromResource("input", "fileName");

  • Use ConfigurationBuilder  to convert configuration into string form, so it can be set do DPU - See here the example.

 

Outdated content, revision is needed: 

The following code shows simple test used for core DPU SPARQL Transformer - SPARQL Transformer is populate with certain testing data and produces certain output data which may be checked.

@Test
public void constructAllTest() throws Exception {
// prepare dpu instance and configure it SPARQLTransformer
trans = new SPARQLTransformer();
SPARQLTransformerConfig config = new SPARQLTransformerConfig();
config.isConstructType = true;
config.SPARQL_Update_Query = "CONSTRUCT {?s ?p ?o} where {?s ?p ?o }";
trans.configureDirectly(config);
// prepare test environment, we use system tmp directory
TestEnvironment env = TestEnvironment.create();
// prepare input and output data units
// here we can simply pre-fill input data unit with content from
// resource file
RDFDataUnit input = env.createRdfInputFromResource("input", false, "metadata.ttl", RDFFormat.TURTLE);
RDFDataUnit output = env.createRdfOutput("output", false);
// first test - check that something has been loaded into input data unit
assertTrue(input.getTripleCount() > 0); 
try {
// run the execution
env.run(trans);
// verify result
assertTrue(input.getTripleCount() == output.getTripleCount());
} finally {
// release resources env.release();
}
}
 

There is also possibility to use Virtuoso for RDF data during testing. The connection to virtuoso for purpose of DPU tests must be configured separately in test class as shows the following example.

@BeforeClass
public static void virtuoso() {
// Adjust this to your virtuoso configuration.
TestEnvironment.virtuosoConfig.host = "localhost";
TestEnvironment.virtuosoConfig.port = "1111";
TestEnvironment.virtuosoConfig.user = "dba";
TestEnvironment.virtuosoConfig.password = "dba";
}

Then you can simply use env.createRdfOutput("output", true); instead of env.createRdfOutput("output", false); (i.e., just change boolean value) and test environment will use given virtuoso instance instead of local RDF storage. 

User can also specify additional information that can be used during tests, such as

  • path to jar files where the DPU is packed as OSGi bundle
  • time of last execution

See methods are on the TestEnvironment class.

  • No labels