Skip to content

Commit

Permalink
Merge branch 'main' into 161-case-insensitive-find-entity-by-partial-…
Browse files Browse the repository at this point in the history
…sign-community-and-pattern
  • Loading branch information
GCHQDeveloper42 authored Jan 18, 2024
2 parents c2dca67 + b232647 commit ef10919
Show file tree
Hide file tree
Showing 525 changed files with 4,411 additions and 6,858 deletions.
44 changes: 44 additions & 0 deletions .github/workflows/maven.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# This workflow will build a Java project with Maven, and cache/restore any dependencies to improve the workflow execution time
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-java-with-maven

# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.

name: Maven Build

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

jobs:
build:

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
java-version: '17'
distribution: 'temurin'
cache: maven
- name: Build with Maven
run: mvn -B package --file pom.xml
javadocs:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
java-version: '17'
distribution: 'temurin'
cache: maven
- name: Generate Javadocs
run: mvn javadoc:aggregate
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,19 @@
# Magma Core

Magma Core is a lightweight set of Java classes to enable [HQDM data objects](https://github.com/gchq/HQDM) to be created and used as RDF Linked Data through [Apache Jena](https://jena.apache.org).
Magma Core is a lightweight set of Java classes to enable [HQDM](https://github.com/hqdmTop/hqdmFramework/wiki) data objects to be created and used as RDF Linked Data through [Apache Jena](https://jena.apache.org).

## Introduction

Representing things that are of common interest in data presents a consistency challenge. How can we use the apparent logic of computer based systems to store, process and query without suffering the losses due to the use of different models, structures and requirements on which interconnected systems are based? There is a lot to be gained from achieving consistency of representation, from addressing the core challenges to do with data quality and information management through to enabling emerging topics like Digital Twins & Artificial Intelligence. Inconsistent representations in data also undermines the ability to achieve improved information security, policy controls and ensuring that information is fit for its intended purpose.

This code release is a contribution to support the adoption of data models that have properties that enable consistency of representation, particularly in the face of emerging and changing requirements. The theory on which the model that this repo adopts is based upon the entity-relationship model published by Dr. Matthew West\* as the [High Quality Data Model Framework](http://www.informationjunction.co.uk/hqdm_framework/). At its core are some ontological commitments drawn upon the identity of actual things being based on distinct existence in space-time. A collection of arguments, including Set Theoretic commitments, provide a powerful model framework that can be extended to (almost) any application area of interest. This approach is often summarised as 4-dimensionalism and is being incorporated into the [UK's Information Management Framework]() (part of the National Digital Twin), as the Foundation Data Model - a backbone of distributed information systems that are sufficiently integrated to address system-wide data quality. One of the challenges in adopting such models is simultaneously addressing the analytic & technological challenges. This release is provided to help those interested in learning about and adopting such models by lowering the technical barriers.
This code release is a contribution to support the adoption of data models that have properties that enable consistency of representation, particularly in the face of emerging and changing requirements. The theory on which the model that this repo adopts is based upon the entity-relationship model published by Dr. Matthew West\* as the [High Quality Data Model Framework](https://github.com/hqdmTop/hqdmFramework/wiki). At its core are some ontological commitments drawn upon the identity of actual things being based on distinct existence in space-time. A collection of arguments, including Set Theoretic commitments, provide a powerful model framework that can be extended to (almost) any application area of interest. This approach is often summarised as 4-dimensionalism and is being incorporated into the UK's Information Management Framework (part of the National Digital Twin), as the Foundation Data Model - a backbone of distributed information systems that are sufficiently integrated to address system-wide data quality. One of the challenges in adopting such models is simultaneously addressing the analytic & technological challenges. This release is provided to help those interested in learning about and adopting such models by lowering the technical barriers.

\* Dr. West was not involved in the creation of this project. GCHQ are grateful for his open contributions to the field of ontology and data quality management.


## The High Quality Data Model for Data Integration in Java

HQDM contains the replication of an openly available data model based on key ontological foundations to enable the consistent integration of data. The HQDM Java package comprises a set of Java classes and respective interfaces, 230 in total, to replicate the entity-relationship model published by Dr Matthew West as the [High Quality Data Model Framework](http://www.informationjunction.co.uk/hqdm_framework/). This class model can be used to create extensions of the entity types, based on the founding ontological commitments and logical restrictions (such as cardinalities), and instances of those types all in Java application code. This, in theory at least, provides a framework for the consistent representation of almost anything that is, or could be, real\*. All the data model patterns published in the HQDM framework are supported by the HQDM package. The object properties are constructed around a top-level Java HQDM Object class with some root attributes to enable class-instances to be managed in a database. The choice of database can be left to the user but the structure of the attributes is optimised for the use of [Linked Data IRIs](https://www.w3.org/TR/rdf11-concepts/#section-IRIs) and [RDF triples](https://www.w3.org/TR/rdf11-concepts/#section-triples) to represent HQDM object relationships and other object properties as predicates. All of the HQDM objects can be created and searched using the HQDMObject methods and collections can be handled using the Object Factory. To complement this there is an OWL version of the HQDM data model that is a close match for the original EXPRESS model and the HQDM Java package.
HQDM contains the replication of an openly available data model based on key ontological foundations to enable the consistent integration of data. The HQDM Java package comprises a set of Java classes and respective interfaces, 230 in total, to replicate the entity-relationship model published by Dr Matthew West as the [High Quality Data Model Framework](https://github.com/hqdmTop/hqdmFramework/wiki). This class model can be used to create extensions of the entity types, based on the founding ontological commitments and logical restrictions (such as cardinalities), and instances of those types all in Java application code. This, in theory at least, provides a framework for the consistent representation of almost anything that is, or could be, real\*. All the data model patterns published in the HQDM framework are supported by the HQDM package. The object properties are constructed around a top-level Java HQDM Object class with some root attributes to enable class-instances to be managed in a database. The choice of database can be left to the user but the structure of the attributes is optimised for the use of [Linked Data IRIs](https://www.w3.org/TR/rdf11-concepts/#section-IRIs) and [RDF triples](https://www.w3.org/TR/rdf11-concepts/#section-triples) to represent HQDM object relationships and other object properties as predicates. All of the HQDM objects can be created and searched using the HQDMObject methods and collections can be handled using the Object Factory. To complement this there is an OWL version of the HQDM data model that is a close match for the original EXPRESS model and the HQDM Java package.

\* This is a gross simplification, but it characterises the goal of the model and in use it has proved to be very capable. The UK's National Digital Twin programme is developing a model that aims to address this goal with even more rigour, called the Foundation Data Model (FDM). Data created using HQDM is likely to be mappable to the FDM with low mapping (due to similar ontological commitments).

Expand All @@ -32,9 +32,9 @@ Magma Core can be incorporated into other maven projects using the following dep

```xml
<dependency>
<groupId>uk.gov.gchq.magmacore</groupId>
<groupId>uk.gov.gchq.magma-core</groupId>
<artifactId>core</artifactId>
<version>1.0</version>
<version>3.0.2</version>
</dependency>
```

Expand All @@ -46,10 +46,10 @@ We welcome contributions to the project. Detailed information on our ways of wor

In brief:

- Sign the [GCHQ Contributor Licence Agreement](https://cla-assistant.io/gchq/MagmaCore).
- Sign the [GCHQ Contributor License Agreement](https://cla-assistant.io/gchq/MagmaCore).
- Push your changes to a new branch.
- Submit a pull request.

## License

Magma Core is released under the [Apache 2.0 Licence](https://www.apache.org/licenses/LICENSE-2.0) and is covered by [Crown Copyright](https://www.nationalarchives.gov.uk/information-management/re-using-public-sector-information/copyright-and-re-use/crown-copyright/).
Magma Core is released under the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0) and is covered by [Crown Copyright](https://www.nationalarchives.gov.uk/information-management/re-using-public-sector-information/copyright-and-re-use/crown-copyright/).
2 changes: 0 additions & 2 deletions checkstyle-suppressions.xml
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,6 @@
<!-- hqdm-core suppressions -->
<suppress checks="MethodName" files="impl/*" />
<suppress checks="TypeName" files="Function_.java" />

<!-- hqdm-rdf suppressions -->
<suppress checks="ConstantName" files="HQDM.java" />
<suppress checks="AbbreviationAsWordInName" files="IRI.java|HQDM.java|RDFS.java|UID.java" />
</suppressions>
2 changes: 1 addition & 1 deletion core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
</dependency>
<dependency>
<groupId>uk.gov.gchq.magma-core</groupId>
<artifactId>hqdm-rdf</artifactId>
<artifactId>hqdm</artifactId>
</dependency>
<dependency>
<groupId>junit</groupId>
Expand Down
3 changes: 1 addition & 2 deletions core/src/main/java/module-info.java
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,7 @@
requires com.fasterxml.jackson.annotation;
requires java.net.http;

requires uk.gov.gchq.magmacore.hqdm;
requires transitive uk.gov.gchq.magmacore.hqdm.rdf;
requires transitive uk.gov.gchq.magmacore.hqdm;

exports uk.gov.gchq.magmacore.service;
exports uk.gov.gchq.magmacore.service.dto;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,14 @@
public interface MagmaCoreDatabase {

/**
* Start a transaction in READ mode and which will switch to WRITE if an update is attempted but
* only if no intermediate transaction has performed an update.
* Start a transaction in READ mode.
*/
void begin();
void beginRead();

/**
* Start a transaction in Write mode.
*/
void beginWrite();

/**
* Commit a transaction - Finish the current transaction and make any changes permanent (if a
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,14 @@
import java.util.stream.Collectors;

import org.apache.jena.query.Dataset;
import org.apache.jena.query.DatasetFactory;
import org.apache.jena.query.Query;
import org.apache.jena.query.QueryExecution;
import org.apache.jena.query.QueryExecutionFactory;
import org.apache.jena.query.QueryFactory;
import org.apache.jena.query.QuerySolution;
import org.apache.jena.query.ResultSet;
import org.apache.jena.query.TxnType;
import org.apache.jena.rdf.model.Literal;
import org.apache.jena.rdf.model.Model;
import org.apache.jena.rdf.model.ModelFactory;
Expand Down Expand Up @@ -69,7 +71,7 @@ public class MagmaCoreJenaDatabase implements MagmaCoreDatabase {
* Constructs a MagmaCoreJenaDatabase with a new in-memory Jena dataset.
*/
public MagmaCoreJenaDatabase() {
dataset = TDB2Factory.createDataset();
dataset = DatasetFactory.createTxnMem();
}

/**
Expand Down Expand Up @@ -113,9 +115,21 @@ public void register(final IriBase base) {
* {@inheritDoc}
*/
@Override
public void begin() {
public void beginRead() {
if (!dataset.isInTransaction()) {
dataset.begin();
dataset.begin(TxnType.READ);
} else {
throw new IllegalStateException("Already in a transaction");
}
}

/**
* {@inheritDoc}
*/
@Override
public void beginWrite() {
if (!dataset.isInTransaction()) {
dataset.begin(TxnType.WRITE);
} else {
throw new IllegalStateException("Already in a transaction");
}
Expand Down Expand Up @@ -178,7 +192,7 @@ public Thing get(final IRI iri) {
public void create(final Thing object) {
final Model defaultModel = dataset.getDefaultModel();

final Resource resource = defaultModel.createResource(object.getId());
final Resource resource = defaultModel.createResource(object.getId().getIri());

object.getPredicates().forEach((iri, predicates) -> predicates.forEach(value -> {
if (value instanceof IRI) {
Expand Down Expand Up @@ -425,7 +439,7 @@ public final List<Thing> toTopObjects(final QueryResultList queryResultsList) {
*/
@Override
public void dump(final PrintStream out) {
begin();
beginRead();
final Model model = dataset.getDefaultModel();
final StmtIterator statements = model.listStatements();

Expand All @@ -443,7 +457,7 @@ public void dump(final PrintStream out) {
* @param language RDF language syntax to output data as.
*/
public final void dump(final PrintStream out, final Lang language) {
begin();
beginRead();
RDFDataMgr.write(out, dataset.getDefaultModel(), language);
abort();
}
Expand All @@ -455,7 +469,7 @@ public final void dump(final PrintStream out, final Lang language) {
* @param language RDF language syntax to output data as.
*/
public final void load(final InputStream in, final Lang language) {
begin();
beginWrite();
final Model model = dataset.getDefaultModel();
RDFDataMgr.read(model, in, language);
commit();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,9 +88,19 @@ public MagmaCoreRemoteSparqlDatabase(final String serviceUrl, final Dataset data
/**
* {@inheritDoc}
*/
public final void begin() {
public final void beginRead() {
if (!connection.isInTransaction()) {
connection.begin(TxnType.READ);
} else {
throw new IllegalStateException("Already in a transaction");
}
}

/**
* {@inheritDoc}
*/
public final void beginWrite() {
if (!connection.isInTransaction()) {
// The default TxnType.READ_PROMOTE is not supported.
connection.begin(TxnType.WRITE);
} else {
throw new IllegalStateException("Already in a transaction");
Expand Down Expand Up @@ -153,7 +163,7 @@ public void create(final Thing object) {

final Model model = ModelFactory.createDefaultModel();

final Resource resource = model.createResource(object.getId());
final Resource resource = model.createResource(object.getId().getIri());

object.getPredicates().forEach((iri, predicates) -> predicates.forEach(value -> {
if (value instanceof IRI) {
Expand Down Expand Up @@ -431,7 +441,7 @@ public final void dump(final PrintStream out, final Lang language) {
* @param language RDF language syntax to output data as.
*/
public final void load(final InputStream in, final Lang language) {
begin();
beginWrite();
final Dataset dataset = connection.fetchDataset();
final Model model = dataset.getDefaultModel();
RDFDataMgr.read(model, in, language);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -698,7 +698,7 @@ public DbTransformation createDbTransformation(final Collection<? extends Thing>
private static DbChangeSet toDbChangeSet(final Thing thing) {

// Map the Thing's predicates to DbCreateOperation objects.
final IRI iri = new IRI(thing.getId());
final IRI iri = thing.getId();
final List<DbCreateOperation> creates = thing.getPredicates()
.entrySet()
.stream()
Expand Down Expand Up @@ -750,7 +750,7 @@ public Thing get(final IRI iri) {
*/
public Thing getInTransaction(final IRI iri) {
try {
database.begin();
database.beginRead();
final Thing result = database.get(iri);
database.commit();
return result;
Expand All @@ -765,9 +765,25 @@ public Thing getInTransaction(final IRI iri) {
*
* @param func {@link Function} to run.
*/
public void runInTransaction(final Function<MagmaCoreService, MagmaCoreService> func) {
public void runInReadTransaction(final Function<MagmaCoreService, MagmaCoreService> func) {
try {
database.begin();
database.beginRead();
func.apply(this);
database.commit();
} catch (final Exception e) {
database.abort();
throw e;
}
}

/**
* Run a {@link Function} in a transaction.
*
* @param func {@link Function} to run.
*/
public void runInWriteTransaction(final Function<MagmaCoreService, MagmaCoreService> func) {
try {
database.beginWrite();
func.apply(this);
database.commit();
} catch (final Exception e) {
Expand All @@ -784,7 +800,7 @@ public void runInTransaction(final Function<MagmaCoreService, MagmaCoreService>
*/
public Map<String, Thing> findByEntityNameInTransaction(final List<String> entityNames) {
try {
database.begin();
database.beginRead();
final HashMap<String, Thing> result = new HashMap<String, Thing>();

for (final String name : entityNames) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@

/**
* A Data Transfer Object for sign and pattern data.
*
* @param sign Some sign.
* @param pattern Some pattern.
* @param representationByPattern Some representationByPattern.
*/
public record SignPatternDto(String sign, String pattern, String representationByPattern) {
}
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ public MagmaCoreService apply(final MagmaCoreService mcService) {
final Thing thing = mcService.get(subject);

if (thing == null) {
final Thing newThing = SpatioTemporalExtentServices.createThing(subject.getIri());
final Thing newThing = SpatioTemporalExtentServices.createThing(subject);
newThing.addValue(predicate, object);
mcService.create(newThing);
} else {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -586,7 +586,7 @@ public class DataIntegrityReport {
* @return A {@link List} of {@link Thing} that represent data integrity errors.
*/
public static List<Thing> verify(final MagmaCoreDatabase db) {
db.begin();
db.beginRead();

final List<Thing> errors = new ArrayList<>();

Expand Down
Loading

0 comments on commit ef10919

Please sign in to comment.