Skip to content

Commit

Permalink
Database partition mode Part 2 (#6409)
Browse files Browse the repository at this point in the history
* First pass at segmenting out changes

* Test fixes

* Test fixes

* Test fix

* Work on tests

* Fix test

* Test fix

* Test fix

* Fixes

* Test fixes

* Test fix

* License headers

* Test fix

* Add changelog

* Address review comments

* Review comments

* Fixes

* Work on tests

* Address review comments

* Fix

* Add tests

* Cleanup

* Rename JpaPidValueTuples

* Add changelog

* Spotless

* Resolve compile issues

* Fix signature issue

* Test fix

* Add header

* Spotless

* Test cleanup

* Work on merge

* Work on tests
  • Loading branch information
jamesagnew authored Dec 2, 2024
1 parent c9c8371 commit 744a0d7
Show file tree
Hide file tree
Showing 98 changed files with 3,820 additions and 1,011 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ public List<IBaseResource> fetchAllConformanceResources() {
if (myCodeSystems != null) {
retVal.addAll(myCodeSystems.values());
}
if (myStructureDefinitionResources != null) {
if (myStructureDefinitions != null) {
retVal.addAll(myStructureDefinitions.values());
}
if (myValueSets != null) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,22 @@
*/
package ca.uhn.fhir.rest.param;

import java.util.Collections;
import java.util.Map;

public class HistorySearchDateRangeParam extends DateRangeParam {
/**
* Constructor
*
* @since 8.0.0
*/
public HistorySearchDateRangeParam() {
this(Collections.emptyMap(), new DateRangeParam(), null);
}

/**
* Constructor
*/
public HistorySearchDateRangeParam(
Map<String, String[]> theParameters, DateRangeParam theDateRange, Integer theOffset) {
super(theDateRange);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ public static String formatFileSize(long theBytes) {
if (theBytes <= 0) {
return "0 " + UNITS[0];
}
int digitGroups = (int) (Math.log10(theBytes) / Math.log10(1024));
int digitGroups = (int) (Math.log10((double) theBytes) / Math.log10(1024));
digitGroups = Math.min(digitGroups, UNITS.length - 1);
return new DecimalFormat("###0.#").format(theBytes / Math.pow(1024, digitGroups)) + " " + UNITS[digitGroups];
}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
type: fix
issue: 6409
title: "When performing a `_history` query using the `_at` parameter, the time value
is now converted to a zoned-date before being passed to the database. This should
avoid conflicts around date changes on some databases.
"
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
type: perf
issue: 6409
title: "When searching in versioned tag mode, the JPA server now avoids a redundant
lookup of the un-versioned tags, avoiding an extra unnecessary database query
in some cases.
"
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
type: perf
issue: 6409
title: "The JPA server will no longer use the HFJ_RES_VER_PROV table to store and index values from
the `Resource.meta.source` element. Beginning in HAPI FHIR 6.8.0 (and Smile CDR 2023.08.R01), a
new pair of columns have been used to store data for this element, so this change only affects
data which was stored in HAPI FHIR prior to version 6.8.0 (released August 2023). If you have
FHIR resources which were stored in a JPA server prior to this version, and you use the
Resource.meta.source element and/or the `_source` search parameter, you should perform a complete
reindex of your server to ensure that data is not lost. See the upgrade notes for more information.
"
Original file line number Diff line number Diff line change
@@ -1,4 +1,20 @@
# Upgrade Notes

The JPA server stores values for the field `Resource.meta.source` in dedicated columns in its database so that they can be indexes and searched for as needed, using the `_source` Search Parameter.

Prior to HAPI FHIR 6.8.0 (and Smile CDR 2023.08.R01), these values were stored in a dedicated table called `HFJ_RES_VER_PROV`. Beginning in HAPI FHIR 6.8.0 (Smile CDR 2023.08.R01), two new columns were added to the `HFJ_RES_VER`
table which store the same data and make it available for searches.

As of HAPI FHIR 8.0.0, the legacy table is no longer searched by default. If you do not have Resource.meta.source data stored in HAPI FHIR that was last created/updated prior to version 6.8.0, this change will not affect you and no action needs to be taken.

If you do have such data, you should follow the following steps:

* Enable the JpaStorageSettings setting `setAccessMetaSourceInformationFromProvenanceTable(true)` to configure the server to continue using the legacy table.

* Perform a server resource reindex by invoking the [$reindex Operation (server)](https://smilecdr.com/docs/fhir_repository/search_parameter_reindexing.html#reindex-server) with the `optimizeStorage` parameter set to `ALL_VERSIONS`.

* When this reindex operation has successfully completed, the setting above can be disabled. Disabling this setting avoids an extra database round-trip when loading data, so this change will have a positive performance impact on your server.

# Fulltext Search with _lastUpdated Filter

Fulltext searches have been updated to support `_lastUpdated` search parameter. A reindexing of Search Parameters
is required to migrate old data to support the `_lastUpdated` search parameter.
Fulltext searches have been updated to support `_lastUpdated` search parameter. If you are using Advanced Hibernate Search indexing and wish to use the `_lastUpdated` search parameetr with this feature, a full reindex of your repository is required.
Original file line number Diff line number Diff line change
Expand Up @@ -121,11 +121,12 @@
import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceHistoryPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceHistoryProvenancePredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceIdPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceLinkPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceTablePredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.SearchParamPresentPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.SourcePredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.StringPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.TagPredicateBuilder;
import ca.uhn.fhir.jpa.search.builder.predicate.TokenPredicateBuilder;
Expand Down Expand Up @@ -699,8 +700,15 @@ public TokenPredicateBuilder newTokenPredicateBuilder(SearchQueryBuilder theSear

@Bean
@Scope("prototype")
public SourcePredicateBuilder newSourcePredicateBuilder(SearchQueryBuilder theSearchBuilder) {
return new SourcePredicateBuilder(theSearchBuilder);
public ResourceHistoryPredicateBuilder newResourceHistoryPredicateBuilder(SearchQueryBuilder theSearchBuilder) {
return new ResourceHistoryPredicateBuilder(theSearchBuilder);
}

@Bean
@Scope("prototype")
public ResourceHistoryProvenancePredicateBuilder newResourceHistoryProvenancePredicateBuilder(
SearchQueryBuilder theSearchBuilder) {
return new ResourceHistoryProvenancePredicateBuilder(theSearchBuilder);
}

@Bean
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@
import ca.uhn.fhir.jpa.api.svc.ISearchCoordinatorSvc;
import ca.uhn.fhir.jpa.dao.ISearchBuilder;
import ca.uhn.fhir.jpa.dao.SearchBuilderFactory;
import ca.uhn.fhir.jpa.dao.data.IResourceSearchViewDao;
import ca.uhn.fhir.jpa.dao.data.IResourceTagDao;
import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
Expand Down Expand Up @@ -89,9 +88,6 @@ public class SearchConfig {
@Autowired
private DaoRegistry myDaoRegistry;

@Autowired
private IResourceSearchViewDao myResourceSearchViewDao;

@Autowired
private FhirContext myContext;

Expand Down Expand Up @@ -169,7 +165,6 @@ public ISearchBuilder newSearchBuilder(String theResourceName, Class<? extends I
myInterceptorBroadcaster,
myResourceTagDao,
myDaoRegistry,
myResourceSearchViewDao,
myContext,
myIdHelperService,
theResourceType);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,6 @@
import ca.uhn.fhir.jpa.model.entity.BaseHasResource;
import ca.uhn.fhir.jpa.model.entity.BaseTag;
import ca.uhn.fhir.jpa.model.entity.ResourceEncodingEnum;
import ca.uhn.fhir.jpa.model.entity.ResourceHistoryProvenanceEntity;
import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTable;
import ca.uhn.fhir.jpa.model.entity.ResourceLink;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
Expand Down Expand Up @@ -561,8 +560,8 @@ protected EncodedResource populateResourceIntoEntity(
} else {
ResourceHistoryTable currentHistoryVersion = theEntity.getCurrentVersionEntity();
if (currentHistoryVersion == null) {
currentHistoryVersion = myResourceHistoryTableDao.findForIdAndVersionAndFetchProvenance(
theEntity.getId(), theEntity.getVersion());
currentHistoryVersion =
myResourceHistoryTableDao.findForIdAndVersion(theEntity.getId(), theEntity.getVersion());
}
if (currentHistoryVersion == null || !currentHistoryVersion.hasResource()) {
changed = true;
Expand Down Expand Up @@ -1083,7 +1082,7 @@ public ResourceTable updateEntity(
*/
if (thePerformIndexing) {
if (newParams == null) {
myExpungeService.deleteAllSearchParams(JpaPid.fromId(entity.getId()));
myExpungeService.deleteAllSearchParams(entity.getPersistentId());
entity.clearAllParamsPopulated();
} else {

Expand Down Expand Up @@ -1315,8 +1314,8 @@ private void createHistoryEntry(
* this could return null if the current resourceVersion has been expunged
* in which case we'll still create a new one
*/
historyEntry = myResourceHistoryTableDao.findForIdAndVersionAndFetchProvenance(
theEntity.getResourceId(), resourceVersion - 1);
historyEntry =
myResourceHistoryTableDao.findForIdAndVersion(theEntity.getResourceId(), resourceVersion - 1);
if (historyEntry != null) {
reusingHistoryEntity = true;
theEntity.populateHistoryEntityVersionAndDates(historyEntry);
Expand Down Expand Up @@ -1374,29 +1373,12 @@ private void createHistoryEntry(
boolean haveSource = isNotBlank(source) && shouldStoreSource;
boolean haveRequestId = isNotBlank(requestId) && shouldStoreRequestId;
if (haveSource || haveRequestId) {
ResourceHistoryProvenanceEntity provenance = null;
if (reusingHistoryEntity) {
/*
* If version history is disabled, then we may be reusing
* a previous history entity. If that's the case, let's try
* to reuse the previous provenance entity too.
*/
provenance = historyEntry.getProvenance();
}
if (provenance == null) {
provenance = historyEntry.toProvenance();
}
provenance.setResourceHistoryTable(historyEntry);
provenance.setResourceTable(theEntity);
provenance.setPartitionId(theEntity.getPartitionId());
if (haveRequestId) {
String persistedRequestId = left(requestId, Constants.REQUEST_ID_LENGTH);
provenance.setRequestId(persistedRequestId);
historyEntry.setRequestId(persistedRequestId);
}
if (haveSource) {
String persistedSource = left(source, ResourceHistoryTable.SOURCE_URI_LENGTH);
provenance.setSourceUri(persistedSource);
historyEntry.setSourceUri(persistedSource);
}
if (theResource != null) {
Expand All @@ -1406,8 +1388,6 @@ private void createHistoryEntry(
shouldStoreRequestId ? requestId : null,
theResource);
}

myEntityManager.persist(provenance);
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@
import ca.uhn.fhir.jpa.api.model.LazyDaoMethodOutcome;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.api.svc.ResolveIdentityMode;
import ca.uhn.fhir.jpa.dao.data.IResourceHistoryProvenanceDao;
import ca.uhn.fhir.jpa.dao.tx.HapiTransactionService;
import ca.uhn.fhir.jpa.delete.DeleteConflictUtil;
import ca.uhn.fhir.jpa.model.cross.IBasePersistedResource;
Expand All @@ -52,6 +53,7 @@
import ca.uhn.fhir.jpa.model.entity.BaseTag;
import ca.uhn.fhir.jpa.model.entity.PartitionablePartitionId;
import ca.uhn.fhir.jpa.model.entity.ResourceEncodingEnum;
import ca.uhn.fhir.jpa.model.entity.ResourceHistoryProvenanceEntity;
import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTable;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.model.entity.TagDefinition;
Expand Down Expand Up @@ -206,6 +208,9 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
@Autowired
private IJobCoordinator myJobCoordinator;

@Autowired
private IResourceHistoryProvenanceDao myResourceHistoryProvenanceDao;

private IInstanceValidatorModule myInstanceValidator;
private String myResourceName;
private Class<T> myResourceType;
Expand Down Expand Up @@ -562,17 +567,15 @@ private DaoMethodOutcome doCreateForPostOrPut(
thePerformIndexing);

// Store the resource forced ID if necessary
JpaPid jpaPid = JpaPid.fromId(updatedEntity.getResourceId());
JpaPid jpaPid = updatedEntity.getPersistentId();

// Populate the resource with its actual final stored ID from the entity
theResource.setId(entity.getIdDt());

// Pre-cache the resource ID
jpaPid.setAssociatedResourceId(entity.getIdType(myFhirContext));
String fhirId = entity.getFhirId();
if (fhirId == null) {
fhirId = Long.toString(entity.getId());
}
assert fhirId != null;
myIdHelperService.addResolvedPidToFhirIdAfterCommit(
jpaPid, theRequestPartitionId, getResourceName(), fhirId, null);
theTransactionDetails.addResolvedResourceId(jpaPid.getAssociatedResourceId(), jpaPid);
Expand Down Expand Up @@ -1016,7 +1019,7 @@ public void beforeCommit(boolean readOnly) {

protected ResourceTable updateEntityForDelete(
RequestDetails theRequest, TransactionDetails theTransactionDetails, ResourceTable theEntity) {
myResourceSearchUrlSvc.deleteByResId(theEntity.getId());
myResourceSearchUrlSvc.deleteByResId(theEntity.getPersistentId());
Date updateTime = new Date();
return updateEntity(theRequest, null, theEntity, updateTime, true, true, theTransactionDetails, false, true);
}
Expand Down Expand Up @@ -1261,7 +1264,7 @@ public IBundleProvider history(
return myPersistedJpaBundleProviderFactory.history(
theRequest,
myResourceName,
entity.getId(),
entity.getPersistentId(),
theSince,
theUntil,
theOffset,
Expand Down Expand Up @@ -1291,7 +1294,7 @@ public IBundleProvider history(
return myPersistedJpaBundleProviderFactory.history(
theRequest,
myResourceName,
entity.getId(),
JpaPid.fromId(entity.getId()),
theHistorySearchDateRangeParam.getLowerBoundAsInstant(),
theHistorySearchDateRangeParam.getUpperBoundAsInstant(),
theHistorySearchDateRangeParam.getOffset(),
Expand Down Expand Up @@ -1391,8 +1394,8 @@ protected <MT extends IBaseMetaType> void doMetaAddOperation(
doMetaAdd(theMetaAdd, latestVersion, theRequest, transactionDetails);

// Also update history entry
ResourceHistoryTable history = myResourceHistoryTableDao.findForIdAndVersionAndFetchProvenance(
entity.getId(), entity.getVersion());
ResourceHistoryTable history =
myResourceHistoryTableDao.findForIdAndVersion(entity.getId(), entity.getVersion());
doMetaAdd(theMetaAdd, history, theRequest, transactionDetails);
}

Expand Down Expand Up @@ -1439,8 +1442,8 @@ public <MT extends IBaseMetaType> void doMetaDeleteOperation(
} else {
doMetaDelete(theMetaDel, latestVersion, theRequest, transactionDetails);
// Also update history entry
ResourceHistoryTable history = myResourceHistoryTableDao.findForIdAndVersionAndFetchProvenance(
entity.getId(), entity.getVersion());
ResourceHistoryTable history =
myResourceHistoryTableDao.findForIdAndVersion(entity.getId(), entity.getVersion());
doMetaDelete(theMetaDel, history, theRequest, transactionDetails);
}

Expand Down Expand Up @@ -1705,7 +1708,7 @@ private void reindexOptimizeStorage(
ResourceTable entity, ReindexParameters.OptimizeStorageModeEnum theOptimizeStorageMode) {
ResourceHistoryTable historyEntity = entity.getCurrentVersionEntity();
if (historyEntity != null) {
reindexOptimizeStorageHistoryEntity(entity, historyEntity);
reindexOptimizeStorageHistoryEntityThenDetachIt(entity, historyEntity);
if (theOptimizeStorageMode == ReindexParameters.OptimizeStorageModeEnum.ALL_VERSIONS) {
int pageSize = 100;
for (int page = 0; ((long) page * pageSize) < entity.getVersion(); page++) {
Expand All @@ -1715,39 +1718,44 @@ private void reindexOptimizeStorage(
// different pages as the underlying data gets updated.
PageRequest pageRequest = PageRequest.of(page, pageSize, Sort.by("myId"));
Slice<ResourceHistoryTable> historyEntities =
myResourceHistoryTableDao.findForResourceIdAndReturnEntitiesAndFetchProvenance(
myResourceHistoryTableDao.findAllVersionsExceptSpecificForResourcePid(
pageRequest, entity.getId(), historyEntity.getVersion());

for (ResourceHistoryTable next : historyEntities) {
reindexOptimizeStorageHistoryEntity(entity, next);
reindexOptimizeStorageHistoryEntityThenDetachIt(entity, next);
}
}
}
}
}

private void reindexOptimizeStorageHistoryEntity(ResourceTable entity, ResourceHistoryTable historyEntity) {
boolean changed = false;
/**
* Note that the entity will be detached after being saved if it has changed
* in order to avoid growing the number of resources in memory to be too big
*/
private void reindexOptimizeStorageHistoryEntityThenDetachIt(
ResourceTable entity, ResourceHistoryTable historyEntity) {
if (historyEntity.getEncoding() == ResourceEncodingEnum.JSONC
|| historyEntity.getEncoding() == ResourceEncodingEnum.JSON) {
byte[] resourceBytes = historyEntity.getResource();
if (resourceBytes != null) {
String resourceText = decodeResource(resourceBytes, historyEntity.getEncoding());
if (myResourceHistoryCalculator.conditionallyAlterHistoryEntity(entity, historyEntity, resourceText)) {
changed = true;
}
myResourceHistoryCalculator.conditionallyAlterHistoryEntity(entity, historyEntity, resourceText);
}
}
if (isBlank(historyEntity.getSourceUri()) && isBlank(historyEntity.getRequestId())) {
if (historyEntity.getProvenance() != null) {
historyEntity.setSourceUri(historyEntity.getProvenance().getSourceUri());
historyEntity.setRequestId(historyEntity.getProvenance().getRequestId());
changed = true;
if (myStorageSettings.isAccessMetaSourceInformationFromProvenanceTable()) {
if (isBlank(historyEntity.getSourceUri()) && isBlank(historyEntity.getRequestId())) {
Long id = historyEntity.getId();
Optional<ResourceHistoryProvenanceEntity> provenanceEntityOpt =
myResourceHistoryProvenanceDao.findById(id);
if (provenanceEntityOpt.isPresent()) {
ResourceHistoryProvenanceEntity provenanceEntity = provenanceEntityOpt.get();
historyEntity.setSourceUri(provenanceEntity.getSourceUri());
historyEntity.setRequestId(provenanceEntity.getRequestId());
myResourceHistoryProvenanceDao.delete(provenanceEntity);
}
}
}
if (changed) {
myResourceHistoryTableDao.save(historyEntity);
}
}

private BaseHasResource readEntity(
Expand Down
Loading

0 comments on commit 744a0d7

Please sign in to comment.