Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MODDATAIMP-957: remove initial saving of records in SRS #845

Merged
merged 8 commits into from
Jan 19, 2024
5 changes: 5 additions & 0 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,17 @@
## 2023-xx-xx v3.8.0-SNAPSHOT
* [MODSOURMAN-1085](https://issues.folio.org/browse/MODSOURMAN-1085) MARC record with a 100 tag without a $a is being discarded on import.
* [MODSOURMAN-1020](https://issues.folio.org/browse/MODSOURMAN-1020) Add table to save incoming records for DI logs
* [MODSOURMAN-1021](https://issues.folio.org/browse/MODSOURMAN-1021) Provide endpoint for getting parsed content for DI log
* [MODSOURMAN-1022](https://issues.folio.org/browse/MODSOURMAN-1022) Remove step of initial saving of incoming records to SRS
* [MODSOURMAN-1070](https://issues.folio.org/browse/MODSOURMAN-1070) Fill in Journal Records for created MARC when INSTANCE_CREATED event received
* [MODSOURMAN-1030](https://issues.folio.org/browse/MODSOURMAN-1030) The number of updated records is not correct displayed in the 'SRS Marc' column in the 'Log summary' table
* [MODSOURMAN-976](https://issues.folio.org/browse/MODSOURMAN-976) Incorrect error counts
* [MODSOURMAN-1093](https://issues.folio.org/browse/MODSOURMAN-1093) EventHandlingUtil hangs forever on error
* [MODSOURMAN-1043](https://issues.folio.org/browse/MODSOURMAN-1043) Improper behavior in multiples for holdings when update action on match and create on non-match
* [MODSOURMAN-1091](https://issues.folio.org/browse/MODSOURMAN-1091) The '1' number of Instance is displayed in cell in the row with the 'Updated' row header at the individual import job's log
* [MODSOURMAN-1108](https://issues.folio.org/browse/MODSOURMAN-1108) MARC authority record is not created when use Job profile with match profile and action only on no-match branch
* [MODSOURMAN-1106](https://issues.folio.org/browse/MODSOURMAN-1106) The status of Instance is '-' in the Import log after uploading file. The numbers of updated SRS and Instance are not displayed in the Summary table.
* [MODSOURMAN-1063](https://issues.folio.org/browse/MODSOURMAN-1063) Update RecordProcessingLogDto to contain incoming record id

## 2023-10-13 v3.7.0
* [MODSOURMAN-1045](https://issues.folio.org/browse/MODSOURMAN-1045) Allow create action with non-matches for instance without match profile
Expand Down
15 changes: 15 additions & 0 deletions descriptors/ModuleDescriptor-template.json
Original file line number Diff line number Diff line change
Expand Up @@ -501,6 +501,15 @@
"permissionsRequired": [
"metadata-provider.jobexecutions.get"
]
},
{
"methods": [
"GET"
],
"pathPattern": "/metadata-provider/incomingRecords/{recordId}",
"permissionsRequired": [
"metadata-provider.incomingrecords.get"
]
}
]
},
Expand Down Expand Up @@ -651,6 +660,11 @@
"displayName": "Metadata Provider - get jobExecution logs",
"description": "Get JobExecutionLogDto"
},
{
"permissionName": "metadata-provider.incomingrecords.get",
"displayName": "Metadata Provider - get incoming record",
"description": "Get IncomingRecord"
},
{
"permissionName": "change-manager.jobexecutions.post",
"displayName": "Change Manager - create jobExecutions",
Expand Down Expand Up @@ -718,6 +732,7 @@
"subPermissions": [
"metadata-provider.jobexecutions.get",
"metadata-provider.logs.get",
"metadata-provider.incomingrecords.get",
"change-manager.jobexecutions.post",
"change-manager.jobexecutions.put",
"change-manager.jobexecutions.get",
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
package org.folio.dao;

import io.vertx.core.Future;
import io.vertx.sqlclient.Row;
import io.vertx.sqlclient.RowSet;
import org.folio.rest.jaxrs.model.IncomingRecord;

import java.util.List;
import java.util.Optional;

/**
* DAO interface for the {@link IncomingRecord} entity
*/
public interface IncomingRecordDao {

/**
* Searches for {@link IncomingRecord} by id
*
* @param id incomingRecord id
* @return optional of incomingRecord
*/
Future<Optional<IncomingRecord>> getById(String id, String tenantId);

/**
* Saves {@link IncomingRecord} entities into DB
*
* @param incomingRecords {@link IncomingRecord} entities to save
* @param tenantId tenant id
* @return future with created incomingRecords entities represented as row set
*/
Future<List<RowSet<Row>>> saveBatch(List<IncomingRecord> incomingRecords, String tenantId);
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
package org.folio.dao;

import io.vertx.core.Future;
import io.vertx.core.Promise;
import io.vertx.core.json.JsonObject;
import io.vertx.sqlclient.Row;
import io.vertx.sqlclient.RowSet;
import io.vertx.sqlclient.Tuple;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.folio.dao.util.PostgresClientFactory;
import org.folio.rest.jaxrs.model.IncomingRecord;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;

import java.util.List;
import java.util.Optional;
import java.util.UUID;

import static java.lang.String.format;
import static org.folio.rest.persist.PostgresClient.convertToPsqlStandard;

@Repository
public class IncomingRecordDaoImpl implements IncomingRecordDao {

private static final Logger LOGGER = LogManager.getLogger();
public static final String INCOMING_RECORDS_TABLE = "incoming_records";
private static final String GET_BY_ID_SQL = "SELECT * FROM %s.%s WHERE id = $1";
private static final String INSERT_SQL = "INSERT INTO %s.%s (id, job_execution_id, incoming_record) VALUES ($1, $2, $3)";

@Autowired
private PostgresClientFactory pgClientFactory;

@Override
public Future<Optional<IncomingRecord>> getById(String id, String tenantId) {
LOGGER.debug("getById:: Get IncomingRecord by id = {} from the {} table", id, INCOMING_RECORDS_TABLE);
Promise<RowSet<Row>> promise = Promise.promise();
try {
String query = format(GET_BY_ID_SQL, convertToPsqlStandard(tenantId), INCOMING_RECORDS_TABLE);
pgClientFactory.createInstance(tenantId).selectRead(query, Tuple.of(UUID.fromString(id)), promise);
} catch (Exception e) {
LOGGER.warn("getById:: Error getting IncomingRecord by id", e);
promise.fail(e);
}
return promise.future().map(rowSet -> rowSet.rowCount() == 0 ? Optional.empty()
: Optional.of(mapRowToIncomingRecord(rowSet.iterator().next())));
}

@Override
public Future<List<RowSet<Row>>> saveBatch(List<IncomingRecord> incomingRecords, String tenantId) {
LOGGER.debug("saveBatch:: Save IncomingRecord entity to the {} table", INCOMING_RECORDS_TABLE);
Promise<List<RowSet<Row>>> promise = Promise.promise();
try {
String query = format(INSERT_SQL, convertToPsqlStandard(tenantId), INCOMING_RECORDS_TABLE);
List<Tuple> tuples = incomingRecords.stream().map(this::prepareInsertQueryParameters).toList();
LOGGER.debug("IncomingRecordDaoImpl:: Save query = {}; tuples = {}", query, tuples);
pgClientFactory.createInstance(tenantId).execute(query, tuples, promise);
} catch (Exception e) {
LOGGER.warn("saveBatch:: Error saving IncomingRecord entity", e);
promise.fail(e);
}
return promise.future().onFailure(e -> LOGGER.warn("saveBatch:: Error saving IncomingRecord entity", e));
}

private IncomingRecord mapRowToIncomingRecord(Row row) {
JsonObject jsonObject = row.getJsonObject("incoming_record");
return new IncomingRecord().withId(String.valueOf(row.getUUID("id")))
.withJobExecutionId(String.valueOf(row.getUUID("job_execution_id")))
.withRecordType(IncomingRecord.RecordType.fromValue(jsonObject.getString("recordType")))
.withOrder(jsonObject.getInteger("order"))
.withRawRecordContent(jsonObject.getString("rawRecordContent"))
.withParsedRecordContent(jsonObject.getString("parsedRecordContent"));
}

private Tuple prepareInsertQueryParameters(IncomingRecord incomingRecord) {
return Tuple.of(UUID.fromString(incomingRecord.getId()), UUID.fromString(incomingRecord.getJobExecutionId()),
JsonObject.mapFrom(incomingRecord));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@
import static java.lang.String.format;
import static java.util.Objects.nonNull;
import static org.apache.commons.lang3.StringUtils.EMPTY;
import static org.folio.dao.IncomingRecordDaoImpl.INCOMING_RECORDS_TABLE;
import static org.folio.dao.util.JobExecutionDBConstants.COMPLETED_DATE_FIELD;
import static org.folio.dao.util.JobExecutionDBConstants.CURRENTLY_PROCESSED_FIELD;
import static org.folio.dao.util.JobExecutionDBConstants.ERROR_STATUS_FIELD;
Expand Down Expand Up @@ -597,12 +598,13 @@ public Future<Boolean> hardDeleteJobExecutions(long diffNumberOfDays, String ten
return Future.succeededFuture();
}

UUID[] uuids = jobExecutionIds.stream().map(UUID::fromString).collect(Collectors.toList()).toArray(UUID[]::new);
UUID[] uuids = jobExecutionIds.stream().map(UUID::fromString).toList().toArray(UUID[]::new);

Future<RowSet<Row>> jobExecutionProgressFuture = Future.future(rowSetPromise -> deleteFromRelatedTable(PROGRESS_TABLE_NAME, uuids, sqlConnection, tenantId, rowSetPromise, postgresClient));
Future<RowSet<Row>> jobExecutionSourceChunksFuture = Future.future(rowSetPromise -> deleteFromRelatedTableWithDeprecatedNaming(JOB_EXECUTION_SOURCE_CHUNKS_TABLE_NAME, uuids, sqlConnection, tenantId, rowSetPromise, postgresClient));
Future<RowSet<Row>> journalRecordsFuture = Future.future(rowSetPromise -> deleteFromRelatedTable(JOURNAL_RECORDS_TABLE_NAME, uuids, sqlConnection, tenantId, rowSetPromise, postgresClient));
return CompositeFuture.all(jobExecutionProgressFuture, jobExecutionSourceChunksFuture, journalRecordsFuture)
Future<RowSet<Row>> incomingRecordsFuture = Future.future(rowSetPromise -> deleteFromRelatedTable(INCOMING_RECORDS_TABLE, uuids, sqlConnection, tenantId, rowSetPromise, postgresClient));
return CompositeFuture.all(jobExecutionProgressFuture, jobExecutionSourceChunksFuture, journalRecordsFuture, incomingRecordsFuture)
.compose(ar -> Future.<RowSet<Row>>future(rowSetPromise -> deleteFromJobExecutionTable(uuids, sqlConnection, tenantId, rowSetPromise, postgresClient)))
.map(true);
}));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@
import io.vertx.sqlclient.Row;
import io.vertx.sqlclient.RowSet;
import org.folio.rest.jaxrs.model.JobExecutionSummaryDto;
import org.folio.rest.jaxrs.model.JobLogEntryDtoCollection;
import org.folio.rest.jaxrs.model.JournalRecord;
import org.folio.rest.jaxrs.model.RecordProcessingLogDto;
import org.folio.rest.jaxrs.model.RecordProcessingLogDtoCollection;

import java.util.List;
import java.util.Optional;
Expand Down Expand Up @@ -55,7 +55,7 @@ public interface JournalRecordDao {
Future<Boolean> deleteByJobExecutionId(String jobExecutionId, String tenantId);

/**
* Searches for JobLogEntryDto entities by jobExecutionId and sorts them using specified sort criteria and direction
* Searches for RecordProcessingLogDtoCollection by jobExecutionId and sorts them using specified sort criteria and direction
*
* @param jobExecutionId job execution id
* @param sortBy sorting criteria
Expand All @@ -67,7 +67,7 @@ public interface JournalRecordDao {
* @param tenantId tenantId
* @return future with JobLogEntryDto collection
*/
Future<JobLogEntryDtoCollection> getJobLogEntryDtoCollection(String jobExecutionId, String sortBy, String order, boolean errorsOnly, String entityType, int limit, int offset, String tenantId);
Future<RecordProcessingLogDtoCollection> getRecordProcessingLogDtoCollection(String jobExecutionId, String sortBy, String order, boolean errorsOnly, String entityType, int limit, int offset, String tenantId);

/**
* Searches for RecordProcessingLogDto entities by jobExecutionId and recordId
Expand Down
Loading