NuGet Libraries
Authorization
Permissions
Builders
General Import flow description
Example Import flows
API Documentation
API Response
ImportJob & DataSource States
Errors
.NET Console Application - How-to
.NET Framework & Kepler Console Application - How-to
Powershell scripts - How-to
The Relativity Import Service API is a Kepler service that provides functionality for importing large numbers of documents, images, and Relativity Dynamic Objects (RDOs) into a Relativity workspace.
The import process operates on structured data sets that are described by load file and located in a place accessible for workspace.
The main principle of operation is based on creating managed importing job with a list of data sets (intended for import) assigned to it.
Thanks to RESTful API you are able to easily create import job, configure it and run it. Dataset (containing structured data) that you want to import can be then added as a source to the job. The system will take care of it in the background by adding this source to the queue, scheduling and finally starting the import data to destination workspace, and if necessary, resuming the import process. All that remains for the user is to monitor the status of import and current progress - all using provided API.
Job and data sources configurations allow you to flexibly adjust the import to your needs. In addition, the adopted error handling helps you to identify the source of potential problems.
NOTE: Import Service (Import) is delivered as a RAP application installed in Relativity One.
-
The following Relativity applications must be installed:
application name application Guid where installed Import 21F65FDC-3016-4F2B-9698-DE151A6186A2 workspace DataTransfer.Legacy 9f9d45ff-5dcd-462d-996d-b9033ea8cfce instance -
Appropriate user permissions need to be set.
-
Data set - load files, source files (native documents, images, text files) - need to be placed in the destination fileshare location accessible to workspace. You can use TAPI (https://platform.relativity.com/RelativityOne/#Transfer_API/Relativity_Transfer_SDK.htm) to upload all files to destination location.
-
The following packages need to be installed in client application: NOTE: Required only when Kepler .NET client is used.
- Relativity.Import.SDK
- Relativity.Import.Models.SDK (added automatically as dependency to Relativity.Import.SDK)
- Relativity.Kepler.Client.SDK (added automatically as dependency to Relativity.Import.SDK)
Data importing - Functionality that makes that structured data set are uploaded into destination workspace.
Dataset - Structured data containing metadata, native documents, images, text files described by load file or opticon file. Such a dataset can be pointed during data source configuration and MUST be located in place accessible for workspace.
ImportJob - It is the main object in import service taking part in import flow. It represents single import entity described by its configuration which decides about import behavior e.g. import type, overlay mode, fields mapping, etc.
In addition, ImportJob object holds the information about its current state and import progress.
Import jobs aggregates dataSources - single import job can consists of many sources.
DataSource - It is an object that corresponds to single set of data to be imported. Each data source has own configuration that indicates the physical location of data set (load file). Data set configuration affects also how data in load file are read. In addition, data source stores the information about current state and import progress of particular source.
Kepler service - API service created but using the Relativity Kepler framework. This framework provides you with the ability to build custom REST Endpoints via a .NET interface. Additionally, the Kepler framework includes a client proxy that you can use when interacting with the services through .NET. See more information
Item Error - An error that may occur during the import process and concerns only one imported record from the load file. A common reason of these errors are data validation or already existing records in the workspace.
Import Service is built as a standard Relativity Kepler Service. It provides sets of endpoints that must be called sequentially in order to execute import. The following sections outline how to make calls to import service.
HTTP clients
You can make calls to a import service using any standard REST or HTTP client, because all APIs (Keplers APIs) are exposed over the HTTP protocol. You need to set the required X-CSRF-Header. more details
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Add("X-CSRF-Header", "-");
createImportJobUri = $"{host}/Relativity.REST/api/import.service/v1....
var response = await httpClient.PostAsJsonAsync(createImportJobUri, payload);
In case of using .NET client the Relativity.Import.Models.SDK package containing contract models would be used.
Please look at dedicated code samples for .NET 7 or for PowerShell scripts.
Kepler .NET client
You can access Kepler service from any .NET language using the client library provided as part of the Kepler framework. It exposes a factory class that you can use to create the client proxy by passing URIs to import services and credentials. Then use .NET proxy to interact with a import service as a set of .NET objects. When you call a member method, the proxy makes a corresponding HTTP request to the respective service endpoint. more details
Kepler contract for import service are exposed in Relativity.Import.SDK package.
using (Relativity.Import.V1.Services.IImportJobController importJobController =_serviceFactory.CreateProxy<Relativity.Import.V1.Services.IImportJobController>())
{
// Create import job.
Response response = await importJobController.CreateAsync(
importJobID: importId,
workspaceID: workspaceId,
applicationName: "Import-service-sample-app",
correlationID: "Sample-job-0001");
}
}
Please look at dedicated code samples for .NET 4.6.2 with Kepler.
Relativity.Import.SDK is a .NET library that contains kepler interfaces for import service. It provides and simplifies executing import in client application. Relativity.Import.SDK targets .NET Framework 4.6.2
NOTE: Use this package when your application USE Kepler.
Install-Package Relativity.Import.SDK
Relativity.Import.Models.SDK is a .NET library that contains contract models for API and builders which help user to prepare payloads in correct and consistent way.
Relativity.Import.Models.SDK targets .NET Standard 2.0. The NuGet package also includes direct targets for .NET Framework 4.6.2.
NOTE:
This package is automatically installed as dependency when using Relativity.Import.SDK.
NOTE: You can install this package directly when your application does not use Kepler.
Install-Package Relativity.Import.Models.SDK
HTTP clients
Import Service API conforms to the same authentication rules as other Relativity REST APIs.
The more details can be found under the following link: REST_API_authentication
Kepler .NET client
The Kepler framework uses a proxy to handle client requests. The more details can be found under the following link: Proxies_and_authentication
The following Relativity permissions are required to use import features provided in Import Service API.
Object Security section | Permission |
---|---|
• Document | View, Add, Edit |
• Relativity Import Job: | View, Add, Edit |
• Relativity Import Data Source | View, Add, Edit |
Tab Visibility |
---|
• Documents |
Admin Operation |
---|
• Allow Import |
Builders provided in Relativity.Import.Models.SDK package help to create settings for import job and data source in correct and consistent way. It is highly recommended to prepare these objects in such a way in .NET application. They are implemented in fluent api pattern so it is very easy to use them. Moreover, using them in client application will avoid the risk of incorrect and inconsistent configuration which may lead to errors during import process.
ImportDocumentsSettingsBuilder - builds ImportDocumentsSettings used for import job configuration (documents import).
ImportRdoSettingsBuilder - builds ImportRdoSettings used for import job configuration (rdos import).
DataSourceSettingsBuilder - builds DataSourceSettings used for data source configuration.
C#
// Example of using ImportDocumentSettingsBuilder to create ImportDocumentSettings.
ImportDocumentSettingsBuilder.Create()
.WithOverlayMode(x => x
.WithKeyField(overlayKeyField)
.WithMultiFieldOverlayBehaviour(MultiFieldOverlayBehaviour.MergeAll))
.WithNatives(x => x
.WithFilePathDefinedInColumn(filePathColumnIndex)
.WithFileNameDefinedInColumn(fileNameColumnIndex))
.WithoutImages()
.WithFieldsMapped(x => x
.WithField(controlNumberColumnIndex, "Control Number")
.WithExtractedTextInSeparateFiles(f => f
.WithEncoding("UTF-16")
.WithFileSizeDefinedInColumn(fileSizeColumnIndex))))
.WithFolders(f => f
.WithRootFolderID(rootFolderId, r => r
.WithFolderPathDefinedInColumn(folderPathColumnIndex)));
C#
// Example of using DataSourceSettingsBuilder to create DataSourceSettings.
DataSourceSettings dataSourceSettings = DataSourceSettingsBuilder.Create()
.ForLoadFile(loadFile01Path)
.WithDelimiters(d => d
.WithColumnDelimiters('|')
.WithQuoteDelimiter('^')
.WithNewLineDelimiter('#')
.WithNestedValueDelimiter('&')
.WithMultiValueDelimiter('$'))
.WithFirstLineContainingHeaders()
.WithEndOfLineForWindows()
.WithStartFromBeginning()
.WithDefaultEncoding()
.WithDefaultCultureInfo();
NOTE: Please review the samples to find more about builders.
The general flow includes several steps consisted in sending appropriate HTTP requests.
-
Create Import Job
Creates import job entity in particular workspace. Job is defined by its unique Id generated by user and provided in the request which is used in the next steps.
-
Configure Import Job
Configures existing import job by defining sets of significant parameters including import type, its mode, fields mapping. This step covers two configuration options:
-
Documents Configuration
-
Rdos Configuration.
-
-
Add one or multiple DataSources
Creating data source entity or entities for particular import job. It represents the configuration that corresponds to dataset being imported. Data source is identified by its unique Id generated by user and provided in the request. Data source configuration includes path to “load file” and other significant parameters telling how data in load file will be read and interpreted by system.
Many data sources can be added to the same import job. Data sources can be added both before job is started and after, so user can add additional sources to running importJob.
-
Begin Job
Starts Import Job which enables the process that schedules importing data to workspace based on the configuration assigned in previous steps. Started job does not mean that data are instantly imported. However DataSources are added to the queue and scheduled by background mechanism. The import Job state or data source state shows the current stage. -
Cancel Job User can cancel running Import Job in every moment. All related data sources will not be imported except for those whose import has already started.
-
End Import Job Ends import job that was already started. It is optional step but it is highly recommended in case when no more data source is plan to be added for particular job. All data sources added to the job before the end request was sent will be imported.
-
Create Import Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235' -H 'X-CSRF-Header: -' -d '{ "applicationName": "simpleImportDocuments", "correlationID": "c0r31ati0n_ID" }'
-
Create Import Job Configuration
curl
curl -X POST \'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235/documents-configurations/' -H 'X-CSRF-Header: -' -d "$importSettingsPayloadJson"
Import Configuration payload example:
JSON
{ "importSettings": { "Overlay":null, "Native":{ "FilePathColumnIndex": "22", "FileNameColumnIndex": "13" }, "Image": null, "Fields": { "FieldMappings": [ { "ColumnIndex": 0, "Field": "Control Number", "ContainsID": false, "ContainsFilePath": false }, ] }, "Folder": { "RootFolderID": 1003663, "FolderPathColumnIndex": 2 } } }'
C# Builders
ImportDocumentSettings importSettings = ImportDocumentSettingsBuilder.Create() .WithAppendMode() .WithNatives(x => x .WithFilePathDefinedInColumn(filePathColumnIndex) .WithFileNameDefinedInColumn(fileNameColumnIndex)) .WithoutImages() .WithFieldsMapped(x => x .WithField(controlNumberColumnIndex, "Control Number") .WithFolders(f => f .WithRootFolderID(rootFolderId, r => r .WithFolderPathDefinedInColumn(folderPathColumnIndex)));
C#
ImportDocumentSettings importSettings = new ImportDocumentSettings() { Overlay = null, Native = new NativeSettings { FileNameColumnIndex = fileNameColumnIndex, FilePathColumnIndex = filePathColumnIndex, }, Fields = new FieldsSettings { FieldMappings = new[] { new FieldMapping { Field = "Control Number", ContainsID = false, ColumnIndex = 0, ContainsFilePath = false, }, }, }, Folder = new FolderSettings { FolderPathColumnIndex = folderPathColumnIndex, RootFolderID = 1003663, }, Other = null, };
-
Add DataSource
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235/sources/0cb922a2-8df4-42fd-9429-c241410a0d1e' -H 'X-CSRF-Header: -' \ -H 'Content-Type: application/json' -d "$dataSourceSettingsJson" }'
Data source configuration payload example:
JSON
{ "dataSourceSettings": { "Path": "\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\load_file.dat", "FirstLineContainsColumnNames": true, "StartLine": 1, "ColumnDelimiter": "|", "QuoteDelimiter": "^", "NewLineDelimiter": "#", "MultiValueDelimiter": ";", "NestedValueDelimiter": "&", "EndOfLine" = 0 Encoding" = null "CultureInfo" : "en-US", "Type": 2 } }'
C# builders
DataSourceSettings dataSourceSettings = DataSourceSettingsBuilder.Create() .ForLoadFile("\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\load_file.dat") .WithDelimiters(d => d .WithColumnDelimiters('|') .WithQuoteDelimiter('^') .WithNewLineDelimiter('#') .WithNestedValueDelimiter('&') .WithMultiValueDelimiter(';')) .WithFirstLineContainingHeaders() .WithEndOfLineForWindows() .WithStartFromBeginning() .WithDefaultEncoding() .WithDefaultCultureInfo();
C#
DataSourceSettings dataSourceSettings = new DataSourceSettings { Type = DataSourceType.LoadFile, Path = "\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\load_file.dat", NewLineDelimiter = '#', ColumnDelimiter = '|', QuoteDelimiter = '^', MultiValueDelimiter = ';', NestedValueDelimiter = '&', Encoding = null, CultureInfo = "en-us", EndOfLine = DataSourceEndOfLine.Windows, FirstLineContainsColumnNames = true, StartLine = 0, };
-
Begin Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235/begin/' -H 'X-CSRF-Header: -' -d ''
-
End Import Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235/end/' -H 'X-CSRF-Header: -' -d ''
-
Create Import Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/4c4215bf-d8a3-48d4-a3e0-3a40428415e7/' -H 'X-CSRF-Header: -' -d '{ "applicationName": "simpleImportImages", "correlationID": "img0r22ati0n_ID" }'
-
Create Import Job Configuration
curl
curl -X POST \'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/4c4215bf-d8a3-48d4-a3e0-3a40428415e7/documents-configurations/' -H 'X-CSRF-Header: -' -d "$importSettings"
Import Configuration payload example:
JSON
{ "importSettings": { "Overlay":null, "Native":null, "Image": { "PageNumbering": 1, "ProductionID": null, "LoadExtractedText": false, "FileType": 0 } "Fields": null, "Folder": null } }'
C# Builder
ImportDocumentSettings importSettings = ImportDocumentSettingsBuilder.Create() .WithAppendMode() .WithoutNatives() .WithImages(i => i .WithAutoNumberImages() .WithoutProduction() .WithoutExtractedText() .WithFileTypeAutoDetection()) .WithoutFieldsMapped() .WithoutFolders();
-
Add DataSource
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/4c4215bf-d8a3-48d4-a3e0-3a40428415e7/sources/0cb922a2-8df4-42fd-9429-c241410a0002' -H 'X-CSRF-Header: -' \ -H 'Content-Type: application/json' -d "$dataSourceSettingsJson" }'
JSON
{ "dataSourceSettings": { "Path": "\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\opticon_file.opt", "FirstLineContainsColumnNames": false, "StartLine": 0; "EndOfLine" = 0 Encoding" = null "CultureInfo" : null, "Type": 1 } }'
C# Builders
DataSourceSettings dataSourceSettings = DataSourceSettingsBuilder.Create() .ForOpticonFile("\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\opticon_file.opt") .WithDefaultDelimitersForOpticonFile() .WithEndOfLineForWindows() .WithStartFromBeginning() .WithDefaultEncoding() .WithDefaultCultureInfo();
C#
DataSourceSettings dataSourceSettings = new DataSourceSettings { Type = DataSourceType.Opticon, Path = "\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\opticon_file.opt", NewLineDelimiter = default, ColumnDelimiter = default, QuoteDelimiter = default, MultiValueDelimiter = default, NestedValueDelimiter = default, Encoding = null, CultureInfo = null, EndOfLine = DataSourceEndOfLine.Windows, FirstLineContainsColumnNames = false, StartLine = 0, };
-
Begin Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/4c4215bf-d8a3-48d4-a3e0-3a40428415e7/begin/' -H 'X-CSRF-Header: -' -d ''
-
End Import Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/4c4215bf-d8a3-48d4-a3e0-3a40428415e7/end/' -H 'X-CSRF-Header: -' -d ''
-
Create Import Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/77140fb9-f515-4b65-a2ce-c347492e2905/' -H 'X-CSRF-Header: -' -d '{ "applicationName": "simpleImportRdo", "correlationID": "rdor31ati0n_ID" }'
-
Create Import Job Configuration
curl
curl -X POST \'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/77140fb9-f515-4b65-a2ce-c347492e2905/rdo-configurations/' -H 'X-CSRF-Header: -' -d $"importRdoSettings"'
Import RDO Configuration payload example:
JSON
{ "importSettings": { "OverwriteMode": "Append", "Fields": { "FieldMappings": [ { "ColumnIndex": 0, "Field": "Name" }, ] }, "Rdo": { "ArtifactTypeID": 1000066, "ParentColumnIndex" :null }, } }
C# Builder
ImportRdoSettings importSettings = ImportRdoSettingsBuilder.Create() .WithAppendMode() .WithFieldsMapped(f => f .WithField(nameColumnIndex, "Name") .WithRdo(r => r .WithArtifactTypeId(domainArtifactTypeID) .WithoutParentColumnIndex());
C#
ImportRdoSettings importSettings = new ImportRdoSettings() { Overlay = null, Fields = new FieldsSettings { FieldMappings = new[] { new FieldMapping { Field = "Name", ContainsID = false, ColumnIndex = nameColumnIndex, ContainsFilePath = false, }, }, }, Rdo = new RdoSettings { ArtifactTypeID = rdoArtifactTypeID, ParentColumnIndex = null, }, };
-
Add DataSource
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/4c4215bf-d8a3-48d4-a3e0-3a40428415e7/sources/0cb922a2-8df4-42fd-9429-c241410a0002' -H 'X-CSRF-Header: -' \ -H 'Content-Type: application/json' -d "$dataSourceSettingsJson" }'
Data source configuration payload example:
JSON
{ "dataSourceSettings": { "Path": "\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\load_file.dat", "FirstLineContainsColumnNames": true, "StartLine": 1, "ColumnDelimiter": "|", "QuoteDelimiter": "^", "NewLineDelimiter": "#", "MultiValueDelimiter": ";", "NestedValueDelimiter": "&", "EndOfLine" = 0 Encoding" = null "CultureInfo" : "en-US", "Type": 2 } }'
C# builder
DataSourceSettings dataSourceSettings = DataSourceSettingsBuilder.Create() .ForLoadFile("\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\load_file.dat) .WithDelimiters(d => d .WithColumnDelimiters('|') .WithQuoteDelimiter('^') .WithNewLineDelimiter('#') .WithNestedValueDelimiter('&') .WithMultiValueDelimiter(';')) .WithFirstLineContainingHeaders() .WithEndOfLineForWindows() .WithStartFromBeginning() .WithDefaultEncoding() .WithDefaultCultureInfo();
C#
DataSourceSettings dataSourceSettings = new DataSourceSettings { Type = DataSourceType.LoadFile, Path = "\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\load_file.dat", NewLineDelimiter = '#', ColumnDelimiter = '|', QuoteDelimiter = '^', MultiValueDelimiter = ';', NestedValueDelimiter = '&', Encoding = null, CultureInfo = "en-us", EndOfLine = DataSourceEndOfLine.Windows, FirstLineContainsColumnNames = true, StartLine = 0, };
-
Begin Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/77140fb9-f515-4b65-a2ce-c347492e2905/begin/' -H 'X-CSRF-Header: -' -d ''
-
End Import Job
curl
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/77140fb9-f515-4b65-a2ce-c347492e2905/end/' -H 'X-CSRF-Header: -' -d ''
Review the open API spec for import service: OpenAPI spec
Each HTTP response to POST request has unified schema:
{
"IsSuccess": true,
"ErrorMessage": "",
"ErrorCode": "",
"ImportJobID": "00000000-0000-0000-0000-000000000000"
}
Each HTTP response to GET requests has unified schema:
{
"Value": {
....
....
},
"IsSuccess": true,
"ErrorMessage": "",
"ErrorCode": "",
"ImportJobID": "00000000-0000-0000-0000-000000000000"
}
Import job state can be read from GET Import Job details response
curl
curl -X GET 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/ca04baf0-4a1a-4787-94d8-5bba89d2eb0f/details'
-H 'X-CSRF-Header: -'
response JSON
{
"Value": {
"IsFinished": false,
"State": "New",
"ApplicationName": "import demo",
"Errors": [],
"CreatedBy": 9,
"CreatedOn": "2023-01-11T13:47:45.513",
"LastModifiedBy": 9,
"LastModifiedOn": "2023-01-11T13:47:45.513"
},
"IsSuccess": true,
"ErrorMessage": "",
"ErrorCode": "",
"ImportJobID": "ca04baf0-4a1a-4787-94d8-5bba89d2eb0f"
}
value | State | Description |
---|---|---|
10 | New | Initial state, job created. |
13 | Configured | Job has been configured and is ready to begin. |
16 | InvalidConfiguration | Job has been configured but the configuration is invalid. |
20 | Idle | Job is ready for running but is waiting on new data source or all data source has been processed. |
22 | Scheduled | Job is ready waiting on queue to begin the process of import. |
25 | Inserting | Job is executing, import of data source is currently in progress. |
26 | PendingCompletion_Scheduled | Job is ended but data source is still waiting on queue to begin the process of import. |
27 | PendingCompletion_Inserting | Job is ended but the import of data source is currently in progress. |
29 | Paused | Job is paused and waiting. |
30 | Canceled | Job canceled. |
40 | Failed | Job has failed to import data. |
50 | Completed | Job has ended with success. |
Data source state can be read from GET Data source details response
curl
curl -X GET 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/ca04baf0-4a1a-4787-94d8-5bba89d2eb0f/sources/40ddb007-4330-41cc-b5aa-2ea6961073a5/details'
-H 'X-CSRF-Header: -'
response JSON
{
"Value": {
"State": "New",
"DataSourceSettings": {
...
...
},
"JobLevelErrors": []
},
"IsSuccess": true,
"ErrorMessage": "",
"ErrorCode": "",
"ImportJobID": "ca04baf0-4a1a-4787-94d8-5bba89d2eb0f"
}
Value | State | Description |
---|---|---|
0 | Unknown | Invalid state for a data source. |
10 | New | Initial state, data source was created. |
22 | Scheduled | Data source is waiting on queue to begin the process of import. |
24 | PendingInserting | Data source has been sent to Worker to begin the import. |
25 | Inserting | Data source is currently in progress of processing. |
30 | Canceled | Data source canceled. |
40 | Failed | Failed to import data from Data source. |
45 | CompletedWithItemErrors | Data source processed, import finished with item errors. |
50 | Completed | Data source processed, import finished. |
Import service throws two kind of errors:
- job level errors
- item level errors.
Job level errors are severe enough to cause the entire import job to fail. These errors can be found in the GetDetailsAsync endpoints for IImportJobController and IImportSourceController.
Item level errors are specific to rows within the data source being imported. Unlike job level errors, item level errors do not cause the entire import job to fail. Instead, they are logged and the import process continues with the next row from the load file. Item level errors can result in whole record, meaning document or RDO, not being imported to the workspace or the record in the workspace can be incomplete. You can retrieve all item level errors that occurred during the import process from GetItemErrorsAsync endpoint for IImportSourceController.
Error handling in Import Service returns Error Codes and Error Messages:
- in every response for failed HTTP request
- when requested by user for all job level errors that occurred during importing of particular data source e.g.:
curl
curl -X GET 'https://relativity.roadie.so/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/00000000-0000-0000-0000-000000000000/sources/00000000-0000-0000-0000-000000000000/details'
-H 'accept: application/json'
-H 'X-CSRF-Header: -'
JSON response
{
"Value": {
"State": "Failed",
"DataSourceSettings": {
Path": "file.dat",
"EndOfLine": "Windows",
"Type": "LoadFile",
"FirstLineContainsColumnNames": true,
"StartLine": 0,
"Encoding": "utf-8"
"ColumnDelimiter": "a",
"QuoteDelimiter": "b",
"NewLineDelimiter": "c",
"MultiValueDelimiter": "d",
"NestedValueDelimiter": "e"
},
"JobLevelErrors": [
{
"LineNumber": -1,
"ErrorDetails": [
{
"ColumnIndex": -1,
"ErrorCode": "S.RD.EXT.0217",
"ErrorMessage": "Cannot read Data Source. Could not open file for reading by RestartableStream.",
"ErrorMessageTemplate": "Cannot read Data Source. Could not open file for reading by RestartableStream.",
"ErrorProperties": {}
}
]
},
"CreatedBy": 777,
"CreatedOn": "2022-10-18T15:09:12.69",
"LastModifiedBy": 777,
"LastModifiedOn": "2022-10-18T15:10:00.497"
},
"IsSuccess": true,
"ErrorMessage": "",
"ErrorCode": "",
"ImportJobID": "00000000-0000-0000-0000-000000000000"
}
- when requested by user for all item errors that occurred during importing of particular data source e.g.:
curl
curl -X GET 'https://relativity.roadie.so/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/00000000-0000-0000-0000-000000000000/sources/00000000-0000-0000-0000-000000000000/itemerrors?start=0&length=10'
-H 'accept: application/json'
-H 'X-CSRF-Header: -'
JSON response
{
"Value": {
"DataSourceID": "00000000-0000-0000-0000-000000000000",
"Errors": [
{
"ErrorDetails": [
{
"ColumnIndex": 1,
"ErrorCode": "S.LN.INT.0001",
"ErrorMessage": "Error message.",
"ErrorMessageTemplate": "Template error message.",
"ErrorProperties": {
"additionalProp1": "string",
"additionalProp2": "string",
"additionalProp3": "string"
}
}
],
"LineNumber": 1
}
],
"TotalCount": 1,
"NumberOfSkippedRecords": 0,
"NumberOfRecords": 1,
"HasMoreRecords": false
},
"IsSuccess": true,
"ErrorMessage": "",
"ErrorCode": "",
"ImportJobID": "00000000-0000-0000-0000-000000000000"
}
Error code returned from the Import Service API endpoint has the following structure:
[Resource].[Action].[ErrorType].[ErrorNumber]
Examples:
Error code | Description |
---|---|
J.CR.VLD.1501 | Cannot create job because validation has failed. |
Resource code | Description |
---|---|
J | Job |
C | Document Configuration |
S | Source |
E | ItemErrors |
R | RDO Configuration |
Action code | Description |
---|---|
BEG | Begin |
CR | Create |
CNL | Cancel |
END | End |
GET | Get |
GET_COL | Get columns |
GET_CFG | Get config |
GET_DAT | Get data |
GET_DTLS | Get details |
GET_PRG | Get progress |
LN | Line |
PS | Pause |
RD | Read |
RES | Resume |
RUN | Run |
Error type code | Description |
---|---|
INT | Internal service error |
EXT | External dependency error |
VLD | Validation error |
Error number has 4 digits. Digits on the first and on the second position has the special meaning.
Meaning of the first digit is the same for all error types.
Resource code | Description |
---|---|
0XXX | General error |
1XXX | Job related error |
2XXX | Configuration related error |
3XXX | Source related error |
4XXX | ItemErrors related error |
Meaning of the second digit differs for each error type.
Error Type | Resource code | Description |
---|---|---|
INT | X[0-9]XX | Service errors |
EXT | X[0-9]XX | Runtime errors |
VLD | X0XX | Invalid input data |
VLD | X5XX | System state does not allow to execute request |
VLD | X6XX | Data in the system does not exist |
VLD | X7XX | Data in the system is incorrect |
VLD | X9XX | Data in the system is corrupted |
There are three types of sample application that demonstrate the use of Import Service API features.
- DotNetClientConsole - .NET console application (.NET 7, C#).
- KeplerClientConsole - .NET console application (.NET Framework 4.6.2, Kepler client, C#).
- REST client - Powershell scripts.
Examples structure:
- Each code example for particular import flow is contained in separate file (e.g. Sample04_AddDataSourceToRunningJob.cs).
- Each sample is numbered. Number of sample is included in file and class name.
- The individual samples for each application are consistent. For instance Sample_08 in Net7ConsoleClient presents the same import flow as in KeplerClientConsole and in PS scripts.
- Sample code contains accurate comments describing the flow.
- Expected console result of demonstrated sample flow is included at the end of the file.
- Repository with samples contains also structured data set used in all examples - load files, opticon files, folders structure with native files, images, text files.
List of samples:
Sample name | .NET | .Net Framework & Kepler | PowerShell |
---|---|---|---|
Sample01_ImportNativeFiles | Sample01 | Sample01 | Sample01 |
Sample02_ImportDocumentsInOverlayMode | Sample02 | Sample02 | Sample02 |
Sample03_ImportFromTwoDataSources | Sample03 | Sample03 | Sample03 |
Sample04_AddDataSourceToRunningJob | Sample04 | Sample04 | Sample04 |
Sample05_ImportDocumentsWithExtractedText | Sample05 | Sample05 | Sample05 |
Sample06_ImportDocumentsToSelectedFolder | Sample06 | Sample06 | Sample06 |
Sample07_DirectImportSettingsForDocuments.cs | Sample07 | Sample07 | Sample07 |
Sample08_ImportImages | Sample08 | Sample08 | Sample08 |
Sample09_ImportProductionFiles | Sample09 | Sample09 | Sample09 |
Sample10_ImportImagesInAppendOverlayMode | Sample10 | Sample10 | Sample10 |
Sample11_DirectImportSettingsForImages | Sample11 | Sample11 | Sample11 |
Sample12_ImportRelativityDynamicObject | Sample12 | Sample12 | Sample12 |
Sample13_ImportRdoWithParent | Sample13 | Sample13 | Sample13 |
Sample14_DirectImportSettingsForRdo | Sample14 | Sample14 | Sample14 |
Sample15_ReadImportRdoSettings | Sample15 | Sample15 | Sample15 |
Sample16_ReadImportDocumentSettings | Sample16 | Sample16 | Sample16 |
Sample17_GetImportJobs | Sample17 | Sample17 | Sample17 |
Sample18_GetDataSource | Sample18 | Sample18 | Sample18 |
Sample19_GetImportJobDetailsAndProgress | Sample19 | Sample19 | Sample19 |
Sample20_GetDataSourceDetailsAndProgress | Sample20 | Sample20 | Sample20 |
Sample21_CancelStartedJob | Sample21 | Sample21 | Sample21_ |
Sample22_ReadResponse | Sample22 | Sample22 | Sample22 |
Sample23_GetDataSourceErrors | Sample23 | Sample23 | Sample23 |
To run a sample code:
-
Copy the content of sample dataset to your Relativity fileshare.
-
Uncomment line with sample invocation you want to run in Main method.
// await sampleCollection.Sample08_ImportImages(); // await sampleCollection.Sample09_ImportProductionFiles(); await sampleCollection.Sample10_ImportImagesInAppendOverlayMode();
-
Set the proper credentials and URI of your Relativity instance in RelativityUserSettings helper class.
public class RelativityUserSettings { public const string BaseAddress = "https://host/Relativity.REST/"; public const string UserName = "[email protected]"; public const string Password = "password!"; }
-
Update workspaceId const to the proper value equals Id of the workspace where you intend to import data. It is required in each sample.
// destination workspace artifact Id. const int workspaceId = 1000000;
-
Update other Ids related to your workspace - productionSetsArtifactId , rootFolderId,rdoArtifactTypeID. They are required only by specific samples.
-
Update const which defines the path to the load file (e.g. const string loadFile01Path) according to the location where you copied sample data.
// Path to the file in opticon format used in data source settings. const string opticonFilePath = "\\\\files\\T001\\StructuredData\\Import\\SampleDataSources\\opticon_01.opt";
-
Run application
To run a sample code:
-
Copy the content of sample dataset to your Relativity fileshare.
-
Uncomment line with sample invocation you want to run in Main method.
-
Set the proper credentials and host address of your Relativity instance in RelativityUserSettings helper class.
public static class RelativityUserSettings { public const string HostAddress = "https://host-address"; public const string UserName = "[email protected]"; public const string Password = "password!"; }
-
Update workspaceId const to the proper value equal Id of the workspace where you intend to import data. It is required in each sample.
-
Update other Ids related to your workspace - productionSetsArtifactId , rootFolderId,rdoArtifactTypeID. They are required only by specific samples.
-
Update const which defines the path to the load file (e.g. const string loadFile01Path) according to the location where you copied sample data.
-
Run application
To run a sample code:
-
Install Powershell "Pester" Module (ver. >= 5.3.3)
Find-Module -Name "Pester" | Install-Module -Force;
-
Copy the content of sample dataset to your Relativity fileshare.
-
Uncomment line with sample invocation you want to run in run-sample-import.ps1.
Describe "Sample import" { . "$global:rootDir\SamplesCollection\sample01-import-native-files.ps1" # . "$global:rootDir\SamplesCollection\sample02-import-documents-in-overlay-mode.ps1"
-
Set the proper credentials and host address of your Relativity instance in "run-sample-import.ps1".
$hostAddress = "https://sample-host/" $userName = "sample@username" $password = "password!"
-
Update workspaceId to the proper value equal Id of the workspace where you intend to import data. It is required in each sample.
-
Update variable which defines the path to the load file/opticon file (e.g. $opticonFilePath) according to the location where you copied sample data.
$workspaceId = 1000000 $loadFilePath = "\\files\T001\StructuredData\Import\SampleDataSources\load_file_01.dat"
-
Update other Ids related to your workspace - productionSetsArtifactId , rootFolderId,rdoArtifactTypeID. They are required only by specific samples.
-
Invoke run-sample-import.ps1
For best performance, we highly recommend using UTF-16 encoding for any single long text field (including Extracted text). Other encodings are still supported, but will be converted to UTF-16 which will add delay to document or image import process.
For the document workflow, set FieldMapping.Encoding to UTF-16. Similarly, for the image workflow, configure ImageSettings.ExtractedTextEncoding as UTF-16. With these settings in place, the conversion overhead is eliminated, and your files will be copied directly in the unicode encoding, resulting in faster processing times.
ImportDocumentSettings importDocuments = ImportDocumentSettingsBuilder.Create()
.WithAppendMode()
.WithNatives(x => x
.WithFilePathDefinedInColumn(filePathColumnIndex)
.WithFileNameDefinedInColumn(fileNameColumnIndex))
.WithoutImages()
.WithFieldsMapped(x => x
.WithField(controlNumberColumnIndex, "Control Number")
.WithExtractedTextField(extractedTextPathColumnIndex, e => e
.WithExtractedTextInSeparateFiles(f => f
.WithEncoding("UTF-16")
.WithFileSizeDefinedInColumn(fileSizeColumnIndex))))
.WithoutFolders();
ImportDocumentSettings importImages = ImportDocumentSettingsBuilder.Create()
.WithAppendMode()
.WithoutNatives()
.WithImages(i => i
.WithAutoNumberImages()
.WithoutProduction()
.WithExtractedText(e => e.WithEncoding("UTF-16"))
.WithFileTypeAutoDetection())
.WithoutFieldsMapped()
.WithoutFolders();
If your mapping contains more than one long text field, you should use UTF-16. No other encodings are supported in this case.
Another valuable setting that can enhance performance is the FieldMapping.FileSizeColumnIndex. By configuring this setting, the need for additional file size calculations can be eliminated. The file sizes will be automatically extracted from the load file, streamlining the process and saving valuable processing time.
Note: The FileSizeColumnIndex setting will only take effect if FieldMapping.ContainsFilePath is set to true, and the FieldMapping.Encoding is set to UTF-16. This property applies only to long text fields stored in Data Grid, including Extracted Text.