Skip to content

Shanoir account migration

jcomedouteau edited this page Apr 14, 2021 · 2 revisions

Migration comptes GME:

The decision was made to make a mixed SQL/programatical solution to migrate data from a study to another server. The idea is tu use as much as possible the rest API to migrate the studies and their datas. For reference data (center, equipements, etc) can be used SQL scripts.

SQL Scripts

Elements that can be migrated using an SQL script:

import

  • nifticonverter

migrations

  • migrations (the database will likely be up to date with migrations)

preclinical

  • anesthetic
  • anesthetic_ingredient
  • contrast_agent
  • pathology
  • pathology_model
  • reference
  • therapy

studies

  • acquisition_equipment
  • center
  • coil
  • manufacturer
  • manufacturer_model
  • timepoint

users

  • role

We have to use mySQL method "INSERT IGNORE INTO" so that an already existing element would not be updated.

Users microservices

Users don't have to be migrated, they have to be re-created by the concerned users.

Study Microservice

Elements to be moved: Subject

  • reset ID

Study

  • reset ID
  • reset SubjectStudy to update with subjects
  • Add protocol/ DUA files
  • Set StudyUser to only current user
  • reset studyExaminations

Preclinical microservice

Elements to be moved:

animal_subject

  • reset ID
  • Update subject ID with animalSubjectID

subject_pathology

  • reset ID
  • Update subject ID with animalSubjectID

subject_therapy

  • reset ID
  • Update subject ID with animalSubjectID

AFTER DATASETS

examination_anesthetic

  • reset ID
  • reset examination ID

examination_extradata

  • reset ID
  • reset examination ID

Dataset microservice

This is the most complicated microservice as it contains all the logic of datasets.

StudyCards

  • Update study ID
    • StudyCardRule
      • StudyCardCondition
      • StudyCardAssignment

Examinations

  • DatasetAcquisitions

    • Update studyCard ID
    • Protocol(
      • CT,
      • PET
      • MR
        • DiffusionGradient
        • MrProtocolMetadata
          • mr_protocol_metadata_mr_scanning_sequence (complicated)
          • mr_protocol_metadata_mr_sequence_variant (complicated) )
    • datasets (
      • CalibrationDataset
      • CtDataset
      • EegDataset
        • Channel
        • Event
      • MegDataset
      • MeshDataset
      • MrDataset
        • DiffusionGradient
        • EchoTime
        • FlipAngle
        • InversionTime
        • RepetitionTime
        • MrDatasetMetadata
      • ParameterQuantificationDataset
      • PetDataset
      • RegistrationDataset
      • SegmentationDataset
      • SpectDataset
      • StatisticalDataset
      • TemplateDataset )
        • referencedDatasetForSuperimposition(ChildrenList) (?)
        • DatasetProcessing
          • input_of_dataset_processing
        • DatasetMetadata
        • DatasetExpression
          • DatasetFile
    • InstrumentBasedAssesment
      • VariableAssessment
        • InstrumentVariable (coded/num)
          • ScaleItem
        • Score (coded/num)
          • ScaleItem
      • Instrument
        • Scientific instrument

Related datasets (update datasetId and studyId)

Add extra-data files

Other important points:

  • Create a simple "add dicom image" just to push a dicom image into PACS in adequation with a DatasetFile
  • Create a simple "add nifti image" just to copy a nifti file on the server
  • Solr has to be declenched too
  • What about the fact that the user has to enter its password directly in shanoir -> I think it's not OK, it should connect directly (see what was done in uploader before) and get a keyloack token instead.

Other envisaged solutions

Moving data to GME, how to do this properly ? Many possibilities:

  1. Copy all the database (dump) then remove one by on all the unneeded studies from interface
  • Can we bring a hard disk with all the content to the GME? / How to transfer the data ?
  • Can we (even temporarily) copy all the ‘private’ data of Neurinfo ?
  • How to copy data from the PACS ? Is it possible to make a dump too ? (or use a dcmtk tool ?)
  • If we delete a study, all data deleted ? (in PACS = NO I guess (this has to be corrected btw), but if the data is not here it’s not a pb)
  1. Programatically, using the API/interfaces

Create a little program that interrogates shanoir-neurinfo on choosen study, then recreates the study in GME with all its elements.

  • Difficult to list ALL elements to recreate
  • Maybe use shanoir uploader?
  • Will we use import interface then ? What about re-anonymization ?
  1. Use some kind of BIDS archive that we can export subject by subject then re-import directly into GME
  • Problem of re-anonymization as we redo an import ?
  • Not BIDS as we want to work with dicom essentially
  • Recreate manually all studies / study card / centers ?
  • How to do this automatically (export then re-import)
  1. Create a huge SQL request / procedure to move data
  • How to move files ?
  • How to move PACS ?
  • The list of all data will be enormous
  1. Mixed solution
  • Move with an SQL script all references (can by applied easily) (studies, centers, etc..)
  • Move programatically subjects, exams, datasets, dataset files only for the studies we want, importing data with some kind of « Shanoir uploader »

Pertinent questions:

  • How do we move dicom in PACS ? Can we make PACS communicate => create an alternative PACS that’ll be dupliacted from neurinfo with choosen studies ? (or use a dcmtk tool ?) Possible to make a dump too ?

That would mean opening shanoir’s/GME's PACS to the world (temporarly)?

  • How do we move files on server if not programaticaly or with hard drive ?

Lots of questions here, not a lot of answers => to be discussed !

Clone this wiki locally