-
Email [email protected] for a database username.
-
Install Docker (https://www.docker.com/). Linux users also need to install Docker Compose separately. For Mac: https://docs.docker.com/docker-for-mac/.
-
Fork the repository (https://github.com/int-brain-lab/IBL-pipeline) onto your own GitHub account.
-
Clone the forked repository, i.e. copy the files to your local machine by
git clone [email protected]:YourUserName/IBL-pipeline.git
-
Create a .env file in the cloned directory and modify user and password values per Step 1.
File contents of
.env
:DJ_HOST=datajoint.internationalbrainlab.org DJ_USER=username DJ_PASS=password
-
Copy your
.one_params
file intoIBL-pipeline/root
to not be prompted for Alyx login (see https://ibllib.readthedocs.io/en/latest/02a_installation_python.html).
Note: if you first build the docker container and then add .one_params
, running ONE() in Python may still prompt you for your Alyx and FlatIron login details. In this case, do
docker-compose down docker image rm ibl-pipeline_datajoint:latest docker-compose up -d docker exec -it ibl-pipeline_datajoint_1 /bin/bash
!ToDo: clarify if step 6 is necessary when not importing ONE()
-
To save figures into AlyxPlots on the Google Drive, you can mount this path to somewhere inside the docker. The save the figs into the docker folder. The saved results will be automatically present in the outside folder you mounted.
a.
docker-compose down
b. open
docker-compose.yml
c. add
~/Google Drive/Rig building WG/DataFigures/BehaviourData_Weekly/Snapshot_DataJoint/:/Snapshot_DataJoint_shortcut
in to thevolumes:
d. close the file
e.
docker-compose up -d
Then save the plots from Python into /Snapshot_DataJoint_shortcut
inside the docker, then you’ll see that the plots are in the folder you want.
- If you would like to enter the docker and run scripts through the terminal, navigate
cd
to the IBL-pipeline directory,chmod +x ibl_docker_setup.sh
(only needed once, will give you permission to treat this file as an 'executable') will allow you to run
./ibl_docker_setup.sh
Which contains the following individual steps (as well as starting Docker):
docker-compose up -d
docker exec -it ibl-pipeline_datajoint_1 /bin/bash
After Docker has started, you'll be dropped in a new Terminal. To go back from there to the IBL-pipeline/ibl_pipeline/analysis
folder containing Python scripts: cd /src/ibl-pipeline/ibl_pipeline/analyses
.
Then run e.g. the behavioral snapshot code: python behavioral_snapshot.py
or python behavioral_overview_perlab.py
.
-
Move into the cloned directory in a terminal, then run
docker-compose up -d
. -
Go to http://localhost:8888/tree in your favorite browser to open Jupyter Notebook.
-
Open "Datajoint pipeline query tutorial.ipynb".
-
Run through the notebook and feel free to experiment.
To run an local instance of database in the background, run the docker-compose command as follows:
docker-compose -f docker-compose-local.yml up -d
This will create a docker container with a local database inside. To access the docker from the terminal, first get the docker container ID with docker ps
, then run:
docker exec -it CONTAINER_ID /bin/bash
Now we are in the docker, and run the bash script for the ingestion:
bash /src/ibl-pipeline/scripts/ingest_alyx.sh ../data/alyx_dump/2018-10-30_alyxfull.json
Make sure that the json file is in the correct directory as shown above.
To turn stop the containers, run:
docker-compose -f docker-compose-local.yml down
To insert Alyx data into the remote Amazon RDS, create a .env file in the same directory of your docker-compose.yml
, as instructed in Step 4 above.
Now run the docker-compose as follows, it will by default run through the file docker-compose.yml
docker-compose up -d
This will create a docker container and link to the remote Amazon RDS. Then follow the same instruction of ingestion to the local database.
Alyx-corresponding schemas, including, referenall_erd.save('/images/all_erd.png')ce
, subject
, action
, acquisition
, and data