This project provides a connector to allow for communication between an HRM (Huygens Remote Manager) and an OMERO server.
Its purpose is to simplify the data transfer by allowing raw images to be downloaded from OMERO as well as uploading deconvolution results back to OMERO directly from within the HRM web interface.
NOTE: strictly speaking, Java is only required for uploading data from the HRM to OMERO, so in case for whatever reason you are planning to use the connector in a unidirectional way only you might skip installing the Java packages below. Keep in mind this scenario won't be tested by us though.
# install the build-time requirements for Python 3.6 and Java 1.8 for Bio-Formats
sudo yum install \
python36 \
python36-devel \
openssl-devel \
bzip2-devel \
readline-devel \
gcc-c++ \
java-1.8.0-openjdk
# define the target path for the virtual environment:
HRM_OMERO_VENV="/opt/venvs/hrm-omero"
# create a Python 3.6 virtual environment:
python3 -m venv $HRM_OMERO_VENV
# upgrade pip, install wheel:
$HRM_OMERO_VENV/bin/pip install --upgrade pip wheel
apt install -y \
python3-venv \
openjdk-11-jre-headless
# define the target path for the virtual environment:
HRM_OMERO_VENV="/opt/venvs/hrm-omero"
# create a Python virtual environment:
python3 -m venv $HRM_OMERO_VENV
# upgrade pip, install wheel:
$HRM_OMERO_VENV/bin/pip install --upgrade pip wheel
# install the pre-built Ice wheel from the OME project:
ICE_WHEEL="zeroc_ice-3.6.5-cp38-cp38-linux_x86_64.whl"
wget "https://github.com/ome/zeroc-ice-ubuntu2004/releases/download/0.2.0/$ICE_WHEEL"
$HRM_OMERO_VENV/bin/pip install $ICE_WHEEL
apt install -y \
python3-venv \
openjdk-11-jre-headless
# define the target path for the virtual environment:
HRM_OMERO_VENV="/opt/venvs/hrm-omero"
# create a Python virtual environment:
python3 -m venv $HRM_OMERO_VENV
# upgrade pip, install wheel:
$HRM_OMERO_VENV/bin/pip install --upgrade pip wheel
# install the pre-built Ice wheel from the OME project:
ICE_WHEEL="zeroc_ice-3.6.5-cp310-cp310-linux_x86_64.whl"
wget "https://github.com/ome/zeroc-ice-py-github-ci/releases/download/0.2.0/$ICE_WHEEL"
$HRM_OMERO_VENV/bin/pip install $ICE_WHEEL
# install the connector - please note that it takes quite a while (~15min) as it needs
# to build (compile) the ZeroC Ice bindings:
$HRM_OMERO_VENV/bin/pip install hrm-omero
# from now on you can simply call the connector using its full path, there is no need
# to pre-activate the virtual environment - you could even drop your pyenv completely:
$HRM_OMERO_VENV/bin/ome-hrm --help
# this is even usable as a drop-in replacement for the legacy `ome_hrm.py` script:
cd $PATH_TO_YOUR_HRM_INSTALLATION/bin
mv "ome_hrm.py" "__old__ome_hrm.py"
ln -s "$HRM_OMERO_VENV/bin/ome-hrm" "ome_hrm.py"
Add the following lines to /etc/hrm.conf
and fill in the desired values:
# Interaction with OMERO (if switched on in hrm/config).
OMERO_HOSTNAME="omero.example.xy"
# OMERO_PORT="4064"
OMERO_CONNECTOR_LOGLEVEL="DEBUG"
# OMERO_CONNECTOR_LOGFILE_DISABLED="true"
On top of that it is necessary to explicitly set two environment variables for the
Apache process. By default (at least on recent Ubuntu and CentOS / RHEL versions) the
system user running Apache is not allowed to write to its $HOME
directory for security
reasons. Therefore it is required to specify where the OMERO Python bindings and also
Java may store cache files and preferences. This can be done by running the following
command:
systemctl edit apache2.service # Debian / Ubuntu
systemctl edit httpd.service # CentOS / RHEL / AlmaLinux
There, add the following section, adjusting the path if desired:
[Service]
Environment=OMERO_USERDIR=/var/cache/omero
Environment=JAVA_OPTS="-Djava.util.prefs.userRoot=/var/cache/omero/javaUserRoot"
Now make sure the specified directory exists and is writable by the Apache system user:
mkdir -v /var/cache/omero
chown www-data:www-data /var/cache/omero # Debian / Ubuntu
chown apache:apache /var/cache/omero # CentOS / RHEL / AlmaLinux
Finally, restart Apache by running the respective systemctl
command from above while
replacing edit
for restart
.
The connector will try to place log messages in a file in the directory specified as
$HRM_LOG
in the HRM configuration file unless a configuration option named
$OMERO_CONNECTOR_LOGFILE_DISABLED
is present and non-empty. In a standard setup this
will result in the log file being /var/log/hrm/omero-connector.log
.
In addtion, log messages produced by the connector when called by HRM will be sent to
stderr
, which usually means they will end up in the web server's error log.
By default the connector will be rather silent as otherwise the log files will be
cluttered up quite a bit on a production system. However, it is possible to increase the
log level by specifying -v
, -vv
and so on.
Since this is not useful when being operated through the HRM web interface (which is
the default) it's also possible to set the verbosity level by adjusting the
OMERO_CONNECTOR_LOGLEVEL
in /etc/hrm.conf
.
Valid settings are "SUCCESS"
, "INFO"
, "DEBUG"
and "TRACE"
. If the option is
commented out in the configuration file, the level will be set to WARNING
.
Store username and password in variables, export the OMERO_PASSWORD variable:
read OMERO_USER
read -s OMERO_PASSWORD
export OMERO_PASSWORD # use 'set --export OMERO_PASSWORD $OMERO_PASSWORD' for fish
ome-hrm \
--user $OMERO_USER \
checkCredentials
Set the --id
parameter according to what part of the tree should be retrieved:
OMERO_ID="ROOT" # fetches the base tree view for the current user
OMERO_ID="G:4:Experimenter:9" # fetches the projects of user '9' in group '4'
OMERO_ID="G:4:Project:12345" # fetches the datasets of project '12345'
OMERO_ID="G:4:Dataset:65432" # lists the images of dataset '65432'
Then run the actual command to fetch the information, the result will be a JSON tree:
ome-hrm \
--user $OMERO_USER \
retrieveChildren \
--id "$OMERO_ID"
For example this could be the output when requesting "G:4:Dataset:65432"
:
[
{
"children": [],
"class": "Image",
"id": "G:4:Image:1311448",
"label": "4321_mko_ctx_77.tif",
"owner": "somebody"
},
{
"children": [],
"class": "Image",
"id": "G:4:Image:1566150",
"label": "test-image.tif",
"owner": "somebody"
}
]
This will fetch the second image from the example tree above and store it in /tmp/
:
ome-hrm \
--user $OMERO_USER \
OMEROtoHRM \
--imageid "G:4:Image:1566150" \
--dest /tmp/
The command below will import a local image file into the example dataset from above:
ome-hrm \
--user $OMERO_USER \
HRMtoOMERO \
--dset "G:4:Dataset:65432" \
--file test-image.tif