Skip to content

Commit

Permalink
Deploy preview for PR 57 πŸ›«
Browse files Browse the repository at this point in the history
  • Loading branch information
mjaehn committed Feb 1, 2024
1 parent 5a139d2 commit bbce8da
Show file tree
Hide file tree
Showing 5 changed files with 61 additions and 48 deletions.
Binary file modified pr-preview/pr-57/.doctrees/environment.pickle
Binary file not shown.
Binary file modified pr-preview/pr-57/.doctrees/howtorun.doctree
Binary file not shown.
54 changes: 31 additions & 23 deletions pr-preview/pr-57/_sources/howtorun.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -23,51 +23,50 @@ contain additional runscripts to be submitted via ``sbatch``.

.. hint::
Technically, you can run several cases (instead of a single case) in one command,
which is useful for nested run, for example. This can be achieved by running
which is useful for nested runs, for example. This can be achieved by running
``./run_chain.py <case1> <case2>``. With that, the full chain is executed for
``case1`` first, and afterwards for ``case2``.

Without specifiying a job list, the default joblist defined in
``config/models.yaml`` will be executed.

There are several optional arguments available to change the behavior of the chain:

$ ./run_chain.py -h

* ``-h``, ``--help``
Show a help message and exit.
Show this help message and exit.
* ``-j [JOB_LIST ...]``, ``--jobs [JOB_LIST ...]``
List of job names to be executed.
A job is a .py-file in jobs/ with a ``main()`` function, which
A job is a ``.py`` file in i``jobs/`` with a ``main()`` function, which
handles one aspect of the Processing Chain, for
example copying ``meteo`` input data or launching a
job for ``int2lm``. Jobs are executed in the order
in which they are given here. If no jobs are
given, default jobs will be executedas defined
in config/models.yaml.
given, default jobs will be executed as defined
in ``config/models.yaml``.
* ``-f``, ``--force``
Force the processing chain to redo all specified
Force the Processing Chain to redo all specified
jobs, even if they have been started already or
were finished previously. WARNING: Only logfiles
get deleted, other effects of a given job
(copied files etc.) are simply overwritten. This
may cause errors.
* ``-t NTRY``, ``--try NTRY``
Amount of time the cosmo job is re-tried before crashing. Default is 1.
may cause errors or unexpected behavior.
* ``-r``, ``--resume``
Resume the processing chain by restarting the
Resume the Processing Chain by restarting the
last unfinished job. WARNING: Only the logfile
gets deleted, other effects of a given job
(copied files etc.) are simply overwritten. This
may cause errors.
may cause errors or unexpected behavior.

What it Does
------------

The script ``run_chain.py`` reads the command line arguments and the config file.
The script ``run_chain.py`` reads the command line arguments and the config file
from the specified case.
It then calls the function :func:`run_chain.restart_runs`, which divides the
simulation time according to the specified restart steps. Then it calls
:func:`run_chain.run_chain` for each sub-run. This function sets up the directory
structure of the chain and then starts the specified :ref:`jobs<jobs-section>`
sequentially.
:func:`run_chain.run_chunk` for each part (chunk) of the simulation workflow.
This function sets up the directory structure of the chain and then submits the
specified :ref:`jobs<jobs-section>` via ``sbatch`` to the Slurm workload manager,
taking job dependencies into account.

Test Cases
----------
Expand All @@ -89,6 +88,10 @@ the script::
This will run all the individual scripts in ``jenkins/scripts/``, which
can also be launched separately if desired.

These cases undergo regulary testing to ensure that the Processing Chain runs
correctly. A corresponding Jenkins plan is launched on a weekly basis and
when triggered within a GitHub pull request.

Directory Structure
-------------------

Expand All @@ -108,6 +111,11 @@ run looks like this::
β”œβ”€β”€ cfg.int2lm_input/
β”œβ”€β”€ cfg.int2lm_work/
└── cfg.int2lm_output/

As one can see, it creates working directories for both the ``int2lm`` preprocessor
and ``cosmo``. Additionally, and this is always the case, the ``checkpoints``
directory holds all the job logfiles. Whenever a job has successfully finished,
the logfile is copied from the ``working`` to the ``finished`` sub-directory.

Running the ``cosmo-ghg-test`` case therefore produces the following
directories and files (showing four levels of directories deep)::
Expand All @@ -124,7 +132,7 @@ directories and files (showing four levels of directories deep)::
β”‚ β”‚ β”‚ β”œβ”€β”€ online_vprm
β”‚ β”‚ β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”‚ β”‚ β”œβ”€β”€ post_int2lm
β”‚ β”‚ β”‚ └── prepare_data
β”‚ β”‚ β”‚ └── prepare_cosmo
β”‚ β”‚ └── working/
β”‚ β”‚ β”œβ”€β”€ biofluxes
β”‚ β”‚ β”œβ”€β”€ cosmo
Expand All @@ -134,7 +142,7 @@ directories and files (showing four levels of directories deep)::
β”‚ β”‚ β”œβ”€β”€ online_vprm
β”‚ β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”‚ β”œβ”€β”€ post_int2lm
β”‚ β”‚ └── prepare_data
β”‚ β”‚ └── prepare_cosmo
β”‚ β”œβ”€β”€ cosmo/
β”‚ β”‚ β”œβ”€β”€ input/
β”‚ β”‚ β”‚ β”œβ”€β”€ oem/
Expand Down Expand Up @@ -177,7 +185,7 @@ directories and files (showing four levels of directories deep)::
β”‚ β”‚ β”œβ”€β”€ online_vprm
β”‚ β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”‚ β”œβ”€β”€ post_int2lm
β”‚ β”‚ └── prepare_data
β”‚ β”‚ └── prepare_cosmo
β”‚ └── working/
β”‚ β”œβ”€β”€ biofluxes
β”‚ β”œβ”€β”€ cosmo
Expand All @@ -187,7 +195,7 @@ directories and files (showing four levels of directories deep)::
β”‚ β”œβ”€β”€ online_vprm
β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”œβ”€β”€ post_int2lm
β”‚ └── prepare_data
β”‚ └── prepare_cosmo
β”œβ”€β”€ cosmo/
β”‚ β”œβ”€β”€ input/
β”‚ β”‚ β”œβ”€β”€ oem
Expand Down Expand Up @@ -222,7 +230,7 @@ directories and files (showing four levels of directories deep)::

-------------------------------------------

.. autofunction:: run_chain.run_chain
.. autofunction:: run_chain.run_chunk

-------------------------------------------

Expand Down
53 changes: 29 additions & 24 deletions pr-preview/pr-57/howtorun.html
Original file line number Diff line number Diff line change
Expand Up @@ -169,65 +169,63 @@ <h2>Starting the Chain<a class="headerlink" href="#starting-the-chain" title="Pe
<div class="admonition hint">
<p class="admonition-title">Hint</p>
<p>Technically, you can run several cases (instead of a single case) in one command,
which is useful for nested run, for example. This can be achieved by running
which is useful for nested runs, for example. This can be achieved by running
<code class="docutils literal notranslate"><span class="pre">./run_chain.py</span> <span class="pre">&lt;case1&gt;</span> <span class="pre">&lt;case2&gt;</span></code>. With that, the full chain is executed for
<code class="docutils literal notranslate"><span class="pre">case1</span></code> first, and afterwards for <code class="docutils literal notranslate"><span class="pre">case2</span></code>.</p>
</div>
<p>Without specifiying a job list, the default joblist defined in
<code class="docutils literal notranslate"><span class="pre">config/models.yaml</span></code> will be executed.</p>
<p>There are several optional arguments available to change the behavior of the chain:</p>
<blockquote>
<div><p>$ ./run_chain.py -h</p>
</div></blockquote>
<ul class="simple">
<li><dl class="simple">
<dt><code class="docutils literal notranslate"><span class="pre">-h</span></code>, <code class="docutils literal notranslate"><span class="pre">--help</span></code></dt><dd><p>Show a help message and exit.</p>
<dt><code class="docutils literal notranslate"><span class="pre">-h</span></code>, <code class="docutils literal notranslate"><span class="pre">--help</span></code></dt><dd><p>Show this help message and exit.</p>
</dd>
</dl>
</li>
<li><dl class="simple">
<dt><code class="docutils literal notranslate"><span class="pre">-j</span> <span class="pre">[JOB_LIST</span> <span class="pre">...]</span></code>, <code class="docutils literal notranslate"><span class="pre">--jobs</span> <span class="pre">[JOB_LIST</span> <span class="pre">...]</span></code></dt><dd><p>List of job names to be executed.
A job is a .py-file in jobs/ with a <code class="docutils literal notranslate"><span class="pre">main()</span></code> function, which
A job is a <code class="docutils literal notranslate"><span class="pre">.py</span></code> file in i``jobs/`` with a <code class="docutils literal notranslate"><span class="pre">main()</span></code> function, which
handles one aspect of the Processing Chain, for
example copying <code class="docutils literal notranslate"><span class="pre">meteo</span></code> input data or launching a
job for <code class="docutils literal notranslate"><span class="pre">int2lm</span></code>. Jobs are executed in the order
in which they are given here. If no jobs are
given, default jobs will be executedas defined
in config/models.yaml.</p>
given, default jobs will be executed as defined
in <code class="docutils literal notranslate"><span class="pre">config/models.yaml</span></code>.</p>
</dd>
</dl>
</li>
<li><dl class="simple">
<dt><code class="docutils literal notranslate"><span class="pre">-f</span></code>, <code class="docutils literal notranslate"><span class="pre">--force</span></code></dt><dd><p>Force the processing chain to redo all specified
<dt><code class="docutils literal notranslate"><span class="pre">-f</span></code>, <code class="docutils literal notranslate"><span class="pre">--force</span></code></dt><dd><p>Force the Processing Chain to redo all specified
jobs, even if they have been started already or
were finished previously. WARNING: Only logfiles
get deleted, other effects of a given job
(copied files etc.) are simply overwritten. This
may cause errors.</p>
may cause errors or unexpected behavior.</p>
</dd>
</dl>
</li>
<li><dl class="simple">
<dt><code class="docutils literal notranslate"><span class="pre">-t</span> <span class="pre">NTRY</span></code>, <code class="docutils literal notranslate"><span class="pre">--try</span> <span class="pre">NTRY</span></code></dt><dd><p>Amount of time the cosmo job is re-tried before crashing. Default is 1.</p>
</dd>
</dl>
</li>
<li><dl class="simple">
<dt><code class="docutils literal notranslate"><span class="pre">-r</span></code>, <code class="docutils literal notranslate"><span class="pre">--resume</span></code></dt><dd><p>Resume the processing chain by restarting the
<dt><code class="docutils literal notranslate"><span class="pre">-r</span></code>, <code class="docutils literal notranslate"><span class="pre">--resume</span></code></dt><dd><p>Resume the Processing Chain by restarting the
last unfinished job. WARNING: Only the logfile
gets deleted, other effects of a given job
(copied files etc.) are simply overwritten. This
may cause errors.</p>
may cause errors or unexpected behavior.</p>
</dd>
</dl>
</li>
</ul>
</section>
<section id="what-it-does">
<h2>What it Does<a class="headerlink" href="#what-it-does" title="Permalink to this heading"></a></h2>
<p>The script <code class="docutils literal notranslate"><span class="pre">run_chain.py</span></code> reads the command line arguments and the config file.
<p>The script <code class="docutils literal notranslate"><span class="pre">run_chain.py</span></code> reads the command line arguments and the config file
from the specified case.
It then calls the function <code class="xref py py-func docutils literal notranslate"><span class="pre">run_chain.restart_runs()</span></code>, which divides the
simulation time according to the specified restart steps. Then it calls
<code class="xref py py-func docutils literal notranslate"><span class="pre">run_chain.run_chain()</span></code> for each sub-run. This function sets up the directory
structure of the chain and then starts the specified <a class="reference internal" href="jobs.html#jobs-section"><span class="std std-ref">jobs</span></a>
sequentially.</p>
<code class="xref py py-func docutils literal notranslate"><span class="pre">run_chain.run_chunk()</span></code> for each part (chunk) of the simulation workflow.
This function sets up the directory structure of the chain and then submits the
specified <a class="reference internal" href="jobs.html#jobs-section"><span class="std std-ref">jobs</span></a> via <code class="docutils literal notranslate"><span class="pre">sbatch</span></code> to the Slurm workload manager,
taking job dependencies into account.</p>
</section>
<section id="test-cases">
<h2>Test Cases<a class="headerlink" href="#test-cases" title="Permalink to this heading"></a></h2>
Expand All @@ -247,6 +245,9 @@ <h2>Test Cases<a class="headerlink" href="#test-cases" title="Permalink to this
</div>
<p>This will run all the individual scripts in <code class="docutils literal notranslate"><span class="pre">jenkins/scripts/</span></code>, which
can also be launched separately if desired.</p>
<p>These cases undergo regulary testing to ensure that the Processing Chain runs
correctly. A corresponding Jenkins plan is launched on a weekly basis and
when triggered within a GitHub pull request.</p>
</section>
<section id="directory-structure">
<h2>Directory Structure<a class="headerlink" href="#directory-structure" title="Permalink to this heading"></a></h2>
Expand All @@ -267,6 +268,10 @@ <h2>Directory Structure<a class="headerlink" href="#directory-structure" title="
└── cfg.int2lm_output/
</pre></div>
</div>
<p>As one can see, it creates working directories for both the <code class="docutils literal notranslate"><span class="pre">int2lm</span></code> preprocessor
and <code class="docutils literal notranslate"><span class="pre">cosmo</span></code>. Additionally, and this is always the case, the <code class="docutils literal notranslate"><span class="pre">checkpoints</span></code>
directory holds all the job logfiles. Whenever a job has successfully finished,
the logfile is copied from the <code class="docutils literal notranslate"><span class="pre">working</span></code> to the <code class="docutils literal notranslate"><span class="pre">finished</span></code> sub-directory.</p>
<p>Running the <code class="docutils literal notranslate"><span class="pre">cosmo-ghg-test</span></code> case therefore produces the following
directories and files (showing four levels of directories deep):</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span>work/cosmo-ghg-test
Expand All @@ -281,7 +286,7 @@ <h2>Directory Structure<a class="headerlink" href="#directory-structure" title="
β”‚ β”‚ β”‚ β”œβ”€β”€ online_vprm
β”‚ β”‚ β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”‚ β”‚ β”œβ”€β”€ post_int2lm
β”‚ β”‚ β”‚ └── prepare_data
β”‚ β”‚ β”‚ └── prepare_cosmo
β”‚ β”‚ └── working/
β”‚ β”‚ β”œβ”€β”€ biofluxes
β”‚ β”‚ β”œβ”€β”€ cosmo
Expand All @@ -291,7 +296,7 @@ <h2>Directory Structure<a class="headerlink" href="#directory-structure" title="
β”‚ β”‚ β”œβ”€β”€ online_vprm
β”‚ β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”‚ β”œβ”€β”€ post_int2lm
β”‚ β”‚ └── prepare_data
β”‚ β”‚ └── prepare_cosmo
β”‚ β”œβ”€β”€ cosmo/
β”‚ β”‚ β”œβ”€β”€ input/
β”‚ β”‚ β”‚ β”œβ”€β”€ oem/
Expand Down Expand Up @@ -334,7 +339,7 @@ <h2>Directory Structure<a class="headerlink" href="#directory-structure" title="
β”‚ β”‚ β”œβ”€β”€ online_vprm
β”‚ β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”‚ β”œβ”€β”€ post_int2lm
β”‚ β”‚ └── prepare_data
β”‚ β”‚ └── prepare_cosmo
β”‚ └── working/
β”‚ β”œβ”€β”€ biofluxes
β”‚ β”œβ”€β”€ cosmo
Expand All @@ -344,7 +349,7 @@ <h2>Directory Structure<a class="headerlink" href="#directory-structure" title="
β”‚ β”œβ”€β”€ online_vprm
β”‚ β”œβ”€β”€ post_cosmo
β”‚ β”œβ”€β”€ post_int2lm
β”‚ └── prepare_data
β”‚ └── prepare_cosmo
β”œβ”€β”€ cosmo/
β”‚ β”œβ”€β”€ input/
β”‚ β”‚ β”œβ”€β”€ oem
Expand Down
Loading

0 comments on commit bbce8da

Please sign in to comment.