Skip to content

Commit

Permalink
Merge pull request #340 from elfi-dev/dev
Browse files Browse the repository at this point in the history
Release v.0.7.7
  • Loading branch information
hpesonen authored Oct 12, 2020
2 parents 823d34b + 9ae457f commit 5ce82ed
Show file tree
Hide file tree
Showing 26 changed files with 353 additions and 131 deletions.
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -105,3 +105,9 @@ ENV/
*.swp

notebooks/mydask.png

# vscode-settings
.vscode

# dask
dask-worker-space
24 changes: 24 additions & 0 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# .readthedocs.yml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py

# Build documentation with MkDocs
#mkdocs:
# configuration: mkdocs.yml

# Optionally build your docs in additional formats such as PDF
formats:
- pdf

# Optionally set the version of Python and requirements required to build your docs
python:
version: 3.5
install:
- requirements: requirements.txt
6 changes: 3 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ matrix:
include:
- os: linux
language: python
python: 3.5
python: 3.6
- os: linux
language: python
python: 3.6
python: 3.7
- os: osx
language: generic
before_install:
Expand All @@ -25,6 +25,6 @@ install:
- pip install -e .

script:
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then ipcluster start -n 2 --daemon ; fi
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then ipcluster start -n 2 --daemonize ; fi
#- travis_wait 20 make test
- make test
13 changes: 13 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,19 @@
Changelog
=========


0.7.7 (2020-10-12)
------------------
- Update info to reflect setting python 3.6 as the default version
- Update documentation to setting python 3.6 as default
- Add dask support to elfi client options
- Add python 3.7 to travis tests and remove python 3.5 due to clash with dask
- Modify progress bar to better indicate ABC-SMC inference status
- Change networkx support from 1.X to 2.X
- Improve docstrings in elfi.methods.bo.acquisition
- Fix readthedocs-build by adding .readthedocs.yml and restricting the build to
python3.5, for now

0.7.6 (2020-08-29)
------------------
- Fix incompatibility with scipy>1.5 in bo.utils.stochastic_optimization
Expand Down
6 changes: 3 additions & 3 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -75,10 +75,10 @@ Ready to contribute? Here's how to set up `ELFI` for local development.
$ python -V

4. Install your local copy and the development requirements into a conda
environment. You may need to replace "3.5" in the first line with the python
environment. You may need to replace "3.6" in the first line with the python
version printed in the previous step::

$ conda create -n elfi python=3.5 numpy
$ conda create -n elfi python=3.6 numpy
$ source activate elfi
$ cd elfi
$ make dev
Expand Down Expand Up @@ -127,7 +127,7 @@ Before you submit a pull request, check that it meets these guidelines:
2. If the pull request adds functionality, the docs should be updated. Put
your new functionality into a function with a docstring, and add the
feature to the list in README.rst.
3. The pull request should work for Python 3.5 and later. Check
3. The pull request should work for Python 3.6 and later. Check
https://travis-ci.org/elfi-dev/elfi/pull_requests
and make sure that the tests pass for all supported Python versions.

Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
**Version 0.7.6 released!** See the [CHANGELOG](CHANGELOG.rst) and [notebooks](https://github.com/elfi-dev/notebooks).
**Version 0.7.7 released!** See the [CHANGELOG](CHANGELOG.rst) and [notebooks](https://github.com/elfi-dev/notebooks).

**NOTE:** For the time being NetworkX 2 is incompatible with ELFI.

Expand Down Expand Up @@ -40,7 +40,7 @@ is preferable.
Installation
------------

ELFI requires Python 3.5 or greater. You can install ELFI by typing in your terminal:
ELFI requires Python 3.6 or greater. You can install ELFI by typing in your terminal:

```
pip install elfi
Expand Down Expand Up @@ -70,7 +70,7 @@ with your default Python environment and can easily use different versions of Py
in different projects. You can create a virtual environment for ELFI using anaconda with:

```
conda create -n elfi python=3.5 numpy
conda create -n elfi python=3.6 numpy
source activate elfi
pip install elfi
```
Expand Down
10 changes: 5 additions & 5 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
Installation
============

ELFI requires Python 3.5 or greater (see below how to install). To install ELFI, simply
ELFI requires Python 3.6 or greater (see below how to install). To install ELFI, simply
type in your terminal:

.. code-block:: console
Expand All @@ -18,16 +18,16 @@ process.
.. _Python installation guide: http://docs.python-guide.org/en/latest/starting/installation/


Installing Python 3.5
Installing Python 3.6
---------------------

If you are new to Python, perhaps the simplest way to install it is with Anaconda_ that
manages different Python versions. After installing Anaconda, you can create a Python 3.5.
manages different Python versions. After installing Anaconda, you can create a Python 3.6.
environment with ELFI:

.. code-block:: console
conda create -n elfi python=3.5 numpy
conda create -n elfi python=3.6 numpy
source activate elfi
pip install elfi
Expand All @@ -51,7 +51,7 @@ Resolving these may sometimes go wrong:
* If you receive an error about missing ``numpy``, please install it first.
* If you receive an error about `yaml.load`, install ``pyyaml``.
* On OS X with Anaconda virtual environment say `conda install python.app` and then use `pythonw` instead of `python`.
* Note that ELFI requires Python 3.5 or greater
* Note that ELFI requires Python 3.6 or greater
* In some environments ``pip`` refers to Python 2.x, and you have to use ``pip3`` to use the Python 3.x version
* Make sure your Python installation meets the versions listed in requirements_.

Expand Down
2 changes: 1 addition & 1 deletion docs/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Quickstart

First ensure you have
`installed <http://elfi.readthedocs.io/en/stable/installation.html>`__
Python 3.5 (or greater) and ELFI. After installation you can start using
Python 3.6 (or greater) and ELFI. After installation you can start using
ELFI:

.. code:: ipython3
Expand Down
2 changes: 1 addition & 1 deletion elfi/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@
__email__ = '[email protected]'

# make sure __version_ is on the last non-empty line (read by setup.py)
__version__ = '0.7.6'
__version__ = '0.7.7'
10 changes: 8 additions & 2 deletions elfi/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,8 @@ def submit(self, batch=None):
loaded_net = self.client.load_data(self.compiled_net, self.context, batch_index)
# Override
for k, v in batch.items():
loaded_net.node[k] = {'output': v}
loaded_net.nodes[k].update({'output': v})
del loaded_net.nodes[k]['operation']

task_id = self.client.submit(loaded_net)
self._pending_batches[batch_index] = task_id
Expand Down Expand Up @@ -299,7 +300,12 @@ def compile(cls, source_net, outputs=None):
outputs = source_net.nodes()
if not outputs:
logger.warning("Compiling for no outputs!")
outputs = outputs if isinstance(outputs, list) else [outputs]
if isinstance(outputs, list):
outputs = set(outputs)
elif isinstance(outputs, type(source_net.nodes())):
outputs = outputs
else:
outputs = [outputs]

compiled_net = nx.DiGraph(
outputs=outputs, name=source_net.graph['name'], observed=source_net.graph['observed'])
Expand Down
111 changes: 111 additions & 0 deletions elfi/clients/dask.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
"""This module implements a multiprocessing client using dask."""

import itertools
import os

from dask.distributed import Client as DaskClient

import elfi.client


def set_as_default():
"""Set this as the default client."""
elfi.client.set_client()
elfi.client.set_default_class(Client)


class Client(elfi.client.ClientBase):
"""A multiprocessing client using dask."""

def __init__(self):
"""Initialize a dask client."""
self.dask_client = DaskClient()
self.tasks = {}
self._id_counter = itertools.count()

def apply(self, kallable, *args, **kwargs):
"""Add `kallable(*args, **kwargs)` to the queue of tasks. Returns immediately.
Parameters
----------
kallable: callable
Returns
-------
task_id: int
"""
task_id = self._id_counter.__next__()
async_result = self.dask_client.submit(kallable, *args, **kwargs)
self.tasks[task_id] = async_result
return task_id

def apply_sync(self, kallable, *args, **kwargs):
"""Call and returns the result of `kallable(*args, **kwargs)`.
Parameters
----------
kallable: callable
"""
return self.dask_client.run_on_scheduler(kallable, *args, **kwargs)

def get_result(self, task_id):
"""Return the result from task identified by `task_id` when it arrives.
Parameters
----------
task_id: int
Returns
-------
dict
"""
async_result = self.tasks.pop(task_id)
return async_result.result()

def is_ready(self, task_id):
"""Return whether task with identifier `task_id` is ready.
Parameters
----------
task_id: int
Returns
-------
bool
"""
return self.tasks[task_id].done()

def remove_task(self, task_id):
"""Remove task with identifier `task_id` from scheduler.
Parameters
----------
task_id: int
"""
async_result = self.tasks.pop(task_id)
if not async_result.done():
async_result.cancel()

def reset(self):
"""Stop all worker processes immediately and clear pending tasks."""
self.dask_client.shutdown()
self.tasks.clear()

@property
def num_cores(self):
"""Return the number of processes.
Returns
-------
int
"""
return os.cpu_count()


set_as_default()
25 changes: 12 additions & 13 deletions elfi/compiler.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,8 +54,8 @@ def compile(cls, source_net, compiled_net):
compiled_net.add_edges_from(source_net.edges(data=True))

# Compile the nodes to computation nodes
for name, data in compiled_net.nodes_iter(data=True):
state = source_net.node[name]
for name, data in compiled_net.nodes(data=True):
state = source_net.nodes[name]['attr_dict']
if '_output' in state and '_operation' in state:
raise ValueError("Cannot compile: both _output and _operation present "
"for node '{}'".format(name))
Expand Down Expand Up @@ -92,7 +92,7 @@ def compile(cls, source_net, compiled_net):
uses_observed = []

for node in nx.topological_sort(source_net):
state = source_net.node[node]
state = source_net.nodes[node]['attr_dict']
if state.get('_observable'):
observable.append(node)
cls.make_observed_copy(node, compiled_net)
Expand All @@ -113,14 +113,14 @@ def compile(cls, source_net, compiled_net):
else:
link_parent = parent

compiled_net.add_edge(link_parent, obs_node, source_net[parent][node].copy())
compiled_net.add_edge(link_parent, obs_node, **source_net[parent][node].copy())

# Check that there are no stochastic nodes in the ancestors
for node in uses_observed:
# Use the observed version to query observed ancestors in the compiled_net
obs_node = observed_name(node)
for ancestor_node in nx.ancestors(compiled_net, obs_node):
if '_stochastic' in source_net.node.get(ancestor_node, {}):
if '_stochastic' in source_net.nodes.get(ancestor_node, {}):
raise ValueError("Observed nodes must be deterministic. Observed "
"data depends on a non-deterministic node {}."
.format(ancestor_node))
Expand Down Expand Up @@ -148,11 +148,10 @@ def make_observed_copy(cls, node, compiled_net, operation=None):
raise ValueError("Observed node {} already exists!".format(obs_node))

if operation is None:
compiled_dict = compiled_net.node[node].copy()
compiled_dict = compiled_net.nodes[node].copy()
else:
compiled_dict = dict(operation=operation)

compiled_net.add_node(obs_node, compiled_dict)
compiled_net.add_node(obs_node, **compiled_dict)
return obs_node


Expand All @@ -176,8 +175,8 @@ def compile(cls, source_net, compiled_net):
instruction_node_map = dict(_uses_batch_size='_batch_size', _uses_meta='_meta')

for instruction, _node in instruction_node_map.items():
for node, d in source_net.nodes_iter(data=True):
if d.get(instruction):
for node, d in source_net.nodes(data=True):
if d['attr_dict'].get(instruction):
if not compiled_net.has_node(_node):
compiled_net.add_node(_node)
compiled_net.add_edge(_node, node, param=_node[1:])
Expand All @@ -203,8 +202,8 @@ def compile(cls, source_net, compiled_net):
logger.debug("{} compiling...".format(cls.__name__))

_random_node = '_random_state'
for node, d in source_net.nodes_iter(data=True):
if '_stochastic' in d:
for node, d in source_net.nodes(data=True):
if '_stochastic' in d['attr_dict']:
if not compiled_net.has_node(_random_node):
compiled_net.add_node(_random_node)
compiled_net.add_edge(_random_node, node, param='random_state')
Expand All @@ -230,7 +229,7 @@ def compile(cls, source_net, compiled_net):

outputs = compiled_net.graph['outputs']
output_ancestors = nbunch_ancestors(compiled_net, outputs)
for node in compiled_net.nodes():
for node in list(compiled_net.nodes()):
if node not in output_ancestors:
compiled_net.remove_node(node)
return compiled_net
Loading

0 comments on commit 5ce82ed

Please sign in to comment.