diff --git a/.graphics/gillespy2-UML-class-diagram.png b/.graphics/gillespy2-UML-class-diagram.png
index 182602595..6163e3084 100644
Binary files a/.graphics/gillespy2-UML-class-diagram.png and b/.graphics/gillespy2-UML-class-diagram.png differ
diff --git a/UML_CLASS_DIAGRAM.md b/UML_CLASS_DIAGRAM.md
index c5fb578bb..6038c3ad5 100644
--- a/UML_CLASS_DIAGRAM.md
+++ b/UML_CLASS_DIAGRAM.md
@@ -10,8 +10,8 @@ This diagram was built from a [UML class model](gillespy2-UML-class-model.pyns)
-GillesPySolver: a mathematical algorithm for running a Model object, creating a Results object containing simulation data.
--Results: a dictionary containing data from a simulation trajectory generated by running a Model via a solver.
+-Trajectory: a dictionary containing data from a simulation trajectory generated by running a Model via a solver.
--EnsembleResults: a list of data dictionaries from multiple trajectories generated by running the same Model over multiple instances.
+-Results: a list of data dictionaries from one or more trajectories generated by running the same Model over multiple instances.
![gillespy2-UML-class-diagram](.graphics/gillespy2-UML-class-diagram.png)
diff --git a/docs/build/html/.buildinfo b/docs/build/html/.buildinfo
index e8b00d4b6..201930fad 100644
--- a/docs/build/html/.buildinfo
+++ b/docs/build/html/.buildinfo
@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
-config: 90a763d479747475ebce9163f6d985c4
+config: 80c92b260bd66298b23222542d9d1bb0
tags: 645f666f9bcd5a90fca523b33c5a78b7
diff --git a/docs/build/html/.doctrees/classes/gillespy2.core.doctree b/docs/build/html/.doctrees/classes/gillespy2.core.doctree
index f539b8fed..d2db46e56 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.core.doctree and b/docs/build/html/.doctrees/classes/gillespy2.core.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.doctree b/docs/build/html/.doctrees/classes/gillespy2.doctree
index 7a24c419e..880fedb5b 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.doctree and b/docs/build/html/.doctrees/classes/gillespy2.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.sbml.doctree b/docs/build/html/.doctrees/classes/gillespy2.sbml.doctree
index 9ba81555b..547878e3f 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.sbml.doctree and b/docs/build/html/.doctrees/classes/gillespy2.sbml.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.solvers.auto.doctree b/docs/build/html/.doctrees/classes/gillespy2.solvers.auto.doctree
index 3a9f49b50..a1dbfab53 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.solvers.auto.doctree and b/docs/build/html/.doctrees/classes/gillespy2.solvers.auto.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.solvers.cpp.doctree b/docs/build/html/.doctrees/classes/gillespy2.solvers.cpp.doctree
index 189200c05..9e6b8b931 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.solvers.cpp.doctree and b/docs/build/html/.doctrees/classes/gillespy2.solvers.cpp.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.solvers.cython.doctree b/docs/build/html/.doctrees/classes/gillespy2.solvers.cython.doctree
index fdddc59a1..7b80e5415 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.solvers.cython.doctree and b/docs/build/html/.doctrees/classes/gillespy2.solvers.cython.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.solvers.doctree b/docs/build/html/.doctrees/classes/gillespy2.solvers.doctree
index d382e68e8..4c4e9ddd7 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.solvers.doctree and b/docs/build/html/.doctrees/classes/gillespy2.solvers.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.solvers.numpy.doctree b/docs/build/html/.doctrees/classes/gillespy2.solvers.numpy.doctree
index 173787471..b5ca910e8 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.solvers.numpy.doctree and b/docs/build/html/.doctrees/classes/gillespy2.solvers.numpy.doctree differ
diff --git a/docs/build/html/.doctrees/classes/gillespy2.solvers.stochkit.doctree b/docs/build/html/.doctrees/classes/gillespy2.solvers.stochkit.doctree
index 8dbc43e58..911122ace 100644
Binary files a/docs/build/html/.doctrees/classes/gillespy2.solvers.stochkit.doctree and b/docs/build/html/.doctrees/classes/gillespy2.solvers.stochkit.doctree differ
diff --git a/docs/build/html/.doctrees/environment.pickle b/docs/build/html/.doctrees/environment.pickle
index da304d6b5..7b95137eb 100644
Binary files a/docs/build/html/.doctrees/environment.pickle and b/docs/build/html/.doctrees/environment.pickle differ
diff --git a/docs/build/html/.doctrees/getting_started/basic_usage/basic_usage.doctree b/docs/build/html/.doctrees/getting_started/basic_usage/basic_usage.doctree
index 1af77fac6..fcaf38d37 100644
Binary files a/docs/build/html/.doctrees/getting_started/basic_usage/basic_usage.doctree and b/docs/build/html/.doctrees/getting_started/basic_usage/basic_usage.doctree differ
diff --git a/docs/build/html/.doctrees/getting_started/installation/installation.doctree b/docs/build/html/.doctrees/getting_started/installation/installation.doctree
index 42acca564..dee01a5da 100644
Binary files a/docs/build/html/.doctrees/getting_started/installation/installation.doctree and b/docs/build/html/.doctrees/getting_started/installation/installation.doctree differ
diff --git a/docs/build/html/.doctrees/index.doctree b/docs/build/html/.doctrees/index.doctree
index 54cc1c7c0..ef80a1848 100644
Binary files a/docs/build/html/.doctrees/index.doctree and b/docs/build/html/.doctrees/index.doctree differ
diff --git a/docs/build/html/.doctrees/tutorials/tut_michaelis_menten/tut_michaelis_menten.doctree b/docs/build/html/.doctrees/tutorials/tut_michaelis_menten/tut_michaelis_menten.doctree
index 9141b6a0a..727967474 100644
Binary files a/docs/build/html/.doctrees/tutorials/tut_michaelis_menten/tut_michaelis_menten.doctree and b/docs/build/html/.doctrees/tutorials/tut_michaelis_menten/tut_michaelis_menten.doctree differ
diff --git a/docs/build/html/.doctrees/tutorials/tut_sbml/tut_sbml.doctree b/docs/build/html/.doctrees/tutorials/tut_sbml/tut_sbml.doctree
index c65b5af04..496686735 100644
Binary files a/docs/build/html/.doctrees/tutorials/tut_sbml/tut_sbml.doctree and b/docs/build/html/.doctrees/tutorials/tut_sbml/tut_sbml.doctree differ
diff --git a/docs/build/html/.doctrees/tutorials/tut_toggle_switch/tut_toggle_switch.doctree b/docs/build/html/.doctrees/tutorials/tut_toggle_switch/tut_toggle_switch.doctree
index 65769bfbf..1ac78ea8e 100644
Binary files a/docs/build/html/.doctrees/tutorials/tut_toggle_switch/tut_toggle_switch.doctree and b/docs/build/html/.doctrees/tutorials/tut_toggle_switch/tut_toggle_switch.doctree differ
diff --git a/docs/build/html/_modules/collections.html b/docs/build/html/_modules/collections.html
new file mode 100644
index 000000000..c77c83a55
--- /dev/null
+++ b/docs/build/html/_modules/collections.html
@@ -0,0 +1,1379 @@
+
+
+
+
+
+'''This module implements specialized container datatypes providing
+alternatives to Python's general purpose built-in containers, dict,
+list, set, and tuple.
+
+* namedtuple factory function for creating tuple subclasses with named fields
+* deque list-like container with fast appends and pops on either end
+* ChainMap dict-like class for creating a single view of multiple mappings
+* Counter dict subclass for counting hashable objects
+* OrderedDict dict subclass that remembers the order entries were added
+* defaultdict dict subclass that calls a factory function to supply missing values
+* UserDict wrapper around dictionary objects for easier dict subclassing
+* UserList wrapper around list objects for easier list subclassing
+* UserString wrapper around string objects for easier string subclassing
+
+'''
+
+__all__=['deque','defaultdict','namedtuple','UserDict','UserList',
+ 'UserString','Counter','OrderedDict','ChainMap']
+
+# For backwards compatibility, continue to make the collections ABCs
+# available through the collections module.
+from_collections_abcimport*
+import_collections_abc
+__all__+=_collections_abc.__all__
+
+fromoperatorimportitemgetteras_itemgetter,eqas_eq
+fromkeywordimportiskeywordas_iskeyword
+importsysas_sys
+importheapqas_heapq
+from_weakrefimportproxyas_proxy
+fromitertoolsimportrepeatas_repeat,chainas_chain,starmapas_starmap
+fromreprlibimportrecursive_repras_recursive_repr
+
+try:
+ from_collectionsimportdeque
+exceptImportError:
+ pass
+else:
+ MutableSequence.register(deque)
+
+try:
+ from_collectionsimportdefaultdict
+exceptImportError:
+ pass
+
+
+################################################################################
+### OrderedDict
+################################################################################
+
+class_OrderedDictKeysView(KeysView):
+
+ def__reversed__(self):
+ yield fromreversed(self._mapping)
+
+class_OrderedDictItemsView(ItemsView):
+
+ def__reversed__(self):
+ forkeyinreversed(self._mapping):
+ yield(key,self._mapping[key])
+
+class_OrderedDictValuesView(ValuesView):
+
+ def__reversed__(self):
+ forkeyinreversed(self._mapping):
+ yieldself._mapping[key]
+
+class_Link(object):
+ __slots__='prev','next','key','__weakref__'
+
+
[docs]classOrderedDict(dict):
+ 'Dictionary that remembers insertion order'
+ # An inherited dict maps keys to values.
+ # The inherited dict provides __getitem__, __len__, __contains__, and get.
+ # The remaining methods are order-aware.
+ # Big-O running times for all methods are the same as regular dictionaries.
+
+ # The internal self.__map dict maps keys to links in a doubly linked list.
+ # The circular doubly linked list starts and ends with a sentinel element.
+ # The sentinel element never gets deleted (this simplifies the algorithm).
+ # The sentinel is in self.__hardroot with a weakref proxy in self.__root.
+ # The prev links are weakref proxies (to prevent circular references).
+ # Individual links are kept alive by the hard reference in self.__map.
+ # Those hard references disappear when a key is deleted from an OrderedDict.
+
+ def__init__(*args,**kwds):
+ '''Initialize an ordered dictionary. The signature is the same as
+ regular dictionaries. Keyword argument order is preserved.
+ '''
+ ifnotargs:
+ raiseTypeError("descriptor '__init__' of 'OrderedDict' object "
+ "needs an argument")
+ self,*args=args
+ iflen(args)>1:
+ raiseTypeError('expected at most 1 arguments, got %d'%len(args))
+ try:
+ self.__root
+ exceptAttributeError:
+ self.__hardroot=_Link()
+ self.__root=root=_proxy(self.__hardroot)
+ root.prev=root.next=root
+ self.__map={}
+ self.__update(*args,**kwds)
+
+ def__setitem__(self,key,value,
+ dict_setitem=dict.__setitem__,proxy=_proxy,Link=_Link):
+ 'od.__setitem__(i, y) <==> od[i]=y'
+ # Setting a new item creates a new link at the end of the linked list,
+ # and the inherited dictionary is updated with the new key/value pair.
+ ifkeynotinself:
+ self.__map[key]=link=Link()
+ root=self.__root
+ last=root.prev
+ link.prev,link.next,link.key=last,root,key
+ last.next=link
+ root.prev=proxy(link)
+ dict_setitem(self,key,value)
+
+ def__delitem__(self,key,dict_delitem=dict.__delitem__):
+ 'od.__delitem__(y) <==> del od[y]'
+ # Deleting an existing item uses self.__map to find the link which gets
+ # removed by updating the links in the predecessor and successor nodes.
+ dict_delitem(self,key)
+ link=self.__map.pop(key)
+ link_prev=link.prev
+ link_next=link.next
+ link_prev.next=link_next
+ link_next.prev=link_prev
+ link.prev=None
+ link.next=None
+
+ def__iter__(self):
+ 'od.__iter__() <==> iter(od)'
+ # Traverse the linked list in order.
+ root=self.__root
+ curr=root.next
+ whilecurrisnotroot:
+ yieldcurr.key
+ curr=curr.next
+
+ def__reversed__(self):
+ 'od.__reversed__() <==> reversed(od)'
+ # Traverse the linked list in reverse order.
+ root=self.__root
+ curr=root.prev
+ whilecurrisnotroot:
+ yieldcurr.key
+ curr=curr.prev
+
+ defclear(self):
+ 'od.clear() -> None. Remove all items from od.'
+ root=self.__root
+ root.prev=root.next=root
+ self.__map.clear()
+ dict.clear(self)
+
+ defpopitem(self,last=True):
+ '''Remove and return a (key, value) pair from the dictionary.
+
+ Pairs are returned in LIFO order if last is true or FIFO order if false.
+ '''
+ ifnotself:
+ raiseKeyError('dictionary is empty')
+ root=self.__root
+ iflast:
+ link=root.prev
+ link_prev=link.prev
+ link_prev.next=root
+ root.prev=link_prev
+ else:
+ link=root.next
+ link_next=link.next
+ root.next=link_next
+ link_next.prev=root
+ key=link.key
+ delself.__map[key]
+ value=dict.pop(self,key)
+ returnkey,value
+
+ defmove_to_end(self,key,last=True):
+ '''Move an existing element to the end (or beginning if last==False).
+
+ Raises KeyError if the element does not exist.
+ When last=True, acts like a fast version of self[key]=self.pop(key).
+
+ '''
+ link=self.__map[key]
+ link_prev=link.prev
+ link_next=link.next
+ soft_link=link_next.prev
+ link_prev.next=link_next
+ link_next.prev=link_prev
+ root=self.__root
+ iflast:
+ last=root.prev
+ link.prev=last
+ link.next=root
+ root.prev=soft_link
+ last.next=link
+ else:
+ first=root.next
+ link.prev=root
+ link.next=first
+ first.prev=soft_link
+ root.next=link
+
+ def__sizeof__(self):
+ sizeof=_sys.getsizeof
+ n=len(self)+1# number of links including root
+ size=sizeof(self.__dict__)# instance dictionary
+ size+=sizeof(self.__map)*2# internal dict and inherited dict
+ size+=sizeof(self.__hardroot)*n# link objects
+ size+=sizeof(self.__root)*n# proxy objects
+ returnsize
+
+ update=__update=MutableMapping.update
+
+ defkeys(self):
+ "D.keys() -> a set-like object providing a view on D's keys"
+ return_OrderedDictKeysView(self)
+
+ defitems(self):
+ "D.items() -> a set-like object providing a view on D's items"
+ return_OrderedDictItemsView(self)
+
+ defvalues(self):
+ "D.values() -> an object providing a view on D's values"
+ return_OrderedDictValuesView(self)
+
+ __ne__=MutableMapping.__ne__
+
+ __marker=object()
+
+ defpop(self,key,default=__marker):
+ '''od.pop(k[,d]) -> v, remove specified key and return the corresponding
+ value. If key is not found, d is returned if given, otherwise KeyError
+ is raised.
+
+ '''
+ ifkeyinself:
+ result=self[key]
+ delself[key]
+ returnresult
+ ifdefaultisself.__marker:
+ raiseKeyError(key)
+ returndefault
+
+ defsetdefault(self,key,default=None):
+ 'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od'
+ ifkeyinself:
+ returnself[key]
+ self[key]=default
+ returndefault
+
+ @_recursive_repr()
+ def__repr__(self):
+ 'od.__repr__() <==> repr(od)'
+ ifnotself:
+ return'%s()'%(self.__class__.__name__,)
+ return'%s(%r)'%(self.__class__.__name__,list(self.items()))
+
+ def__reduce__(self):
+ 'Return state information for pickling'
+ inst_dict=vars(self).copy()
+ forkinvars(OrderedDict()):
+ inst_dict.pop(k,None)
+ returnself.__class__,(),inst_dictorNone,None,iter(self.items())
+
+ defcopy(self):
+ 'od.copy() -> a shallow copy of od'
+ returnself.__class__(self)
+
+ @classmethod
+ deffromkeys(cls,iterable,value=None):
+ '''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S.
+ If not specified, the value defaults to None.
+
+ '''
+ self=cls()
+ forkeyiniterable:
+ self[key]=value
+ returnself
+
+ def__eq__(self,other):
+ '''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive
+ while comparison to a regular mapping is order-insensitive.
+
+ '''
+ ifisinstance(other,OrderedDict):
+ returndict.__eq__(self,other)andall(map(_eq,self,other))
+ returndict.__eq__(self,other)
+
+
+try:
+ from_collectionsimportOrderedDict
+exceptImportError:
+ # Leave the pure Python version in place.
+ pass
+
+
+################################################################################
+### namedtuple
+################################################################################
+
+_class_template="""\
+from builtins import property as _property, tuple as _tuple
+from operator import itemgetter as _itemgetter
+from collections import OrderedDict
+
+class {typename}(tuple):
+ '{typename}({arg_list})'
+
+ __slots__ = ()
+
+ _fields = {field_names!r}
+
+ def __new__(_cls, {arg_list}):
+ 'Create new instance of {typename}({arg_list})'
+ return _tuple.__new__(_cls, ({arg_list}))
+
+ @classmethod
+ def _make(cls, iterable, new=tuple.__new__, len=len):
+ 'Make a new {typename} object from a sequence or iterable'
+ result = new(cls, iterable)
+ if len(result) != {num_fields:d}:
+ raise TypeError('Expected {num_fields:d} arguments, got %d' % len(result))
+ return result
+
+ def _replace(_self, **kwds):
+ 'Return a new {typename} object replacing specified fields with new values'
+ result = _self._make(map(kwds.pop, {field_names!r}, _self))
+ if kwds:
+ raise ValueError('Got unexpected field names: %r' % list(kwds))
+ return result
+
+ def __repr__(self):
+ 'Return a nicely formatted representation string'
+ return self.__class__.__name__ + '({repr_fmt})' % self
+
+ def _asdict(self):
+ 'Return a new OrderedDict which maps field names to their values.'
+ return OrderedDict(zip(self._fields, self))
+
+ def __getnewargs__(self):
+ 'Return self as a plain tuple. Used by copy and pickle.'
+ return tuple(self)
+
+{field_defs}
+"""
+
+_repr_template='{name}=%r'
+
+_field_template='''\
+{name} = _property(_itemgetter({index:d}), doc='Alias for field number {index:d}')
+'''
+
+defnamedtuple(typename,field_names,*,verbose=False,rename=False,module=None):
+ """Returns a new subclass of tuple with named fields.
+
+ >>> Point = namedtuple('Point', ['x', 'y'])
+ >>> Point.__doc__ # docstring for the new class
+ 'Point(x, y)'
+ >>> p = Point(11, y=22) # instantiate with positional args or keywords
+ >>> p[0] + p[1] # indexable like a plain tuple
+ 33
+ >>> x, y = p # unpack like a regular tuple
+ >>> x, y
+ (11, 22)
+ >>> p.x + p.y # fields also accessible by name
+ 33
+ >>> d = p._asdict() # convert to a dictionary
+ >>> d['x']
+ 11
+ >>> Point(**d) # convert from a dictionary
+ Point(x=11, y=22)
+ >>> p._replace(x=100) # _replace() is like str.replace() but targets named fields
+ Point(x=100, y=22)
+
+ """
+
+ # Validate the field names. At the user's option, either generate an error
+ # message or automatically replace the field name with a valid name.
+ ifisinstance(field_names,str):
+ field_names=field_names.replace(',',' ').split()
+ field_names=list(map(str,field_names))
+ typename=str(typename)
+ ifrename:
+ seen=set()
+ forindex,nameinenumerate(field_names):
+ if(notname.isidentifier()
+ or_iskeyword(name)
+ orname.startswith('_')
+ ornameinseen):
+ field_names[index]='_%d'%index
+ seen.add(name)
+ fornamein[typename]+field_names:
+ iftype(name)isnotstr:
+ raiseTypeError('Type names and field names must be strings')
+ ifnotname.isidentifier():
+ raiseValueError('Type names and field names must be valid '
+ 'identifiers: %r'%name)
+ if_iskeyword(name):
+ raiseValueError('Type names and field names cannot be a '
+ 'keyword: %r'%name)
+ seen=set()
+ fornameinfield_names:
+ ifname.startswith('_')andnotrename:
+ raiseValueError('Field names cannot start with an underscore: '
+ '%r'%name)
+ ifnameinseen:
+ raiseValueError('Encountered duplicate field name: %r'%name)
+ seen.add(name)
+
+ # Fill-in the class template
+ class_definition=_class_template.format(
+ typename=typename,
+ field_names=tuple(field_names),
+ num_fields=len(field_names),
+ arg_list=repr(tuple(field_names)).replace("'","")[1:-1],
+ repr_fmt=', '.join(_repr_template.format(name=name)
+ fornameinfield_names),
+ field_defs='\n'.join(_field_template.format(index=index,name=name)
+ forindex,nameinenumerate(field_names))
+ )
+
+ # Execute the template string in a temporary namespace and support
+ # tracing utilities by setting a value for frame.f_globals['__name__']
+ namespace=dict(__name__='namedtuple_%s'%typename)
+ exec(class_definition,namespace)
+ result=namespace[typename]
+ result._source=class_definition
+ ifverbose:
+ print(result._source)
+
+ # For pickling to work, the __module__ variable needs to be set to the frame
+ # where the named tuple is created. Bypass this step in environments where
+ # sys._getframe is not defined (Jython for example) or sys._getframe is not
+ # defined for arguments greater than 0 (IronPython), or where the user has
+ # specified a particular module.
+ ifmoduleisNone:
+ try:
+ module=_sys._getframe(1).f_globals.get('__name__','__main__')
+ except(AttributeError,ValueError):
+ pass
+ ifmoduleisnotNone:
+ result.__module__=module
+
+ returnresult
+
+
+########################################################################
+### Counter
+########################################################################
+
+def_count_elements(mapping,iterable):
+ 'Tally elements from the iterable.'
+ mapping_get=mapping.get
+ foreleminiterable:
+ mapping[elem]=mapping_get(elem,0)+1
+
+try:# Load C helper function if available
+ from_collectionsimport_count_elements
+exceptImportError:
+ pass
+
+classCounter(dict):
+ '''Dict subclass for counting hashable items. Sometimes called a bag
+ or multiset. Elements are stored as dictionary keys and their counts
+ are stored as dictionary values.
+
+ >>> c = Counter('abcdeabcdabcaba') # count elements from a string
+
+ >>> c.most_common(3) # three most common elements
+ [('a', 5), ('b', 4), ('c', 3)]
+ >>> sorted(c) # list all unique elements
+ ['a', 'b', 'c', 'd', 'e']
+ >>> ''.join(sorted(c.elements())) # list elements with repetitions
+ 'aaaaabbbbcccdde'
+ >>> sum(c.values()) # total of all counts
+ 15
+
+ >>> c['a'] # count of letter 'a'
+ 5
+ >>> for elem in 'shazam': # update counts from an iterable
+ ... c[elem] += 1 # by adding 1 to each element's count
+ >>> c['a'] # now there are seven 'a'
+ 7
+ >>> del c['b'] # remove all 'b'
+ >>> c['b'] # now there are zero 'b'
+ 0
+
+ >>> d = Counter('simsalabim') # make another counter
+ >>> c.update(d) # add in the second counter
+ >>> c['a'] # now there are nine 'a'
+ 9
+
+ >>> c.clear() # empty the counter
+ >>> c
+ Counter()
+
+ Note: If a count is set to zero or reduced to zero, it will remain
+ in the counter until the entry is deleted or the counter is cleared:
+
+ >>> c = Counter('aaabbc')
+ >>> c['b'] -= 2 # reduce the count of 'b' by two
+ >>> c.most_common() # 'b' is still in, but its count is zero
+ [('a', 3), ('c', 1), ('b', 0)]
+
+ '''
+ # References:
+ # http://en.wikipedia.org/wiki/Multiset
+ # http://www.gnu.org/software/smalltalk/manual-base/html_node/Bag.html
+ # http://www.demo2s.com/Tutorial/Cpp/0380__set-multiset/Catalog0380__set-multiset.htm
+ # http://code.activestate.com/recipes/259174/
+ # Knuth, TAOCP Vol. II section 4.6.3
+
+ def__init__(*args,**kwds):
+ '''Create a new, empty Counter object. And if given, count elements
+ from an input iterable. Or, initialize the count from another mapping
+ of elements to their counts.
+
+ >>> c = Counter() # a new, empty counter
+ >>> c = Counter('gallahad') # a new counter from an iterable
+ >>> c = Counter({'a': 4, 'b': 2}) # a new counter from a mapping
+ >>> c = Counter(a=4, b=2) # a new counter from keyword args
+
+ '''
+ ifnotargs:
+ raiseTypeError("descriptor '__init__' of 'Counter' object "
+ "needs an argument")
+ self,*args=args
+ iflen(args)>1:
+ raiseTypeError('expected at most 1 arguments, got %d'%len(args))
+ super(Counter,self).__init__()
+ self.update(*args,**kwds)
+
+ def__missing__(self,key):
+ 'The count of elements not in the Counter is zero.'
+ # Needed so that self[missing_item] does not raise KeyError
+ return0
+
+ defmost_common(self,n=None):
+ '''List the n most common elements and their counts from the most
+ common to the least. If n is None, then list all element counts.
+
+ >>> Counter('abcdeabcdabcaba').most_common(3)
+ [('a', 5), ('b', 4), ('c', 3)]
+
+ '''
+ # Emulate Bag.sortedByCount from Smalltalk
+ ifnisNone:
+ returnsorted(self.items(),key=_itemgetter(1),reverse=True)
+ return_heapq.nlargest(n,self.items(),key=_itemgetter(1))
+
+ defelements(self):
+ '''Iterator over elements repeating each as many times as its count.
+
+ >>> c = Counter('ABCABC')
+ >>> sorted(c.elements())
+ ['A', 'A', 'B', 'B', 'C', 'C']
+
+ # Knuth's example for prime factors of 1836: 2**2 * 3**3 * 17**1
+ >>> prime_factors = Counter({2: 2, 3: 3, 17: 1})
+ >>> product = 1
+ >>> for factor in prime_factors.elements(): # loop over factors
+ ... product *= factor # and multiply them
+ >>> product
+ 1836
+
+ Note, if an element's count has been set to zero or is a negative
+ number, elements() will ignore it.
+
+ '''
+ # Emulate Bag.do from Smalltalk and Multiset.begin from C++.
+ return_chain.from_iterable(_starmap(_repeat,self.items()))
+
+ # Override dict methods where necessary
+
+ @classmethod
+ deffromkeys(cls,iterable,v=None):
+ # There is no equivalent method for counters because setting v=1
+ # means that no element can have a count greater than one.
+ raiseNotImplementedError(
+ 'Counter.fromkeys() is undefined. Use Counter(iterable) instead.')
+
+ defupdate(*args,**kwds):
+ '''Like dict.update() but add counts instead of replacing them.
+
+ Source can be an iterable, a dictionary, or another Counter instance.
+
+ >>> c = Counter('which')
+ >>> c.update('witch') # add elements from another iterable
+ >>> d = Counter('watch')
+ >>> c.update(d) # add elements from another counter
+ >>> c['h'] # four 'h' in which, witch, and watch
+ 4
+
+ '''
+ # The regular dict.update() operation makes no sense here because the
+ # replace behavior results in the some of original untouched counts
+ # being mixed-in with all of the other counts for a mismash that
+ # doesn't have a straight-forward interpretation in most counting
+ # contexts. Instead, we implement straight-addition. Both the inputs
+ # and outputs are allowed to contain zero and negative counts.
+
+ ifnotargs:
+ raiseTypeError("descriptor 'update' of 'Counter' object "
+ "needs an argument")
+ self,*args=args
+ iflen(args)>1:
+ raiseTypeError('expected at most 1 arguments, got %d'%len(args))
+ iterable=args[0]ifargselseNone
+ ifiterableisnotNone:
+ ifisinstance(iterable,Mapping):
+ ifself:
+ self_get=self.get
+ forelem,countiniterable.items():
+ self[elem]=count+self_get(elem,0)
+ else:
+ super(Counter,self).update(iterable)# fast path when counter is empty
+ else:
+ _count_elements(self,iterable)
+ ifkwds:
+ self.update(kwds)
+
+ defsubtract(*args,**kwds):
+ '''Like dict.update() but subtracts counts instead of replacing them.
+ Counts can be reduced below zero. Both the inputs and outputs are
+ allowed to contain zero and negative counts.
+
+ Source can be an iterable, a dictionary, or another Counter instance.
+
+ >>> c = Counter('which')
+ >>> c.subtract('witch') # subtract elements from another iterable
+ >>> c.subtract(Counter('watch')) # subtract elements from another counter
+ >>> c['h'] # 2 in which, minus 1 in witch, minus 1 in watch
+ 0
+ >>> c['w'] # 1 in which, minus 1 in witch, minus 1 in watch
+ -1
+
+ '''
+ ifnotargs:
+ raiseTypeError("descriptor 'subtract' of 'Counter' object "
+ "needs an argument")
+ self,*args=args
+ iflen(args)>1:
+ raiseTypeError('expected at most 1 arguments, got %d'%len(args))
+ iterable=args[0]ifargselseNone
+ ifiterableisnotNone:
+ self_get=self.get
+ ifisinstance(iterable,Mapping):
+ forelem,countiniterable.items():
+ self[elem]=self_get(elem,0)-count
+ else:
+ foreleminiterable:
+ self[elem]=self_get(elem,0)-1
+ ifkwds:
+ self.subtract(kwds)
+
+ defcopy(self):
+ 'Return a shallow copy.'
+ returnself.__class__(self)
+
+ def__reduce__(self):
+ returnself.__class__,(dict(self),)
+
+ def__delitem__(self,elem):
+ 'Like dict.__delitem__() but does not raise KeyError for missing values.'
+ ifeleminself:
+ super().__delitem__(elem)
+
+ def__repr__(self):
+ ifnotself:
+ return'%s()'%self.__class__.__name__
+ try:
+ items=', '.join(map('%r: %r'.__mod__,self.most_common()))
+ return'%s({%s})'%(self.__class__.__name__,items)
+ exceptTypeError:
+ # handle case where values are not orderable
+ return'{0}({1!r})'.format(self.__class__.__name__,dict(self))
+
+ # Multiset-style mathematical operations discussed in:
+ # Knuth TAOCP Volume II section 4.6.3 exercise 19
+ # and at http://en.wikipedia.org/wiki/Multiset
+ #
+ # Outputs guaranteed to only include positive counts.
+ #
+ # To strip negative and zero counts, add-in an empty counter:
+ # c += Counter()
+
+ def__add__(self,other):
+ '''Add counts from two counters.
+
+ >>> Counter('abbb') + Counter('bcc')
+ Counter({'b': 4, 'c': 2, 'a': 1})
+
+ '''
+ ifnotisinstance(other,Counter):
+ returnNotImplemented
+ result=Counter()
+ forelem,countinself.items():
+ newcount=count+other[elem]
+ ifnewcount>0:
+ result[elem]=newcount
+ forelem,countinother.items():
+ ifelemnotinselfandcount>0:
+ result[elem]=count
+ returnresult
+
+ def__sub__(self,other):
+ ''' Subtract count, but keep only results with positive counts.
+
+ >>> Counter('abbbc') - Counter('bccd')
+ Counter({'b': 2, 'a': 1})
+
+ '''
+ ifnotisinstance(other,Counter):
+ returnNotImplemented
+ result=Counter()
+ forelem,countinself.items():
+ newcount=count-other[elem]
+ ifnewcount>0:
+ result[elem]=newcount
+ forelem,countinother.items():
+ ifelemnotinselfandcount<0:
+ result[elem]=0-count
+ returnresult
+
+ def__or__(self,other):
+ '''Union is the maximum of value in either of the input counters.
+
+ >>> Counter('abbb') | Counter('bcc')
+ Counter({'b': 3, 'c': 2, 'a': 1})
+
+ '''
+ ifnotisinstance(other,Counter):
+ returnNotImplemented
+ result=Counter()
+ forelem,countinself.items():
+ other_count=other[elem]
+ newcount=other_countifcount<other_countelsecount
+ ifnewcount>0:
+ result[elem]=newcount
+ forelem,countinother.items():
+ ifelemnotinselfandcount>0:
+ result[elem]=count
+ returnresult
+
+ def__and__(self,other):
+ ''' Intersection is the minimum of corresponding counts.
+
+ >>> Counter('abbb') & Counter('bcc')
+ Counter({'b': 1})
+
+ '''
+ ifnotisinstance(other,Counter):
+ returnNotImplemented
+ result=Counter()
+ forelem,countinself.items():
+ other_count=other[elem]
+ newcount=countifcount<other_countelseother_count
+ ifnewcount>0:
+ result[elem]=newcount
+ returnresult
+
+ def__pos__(self):
+ 'Adds an empty counter, effectively stripping negative and zero counts'
+ result=Counter()
+ forelem,countinself.items():
+ ifcount>0:
+ result[elem]=count
+ returnresult
+
+ def__neg__(self):
+ '''Subtracts from an empty counter. Strips positive and zero counts,
+ and flips the sign on negative counts.
+
+ '''
+ result=Counter()
+ forelem,countinself.items():
+ ifcount<0:
+ result[elem]=0-count
+ returnresult
+
+ def_keep_positive(self):
+ '''Internal method to strip elements with a negative or zero count'''
+ nonpositive=[elemforelem,countinself.items()ifnotcount>0]
+ foreleminnonpositive:
+ delself[elem]
+ returnself
+
+ def__iadd__(self,other):
+ '''Inplace add from another counter, keeping only positive counts.
+
+ >>> c = Counter('abbb')
+ >>> c += Counter('bcc')
+ >>> c
+ Counter({'b': 4, 'c': 2, 'a': 1})
+
+ '''
+ forelem,countinother.items():
+ self[elem]+=count
+ returnself._keep_positive()
+
+ def__isub__(self,other):
+ '''Inplace subtract counter, but keep only results with positive counts.
+
+ >>> c = Counter('abbbc')
+ >>> c -= Counter('bccd')
+ >>> c
+ Counter({'b': 2, 'a': 1})
+
+ '''
+ forelem,countinother.items():
+ self[elem]-=count
+ returnself._keep_positive()
+
+ def__ior__(self,other):
+ '''Inplace union is the maximum of value from either counter.
+
+ >>> c = Counter('abbb')
+ >>> c |= Counter('bcc')
+ >>> c
+ Counter({'b': 3, 'c': 2, 'a': 1})
+
+ '''
+ forelem,other_countinother.items():
+ count=self[elem]
+ ifother_count>count:
+ self[elem]=other_count
+ returnself._keep_positive()
+
+ def__iand__(self,other):
+ '''Inplace intersection is the minimum of corresponding counts.
+
+ >>> c = Counter('abbb')
+ >>> c &= Counter('bcc')
+ >>> c
+ Counter({'b': 1})
+
+ '''
+ forelem,countinself.items():
+ other_count=other[elem]
+ ifother_count<count:
+ self[elem]=other_count
+ returnself._keep_positive()
+
+
+########################################################################
+### ChainMap
+########################################################################
+
+classChainMap(MutableMapping):
+ ''' A ChainMap groups multiple dicts (or other mappings) together
+ to create a single, updateable view.
+
+ The underlying mappings are stored in a list. That list is public and can
+ be accessed or updated using the *maps* attribute. There is no other
+ state.
+
+ Lookups search the underlying mappings successively until a key is found.
+ In contrast, writes, updates, and deletions only operate on the first
+ mapping.
+
+ '''
+
+ def__init__(self,*maps):
+ '''Initialize a ChainMap by setting *maps* to the given mappings.
+ If no mappings are provided, a single empty dictionary is used.
+
+ '''
+ self.maps=list(maps)or[{}]# always at least one map
+
+ def__missing__(self,key):
+ raiseKeyError(key)
+
+ def__getitem__(self,key):
+ formappinginself.maps:
+ try:
+ returnmapping[key]# can't use 'key in mapping' with defaultdict
+ exceptKeyError:
+ pass
+ returnself.__missing__(key)# support subclasses that define __missing__
+
+ defget(self,key,default=None):
+ returnself[key]ifkeyinselfelsedefault
+
+ def__len__(self):
+ returnlen(set().union(*self.maps))# reuses stored hash values if possible
+
+ def__iter__(self):
+ returniter(set().union(*self.maps))
+
+ def__contains__(self,key):
+ returnany(keyinmforminself.maps)
+
+ def__bool__(self):
+ returnany(self.maps)
+
+ @_recursive_repr()
+ def__repr__(self):
+ return'{0.__class__.__name__}({1})'.format(
+ self,', '.join(map(repr,self.maps)))
+
+ @classmethod
+ deffromkeys(cls,iterable,*args):
+ 'Create a ChainMap with a single dict created from the iterable.'
+ returncls(dict.fromkeys(iterable,*args))
+
+ defcopy(self):
+ 'New ChainMap or subclass with a new copy of maps[0] and refs to maps[1:]'
+ returnself.__class__(self.maps[0].copy(),*self.maps[1:])
+
+ __copy__=copy
+
+ defnew_child(self,m=None):# like Django's Context.push()
+ '''New ChainMap with a new map followed by all previous maps.
+ If no map is provided, an empty dict is used.
+ '''
+ ifmisNone:
+ m={}
+ returnself.__class__(m,*self.maps)
+
+ @property
+ defparents(self):# like Django's Context.pop()
+ 'New ChainMap from maps[1:].'
+ returnself.__class__(*self.maps[1:])
+
+ def__setitem__(self,key,value):
+ self.maps[0][key]=value
+
+ def__delitem__(self,key):
+ try:
+ delself.maps[0][key]
+ exceptKeyError:
+ raiseKeyError('Key not found in the first mapping: {!r}'.format(key))
+
+ defpopitem(self):
+ 'Remove and return an item pair from maps[0]. Raise KeyError is maps[0] is empty.'
+ try:
+ returnself.maps[0].popitem()
+ exceptKeyError:
+ raiseKeyError('No keys found in the first mapping.')
+
+ defpop(self,key,*args):
+ 'Remove *key* from maps[0] and return its value. Raise KeyError if *key* not in maps[0].'
+ try:
+ returnself.maps[0].pop(key,*args)
+ exceptKeyError:
+ raiseKeyError('Key not found in the first mapping: {!r}'.format(key))
+
+ defclear(self):
+ 'Clear maps[0], leaving maps[1:] intact.'
+ self.maps[0].clear()
+
+
+################################################################################
+### UserDict
+################################################################################
+
+classUserDict(MutableMapping):
+
+ # Start by filling-out the abstract methods
+ def__init__(*args,**kwargs):
+ ifnotargs:
+ raiseTypeError("descriptor '__init__' of 'UserDict' object "
+ "needs an argument")
+ self,*args=args
+ iflen(args)>1:
+ raiseTypeError('expected at most 1 arguments, got %d'%len(args))
+ ifargs:
+ dict=args[0]
+ elif'dict'inkwargs:
+ dict=kwargs.pop('dict')
+ importwarnings
+ warnings.warn("Passing 'dict' as keyword argument is deprecated",
+ DeprecationWarning,stacklevel=2)
+ else:
+ dict=None
+ self.data={}
+ ifdictisnotNone:
+ self.update(dict)
+ iflen(kwargs):
+ self.update(kwargs)
+ def__len__(self):returnlen(self.data)
+ def__getitem__(self,key):
+ ifkeyinself.data:
+ returnself.data[key]
+ ifhasattr(self.__class__,"__missing__"):
+ returnself.__class__.__missing__(self,key)
+ raiseKeyError(key)
+ def__setitem__(self,key,item):self.data[key]=item
+ def__delitem__(self,key):delself.data[key]
+ def__iter__(self):
+ returniter(self.data)
+
+ # Modify __contains__ to work correctly when __missing__ is present
+ def__contains__(self,key):
+ returnkeyinself.data
+
+ # Now, add the methods in dicts but not in MutableMapping
+ def__repr__(self):returnrepr(self.data)
+ defcopy(self):
+ ifself.__class__isUserDict:
+ returnUserDict(self.data.copy())
+ importcopy
+ data=self.data
+ try:
+ self.data={}
+ c=copy.copy(self)
+ finally:
+ self.data=data
+ c.update(self)
+ returnc
+ @classmethod
+ deffromkeys(cls,iterable,value=None):
+ d=cls()
+ forkeyiniterable:
+ d[key]=value
+ returnd
+
+
+
+################################################################################
+### UserList
+################################################################################
+
+classUserList(MutableSequence):
+ """A more or less complete user-defined wrapper around list objects."""
+ def__init__(self,initlist=None):
+ self.data=[]
+ ifinitlistisnotNone:
+ # XXX should this accept an arbitrary sequence?
+ iftype(initlist)==type(self.data):
+ self.data[:]=initlist
+ elifisinstance(initlist,UserList):
+ self.data[:]=initlist.data[:]
+ else:
+ self.data=list(initlist)
+ def__repr__(self):returnrepr(self.data)
+ def__lt__(self,other):returnself.data<self.__cast(other)
+ def__le__(self,other):returnself.data<=self.__cast(other)
+ def__eq__(self,other):returnself.data==self.__cast(other)
+ def__gt__(self,other):returnself.data>self.__cast(other)
+ def__ge__(self,other):returnself.data>=self.__cast(other)
+ def__cast(self,other):
+ returnother.dataifisinstance(other,UserList)elseother
+ def__contains__(self,item):returniteminself.data
+ def__len__(self):returnlen(self.data)
+ def__getitem__(self,i):returnself.data[i]
+ def__setitem__(self,i,item):self.data[i]=item
+ def__delitem__(self,i):delself.data[i]
+ def__add__(self,other):
+ ifisinstance(other,UserList):
+ returnself.__class__(self.data+other.data)
+ elifisinstance(other,type(self.data)):
+ returnself.__class__(self.data+other)
+ returnself.__class__(self.data+list(other))
+ def__radd__(self,other):
+ ifisinstance(other,UserList):
+ returnself.__class__(other.data+self.data)
+ elifisinstance(other,type(self.data)):
+ returnself.__class__(other+self.data)
+ returnself.__class__(list(other)+self.data)
+ def__iadd__(self,other):
+ ifisinstance(other,UserList):
+ self.data+=other.data
+ elifisinstance(other,type(self.data)):
+ self.data+=other
+ else:
+ self.data+=list(other)
+ returnself
+ def__mul__(self,n):
+ returnself.__class__(self.data*n)
+ __rmul__=__mul__
+ def__imul__(self,n):
+ self.data*=n
+ returnself
+ defappend(self,item):self.data.append(item)
+ definsert(self,i,item):self.data.insert(i,item)
+ defpop(self,i=-1):returnself.data.pop(i)
+ defremove(self,item):self.data.remove(item)
+ defclear(self):self.data.clear()
+ defcopy(self):returnself.__class__(self)
+ defcount(self,item):returnself.data.count(item)
+ defindex(self,item,*args):returnself.data.index(item,*args)
+ defreverse(self):self.data.reverse()
+ defsort(self,*args,**kwds):self.data.sort(*args,**kwds)
+ defextend(self,other):
+ ifisinstance(other,UserList):
+ self.data.extend(other.data)
+ else:
+ self.data.extend(other)
+
+
+
+################################################################################
+### UserString
+################################################################################
+
+classUserString(Sequence):
+ def__init__(self,seq):
+ ifisinstance(seq,str):
+ self.data=seq
+ elifisinstance(seq,UserString):
+ self.data=seq.data[:]
+ else:
+ self.data=str(seq)
+ def__str__(self):returnstr(self.data)
+ def__repr__(self):returnrepr(self.data)
+ def__int__(self):returnint(self.data)
+ def__float__(self):returnfloat(self.data)
+ def__complex__(self):returncomplex(self.data)
+ def__hash__(self):returnhash(self.data)
+ def__getnewargs__(self):
+ return(self.data[:],)
+
+ def__eq__(self,string):
+ ifisinstance(string,UserString):
+ returnself.data==string.data
+ returnself.data==string
+ def__lt__(self,string):
+ ifisinstance(string,UserString):
+ returnself.data<string.data
+ returnself.data<string
+ def__le__(self,string):
+ ifisinstance(string,UserString):
+ returnself.data<=string.data
+ returnself.data<=string
+ def__gt__(self,string):
+ ifisinstance(string,UserString):
+ returnself.data>string.data
+ returnself.data>string
+ def__ge__(self,string):
+ ifisinstance(string,UserString):
+ returnself.data>=string.data
+ returnself.data>=string
+
+ def__contains__(self,char):
+ ifisinstance(char,UserString):
+ char=char.data
+ returncharinself.data
+
+ def__len__(self):returnlen(self.data)
+ def__getitem__(self,index):returnself.__class__(self.data[index])
+ def__add__(self,other):
+ ifisinstance(other,UserString):
+ returnself.__class__(self.data+other.data)
+ elifisinstance(other,str):
+ returnself.__class__(self.data+other)
+ returnself.__class__(self.data+str(other))
+ def__radd__(self,other):
+ ifisinstance(other,str):
+ returnself.__class__(other+self.data)
+ returnself.__class__(str(other)+self.data)
+ def__mul__(self,n):
+ returnself.__class__(self.data*n)
+ __rmul__=__mul__
+ def__mod__(self,args):
+ returnself.__class__(self.data%args)
+ def__rmod__(self,format):
+ returnself.__class__(format%args)
+
+ # the following methods are defined in alphabetical order:
+ defcapitalize(self):returnself.__class__(self.data.capitalize())
+ defcasefold(self):
+ returnself.__class__(self.data.casefold())
+ defcenter(self,width,*args):
+ returnself.__class__(self.data.center(width,*args))
+ defcount(self,sub,start=0,end=_sys.maxsize):
+ ifisinstance(sub,UserString):
+ sub=sub.data
+ returnself.data.count(sub,start,end)
+ defencode(self,encoding=None,errors=None):# XXX improve this?
+ ifencoding:
+ iferrors:
+ returnself.__class__(self.data.encode(encoding,errors))
+ returnself.__class__(self.data.encode(encoding))
+ returnself.__class__(self.data.encode())
+ defendswith(self,suffix,start=0,end=_sys.maxsize):
+ returnself.data.endswith(suffix,start,end)
+ defexpandtabs(self,tabsize=8):
+ returnself.__class__(self.data.expandtabs(tabsize))
+ deffind(self,sub,start=0,end=_sys.maxsize):
+ ifisinstance(sub,UserString):
+ sub=sub.data
+ returnself.data.find(sub,start,end)
+ defformat(self,*args,**kwds):
+ returnself.data.format(*args,**kwds)
+ defformat_map(self,mapping):
+ returnself.data.format_map(mapping)
+ defindex(self,sub,start=0,end=_sys.maxsize):
+ returnself.data.index(sub,start,end)
+ defisalpha(self):returnself.data.isalpha()
+ defisalnum(self):returnself.data.isalnum()
+ defisdecimal(self):returnself.data.isdecimal()
+ defisdigit(self):returnself.data.isdigit()
+ defisidentifier(self):returnself.data.isidentifier()
+ defislower(self):returnself.data.islower()
+ defisnumeric(self):returnself.data.isnumeric()
+ defisprintable(self):returnself.data.isprintable()
+ defisspace(self):returnself.data.isspace()
+ defistitle(self):returnself.data.istitle()
+ defisupper(self):returnself.data.isupper()
+ defjoin(self,seq):returnself.data.join(seq)
+ defljust(self,width,*args):
+ returnself.__class__(self.data.ljust(width,*args))
+ deflower(self):returnself.__class__(self.data.lower())
+ deflstrip(self,chars=None):returnself.__class__(self.data.lstrip(chars))
+ maketrans=str.maketrans
+ defpartition(self,sep):
+ returnself.data.partition(sep)
+ defreplace(self,old,new,maxsplit=-1):
+ ifisinstance(old,UserString):
+ old=old.data
+ ifisinstance(new,UserString):
+ new=new.data
+ returnself.__class__(self.data.replace(old,new,maxsplit))
+ defrfind(self,sub,start=0,end=_sys.maxsize):
+ ifisinstance(sub,UserString):
+ sub=sub.data
+ returnself.data.rfind(sub,start,end)
+ defrindex(self,sub,start=0,end=_sys.maxsize):
+ returnself.data.rindex(sub,start,end)
+ defrjust(self,width,*args):
+ returnself.__class__(self.data.rjust(width,*args))
+ defrpartition(self,sep):
+ returnself.data.rpartition(sep)
+ defrstrip(self,chars=None):
+ returnself.__class__(self.data.rstrip(chars))
+ defsplit(self,sep=None,maxsplit=-1):
+ returnself.data.split(sep,maxsplit)
+ defrsplit(self,sep=None,maxsplit=-1):
+ returnself.data.rsplit(sep,maxsplit)
+ defsplitlines(self,keepends=False):returnself.data.splitlines(keepends)
+ defstartswith(self,prefix,start=0,end=_sys.maxsize):
+ returnself.data.startswith(prefix,start,end)
+ defstrip(self,chars=None):returnself.__class__(self.data.strip(chars))
+ defswapcase(self):returnself.__class__(self.data.swapcase())
+ deftitle(self):returnself.__class__(self.data.title())
+ deftranslate(self,*args):
+ returnself.__class__(self.data.translate(*args))
+ defupper(self):returnself.__class__(self.data.upper())
+ defzfill(self,width):returnself.__class__(self.data.zfill(width))
+
+"""Utilities for with-statement contexts. See PEP 343."""
+importabc
+importsys
+import_collections_abc
+fromcollectionsimportdeque
+fromfunctoolsimportwraps
+
+__all__=["contextmanager","closing","AbstractContextManager",
+ "ContextDecorator","ExitStack","redirect_stdout",
+ "redirect_stderr","suppress"]
+
+
+classAbstractContextManager(abc.ABC):
+
+ """An abstract base class for context managers."""
+
+ def__enter__(self):
+ """Return `self` upon entering the runtime context."""
+ returnself
+
+ @abc.abstractmethod
+ def__exit__(self,exc_type,exc_value,traceback):
+ """Raise any exception triggered within the runtime context."""
+ returnNone
+
+ @classmethod
+ def__subclasshook__(cls,C):
+ ifclsisAbstractContextManager:
+ return_collections_abc._check_methods(C,"__enter__","__exit__")
+ returnNotImplemented
+
+
+classContextDecorator(object):
+ "A base class or mixin that enables context managers to work as decorators."
+
+ def_recreate_cm(self):
+ """Return a recreated instance of self.
+
+ Allows an otherwise one-shot context manager like
+ _GeneratorContextManager to support use as
+ a decorator via implicit recreation.
+
+ This is a private interface just for _GeneratorContextManager.
+ See issue #11647 for details.
+ """
+ returnself
+
+ def__call__(self,func):
+ @wraps(func)
+ definner(*args,**kwds):
+ withself._recreate_cm():
+ returnfunc(*args,**kwds)
+ returninner
+
+
+class_GeneratorContextManager(ContextDecorator,AbstractContextManager):
+ """Helper for @contextmanager decorator."""
+
+ def__init__(self,func,args,kwds):
+ self.gen=func(*args,**kwds)
+ self.func,self.args,self.kwds=func,args,kwds
+ # Issue 19330: ensure context manager instances have good docstrings
+ doc=getattr(func,"__doc__",None)
+ ifdocisNone:
+ doc=type(self).__doc__
+ self.__doc__=doc
+ # Unfortunately, this still doesn't provide good help output when
+ # inspecting the created context manager instances, since pydoc
+ # currently bypasses the instance docstring and shows the docstring
+ # for the class instead.
+ # See http://bugs.python.org/issue19404 for more details.
+
+ def_recreate_cm(self):
+ # _GCM instances are one-shot context managers, so the
+ # CM must be recreated each time a decorated function is
+ # called
+ returnself.__class__(self.func,self.args,self.kwds)
+
+ def__enter__(self):
+ try:
+ returnnext(self.gen)
+ exceptStopIteration:
+ raiseRuntimeError("generator didn't yield")fromNone
+
+ def__exit__(self,type,value,traceback):
+ iftypeisNone:
+ try:
+ next(self.gen)
+ exceptStopIteration:
+ returnFalse
+ else:
+ raiseRuntimeError("generator didn't stop")
+ else:
+ ifvalueisNone:
+ # Need to force instantiation so we can reliably
+ # tell if we get the same exception back
+ value=type()
+ try:
+ self.gen.throw(type,value,traceback)
+ exceptStopIterationasexc:
+ # Suppress StopIteration *unless* it's the same exception that
+ # was passed to throw(). This prevents a StopIteration
+ # raised inside the "with" statement from being suppressed.
+ returnexcisnotvalue
+ exceptRuntimeErrorasexc:
+ # Don't re-raise the passed in exception. (issue27122)
+ ifexcisvalue:
+ returnFalse
+ # Likewise, avoid suppressing if a StopIteration exception
+ # was passed to throw() and later wrapped into a RuntimeError
+ # (see PEP 479).
+ iftypeisStopIterationandexc.__cause__isvalue:
+ returnFalse
+ raise
+ except:
+ # only re-raise if it's *not* the exception that was
+ # passed to throw(), because __exit__() must not raise
+ # an exception unless __exit__() itself failed. But throw()
+ # has to raise the exception to signal propagation, so this
+ # fixes the impedance mismatch between the throw() protocol
+ # and the __exit__() protocol.
+ #
+ ifsys.exc_info()[1]isvalue:
+ returnFalse
+ raise
+ raiseRuntimeError("generator didn't stop after throw()")
+
+
+
+
+
+classclosing(AbstractContextManager):
+ """Context to automatically close something at the end of a block.
+
+ Code like this:
+
+ with closing(<module>.open(<arguments>)) as f:
+ <block>
+
+ is equivalent to this:
+
+ f = <module>.open(<arguments>)
+ try:
+ <block>
+ finally:
+ f.close()
+
+ """
+ def__init__(self,thing):
+ self.thing=thing
+ def__enter__(self):
+ returnself.thing
+ def__exit__(self,*exc_info):
+ self.thing.close()
+
+
+class_RedirectStream(AbstractContextManager):
+
+ _stream=None
+
+ def__init__(self,new_target):
+ self._new_target=new_target
+ # We use a list of old targets to make this CM re-entrant
+ self._old_targets=[]
+
+ def__enter__(self):
+ self._old_targets.append(getattr(sys,self._stream))
+ setattr(sys,self._stream,self._new_target)
+ returnself._new_target
+
+ def__exit__(self,exctype,excinst,exctb):
+ setattr(sys,self._stream,self._old_targets.pop())
+
+
+classredirect_stdout(_RedirectStream):
+ """Context manager for temporarily redirecting stdout to another file.
+
+ # How to send help() to stderr
+ with redirect_stdout(sys.stderr):
+ help(dir)
+
+ # How to write help() to a file
+ with open('help.txt', 'w') as f:
+ with redirect_stdout(f):
+ help(pow)
+ """
+
+ _stream="stdout"
+
+
+classredirect_stderr(_RedirectStream):
+ """Context manager for temporarily redirecting stderr to another file."""
+
+ _stream="stderr"
+
+
+classsuppress(AbstractContextManager):
+ """Context manager to suppress specified exceptions
+
+ After the exception is suppressed, execution proceeds with the next
+ statement following the with statement.
+
+ with suppress(FileNotFoundError):
+ os.remove(somefile)
+ # Execution still resumes here if the file was already removed
+ """
+
+ def__init__(self,*exceptions):
+ self._exceptions=exceptions
+
+ def__enter__(self):
+ pass
+
+ def__exit__(self,exctype,excinst,exctb):
+ # Unlike isinstance and issubclass, CPython exception handling
+ # currently only looks at the concrete type hierarchy (ignoring
+ # the instance and subclass checking hooks). While Guido considers
+ # that a bug rather than a feature, it's a fairly hard one to fix
+ # due to various internal implementation details. suppress provides
+ # the simpler issubclass based semantics, rather than trying to
+ # exactly reproduce the limitations of the CPython interpreter.
+ #
+ # See http://bugs.python.org/issue12029 for more details
+ returnexctypeisnotNoneandissubclass(exctype,self._exceptions)
+
+
+# Inspired by discussions on http://bugs.python.org/issue13585
+classExitStack(AbstractContextManager):
+ """Context manager for dynamic management of a stack of exit callbacks
+
+ For example:
+
+ with ExitStack() as stack:
+ files = [stack.enter_context(open(fname)) for fname in filenames]
+ # All opened files will automatically be closed at the end of
+ # the with statement, even if attempts to open files later
+ # in the list raise an exception
+
+ """
+ def__init__(self):
+ self._exit_callbacks=deque()
+
+ defpop_all(self):
+ """Preserve the context stack by transferring it to a new instance"""
+ new_stack=type(self)()
+ new_stack._exit_callbacks=self._exit_callbacks
+ self._exit_callbacks=deque()
+ returnnew_stack
+
+ def_push_cm_exit(self,cm,cm_exit):
+ """Helper to correctly register callbacks to __exit__ methods"""
+ def_exit_wrapper(*exc_details):
+ returncm_exit(cm,*exc_details)
+ _exit_wrapper.__self__=cm
+ self.push(_exit_wrapper)
+
+ defpush(self,exit):
+ """Registers a callback with the standard __exit__ method signature
+
+ Can suppress exceptions the same way __exit__ methods can.
+
+ Also accepts any object with an __exit__ method (registering a call
+ to the method instead of the object itself)
+ """
+ # We use an unbound method rather than a bound method to follow
+ # the standard lookup behaviour for special methods
+ _cb_type=type(exit)
+ try:
+ exit_method=_cb_type.__exit__
+ exceptAttributeError:
+ # Not a context manager, so assume its a callable
+ self._exit_callbacks.append(exit)
+ else:
+ self._push_cm_exit(exit,exit_method)
+ returnexit# Allow use as a decorator
+
+ defcallback(self,callback,*args,**kwds):
+ """Registers an arbitrary callback and arguments.
+
+ Cannot suppress exceptions.
+ """
+ def_exit_wrapper(exc_type,exc,tb):
+ callback(*args,**kwds)
+ # We changed the signature, so using @wraps is not appropriate, but
+ # setting __wrapped__ may still help with introspection
+ _exit_wrapper.__wrapped__=callback
+ self.push(_exit_wrapper)
+ returncallback# Allow use as a decorator
+
+ defenter_context(self,cm):
+ """Enters the supplied context manager
+
+ If successful, also pushes its __exit__ method as a callback and
+ returns the result of the __enter__ method.
+ """
+ # We look up the special methods on the type to match the with statement
+ _cm_type=type(cm)
+ _exit=_cm_type.__exit__
+ result=_cm_type.__enter__(cm)
+ self._push_cm_exit(cm,_exit)
+ returnresult
+
+ defclose(self):
+ """Immediately unwind the context stack"""
+ self.__exit__(None,None,None)
+
+ def__exit__(self,*exc_details):
+ received_exc=exc_details[0]isnotNone
+
+ # We manipulate the exception state so it behaves as though
+ # we were actually nesting multiple with statements
+ frame_exc=sys.exc_info()[1]
+ def_fix_exception_context(new_exc,old_exc):
+ # Context may not be correct, so find the end of the chain
+ while1:
+ exc_context=new_exc.__context__
+ ifexc_contextisold_exc:
+ # Context is already set correctly (see issue 20317)
+ return
+ ifexc_contextisNoneorexc_contextisframe_exc:
+ break
+ new_exc=exc_context
+ # Change the end of the chain to point to the exception
+ # we expect it to reference
+ new_exc.__context__=old_exc
+
+ # Callbacks are invoked in LIFO order to match the behaviour of
+ # nested context managers
+ suppressed_exc=False
+ pending_raise=False
+ whileself._exit_callbacks:
+ cb=self._exit_callbacks.pop()
+ try:
+ ifcb(*exc_details):
+ suppressed_exc=True
+ pending_raise=False
+ exc_details=(None,None,None)
+ except:
+ new_exc_details=sys.exc_info()
+ # simulate the stack of exceptions by setting the context
+ _fix_exception_context(new_exc_details[1],exc_details[1])
+ pending_raise=True
+ exc_details=new_exc_details
+ ifpending_raise:
+ try:
+ # bare "raise exc_details[1]" replaces our carefully
+ # set-up context
+ fixed_ctx=exc_details[1].__context__
+ raiseexc_details[1]
+ exceptBaseException:
+ exc_details[1].__context__=fixed_ctx
+ raise
+ returnreceived_excandsuppressed_exc
+
[docs]classEventAssignment:
+ """
+ An EventAssignment describes a change to be performed to the current model
+ simulation. This is assignment can either be fired at the time its
+ associated trigger changes from false to true, or after a specified delay,
+ depending on how the Event to which it is assigned is configured.
+
+ Attributes
+ ----------
+ variable : gillespy2.Species, gillespy2.Parameter
+ Target model component to be modified by the EventAssignment
+ expression. Valid target variables include gillespy2 Species,
+ Parameters, and Compartments.
+ expression : str
+ String to be evaluated when the event is fired. This expression must
+ be evaluable within the model namespace, and the results of it's
+ evaluation will be assigned to the EventAssignment variable.
+ """
+
+ def__init__(self,variable=None,expression=None):
+
+ self.variable=variable
+ self.expression=expression
+
+ ifexpressionisnotNone:
+ self.expression=str(expression)
+
+
+ fromgillespy2.core.gillespy2importSpecies,Parameter
+ #TODO: ADD Compartment to valid variable types once implemented
+ valid_variable_types=[Species,Parameter,str]
+
+ ifnottype(variable)invalid_variable_types:
+ print(variable)
+ print(type(variable))
+ raiseEventError(
+ 'GillesPy2 Event Assignment variable must be a valid gillespy2 species')
+ ifnotisinstance(self.expression,str):
+ raiseEventError(
+ 'GillesPy2 Event Assignment expression requires a '
+ 'valid string expression')
+ def__str__(self):
+ returnself.variable.name+': '+self.expression
+
+
+
[docs]classEventTrigger:
+ """
+ Trigger detects changes in model/environment conditions in order to fire an
+ event. A Trigger contains an expression, a mathematical function which can
+ be evaluated to a boolean value within a model's namespace. Upon
+ transitioning from 'false' to 'true', this trigger will cause the immediate
+ execution of an event's list of assignments if no delay is present, otherwise,
+ the delay evaluation will be initialized.
+
+ Attributes
+ ----------
+ expression : str
+ String for a function calculating EventTrigger values. Should be evaluable
+ in namespace of Model.
+ value : bool
+ Value of EventTrigger at simulation start, with time t=0
+ persistent : bool
+ Determines of trigger condition is persistent or not.
+ """
+
+ def__init__(self,expression=None,initial_value=False,persistent=False):
+
+ ifisinstance(expression,str):
+ self.expression=expression
+ else:
+ raiseEventError('EventTrigger expression must be a string')
+
+ ifisinstance(initial_value,bool):
+ self.value=initial_value
+ else:
+ raiseEventError('EventTrigger initial_value must be bool')
+
+ ifisinstance(persistent,bool):
+ self.persistent=persistent
+ else:
+ raiseEventError('EventTrigger.persistent must be bool')
+ def__str__(self):
+ returnself.expression
+
[docs]classEvent:
+ """
+ An Event can be given as an assignment_expression (function) or directly
+ as a value (scalar). If given an assignment_expression, it should be
+ understood as evaluable in the namespace of a parent Model.
+
+ Attributes
+ ----------
+ name : str
+ The name by which this Event is called or referenced in reactions.
+ assignments : list
+ List of EventAssignments to be executed at trigger or delay
+ trigger : EventTrigger
+ contains math expression which can be evaluated to
+ a boolean result. Upon the transition from 'False' to 'True',
+ event assignments may be executed immediately, or after a
+ designated delay.
+ delay : string
+ contains math expression evaluable within model namespace.
+ This expression designates a delay between the trigger of
+ an event and the execution of its assignments.
+ priority : string
+ contains math expression evaluable within model namespace.
+ TODO: MORE INFO
+ use_values_from_trigger_time: boolean
+ """
+
+ def__init__(self,name="",delay=None,assignments=[],priority="0",
+ trigger=None,use_values_from_trigger_time=False):
+
+ # Events can contain any number of assignments
+ self.assignments=[]
+
+ # Name
+ ifisinstance(name,str):
+ self.name=name
+ else:
+ raiseEventError(
+ 'name must be a valid string')
+
+ # Trigger
+ ifhasattr(trigger,'expression'):
+ self.trigger=trigger
+ else:
+ raiseEventError(
+ 'trigger must be set to a valid EventTrigger')
+
+ # Delay
+ ifdelayisNoneorisinstance(delay,str):
+ self.delay=delay
+ else:
+ raiseEventError(
+ 'delay must be a valid string or None')
+
+ # Priority
+ self.priority=priority
+
+ # Assignments
+ ifisinstance(assignments,list):
+ forassigninassignments:
+ ifhasattr(assign,'variable'):
+ self.assignments.append(assign)
+ else:
+ raiseEventError('assignment list contains an item '
+ 'is not an EventAssignment.')
+ elifhasattr(assignments,'variable'):
+ self.assignments.append(assignments)
+ else:
+ raiseEventError(
+ 'assignments must contain only EventAssignments '
+ 'or a list of EventAssignments')
+ # Use Values from Trigger Time
+ ifisinstance(use_values_from_trigger_time,bool):
+ self.use_values_from_trigger_time=use_values_from_trigger_time
+ else:
+ raiseEventError(
+ 'use_values_from_trigger_time requires bool')
+ def__str__(self):
+ print_string=self.name
+ print_string+='\n\tTrigger: '+str(self.trigger)
+ iflen(self.assignments):
+ print_string+='\n\tAssignments:'
+ forainself.assignments:
+ print_string+='\n\t\t'+a.variable.name+': '+a.expression
+ returnprint_string
+
+
[docs]defadd_assignment(self,assignment):
+ """
+ Adds an eventAssignment or a list of eventAssignments.
+
+ Attributes
+ ----------
+ assignment : EventAssignment or a list of EventAssignments
+ The event or list of events to be added to this event.
+ """
+
+ ifhasattr(assignment,'variable'):
+ self.assignments.append(assignment)
+ elifisinstance(assignment,list):
+ forassigninassignment:
+ ifhasattr(assign,'variable'):
+ self.assignments.append(assign)
+ else:
+ raiseEventError('add_assignment failed to add EventAssignment. '
+ 'Assignment to be added must be of type EventAssignment '
+ 'or list of EventAssignment objects.')
+ else:
+ raiseModelError("Unexpected parameter for add_assignment. Parameter must be EventAssignment or list of EventAssignments")
+ returnobj
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/docs/build/html/_modules/gillespy2/core/gillespy2.html b/docs/build/html/_modules/gillespy2/core/gillespy2.html
index 686571a6b..c47aef66e 100644
--- a/docs/build/html/_modules/gillespy2/core/gillespy2.html
+++ b/docs/build/html/_modules/gillespy2/core/gillespy2.html
@@ -4,7 +4,7 @@
- gillespy2.core.gillespy2 — GillesPy2 1.3.0 documentation
+ gillespy2.core.gillespy2 — GillesPy2 1.4.0 documentation
@@ -106,11 +106,13 @@
"""Base class for GillesPy2 objects that are sortable."""def__eq__(self,other):
- return(isinstance(other,self.__class__)
- andordered(self)==ordered(other))
+ returnstr(self)==str(other)def__ne__(self,other):returnnotself.__eq__(other)
@@ -290,27 +291,27 @@
The species or list of species to be added to the model object. """
- ifisinstance(obj,Species):
- problem=self.problem_with_name(obj.name)
- ifproblemisnotNone:
- raiseproblem
- self.listOfSpecies[obj.name]=obj
- self._listOfSpecies[obj.name]='S{}'.format(len(self._listOfSpecies))
- elifisinstance(obj,list):
+ ifisinstance(obj,list):forSinsorted(obj):self.add_species(S)else:
- raiseModelError("Unexpected parameter for add_species. Parameter must be Species or list of Species.")
+ try:
+ problem=self.problem_with_name(obj.name)
+ ifproblemisnotNone:
+ raiseproblem
+ self.listOfSpecies[obj.name]=obj
+ self._listOfSpecies[obj.name]='S{}'.format(len(self._listOfSpecies))
+ exceptExceptionase:
+ raiseParameterError("Error using {} as a Species. Reason given: {}".format(obj,e))returnobj
forpinsorted(params):self.add_parameter(p)else:
- ifisinstance(params,Parameter):
+ try:problem=self.problem_with_name(params.name)ifproblemisnotNone:raiseproblemself.listOfParameters[params.name]=paramsself._listOfParameters[params.name]='P{}'.format(len(self._listOfParameters))
- else:
- raiseParameterError("Could not resolve Parameter expression {} to a scalar value.".format(params))
+ exceptExceptionase:
+ raiseParameterError("Error using {} as a Parameter. Reason given: {}".format(params,e))returnparams
+ else:
+ try:
+ self.listOfFunctionDefinitions[function_definitions.name]=function_definitions
+ exceptExceptionase:
+ raiseParameterError("Error using {} as a Function Definition. Reason given: ".format(function_definitions,e))
raiseInvalidModelError("StochKit only supports uniform timespans")
[docs]defget_reaction(self,rname):
+ """
+
+ :param rname: name of reaction to return
+ :return: Reaction object
+ """returnself.listOfReactions[rname]
[docs]defget_all_reactions(self):
+ """
+ :return: dict of all Reaction objects
+ """returnself.listOfReactions
[docs]defdelete_reaction(self,obj):
+ """
+ :param obj: Name of Reaction to be removed
+ """self.listOfReactions.pop(obj)self._listOfReactions.pop(obj)
[docs]defdelete_all_reactions(self):
+ """
+ Clears all reactions in model
+ """self.listOfReactions.clear()self._listOfReactions.clear()
+
[docs]defget_event(self,ename):
+ """
+ :param ename: Name of Event to get
+ :return: Event object
+ """
+ returnself.listOfEvents[ename]
+
+
[docs]defget_all_events(self):
+ """
+ :return: dict of all Event objects
+ """
+ returnself.listOfEvents
+
+
[docs]defdelete_event(self,ename):
+ """
+ Removes specified Event from model
+ :param ename: Name of Event to be removed
+ """
+ self.listOfEvents.pop(ename)
+ self._listOfEvents.pop(ename)
[docs]defget_rate_rule(self,rname):
+ """
+ :param rname: Name of Rate Rule to get
+ :return: RateRule object
+ """
+ returnself.listOfRateRules[rname]
+
+
[docs]defget_all_rate_rules(self):
+ """
+ :return: dict of all Rate Rule objects
+ """
+ returnself.listOfRateRules
+
+
[docs]defdelete_rate_rule(self,rname):
+ """
+ Removes specified Rate Rule from model
+ :param rname: Name of Rate Rule to be removed
+ """
+ self.listOfRateRules.pop(rname)
+ self._listOfRateRules.pop(rname)
+
+
[docs]defdelete_all_rate_rules(self):
+ """
+ Clears all of models Rate Rules
+ """
+ self.listOfRateRules.clear()
+ self._listOfRateRules.clear()
+
+
[docs]defget_assignment_rule(self,aname):
+ """
+ :param aname: Name of Assignment Rule to get
+ :return: Assignment Rule object
+ """
+ returnself.listOfAssignmentRules[aname]
[docs]defdelete_assignment_rule(self,aname):
+ """
+ Removes an assignment rule from a model
+ :param aname: Name of AssignmentRule object to be removed from model
+ """
+ self.listOfAssignmentRules.pop(aname)
+ self._listOfAssignmentRules.pop(aname)
+
+
[docs]defdelete_all_assignment_rules(self):
+ """
+ Clears all assignment rules from model
+ """
+ self.listOfAssignmentRules.clear()
+ self._listOfAssignmentRules.clear()
+
+
[docs]defget_function_definition(self,fname):
+ """
+ :param fname: name of Function to get
+ :return: FunctionDefinition object
+ """
+ returnself.listOfFunctionDefinitions[fname]
+
+
[docs]defget_all_function_definitions(self):
+ """
+ :return: Dict of models function definitions
+ """
+ returnself.listOfFunctionDefinitions
+
+
[docs]defdelete_function_definition(self,fname):
+ """
+ Removes specified Function Definition from model
+ :param fname: Name of Function Definition to be removed
+ """
+ self.listOfFunctionDefinitions.pop(fname)
+ self._listOfFunctionDefinitions.pop(fname)
+
+
[docs]defdelete_all_function_definitions(self):
+ """
+ Clears all Function Definitions from a model
+ """
+ self.listOfFunctionDefinitions.clear()
+ self._listOfFunctionDefinitions.clear()
+
+
[docs]defget_element(self,ename):
+ """
+ get element specified by name
+ :param ename: name of element to search for
+ :return:value of element, or 'element not found'
+ """
+ ifenameinself.listOfReactions:
+ returnself.get_reaction(ename)
+ ifenameinself.listOfSpecies:
+ returnself.get_species(ename)
+ ifenameinself.listOfParameters:
+ returnself.get_parameter(ename)
+ ifenameinself.listOfEvents:
+ returnself.get_event(ename)
+ ifenameinself.listOfRateRules:
+ returnself.get_rate_rule(ename)
+ ifenameinself.listOfAssignmentRules:
+ returnself.get_assignment_rule(ename)
+ ifenameinself.listOfFunctionDefinitions:
+ returnself.get_function_definition(ename)
+ return'Element not found!'
+
[docs]defrun(self,solver=None,timeout=0,**solver_args):""" Function calling simulation of the model. There are a number of
@@ -686,10 +874,9 @@
Source code for gillespy2.core.gillespy2
Return ----------
- If show_labels is False, returns a numpy array of arrays of species population data. If show_labels is True and
- number_of_trajectories is 1, returns a results object that inherits UserDict and supports plotting functions.
- If show_labels is False and number_of_trajectories is greater than 1, returns an ensemble_results object that
- inherits UserList and contains results objects and supports ensemble graphing.
+ If show_labels is False, returns a numpy array of arrays of species population data. If show_labels is
+ True,returns a Results object that inherits UserList and contains one or more Trajectory objects that
+ inherit UserDict. Results object supports graphing and csv export. Attributes ----------
@@ -703,75 +890,42 @@
Source code for gillespy2.core.gillespy2
solver-specific arguments to be passed to solver.run() """
- ifos.name=='nt'andtimeout>0:
- fromgillespy2.coreimportlog
- log.warning('Timeouts are not currently supported in Windows.')
- @contextmanager
- deftime_out(time):
- # Register a function to raise a TimeoutError on the signal.
- signal.signal(signal.SIGALRM,raise_time_out)
- # Schedule the signal to be sent after ``time``.
- signal.alarm(time)
-
+ ifsolverisnotNone:try:
- yield
- exceptTimeoutError:
- print('GillesPy2 solver simulation exceeded timeout')
- pass
- finally:
- # Unregister the signal so it won't be triggered
- # if the time_out is not reached.
- signal.signal(signal.SIGALRM,signal.SIG_IGN)
+ solver_results,rc=solver.run(model=self,t=self.tspan[-1],
+ increment=self.tspan[-1]-self.tspan[-2],timeout=timeout,**solver_args)
+ exceptExceptionase:
+ raiseSimulationError(
+ "argument 'solver={}' to run() failed. Reason Given: {}".format(solver,e))
+ else:
+ fromgillespy2.solvers.autoimportSSASolver
+ solver=SSASolver
+ solver_results,rc=SSASolver.run(model=self,t=self.tspan[-1],
+ increment=self.tspan[-1]-
+ self.tspan[-2],timeout=timeout,**solver_args)
- defraise_time_out(signum,frame):
+ ifrc==33:fromgillespy2.coreimportlog
- importsys
- defexcepthook(type,value,traceback):
- pass
- sys.excepthook=excepthooklog.warning('GillesPy2 simulation exceeded timeout.')
- raiseSimulationTimeoutError()
-
-
- withtime_out(timeout):
- ifsolverisnotNone:
- if((isinstance(solver,type)
- andissubclass(solver,GillesPySolver)))orissubclass(type(solver),GillesPySolver):
- ifsolver.name=='SSACSolver':
- signal.signal(signal.SIGALRM,signal.SIG_IGN)
- solver_args['timeout']=timeout
- solver_results,rc=solver.run(model=self,t=self.tspan[-1],increment=self.tspan[-1]-self.tspan[-2],**solver_args)
- else:
- raiseSimulationError(
- "argument 'solver' to run() must be a subclass of GillesPySolver")
- else:
- fromgillespy2.solvers.autoimportSSASolver
- solver=SSASolver
- ifsolver.name=='SSACSolver':
- signal.signal(signal.SIGALRM,signal.SIG_IGN)
- solver_args['timeout']=timeout
- solver_results,rc=SSASolver.run(model=self,t=self.tspan[-1],
- increment=self.tspan[-1]-self.tspan[-2],**solver_args)
-
- ifrc==33:
- fromgillespy2.coreimportlog
- log.warning('GillesPy2 simulation exceeded timeout.')
-
- ifisinstance(solver_results[0],(np.ndarray)):
- returnsolver_results
-
- iflen(solver_results)is1:
- returnResults(data=solver_results[0],model=self,
- solver_name=solver.name,rc=rc)
-
- iflen(solver_results)>1:
- results_list=[]
- foriinrange(0,solver_args.get('number_of_trajectories')):
- results_list.append(Results(data=solver_results[i],model=self,solver_name=solver.name,
- rc=rc))
- returnEnsembleResults(results_list)
- else:
- raiseValueError("number_of_trajectories must be non-negative and non-zero")
ifmode=='continuous':self.initial_value=np.float(initial_value)else:
- ifnotisinstance(initial_value,int):raiseValueError('Discrete values must be of type int.')
+ ifnp.int(initial_value)!=initial_value:
+ raiseValueError("'initial_value' for Species with mode='discrete' must be an integer value. Change to mode='continuous' to use floating point values.")self.initial_value=np.int(initial_value)ifnotallow_negative_populations:ifself.initial_value<0:raiseValueError('A species initial value must be \
@@ -1013,7 +1168,10 @@
Attributes ---------- name : str
- The name by which the reaction is called.
+ The name by which the reaction is called (optional). reactants : dict The reactants that are consumed in the reaction, with stoichiometry. An example would be {R1 : 1, R2 : 2} if the reaction consumes two of R1 and
@@ -1070,7 +1228,10 @@
Source code for gillespy2.core.gillespy2
"""# Metadata
- self.name=name
+ ifname==""ornameisNone:
+ self.name='rxn'+str(uuid.uuid4()).replace('-','_')
+ else:
+ self.name=nameself.annotation=""# We might use this flag in the future to automatically generate
@@ -1115,16 +1276,107 @@
importwarnings
-importcsv
-importosfromdatetimeimportdatetime
+fromgillespy2.core.gillespyErrorimport*
+importpicklefromcollectionsimportUserDict,UserList
-# List of 50 hex color values used for ploting graphs
+# List of 50 hex color values used for plotting graphscommon_rgb_values=['#1f77b4','#ff7f0e','#2ca02c','#d62728','#9467bd','#8c564b','#e377c2','#7f7f7f','#bcbd22','#17becf','#ff0000','#00ff00','#0000ff','#ffff00','#00ffff','#ff00ff','#800000','#808000','#008000','#800080','#008080','#000080','#ff9999','#ffcc99',
@@ -118,9 +118,8 @@
[docs]classResults(UserDict):
- """ Results Dict created by a gillespy2 solver with single trajectory, extends the UserDict object.
+
[docs]classTrajectory(UserDict):
+ """ Trajectory Dict created by a gillespy2 solver containing single trajectory, extends the UserDict object. Attributes ----------
- data : UserList
- A list of Results that are created by solvers with multiple trajectories
+ data : UserDict
+ A dictionary of trajectory values created by a solver
+ model : string
+ The name of the model used to create the trajectory
+ solver_name : string
+ The name of the solver used to create the trajectory
+ rc : int
+ The solver's status return code.
+ status : string
+ The solver status (e.g. 'Success', 'Timed Out') """def__init__(self,data,model=None,solver_name="Undefined solver name",rc=0):
@@ -199,167 +206,99 @@
Source code for gillespy2.core.results
status_list={0:'Success',33:'Timed Out'}self.status=status_list[rc]
-
def__getitem__(self,key):
- iftype(key)istype(1):
- warnings.warn("Results is of type dictionary. Use results['species'] instead of results[0]['species'] ")
+ iftype(key)isint:
+ warnings.warn("Trajectory is of type dictionary. Use trajectory['species'] instead of trajectory[0]['species'] ")returnselfifkeyinself.data:returnself.data[key]ifhasattr(self.__class__,"__missing__"):returnself.__class__.__missing__(self,key)
- raiseKeyError(key)
+ raiseKeyError(key)
-
[docs]defto_csv(self,path=None,nametag=None,stamp=None):
- """ outputs the Results to one or more .csv files in a new directory.
- Attributes
- ----------
- nametag: allows the user to optionally "tag" the directory and included files. Defaults to the model name.
- path: path to the location for the new directory and included files. Defaults to model location.
- stamp: allows the user to optionally identify the directory (not included files). Defaults to timestamp.
- """
- ifstampisNone:
- now=datetime.now()
- stamp=datetime.timestamp(now)
- ifnametagisNone:
- identifier=(self.model.name+" - "+self.solver_name)
- else:
- identifier=nametag
- ifisinstance(self.data,dict):#if only one trajectory
- ifpathisNone:
- directory=os.path.join(".",str(identifier)+str(stamp))
- else:
- directory=os.path.join(path,str(identifier)+str(stamp))
- os.mkdir(directory)
- filename=os.path.join(directory,identifier+".csv")
- field_names=[]
- forspeciesinself.data:#build the header
- field_names.append(species)
- withopen(filename,'w',newline='')ascsv_file:
- csv_writer=csv.writer(csv_file)
- csv_writer.writerow(field_names)#write the header
- forn,timeinenumerate(self.data['time']):#write all lines of the CSV file
- this_line=[]
- forspeciesinself.data:#build one line of the CSV file
- this_line.append(self.data[species][n])
- csv_writer.writerow(this_line)#write one line of the CSV file
-
-
-
-
[docs]defplot(self,xaxis_label="Time (s)",yaxis_label="Species Population",title=None,style="default",
- show_legend=True,included_species_list=[],save_png=False,figsize=(18,10)):
- """ Plots the Results using matplotlib.
+
[docs]classResults(UserList):
+ """ List of Trajectory objects created by a gillespy2 solver, extends the UserList object.
- Attributes
+ Attributes ----------
- xaxis_label : str
- the label for the x-axis
- yaxis_label : str
- the label for the y-axis
- title : str
- the title of the graph
- show_legend : bool
- whether or not to display a legend which lists species
- included_species_list : list
- A list of strings describing which species to include. By default displays all species.
- save_png : bool or str
- Should the graph be saved as a png file. If True, File name is title of graph. If a string is given, file
- is named after that string.
- figsize : tuple
- the size of the graph. A tuple of the form (width,height). Is (18,10) by default.
-
+ data : UserList
+ A list of Trajectory objects """
- importmatplotlib.pyplotasplt
-
- try:
- plt.style.use(style)
- except:
- warnings.warn("Invalid matplotlib style. Try using one of the following {}".format(plt.style.available))
- plt.style.use("default")
-
- iftitleisNone:
- title=(self.model.name+" - "+self.solver_name)
-
- plt.figure(figsize=figsize)
- plt.title(title,fontsize=18)
- plt.xlabel(xaxis_label)
- plt.ylabel(yaxis_label)
-
- _plot_iterate(self,included_species_list=included_species_list)
-
- plt.plot([0],[11])
-
- ifshow_legend:
- plt.legend(loc='best')
-
- ifisinstance(save_png,str):
- plt.savefig(save_png)
-
- elifsave_png:
- plt.savefig(title)
+ def__init__(self,data):
+ self.data=data
-
[docs]defplotplotly(self,xaxis_label="Time (s)",yaxis_label="Species Population",title=None,show_legend=True,
- included_species_list=[],return_plotly_figure=False):
- """ Plots the Results using plotly. Can only be viewed in a Jupyter Notebook.
+ def__getattribute__(self,key):
+ ifkey=='model'orkey=='solver_name'orkey=='rc'orkey=='status':
+ iflen(self.data)>1:
+ warnings.warn("Results is of type list. Use results[i]['model'] instead of results['model'] ")
+ return(getattr(Results.__getattribute__(self,key='data')[0],key))
+ else:
+ returnUserList.__getattribute__(self,key)
- Attributes
- ----------
- xaxis_label : str
- the label for the x-axis
- yaxis_label : str
- the label for the y-axis
- title : str
- the title of the graph
- show_legend : bool
- whether or not to display a legend which lists species
- included_species_list : list
- A list of strings describing which species to include. By default displays all species.
- return_plotly_figure : bool
- whether or not to return a figure dictionary of data(graph object traces) and layout options
- which may be edited by the user.
+ def__getitem__(self,key):
+ ifkey=='data':
+ returnUserList.__getitem__(self,key)
+ iftype(key)isstrandkey!='data':
+ iflen(self.data)>1:
+ warnings.warn("Results is of type list. Use results[i]['model'] instead of results['model'] ")
+ returnself.data[0][key]
+ else:
+ return(UserList.__getitem__(self,key))
+ raiseKeyError(key)
- """
+ def__add__(self,other):
+ combined_data=Results(data=(self.data+other.data))
+ consistent_solver=combined_data._validate_solver()
+ consistent_model=combined_data._validate_model()
- fromplotly.offlineimportinit_notebook_mode,iplot
- importplotly.graph_objsasgo
+ ifconsistent_solverisFalse:
+ warnings.warn("Results objects contain Trajectory objects from multiple solvers.")
- init_notebook_mode(connected=True)
+ consistent_model=combined_data._validate_model()
- iftitleisNone:
- title=(self.model.name+" - "+self.solver_name)
+ ifconsistent_modelisFalse:
+ raiseValidationError('Results objects contain Trajectory objects from multiple models.')
- trace_list=_plotplotly_iterate(self,included_species_list=included_species_list,show_labels=True)
+ combined_data=self.data+other.data
+ returnResults(data=combined_data)
- layout=go.Layout(
- showlegend=show_legend,
- title=title,
- xaxis=dict(
- title=xaxis_label),
- yaxis=dict(
- title=yaxis_label)
- )
- fig=dict(data=trace_list,layout=layout)
-
- ifreturn_plotly_figure:
- returnfig
+ def_validate_model(self,reference=None):
+ is_valid=True
+ ifreferenceisnotNone:
+ reference_model=referenceelse:
- iplot(fig)
-
-
[docs]classEnsembleResults(UserList):
- """ List of Results Dicts created by a gillespy2 solver with multiple trajectories, extends the UserList object.
-
- Attributes
- ----------
- data : UserList
- A list of Results
- """
-
- def__init__(self,data):
- self.data=data
+ reference_model=self.data[0].model
+ fortrajectoryinself.data:
+ iftrajectory.model!=reference_model:
+ is_valid=False
+ returnis_valid
+
+ def_validate_solver(self,reference=None):
+ is_valid=True
+ ifreferenceisnotNone:
+ reference_solver=reference
+ else:
+ reference_solver=self.data[0].solver_name
+ fortrajectoryinself.data:
+ iftrajectory.solver_name!=reference_solver:
+ is_valid=False
+ returnis_valid
+
+ def_validate_title(self):
+ ifself._validate_model():
+ title_model=self.data[0].model.name
+ else:
+ title_model='Multiple Models'
+ ifself._validate_solver():
+ title_solver=self.data[0].solver_name
+ else:
+ title_solver='Multiple Solvers'
+ title=(title_model+" - "+title_solver)
+ returntitle
-
[docs]defto_csv(self,path=None,nametag=None,stamp=None):""" outputs the Results to one or more .csv files in a new directory. Attributes
@@ -368,11 +307,14 @@
Source code for gillespy2.core.results
path: the location for the new directory and included files. Defaults to model location. stamp: Allows the user to optionally "tag" the directory (not included files). Default is timestamp. """
+ importcsv
+ importos
+
ifstampisNone:now=datetime.now()stamp=datetime.timestamp(now)ifnametagisNone:
- identifier=(self[0].model.name+" - "+self[0].solver_name)
+ identifier=self._validate_title()else:identifier=nametagifpathisNone:
@@ -396,12 +338,13 @@
Source code for gillespy2.core.results
this_line.append(trajectory[species][n])csv_writer.writerow(this_line)#write one line of the CSV file
[docs]defplot(self,index=None,xaxis_label="Time (s)",yaxis_label="Species Population",style="default",title=None,show_legend=True,multiple_graphs=False,included_species_list=[],save_png=False,figsize=(18,10)):""" Plots the Results using matplotlib. Attributes ----------
+ index : if not none, the index of the Trajectory to be plotted xaxis_label : str the label for the x-axis yaxis_label : str
@@ -423,20 +366,26 @@
[docs]defplotplotly(self,index=None,xaxis_label="Time (s)",yaxis_label="Species Population",title=None,show_legend=True,multiple_graphs=False,included_species_list=[],return_plotly_figure=False):""" Plots the Results using plotly. Can only be viewed in a Jupyter Notebook. Attributes ----------
+ index : if not none, the index of the Trajectory to be plotted xaxis_label : str the label for the x-axis yaxis_label : str
@@ -499,15 +449,24 @@
[docs]defaverage_ensemble(self):"""
- Generate a single Results dictionary that is made of the means of all trajectories' outputs
- :return: the Results dictionary
+ Generate a single Results object with a Trajectory that is made of the means of all trajectories' outputs
+ :return: the Results object """
- results_list=self.data
- number_of_trajectories=len(results_list)
+ trajectory_list=self.data
+ number_of_trajectories=len(trajectory_list)
- output=Results(data={},model=results_list[0].model,solver_name=results_list[0].solver_name)
+ output_trajectory=Trajectory(data={},model=trajectory_list[0].model,solver_name=trajectory_list[0].solver_name)
- forspeciesinresults_list[0]:#Initialize the output to be the same size as the inputs
- output[species]=[0]*len(results_list[0][species])
+ forspeciesintrajectory_list[0]:#Initialize the output to be the same size as the inputs
+ output_trajectory[species]=[0]*len(trajectory_list[0][species])
- output['time']=results_list[0]['time']
+ output_trajectory['time']=trajectory_list[0]['time']
- foriinrange(0,number_of_trajectories):#Add every value of every Results Dict into one output Results
- results_dict=results_list[i]
- forspeciesinresults_dict:
- ifspeciesis'time':
+ foriinrange(0,number_of_trajectories):#Add every value of every Trajectory Dict into one output Trajectory
+ trajectory_dict=trajectory_list[i]
+ forspeciesintrajectory_dict:
+ ifspecies=='time':continue
- forkinrange(0,len(output[species])):
- output[species][k]+=results_dict[species][k]
+ forkinrange(0,len(output_trajectory[species])):
+ output_trajectory[species][k]+=trajectory_dict[species][k]
- forspeciesinoutput:#Divide for mean of every value in output Results
- ifspeciesis'time':
+ forspeciesinoutput_trajectory:#Divide for mean of every value in output Trajectory
+ ifspecies=='time':continue
- foriinrange(0,len(output[species])):
- output[species][i]/=number_of_trajectories
+ foriinrange(0,len(output_trajectory[species])):
+ output_trajectory[species][i]/=number_of_trajectories
+
+ output_results=Results(data=[output_trajectory])#package output_trajectory in a Results object
- returnoutput
[docs]defstddev_ensemble(self,ddof=0):"""
- Generate a single Results dictionary that is made of the sample standard deviations of all trajectories'
- outputs.
+ Generate a single Results object with a Trajectory that is made of the sample standard deviations of all
+ trajectories' outputs. Attributes ----------
@@ -607,49 +568,50 @@
Source code for gillespy2.core.results
the number of trajectories. Sample standard deviation uses ddof of 1. Defaults to population standard deviation where ddof is 0.
- :return: the Results dictionary
+ :return: the Results object """frommathimportsqrt
- results_list=self.data
- number_of_trajectories=len(results_list)
+ trajectory_list=self.data
+ number_of_trajectories=len(trajectory_list)ifddof==number_of_trajectories:warnings.warn("ddof must be less than the number of trajectories. Using ddof of 0")ddof=0
- average_list=self.average_ensemble()
+ average_list=self.average_ensemble().data[0]
- output=Results(data={},model=results_list[0].model,solver_name=results_list[0].solver_name)
+ output_trajectory=Trajectory(data={},model=trajectory_list[0].model,solver_name=trajectory_list[0].solver_name)
- forspeciesinresults_list[0]:#Initialize the output to be the same size as the inputs
- output[species]=[0]*len(results_list[0][species])
+ forspeciesintrajectory_list[0]:#Initialize the output to be the same size as the inputs
+ output_trajectory[species]=[0]*len(trajectory_list[0][species])
- output['time']=results_list[0]['time']
+ output_trajectory['time']=trajectory_list[0]['time']foriinrange(0,number_of_trajectories):
- results_dict=results_list[i]
- forspeciesinresults_dict:
- ifspeciesis'time':
+ trajectory_dict=trajectory_list[i]
+ forspeciesintrajectory_dict:
+ ifspecies=='time':continue
- forkinrange(0,len(output[species])):
- output[species][k]+=(results_dict[species][k]-average_list[species][k])\
- *(results_dict[species][k]-average_list[species][k])
+ forkinrange(0,len(output_trajectory['time'])):
+ output_trajectory[species][k]+=(trajectory_dict[species][k]-average_list[species][k])\
+ *(trajectory_dict[species][k]-average_list[species][k])
- forspeciesinoutput:#Divide for mean of every value in output Results
- ifspeciesis'time':
+ forspeciesinoutput_trajectory:#Divide for mean of every value in output Trajectory
+ ifspecies=='time':continue
- foriinrange(0,len(output[species])):
- output[species][i]/=(number_of_trajectories-ddof)
- output[species][i]=sqrt(output[species][i])
+ foriinrange(0,len(output_trajectory[species])):
+ output_trajectory[species][i]/=(number_of_trajectories-ddof)
+ output_trajectory[species][i]=sqrt(output_trajectory[species][i])
- returnoutput
+ output_results=Results(data=[output_trajectory])#package output_trajectory in a Results object
+ returnoutput_results
[docs]defplotplotly_std_dev_range(self,xaxis_label="Time (s)",yaxis_label="Species Population",title=None,show_legend=True,included_species_list=[],return_plotly_figure=False,ddof=0):"""
- Plot a plotly graph depicting standard deviation and the mean graph of an ensemble_results object
+ Plot a plotly graph depicting standard deviation and the mean graph of a results object Attributes ----------
@@ -673,8 +635,8 @@
[docs]defplot_std_dev_range(self,xaxis_label="Time (s)",yaxis_label="Species Population",title=None,style="default",show_legend=True,included_species_list=[],ddof=0,save_png=False,figsize=(18,10)):"""
- Plot a matplotlib graph depicting standard deviation and the mean graph of an ensemble_results object
+ Plot a matplotlib graph depicting standard deviation and the mean graph of a results object Attributes ----------
@@ -779,8 +741,8 @@
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
curr_state['time']=curr_time# Integrate until end or tau is reached
+ # TODO: Need a way to exit solve_ivp when timeout is triggeredsol=solve_ivp(rhs,[curr_time,model.tspan[-1]],y0,method=integrator,dense_output=True,events=tau_event,**integrator_options)
@@ -705,7 +704,7 @@
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
def run(self,model,t=20,number_of_trajectories=1,increment=0.05,seed=None,debug=False,profile=False,show_labels=True,tau_tol=0.03,event_sensitivity=100,integrator='LSODA',
- integrator_options={},**kwargs):
+ integrator_options={},timeout=None,**kwargs):""" Function calling simulation of the model. This is typically called by the run function in GillesPy2 model objects and will inherit those parameters which are passed with the model as the arguments this run function.
@@ -870,19 +869,56 @@
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
# Main trajectory loop
fortrajectory_numinrange(number_of_trajectories):
- ifself.interrupted:break
+ ifself.stop_event.is_set():
+ print('exiting')
+ self.rc=33
+ breaktrajectory=trajectory_base[trajectory_num]# NumPy array containing this simulation's resultspropensities=OrderedDict()# Propensities evaluated at current state
@@ -974,7 +1013,6 @@
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
# One-time compilations to reduce time spent with eval
compiled_reactions,compiled_rate_rules,compiled_inactive_reactions,compiled_propensities=self.__compile_all(model)
-
all_compiled=OrderedDict()all_compiled['rxns']=compiled_reactionsall_compiled['inactive_rxns']=compiled_inactive_reactions
@@ -999,12 +1037,14 @@
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
# Each save step
whilecurr_time<model.tspan[-1]:
- ifself.interrupted:break
+ ifself.stop_event.is_set():
+ self.rc=33
+ break# Get current propensitiesifnotpure_ode:fori,rinenumerate(model.listOfReactions):try:
- propensities[r]=eval(compiled_propensities[r],eval_globals,curr_state)
+ propensities[r]=eval(compiled_propensities[r],{**eval_globals,**curr_state})exceptExceptionase:raiseSimulationError('Error calculation propensity for {0}.\nReason: {1}'.format(r,e))
@@ -1014,16 +1054,16 @@
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
model,propensities,curr_state,curr_time,save_times[0]]tau_step=save_times[-1]-curr_timeifpure_odeelseTau.select(*tau_args)
+ # Process switching if used
+ ifnotpure_stochasticandnotpure_ode:
+ switch_args=[model,propensities,curr_state,tau_step,det_spec]
+ sd,CV=self.__calculate_statistics(*switch_args)
+
# Calculate sd and CV for hybrid switching and flag deterministic reactionsifpure_stochastic:deterministic_reactions=frozenset()# Empty if non-detelse:deterministic_reactions=self.__flag_det_reactions(model,det_spec,det_rxn,dependencies)
-
- # Process switching if used
- ifnotpure_stochasticandnotpure_ode:
- switch_args=[model,propensities,curr_state,tau_step,det_spec]
- sd,CV=self.__calculate_statistics(*switch_args)ifdebug:print('mean: {0}'.format(mu_i))
@@ -1064,7 +1104,8 @@
Source code for gillespy2.solvers.numpy.basic_tau_hybrid_solver
Source code for gillespy2.solvers.numpy.basic_tau_leaping_solver
[docs]classBasicTauLeapingSolver(GillesPySolver):name='BasicTauLeapingSolver'
- interrupted=Falserc=0
+ stop_event=None
+ result=None""" A Basic Tau Leaping Solver for GillesPy2 models. This solver uses an algorithm calculates multiple reactions in a single step over a given tau step size. The change in propensities
@@ -124,8 +125,9 @@
Source code for gillespy2.solvers.numpy.basic_tau_leaping_solver
Source code for gillespy2.solvers.numpy.basic_tau_leaping_solver
[docs]@classmethoddefrun(self,model,t=20,number_of_trajectories=1,increment=0.05,seed=None,
- debug=False,profile=False,show_labels=True,tau_tol=0.03,**kwargs):
+ debug=False,profile=False,show_labels=True,
+ timeout=None,tau_tol=0.03,**kwargs):""" Function calling simulation of the model. This is typically called by the run function in GillesPy2 model objects
@@ -191,19 +194,48 @@
Source code for gillespy2.solvers.numpy.basic_tau_leaping_solver
show_labels : bool (True)
Use names of species as index of result object rather than position numbers. """
- deftimed_out(signum,frame):
- self.rc=33
- self.interrupted=True
- signal.signal(signal.SIGALRM,timed_out)
-
-
- ifnotisinstance(self,BasicTauLeapingSolver):
+ ifisinstance(self,type):self=BasicTauLeapingSolver(debug=debug,profile=profile)
+ self.stop_event=Event()
+ iftimeoutisnotNoneandtimeout<=0:timeout=Noneiflen(kwargs)>0:forkeyinkwargs:log.warning('Unsupported keyword argument to {0} solver: {1}'.format(self.name,key))
+
+ sim_thread=Thread(target=self.___run,args=(model,),kwargs={'t':t,
+ 'number_of_trajectories':number_of_trajectories,
+ 'increment':increment,'seed':seed,
+ 'debug':debug,'show_labels':show_labels,
+ 'timeout':timeout,'tau_tol':tau_tol})
+ try:
+ sim_thread.start()
+ sim_thread.join(timeout=timeout)
+ self.stop_event.set()
+ whileself.resultisNone:pass
+ except:
+ pass
+ ifhasattr(self,'has_raised_exception'):
+ raiseself.has_raised_exception
+ returnself.result,self.rc
[docs]@classmethod
- defrun(self,model,t=20,number_of_trajectories=1,increment=0.05,seed=None,debug=False,show_labels=True,**kwargs):
+ defrun(self,model,t=20,number_of_trajectories=1,increment=0.05,
+ seed=None,debug=False,show_labels=True,timeout=None,**kwargs):""" Run the SSA algorithm using a NumPy for storing the data in arrays and generating the timeline. :param model: The model on which the solver will operate.
@@ -132,20 +135,46 @@
Source code for gillespy2.solvers.numpy.ssa_solver
:param show_labels: Use names of species as index of result object rather than position numbers.
:return: a list of each trajectory simulated. """
- deftimed_out(signum,frame):
- self.rc=33
- self.interrupted=True
- signal.signal(signal.SIGALRM,timed_out)
-
-
- ifnotisinstance(self,NumPySSASolver):
+ ifisinstance(self,type):self=NumPySSASolver()
+ self.stop_event=Event()
+ iftimeoutisnotNoneandtimeout<=0:timeout=None
+
iflen(kwargs)>0:forkeyinkwargs:log.warning('Unsupported keyword argument to {0} solver: {1}'.format(self.name,key))
+ sim_thread=Thread(target=self.___run,args=(model,),kwargs={'t':t,
+ 'number_of_trajectories':number_of_trajectories,
+ 'increment':increment,'seed':seed,
+ 'debug':debug,'show_labels':show_labels,
+ 'timeout':timeout})
+ try:
+ sim_thread.start()
+ sim_thread.join(timeout=timeout)
+ self.stop_event.set()
+ whileself.resultisNone:pass
+ except:
+ pass
+ ifhasattr(self,'has_raised_exception'):
+ raiseself.has_raised_exception
+ returnself.result,self.rc
Function calling simulation of the model. There are a number of
-parameters to be set here.
-
-
Returns
-
-
If show_labels is False, returns a numpy array of arrays of species population data. If show_labels is True and
-
number_of_trajectories is 1, returns a results object that inherits UserDict and supports plotting functions.
-
If show_labels is False and number_of_trajectories is greater than 1, returns an ensemble_results object that
-
inherits UserList and contains results objects and supports ensemble graphing.
-
-
-
-
-
-
-solver
-
The solver by which to simulate the model. This solver object may
-be initialized separately to specify an algorithm. Optional,
-defaults to ssa solver.
-
-
Type
-
gillespy.GillesPySolver
-
-
-
-
-
-
-timeout
-
Allows a time_out value in seconds to be sent to a signal handler, restricting simulation run-time
-
-
Type
-
int
-
-
-
-
-
-
-solver_args
-
solver-specific arguments to be passed to solver.run()
Generate a dictionary mapping user chosen parameter names to simplified formats which will be used
-later on by GillesPySolvers evaluating reaction propensity functions.
-:return: the dictionary mapping user parameter names to their internal GillesPy notation.
Generate a dictionary mapping user chosen species names to simplified formats which will be used
-later on by GillesPySolvers evaluating reaction propensity functions.
-:return: the dictionary mapping user species names to their internal GillesPy notation.
Representation of a well mixed biochemical model. Contains reactions,
-parameters, species.
-
-
-name
-
The name of the model, or an annotation describing it.
-
-
Type
-
str
-
-
-
-
-
-
-population
-
The type of model being described. A discrete stochastic model is a
-population model (True), a deterministic model is a concentration model
-(False). Automatic conversion from population to concentration models
-may be used, by setting the volume parameter.
-
-
Type
-
bool
-
-
-
-
-
-
-volume
-
The volume of the system matters when converting to from population to
-concentration form. This will also set a parameter “vol” for use in
-custom (i.e. non-mass-action) propensity functions.
-
-
Type
-
float
-
-
-
-
-
-
-tspan
-
The timepoints at which the model should be simulated. If None, a
-default timespan is added. May be set later, see Model.timespan
A parameter can be given as an expression (function) or directly
-as a value (scalar). If given an expression, it should be
-understood as evaluable in the namespace of a parent Model.
-
-
-name
-
The name by which this parameter is called or referenced in reactions.
-
-
Type
-
str
-
-
-
-
-
-
-expression
-
String for a function calculating parameter values. Should be evaluable
-in namespace of Model.
-
-
Type
-
str
-
-
-
-
-
-
-value
-
Value of a parameter if it is not dependent on other Model entities.
A RateRule is used to express equations that determine the rates of change
-of variables. This would correspond to a function in the form of dx/dt=f(W)
Models a single reaction. A reaction has its own dicts of species
-(reactants and products) and parameters. The reaction’s propensity
-function needs to be evaluable (and result in a non-negative scalar
-value) in the namespace defined by the union of those dicts.
-
-
-name
-
The name by which the reaction is called.
-
-
Type
-
str
-
-
-
-
-
-
-reactants
-
The reactants that are consumed in the reaction, with stoichiometry. An
-example would be {R1 : 1, R2 : 2} if the reaction consumes two of R1 and
-one of R2, where R1 and R2 are Species objects.
-
-
Type
-
dict
-
-
-
-
-
-
-products
-
The species that are created by the reaction event, with stoichiometry.
-Same format as reactants.
-
-
Type
-
dict
-
-
-
-
-
-
-propensity_function
-
The custom propensity fcn for the reaction. Must be evaluable in the
-namespace of the reaction using C operations.
-
-
Type
-
str
-
-
-
-
-
-
-massaction
-
The switch to use a mass-action reaction. If set to True, a rate value
-is required.
-
-
Type
-
bool
-
-
-
-
-
-
-rate
-
The rate of the mass-action reaction. Take care to note the units…
-
-
Type
-
float
-
-
-
-
-
-
-annotation
-
An optional note about the reaction.
-
-
Type
-
str
-
-
-
-
-
Notes
-
For a species that is NOT consumed in the reaction but is part of a mass
-action reaction, add it as both a reactant and a product.
-
Mass-action reactions must also have a rate term added. Note that the input
-rate represents the mass-action constant rate independent of volume.
Chemical species. Can be added to Model object to interact with other
-species or time.
-
-
-name
-
The name by which this species will be called in reactions and within
-the model.
-
-
Type
-
str
-
-
-
-
-
-
-initial_value
-
Initial population of this species. If this is not provided as an int,
-the type will be changed when it is added by numpy.int
-
-
Type
-
int >= 0
-
-
-
-
-
-
-constant
-
If true, the value of the species cannot be changed.
-(currently BasicTauHybridSolver only)
-
-
Type
-
bool
-
-
-
-
-
-
-boundary_condition
-
If true, species can be changed by events and rate rules, but not by
-reactions. (currently BasicTauHybridOnly)
-
-
Type
-
bool
-
-
-
-
-
-
-mode
-
*FOR USE WITH BasicTauHybridSolver ONLY*
-Sets the mode of representation of this species for the TauHybridSolver,
-can be discrete, continuous, or dynamic.
-mode=’dynamic’ - Default, allows a species to be represented as
-
-
either discrete or continuous
-
-
mode=’continuous’ - Species will only be represented as continuous
-mode=’discrete’ - Species will only be represented as discrete
-
-
Type
-
str
-
-
-
-
-
-
-allow_negative_populations
-
If true, population can be reduced below 0
-
-
Type
-
bool
-
-
-
-
-
-
-switch_tol
-
*FOR USE WITH BasicTauHybridSolver ONLY*
-Tolerance level for considering a dynamic species deterministically,
-value is compared to an estimated sd/mean population of a species after a
-given time step. This value will be used if a switch_min is not
-provided. The default value is 0.03
-
-
Type
-
float
-
-
-
-
-
-
-switch_min
-
*FOR USE WITH BasicTauHybridSolver ONLY*
-Minimum population value at which species will be represented as
-continuous. If a value is given, switch_min will be used instead of
-switch_tol
Creates an StochKit XML document from an exisiting Mdoel object.
-This method assumes that all the parameters in the model are already
-resolved to scalar floats (see Model.resolveParamters).
-
Note, this method is intended to be used interanally by the models
-‘serialization’ function, which performs additional operations and
-tests on the model prior to writing out the XML file. You should NOT do:
SBML to GillesPy model converter. NOTE: non-mass-action rates
-in terms of concentrations may not be converted for population
-simulation. Use caution when importing SBML.
-
-
-gillespy2.core.gillespy2.filename
-
Path to the SBML file for conversion.
-
-
Type
-
str
-
-
-
-
-
-
-gillespy2.core.gillespy2.name
-
Name of the resulting model.
-
-
Type
-
str
-
-
-
-
-
-
-gillespy2.core.gillespy2.gillespy_model
-
If desired, the SBML model may be added to an existing GillesPy model.
Plot a matplotlib graph depicting standard deviation and the mean graph of an ensemble_results object
-
-
Attributes
-
-
-
xaxis_labelstr
the label for the x-axis
-
-
yaxis_labelstr
the label for the y-axis
-
-
titlestr
the title of the graph
-
-
show_legendbool
whether or not to display a legend which lists species
-
-
included_species_listlist
A list of strings describing which species to include. By default displays all species.
-
-
ddofint
Delta Degrees of Freedom. The divisor used in calculations is N - ddof, where N represents
-the number of trajectories. Sample standard deviation uses ddof of 1. Defaults to population
-standard deviation where ddof is 0.
-
-
save_pngbool or str
Should the graph be saved as a png file. If True, File name is title of graph. If a string is given, file
-is named after that string.
-
-
figsizetuple
the size of the graph. A tuple of the form (width,height). Is (18,10) by default.
Plot a plotly graph depicting standard deviation and the mean graph of an ensemble_results object
-
-
Attributes
-
-
-
xaxis_labelstr
the label for the x-axis
-
-
yaxis_labelstr
the label for the y-axis
-
-
titlestr
the title of the graph
-
-
show_legendbool
whether or not to display a legend which lists species
-
-
included_species_listlist
A list of strings describing which species to include. By default displays all species.
-
-
return_plotly_figurebool
whether or not to return a figure dictionary of data(graph object traces) and layout options
-which may be edited by the user.
-
-
ddofint
Delta Degrees of Freedom. The divisor used in calculations is N - ddof, where N represents
-the number of trajectories. Sample standard deviation uses ddof of 1. Defaults to population
-standard deviation where ddof is 0.
Generate a single Results dictionary that is made of the sample standard deviations of all trajectories’
-outputs.
-
-
Attributes
-
-
-
ddofint
Delta Degrees of Freedom. The divisor used in calculations is N - ddof, where N represents
-the number of trajectories. Sample standard deviation uses ddof of 1. Defaults to population
-standard deviation where ddof is 0.
outputs the Results to one or more .csv files in a new directory.
-
-
Attributes
-
-
nametag: allows the user to optionally “tag” the directory and included files. Defaults to the model name.
-path: the location for the new directory and included files. Defaults to model location.
-stamp: Allows the user to optionally “tag” the directory (not included files). Default is timestamp.
outputs the Results to one or more .csv files in a new directory.
-
-
Attributes
-
-
nametag: allows the user to optionally “tag” the directory and included files. Defaults to the model name.
-path: path to the location for the new directory and included files. Defaults to model location.
-stamp: allows the user to optionally identify the directory (not included files). Defaults to timestamp.
Generate a single Results object with a Trajectory that is made of the sample standard deviations of all
+trajectories’ outputs.
+
+
Attributes
+
+
+
ddofint
Delta Degrees of Freedom. The divisor used in calculations is N - ddof, where N represents
+the number of trajectories. Sample standard deviation uses ddof of 1. Defaults to population
+standard deviation where ddof is 0.
Function calling simulation of the model. This is typically called by the run function in GillesPy2 model
objects and will inherit those parameters which are passed with the model as the arguments this run function.
This Solver uses a root-finding interpretation of the direct SSA method,
-along with ODE solvers to simulate ODE and Stochastic systems
-interchangeably or simultaneously.
Function calling simulation of the model. This is typically called by the run function in GillesPy2 model
-objects and will inherit those parameters which are passed with the model as the arguments this run function.
If true, simulation returns a list of trajectories, where each list entry is a dictionary containing key value pairs of species : trajectory. If false, returns a numpy array with shape [traj_no, time, species]
A Basic Tau Leaping Solver for GillesPy2 models. This solver uses an algorithm calculates
multiple reactions in a single step over a given tau step size. The change in propensities
over this step are bounded by bounding the relative change in state, yielding greatly improved
@@ -557,7 +401,7 @@
Function calling simulation of the model.
This is typically called by the run function in GillesPy2 model objects
and will inherit those parameters which are passed with the model
@@ -587,188 +431,11 @@
This Solver uses a root-finding interpretation of the direct SSA method,
-along with ODE solvers to simulate ODE and Stochastic systems
-interchangeably or simultaneously.
Function calling simulation of the model. This is typically called by the run function in GillesPy2 model
-objects and will inherit those parameters which are passed with the model as the arguments this run function.
If true, simulation returns a list of trajectories, where each list entry is a dictionary containing key value pairs of species : trajectory. If false, returns a numpy array with shape [traj_no, time, species]
Run the SSA algorithm using a NumPy for storing the data in arrays and generating the timeline.
:param model: The model on which the solver will operate.
:param t: The end time of the solver.
@@ -809,6 +476,11 @@
Run the SSA algorithm using a NumPy for storing the data in arrays and generating the timeline.
:param model: The model on which the solver will operate.
:param t: The end time of the solver.
@@ -849,6 +521,11 @@
A Basic Tau Leaping Solver for GillesPy2 models. This solver uses an algorithm calculates
multiple reactions in a single step over a given tau step size. The change in propensities
over this step are bounded by bounding the relative change in state, yielding greatly improved
@@ -922,7 +604,7 @@
Function calling simulation of the model.
This is typically called by the run function in GillesPy2 model objects
and will inherit those parameters which are passed with the model
@@ -952,6 +634,11 @@
Function calling simulation of the model. This is typically called by the run function in GillesPy2 model
objects and will inherit those parameters which are passed with the model as the arguments this run function.
GillesPy2 is an open-source Python package for stochastic simulation of biochemical systems. It offers an object-oriented approach for creating mathematical models of biological systems, as well as a variety of methods for performing time simulation of those models. The methods include the Gillespie direct method (SSA), several variant stochastic simulation methods including tau leaping, and numerical integration of ODEs. The solvers support a variety of user environments, with optimized code for C++, Cython, and NumPy. Models can also be read from files in SBML format.