Skip to content

Releases: colgreen/sharpneat

v2.2.4 (Efficacy Sampling - Last Minute Changes)

29 Apr 17:04
Compare
Choose a tag to compare

This is the version of the software used to run the efficacy sampling experiments. Previously this was reported to be version 2.2.3, but there were in fact last minute changes made; this release thus captures the state of the software used to generate the data in that efficacy sampling post.

Changes include a code refactor of the generative function task, and tweaks to the generative function task fitness evaluation scheme.

v2.2.3 (Efficacy Sampling)

22 Mar 22:28
Compare
Choose a tag to compare

This version is primarily a marker for the state of the software that efficacy sampling experiments have been performed against. Future changes can be tested with efficacy sampling on the same tasks in order to determine if those changes improve the efficacy of SharpNEAT, or otherwise.

Other changes:

  • Function regression task: Implementation changes. Evaluation scheme changes.
  • New Generative Sinewave task.

v2.2.2

28 Jan 21:30
Compare
Choose a tag to compare

SharpNEAT 2.2.2
2017-01-28
Colin Green

Changes from previous release

Fixes

Enhancements

Miscellany

v2.2.1

21 Feb 22:21
Compare
Choose a tag to compare

SharpNEAT 2.2.1
2015-06-15
Colin Green

Changes from previous release

Fixes

  • Performance: ZigguratGaussianSampler.cs: Switched SampleTail() to use NextDoubleNonZero() instead of NextDouble(),
    to avoid attempt to compute Log(0). That wasn't a defect per se because Log() returns a NaN instead of throwing an
    exception, but it did cause a slow execution path.
  • FIX: FastAcyclicNetworkFactory.cs: Lookup of definition node index from a new/working index was wrong. The lookup
    table used worked the other way around, i.e. it mapped definition indexes to new indexes. At this time this
    affected only the Walker 2D (Hyperneat) problem domain.
  • FIX: Box2dDomainView.Designer.cs: Added openGlControl.DestroyContexts() based on report of a blue screen failure
    on one box being remedied by this (almost certainly a video driver issue).
  • FIX: WalkerBox2dExperimentHyperNeat.cs: Nasty bug in setting of substrate node coordinates.
  • FIX: FastRandom.cs: Seeds are now hashed. Without this the first random sample for nearby seeds (1,2,3, etc.) are
    very similar (have a similar bit pattern). Thanks to Francois Guibert for identifying this problem.

v2.2.0

21 Feb 22:23
Compare
Choose a tag to compare

SharpNEAT 2.2.0
2012-04-02
Colin Green

Changes from previous release

New Features

  • Walker BOX2D Problem Domain.
  • Support for multiple auxiliary fitness values per genome (plotted on graphs).
  • Acyclic networks as HyperNEAT CPPNs.

Fixes

  • FIX: Prey Capture problem domain: Only one of the four ANN output signals was being read.
    This effectively completely broke the prey capture domain.
  • IntPoint: Fixed equality and inequality operators and CalculateDistance().
    In the released code these defects severely affected the prey capture domain.
  • FIX: Config loading: Relaxing network delta threshold setting was being parsed as an Int32 instead of a Double.
  • Fix to RandomClusteringStrategy.cs. Genome.SpecieIdx was not being set upon allocation.
    Added debug assertion to check specieIdx is correctly set following speciation.
  • FIX/MOD: Network visualisation: Layout logic failed when there were large numbers of neurons in
    a layout layer such that the gap between them was less than 0.5 of a pixel, and thus got rounded down to 0,
    positioning all nodes in that layer at the same coordinates.
  • FIX/MOD: Ensure ID generators are set accordingly when loading genomes using a pre-existing genome factory.

v2.1.0

21 Feb 22:26
Compare
Choose a tag to compare

SharpNEAT 2.1
2011-09-16
Colin Green

Changes from previous release

Major New Features

  • Support for evolution of acyclic networks. This includes:
    • Development of NeatGenome and associated classes to support efficient
      evolution of acyclic networks.
    • Development of neural network classes to efficiently execute neural
      networks.
    • Development of network visualization to layout nodes by layer. This also
      affects how cylcic networks are laid out because the algorithm that
      determines which layer a node is in for acylcic nets was extended to also
      calculate a sensible layer number for cyclic networks, This greatly
      improves how the networks are laid out with respect to gaining
      understanding of the network architecture from its visual rendering.
    • Modified all function regression and binary logic themed experiments to
      use acyclic networks.
  • Added support for Box2D (2D physics engine) based problem domains and
    visualisation of Box2D worlds with OpenGL.
    • New experiment - Single pole balancing using Box2D, including
      visualization.
    • New experiment - Inverted double pendulum using Box2D, including
      visualization.

Other Developments

  • Improved assertions when running in debug mode. This improves code quailty
    by reducing the number of undetected defects in released code.
  • New ZigguratGaussianSampler class for generating Gaussian noise for
    mutations and simulations (where required). This approach has a much reduced
    requirement for calls to expensive floating point operations such as
    Math.Sqr() and Math.Log(). Typically this is about 2x faster than sampling
    by using the previously used and simpler Box-Muller method.
  • FastRandom now seeds using random numbers from a global FastRandom. This
    prevents multiple instances from obtaining the same seed from the system
    tick count when initialising within the same clock tick.
  • Log(n) function regression experiment.
  • XOR and binary multiplexer experiments modified to use fitness score based
    on squared error. This change improves search efficiency.
  • Function regression experiments changed to have peaks and troughs at y = 0.9
    and 0.1 respectively. This avoids requiring activation functions to output
    values at the extremes of their ranges.
  • CPPNs modified to use a Gaussian activation function instead of
    BipolarGaussian. My opinion is that BipolarGaussian isn't directly useful
    and the equivalent functionality can be achieved if necessary by combining
    Gaussian with Linear.
  • Refactoring of mutation type selection logic/code.

Fixes

  • Fix to crossover logic whereby connection genes where not copied and thus
    shared between parent and child genomes.
  • RelaxingCyclicNetwork and FastRelaxingCyclicNetwork: IsStateValid property
    return value was defined the wrong way around for relaxed networks. It
    returned false when the networks were relaxed (which is the valid state).
    Note - none of the current experiments shipped with SharpNEAT use relaxing
    networks.
  • XML I/O was not culture neutral. RBF-NEAT uses comma separated numbers
    within genome XML which conflicted with use of commas as the numeric decimal
    separator in come cultures.
  • Fix to genome loading. Now uses genome factory from the current experiment;
    previously it was hard coded to NeatGenomeFactory which was incorrect when
    using sub-classes such as CppnGenomeFactory.
  • Genetic crossover of CPPN genomes randomly regenerated the node activation
    functions on each node of a child genome instead of taking the activation
    functions from the parent genome.