Releases: Craigacp/MIToolbox
MIToolbox 3.0.2
MIToolbox 3.0.1
Bug fix release to ensure ANSI C compatibility.
Adding '-pedantic' to the Makefile to ensure this doesn't happen again.
MIToolbox v3.0.0
Major refactor of code and reorganised the repository so it's a little more sensible.
- Refactored all C functions to expose a version which takes unsigned integer inputs.
- Rearranged the repository to separate out headers from source, and MATLAB code from C library code.
Minor changes:
- General code cleanup to reduce duplicated code.
- Adding an COMPILE_R flag to go with the COMPILE_C flag, to make it easier to produce an R wrapper.
- All code now compiles cleanly with "-std=c89 -Wall -Werror".
MIToolbox v2.1.2
- Relicensed from LGPL to BSD.
- Added checks to the Matlab functions (MIToolbox.m, mi.m, cmi.m, condh.m h.m joint.m, RenyiMIToolbox.m and WeightedMIToolbox.m) to ensure the inputs are double vectors or matrices.
MIToolbox 2.1.1
Fixes some issues with the C library compilation.
MIToolbox v2.1
This is MIToolbox v2.1, it is a bugfix release for v2. There are a few memory related bugs fixed, plus some additions to the Makefile and helper functions for FEAST. These changes aid compatibility with PyFeast.
MIToolbox v2.0
This release adds functions from Guiasu's formulation of Weighted Information Theory as described in "Information Theory with Applications", S. Guiasu, 1977.
New information theoretic functions:
- Weighted Entropy - H_w(X)
- Weighted Conditional Entropy - H_w(X|Y)
- Weighted Mutual Information - I_w(X;Y)
There are also functions to calculate and manipulate weighted probability distributions.
MIToolbox v1.0.3
MIToolbox v1.0.3 for C/C++ and MATLAB/Octave
MIToolbox contains a set of functions to calculate information theoretic
quantities from data, such as the entropy and mutual information. The toolbox
contains implementations of the most popular Shannon entropies, and also the
lesser known Renyi entropy. The toolbox only supports discrete distributions,
as opposed to continuous. All real-valued numbers will be processed by x = floor(x).
These functions are targeted for use with feature selection algorithms rather
than communication channels and so expect all the data to be available before
execution and sample their own probability distributions from the data.
Functions contained:
- Entropy
- Conditional Entropy
- Mutual Information
- Conditional Mutual Information
- generating a joint variable
- generating a probability distribution from a discrete random variable
- Renyi's Entropy
- Renyi's Mutual Information