| |
- avgdigamma(points, dvec)
- # Internal functions
- cmi(x, y, z, k=3, base=2)
- Mutual information of x and y, conditioned on z; x, y, z should be a list of vectors, e.g. x = [[1.3],[3.7],[5.1],[2.4]]
if x is a one-dimensional scalar and we have four samples
- cmidd(x, y, z)
- Discrete mutual information estimator given a list of samples which can be any hashable object
- elog(x)
- entropy(x, k=3, base=2)
- The classic K-L k-nearest neighbor continuous entropy estimator x should be a list of vectors,
e.g. x = [[1.3],[3.7],[5.1],[2.4]] if x is a one-dimensional scalar and we have four samples
- entropyd(sx, base=2)
- Discrete entropy estimator given a list of samples which can be any hashable object
- entropyfromprobs(probs, base=2)
- hist(sx)
- kldiv(x, xp, k=3, base=2)
- KL Divergence between p and q for x~p(x), xp~q(x); x, xp should be a list of vectors, e.g. x = [[1.3],[3.7],[5.1],[2.4]]
if x is a one-dimensional scalar and we have four samples
- log(...)
- log(x[, base])
Return the logarithm of x to the given base.
If the base not specified, returns the natural logarithm (base e) of x.
- mi(x, y, k=3, base=2)
- Mutual information of x and y; x, y should be a list of vectors, e.g. x = [[1.3],[3.7],[5.1],[2.4]]
if x is a one-dimensional scalar and we have four samples
- micd(x, y, k=3, base=2, warning=True)
- If x is continuous and y is discrete, compute mutual information
- midd(x, y)
- Discrete mutual information estimator given a list of samples which can be any hashable object
- shuffle_test(measure, x, y, z=False, ns=200, ci=0.95, **kwargs)
- Shuffle test
Repeatedly shuffle the x-values and then estimate measure(x,y,[z]).
Returns the mean and conf. interval ('ci=0.95' default) over 'ns' runs, 'measure' could me mi,cmi,
e.g. Keyword arguments can be passed. Mutual information and CMI should have a mean near zero.
- vectorize(scalarlist)
- Turn a list of scalars into a list of one-d vectors
- zip2(*args)
|