| |
- conditional_entropy(f1, f2)
- This function calculates the conditional entropy, where ce = H(f1) - I(f1;f2)
Input
-----
f1: {numpy array}, shape (n_samples,)
f2: {numpy array}, shape (n_samples,)
Output
------
ce: {float}
ce is conditional entropy of f1 and f2
- information_gain(f1, f2)
- This function calculates the information gain, where ig(f1,f2) = H(f1) - H(f1|f2)
Input
-----
f1: {numpy array}, shape (n_samples,)
f2: {numpy array}, shape (n_samples,)
Output
------
ig: {float}
- su_calculation(f1, f2)
- This function calculates the symmetrical uncertainty, where su(f1,f2) = 2*IG(f1,f2)/(H(f1)+H(f2))
Input
-----
f1: {numpy array}, shape (n_samples,)
f2: {numpy array}, shape (n_samples,)
Output
------
su: {float}
su is the symmetrical uncertainty of f1 and f2
|