Estimate the conditional mutual information between target and interacting variables given conditioning variables.
Arguments
- data
Observation data.
- target
Integer vector of column indices for the target variables.
- interact
Integer vector of column indices for the interacting variables.
- conds
Integer vector of column indices for the conditioning variables.
- base
(optional) Logarithm base of the entropy. Defaults to
exp(1)(nats). Use2for bits or10for dits.- type
(optional) Estimation method:
"disc"for discrete entropy or"cont"for continuous entropy (KSG estimator).- k
(optional) Number of nearest neighbors used by the continuous estimator. Ignored when
type = "disc".- normalize
(optional) Logical; if
TRUE, return normalized mutual information.