Added Distributions for use in Clustering (Mixture Modelling), Function Models, Regression Trees, Segmentation, and mixed Bayesian Networks in Inductive Programming 1.2

Lloyd Allison,
TR 2008/224, FIT, Monash University,
April 2008
 
Inductive programming is a machine learning paradigm combining functional programming (FP) with the information theoretic criterion, Minimum Message Length (MML). IP 1.2 now includes the Geometric and Poisson distributions over non-negative integers, and Student's t-Distribution over continuous values, as well as the Multinomial and Normal (Gaussian) distributions from before. All of these can be used with IP's model-transformation operators, and structure-learning algorithms including clustering (mixture-models), classification- (decision-) trees and other regressions, and mixed Bayesian networks, provided only that the types match between each corresponding component Model, transformation, structured model, and variable – discrete, continuous, sequence, multivariate, and so on.

[paper.pdf], [source-code].