Consistency of local classifiers under exchangeable data inputs

Traditionally, proofs of universal consistency of particular machine learning algorithms – including local learning algorithms such as k-NN classifier – are given under the assumption of data inputs being independent identically distributed random variables. This assumption is often too strong, for instance, when modelling learning and generalization of time series. A sequence of random variables X_1, X_2, … is called exchangeable if the joint law of any finite subsequence is invariant under permutations of members of the subsequence. For instance, if we have two biased coins, A and B, with different probabilities to get tails, then an exchangeable sequence of random variables is obtained by tossing either A or B, and a decision on which coin to toss being made each time by of tossing a third coin, C. A famous De Finetti theorem states that in essence every exchangeable sequence of random variables is obtained like this, as a mixture of a family of i.i.d. sequences. The notion of learnability has to be modified if the data inputs are assumed exchangeable, and in the earlier work of the project supervisor [V. Pestov, Predictive PAC learnability: a paradigm for learning from exchangeable input data. – In: Proc. 2010 IEEE Int. Conference on Granular Computing (San Jose, CA, 14-16 Aug. 2010), pp. 387-391, Symposium on Foundations and Practice of Data Mining. doi> 10.1109/GrC.2010.102] it was shown that a “conditional”, or “predictive”, version of a probably approximately learnable concept class behaves well for such input variables and satisfies an analogous result to its classical i.i.d. counterpart. The next interesting question it to reformulate the notion of universal consistency, and to give a proof that a “conditional”, or “predictive” universal consistency of the k-NN classifier holds under exchangeable inputs. Moreover, as exchangeable inputs can be easily simulated, it would be very interesting to stage a large-scale experiment using the High Performance Computing Virtual Laboratory (HPCVL) in order to test the asymptotic performance of the learning algorithm under such inputs and compare it to the performance under the traditional i.i.d. inputs. Thus, the research problem has both a theoretical and a practical dimension, which can be pursued in parallel or even independently of each other. This is a rich project, sufficient not only for a Summer MITACS project, but for a good Ph.D. thesis as well!

Faculty Supervisor:

Vladimir Pestov






University of Ottawa


Globalink Research Internship

Current openings

Find the perfect opportunity to put your academic skills and knowledge into practice!

Find Projects