CSCE 633 Machine Learning (Spring 2010) due: Tues, Apr 20 Project 4 --------- Goal: Implement and test a Naive Bayes learner, and compare its performance to your decision-tree, neural network, and nearest neighbor algorithm on 5 different datasets. This is a relatively straightforward project. Mainly, you just have to compute statistical summaries of your training data (prior and conditional probabilities), and then apply them to test examples using the formula in the book (based on Bayes Rule + Independence Assumption). For continuous attributes, model them as a normally-distributed. (Optionally, you might want to compare this to a simpler approach of discretizing continuous attributes into k bins and treating them as nominal; you could compare the performance to using Gaussian distributions.) For nominal attibutes, you might want to try one of several approaches we talked about in class, or perhaps compare two approaches to see which is best.