Decision Trees
For the following two projects, use WEKA to build a Decision Tree. Choose the Trees >> J48 model from the Classify tab.
Use Cross-validation with 10 folds. This builds 10 trees, using a different random 10% slice of the whole dataset as the TEST set. Then averages the 10 resulting accuracies together as a best-guess real-world performance rating.
1) Use WEKA to build a Decision Tree on the Iris data set
Append the cross-validation tree's results to the TEST columns of the Iris model-comparison table you have been building all semester. Train a separate model on the whole dataset as a Training Set and append those results in your TRAIN columns.
Display the complete table and discuss how this model's results compare to the rest.
2) Use WEKA to build a J48 Decision Tree on your favorite classification-style dataset you select and download from the UCI Machine Learning Repository
For both trees, paste a copy of WEKA's complete text-output page for your final J48 model into your report.
Also paste a screenshot of WEKA's tree visualisation.
Interpret these results for me with as much detail as possible.
Do you fear of not being able to precisely handle your Decision Tree assignments and homework within the stipulated time-period? Then, you must count on our Decision Tree Assignment Help service for better academic grades!
Tags: Decision Tree Assignment Help, Decision Tree Homework Help, Decision Tree Coursework, Decision Tree Solved Assignments
Attachment:- Decision Trees.rar