Problems - For the following problems, use these weights:
w1 = 0.75
w2 = 0.45
w0 = 0.5
With this sample:
x1 = 2
x2 = 0.6
1. In a linear regression model, what would the output be?
2. In a perceptron, what would the output be? (Positive class or negative class?)
3. In a logistic regression model, what would the output be (in terms of a posterior probability)?
4. Say we added a higher order product term, x1*x2 with a new weight w3 = 0.25. Compute the posterior probability of a logistic regression model.
5. Say that the above sample is a new sample with yt = 1 and rt = 0. Using the stochastic update rule with learning rate η = 0.1, what are the updated weights w1, w2, and w0?
6. After completing the programming assignment, adjust the l2 regularization on the not MIST logistic regression model. Try values 1, 0.1, 0.01, and 0. Which produces the lowest error? Why do you think this is the case?
The answers must include the work to get to solutions.