info@avatto.com
+91-9920808017
91. Cross validation is a model evaluation method. Leave-one-out cross validation (LOOCV) is K-fold cross validation taken to its logical extreme, with K equal to N, the number of data points in the set. That means that N separate times, the function approximator is trained on all the data except for one point and a prediction is made for that point. Thus, it iterates over the other datapoints keeping the rest of the dataset fixed. What can be the major issues in LOOCV? a. low variance b. high variance c. faster run time compared to K-fold cross validation d. Slover run time compared to K-fold cross validation
a and c
a and d
b and c
b and d
Your email address will not be published. Required fields are marked *
Report
Name
Email
Website
Save my name, email, and website in this browser for the next time I comment.
Comment
92. State whether the statements are True or False. Statement A: When the hypothesis space is richer, overfitting is more likely. When the feature space is larger, overfitting is more likely. reactor will be
False, False
True, False
True, True
False, True
93. What is the purpose of restricting hypothesis space in machine learning?
Can be easier to search
May avoid overfit since they are usually simpler (e.g. linear or low order decision surface)
Both of the above
None of the above
94. What is true about K-Mean Clustering? 1. K-means is extremely sensitive to cluster center initializations 2. Bad initialization can lead to Poor convergence speed 3. Bad initialization can lead to bad overall clustering
1 and 2
1 and 3
All of the above
2 and 3
95. In which of the following cases will K-Means clustering fail to give good results? a. Data points with outliers b. Data points with round shapes c. Data points with non-convex shapes d. Data points with different densities
a and b
a, c and d
c and d
Login with Facebook
Login with Google
Forgot your password?
Lost your password? Please enter your email address. You will receive mail with link to set new password.
Back to login