Home / Programming Questions / Machine Learning MCQs / Page 1

Machine Learning MCQs with answers Page - 1

Dear candidates you will find MCQ questions of Machine Learning here. Learn these questions and prepare yourself for coming examinations and interviews. You can check the right answer of any question by clicking on any option or by clicking view answer button.
Share your questions by clicking Add Question

M

Mr. Dubey • 51.32K Points
Coach

Q. Support vectors are the data points that lie closest to the decision

(A) TRUE
(B) 1 and 3
(C) ---
(D) ---

M

Mr. Dubey • 51.32K Points
Coach

Q. How can we assign the weights to output of different models in an ensemble?
1. Use an algorithm to return the optimal weights
2. Choose the weights using cross validation
3. Give high weights to more accurate models

(A) 1 and 2
(B) o(tkn)
(C) 2 and 3
(D) all of above

M

Mr. Dubey • 51.32K Points
Coach

Q. The main disadvantage of maximum likelihood methods is that they are . . . . . . . .

(A) mathematically less folded
(B) mathematically less complex
(C) mathematically less complex
(D) computationally intense

M

Mr. Dubey • 51.32K Points
Coach

Q. Frequent item sets is

(A) superset of only closed frequent item sets
(B) superset of only maximal frequent item sets
(C) subset of maximal frequent item sets
(D) superset of both closed frequent item sets and maximal frequent item sets

M

Mr. Dubey • 51.32K Points
Coach

Q. If there is only a discrete number of possible outcomes called . . . . . . . .

(A) Modelfree
(B) Categories
(C) Prediction
(D) None of above

M

Mr. Dubey • 51.32K Points
Coach

Q. Which of the following algorithm comes under the classification

(A) apriori
(B) brute force
(C) dbscan
(D) k-nearest neighbor

M

Mr. Dubey • 51.32K Points
Coach

Q. Which of the following are real world applications of the SVM?

(A) Text and Hypertext Categorization
(B) Image Classification
(C) Clustering of News Articles
(D) All of the above

M

Mr. Dubey • 51.32K Points
Coach

Q. . . . . . . . . dataset with many features contains information proportional to the independence of all features and their variance.

(A) normalized
(B) unnormalized
(C) Both A and B
(D) None of the Mentioned

Login

Forgot username? click here

Forgot password? Click here

Don't have account? Register here.