🤖 Top AI Interview Questions: Part-3
Continue your AI interview prep with advanced ML & Data Science concepts.
- Generative Models: Learn joint probability P(x, y) and can generate new data. Examples: Naive Bayes, GANs, HMM.
- Discriminative Models: Learn conditional probability P(y|x) and focus on classification. Examples: Logistic Regression, SVM, Neural Networks.
PCA is a dimensionality reduction technique.
Transforms features into principal components keeping only the most important ones that explain the variance.
Benefits:
- Reduces overfitting
- Improves visualization
- Speeds up training
Feature selection chooses the most relevant features for a model.
Benefits:
- Reduces overfitting
- Improves model accuracy
- Speeds up training
- Filter (correlation)
- Wrapper (RFE)
- Embedded (Lasso)
Converts categorical data into numerical format.
Each category becomes a binary column (0 or 1).
Example: Color = Red, Green, Blue → Red=[1,0,0], Green=[0,1,0].
Reducing the number of input variables while retaining important info.
Techniques:
- PCA (unsupervised)
- LDA (supervised)
Regularization prevents overfitting by penalizing large weights.
- L1 (Lasso): Adds absolute values, can shrink some weights to zero (feature selection)
- L2 (Ridge): Adds squared values, reduces weight magnitudes but keeps all features
As dimensions increase, data becomes sparse and harder to model.
Distance-based algorithms like KNN become less effective.
Solution: Use dimensionality reduction or feature selection.
Unsupervised algorithm that groups data into k clusters.
Steps:
- Choose k centroids
- Assign each point to the nearest centroid
- Recalculate centroids
- Repeat until convergence
- KNN (K-Nearest Neighbors): Supervised, classification/regression, uses labeled data
- K-Means: Unsupervised, clustering, does not use labels
A probabilistic classifier based on Bayes' Theorem.
Assumes features are independent (naive assumption).
Works well for text classification like spam detection and is very fast.
Comments (0)
No comments yet
Be the first to share your thoughts!
Leave a Comment