Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

Introduction_to_Machine_Learning

EECS 836: Machine Learning, taught by Professor Zijun Yao, covers course administration, motivation for studying AI/ML, historical milestones, machine learning tasks, neural networks, and optimization techniques. The course includes lectures, assignments, a team project, and emphasizes academic integrity. Key topics include regression, classification, and the use of loss functions and gradient descent in optimizing machine learning models.

Uploaded by

vinay
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Introduction_to_Machine_Learning

EECS 836: Machine Learning, taught by Professor Zijun Yao, covers course administration, motivation for studying AI/ML, historical milestones, machine learning tasks, neural networks, and optimization techniques. The course includes lectures, assignments, a team project, and emphasizes academic integrity. Key topics include regression, classification, and the use of loss functions and gradient descent in optimizing machine learning models.

Uploaded by

vinay
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Introduction to Machine Learning - EECS 836

1. Course Overview and Administration


Instructor & Schedule:
The course “EECS 836: Machine Learning” is taught by Professor Zijun Yao. It includes
regular lectures (Monday/Friday) with designated office hours, and all class materials
(slides, assignments, projects) are posted on Canvas.

Course Components:
- Lectures: Attend and actively engage.
- Slides & Readings: Review these materials before class.
- Assignments & Exams: Submit on time.
- Team Project: A group project (3–4 students) where you apply machine learning to a real-
world problem.

Grading Policy:
- Components Include: Attendance, assignments, two exams, and a team project.
- Academic Integrity: Cheating is not allowed and similar answers require explanation.

2. Motivation and Future of AI/ML


Why Study Machine Learning?
- AI is expected to create millions of new jobs while transforming industries (e.g.,
healthcare, finance).
- AI might contribute trillions to global GDP in the future.

Definition of AI and Its Subfields:


- AI: The broad science of making machines that can mimic human thinking and behavior.
- ML: A subset of AI focused on building algorithms that learn from data without being
explicitly programmed.
- Deep Learning: A technique within ML that uses neural networks with many layers for
tasks like image or speech recognition.

3. Historical Perspective and Key Milestones


Early AI (Turing Test, 1950): Introduced the idea of testing a machine’s ability to exhibit
intelligent behavior equivalent to humans.
- 1990s: Development of techniques like SVMs and Bayesian networks.
- 1997: Deep Blue (Chess AI) beats a world champion.
- 2010s: Advances like AlexNet, GANs, AlphaGo, and ChatGPT showcase AI’s rapid progress.

4. Machine Learning Tasks and Data Types


Tasks in ML:
- Regression: Predicting a continuous output (e.g., stock prices). Example formula:
y=b+w*x
- Classification: Assigning an input into predefined classes (e.g., spam filtering).

Data Types and Attributes:


- Nominal: Names or labels.
- Binary: Two possible states (0 or 1).
- Ordinal: Ordered values (e.g., small, medium, large).
- Numeric: Continuous (e.g., speed) or discrete values.

5. Neural Networks and Optimization


Neural Networks Basics:
- Consist of layers of interconnected neurons processing input data.
- Key Components: Weights, biases, and activation functions (e.g., Sigmoid function).

Loss Functions and Error Measurement:


- Mean Squared Error (MSE):
L(b, w) = (1/N) * Σ (y_n - ŷ_n)^2
- Mean Absolute Error (MAE): Uses absolute differences instead.

Optimization and Gradient Descent:


- Gradient Descent Formula:
w_new = w_old - η * (∂L/∂w)
where η is the learning rate and ∂L/∂w is the gradient.

6. Summary: Steps in Machine Learning


Step 1: Problem Formulation - Define the type of function needed for the task (e.g.,
classification, regression).
Step 2: Measuring Error - Use a loss function to quantify model performance.
Step 3: Optimization - Adjust model parameters using techniques like gradient descent to
minimize error.

You might also like