Description
Stone River Elearning – Math for Machine Learning
Learn the core topics of Machine Learning to open doors to data science and artificial intelligence.
Would you like to learn a mathematics subject that is crucial for many high-demand lucrative career fields such as:
- Computer Science
- Data Science
- Artificial Intelligence
If you’re looking to gain a solid foundation in Machine Learning to further your career goals, in a way that allows you to study on your own schedule at a fraction of the cost it would take at a traditional university, this online course is for you. If you’re a working professional needing a refresher on machine learning or a complete beginner who needs to learn Machine Learning for the first time, this online course is for you.
Why you should take this online course: You need to refresh your knowledge of machine learning for your career to earn a higher salary. You need to learn machine learning because it is a required mathematical subject for your chosen career field such as data science or artificial intelligence. You intend to pursue a masters degree or PhD, and machine learning is a required or recommended subject.
Why you should choose this instructor: I earned my PhD in Mathematics from the University of California, Riverside. I have created many successful online math courses that students around the world have found invaluable—courses in linear algebra, discrete math, and calculus.
In this course, we will cover the core concepts such as:
- Linear Regression
- Linear Discriminant Analysis
- Logistic Regression
- Artificial Neural Networks
- Support Vector Machines
Course Curriculum
- Introduction (2:46)
- Linear Regression (7:32)
- The Least Squares Method (11:25)
- Linear Algebra Solution to Least Squares Problem (12:50)
- Example Linear Regression (4:05)
- Summary Linear Regression (0:33)
- Problem Set Linear Regression
- Solution Set Linear Regression
- Classification (1:15)
- Linear Discriminant Analysis (0:44)
- The Posterior Probability Functions (3:42)
- Modelling the Posterior Probability Functions (7:13)
- Linear Discriminant Functions (5:32)
- Estimating the Linear Discriminant Functions (6:00)
- Classifying Data Points Using Linear Discriminant Functions (3:09)
- LDA Example 1 (13:52)
- LDA Example 2 (17:38)
- Summary Linear Discriminant Analysis (1:34)
- Problem Set Linear Discriminant Analysis
- Solution Set Linear Discriminant Analysis
- Logistic Regression (1:15)
- Logistic Regression Model of the Posterior Probability Function (3:02)
- Estimating the Posterior Probability Function (8:57)
- The Multivariate Newton-Raphson Method (9:14)
- Maximizing the Log-Likelihood Function (13:51)
- Logistic Regression Example (9:55)
- Summary Logistic Regression (1:21)
- Problem Set Logistic Regression
- Solution Set Logistic Regression
- Artificial Neural Networks (0:36)
- Neural Network Model of the Output Functions (12:59)
- Forward Propagation (0:51)
- Choosing Activation Functions (4:30)
- Estimating the Output Functions (2:17)
- Error Function for Regression (2:27)
- Error Function for Binary Classification (6:15)
- Error Function for Multiclass Classification (4:38)
- Minimizing the Error Function Using Gradient Descent (6:27)
- Backpropagation Equations (4:16)
- Summary of Backpropagation (1:27)
- Summary Artificial Neural Networks (1:47)
- Problem Set Artificial Neural Networks
- Solution Set Artificial Neural Networks
- Maximal Margin Classifier (2:29)
- Definitions of Separating Hyperplane and Margin (5:43)
- Proof 1 (6:42)
- Maximizing the Margin (3:36)
- Definition of Maximal Margin Classifier (1:01)
- Reformulating the Optimization Problem (7:37)
- Proof 2 (1:13)
- Proof 3 (4:52)
- Proof 4 (8:41)
- Proof 5 (5:10)
- Solving the Convex Optimization Problem (1:05)
- KKT Conditions (1:24)
- Primal and Dual Problems (1:24)
- Solving the Dual Problem (3:31)
- The Coefficients for the Maximal Margin Hyperplane (0:29)
- Classifying Test Points (1:50)
- The Support Vectors (0:58)
- Maximal Margin Classifier Example 1 (9:50)
- Maximal Margin Classifier Example 2 (11:41)
- Summary Maximal Margin Classifier (0:31)
- Problem Set Maximal Margin Classifier
- Solution Set Maximal Margin Classifier
- Support Vector Classifier (3:54)
- Slack Variables Points on Correct Side of Hyperplane (3:47)
- Slack Variables Points on Wrong Side of Hyperplane (1:37)
- Formulating the Optimization Problem (3:52)
- Definition of Support Vector Classifier (0:44)
- A Convex Optimization Problem (1:46)
- Solving the Convex Optimization Problem (Soft Margin) (6:38)
- The Coefficients for the Soft Margin Hyperplane (2:09)
- The Support Vectors (Soft Margin) (1:37)
- Classifying Test Points (Soft Margin) (1:36)
- Support Vector Classifier Example 1 (14:53)
- Support Vector Classifier Example 2 (9:19)
- Summary Support Vector Classifier (0:41)
- Problem Set Support Vector Classifier
- Solution Set Support Vector Classifier
- Support Vector Machine Classifier (1:19)
- Enlarging the Feature Space (5:22)
- The Kernel Trick (4:24)
- Summary Support Vector Machine Classifier (1:07)
- Concluding Letter (Math for Machine Learning)
Sale Page: https://stoneriverelearning.com/p/math-for-machine-learning
Archive: https://archive.ph/wip/aJCkj
Reviews
There are no reviews yet.