In this course, you’ll be learning various supervised ML algorithms and prediction tasks applied to different data. You’ll learn when to use which model and why, and how to improve the model performances. We will cover models such as linear and logistic regression, KNN, Decision trees and ensembling methods such as Random Forest and Boosting, kernel methods such as SVM.
本课程是 Machine Learning: Theory and Hands-on Practice with Python 专项课程 专项课程的一部分
课程信息
Calculus, Linear algebra, Python
您将学到的内容有
Use modern machine learning tools and python libraries.
Compare logistic regression’s strengths and weaknesses.
Explain how to deal with linearly-inseparable data.
Explain what decision tree is & how it splits nodes.
您将获得的技能
- Hyperparameter
- Decision Tree
- ensembling
- sklearn
Calculus, Linear algebra, Python
提供方

科罗拉多大学波德分校
CU-Boulder is a dynamic community of scholars and learners on one of the most spectacular college campuses in the country. As one of 34 U.S. public institutions in the prestigious Association of American Universities (AAU), we have a proud tradition of academic excellence, with five Nobel laureates and more than 50 members of prestigious academic academies.
立即开始攻读硕士学位
授课大纲 - 您将从这门课程中学到什么
Introduction to Machine Learning, Linear Regression
This week, we will build our supervised machine learning foundation. Data cleaning and EDA might not seem glamorous, but the process is vital for guiding your real-world data projects. The chances are that you have heard of linear regression before. With the buzz around machine learning, perhaps it seems surprising that we are starting with such a standard statistical technique. In "How Not to Be Wrong: The Power of Mathematical Thinking", Jordan Ellenberg refers to linear regression as "the statistical technique that is to social science as the screwdriver is to home repair. It's the one tool you're pretty much going to use, whatever the task" (51). Linear regression is an excellent starting place for solving problems with a continuous outcome. Hopefully, this week will help you appreciate how much you can accomplish with a simple model like this.
Multilinear Regression
This week we are building on last week's foundation and working with more complex linear regression models. After this week, you will be able to create linear models with several explanatory and categorical variables. Mathematically and syntactically, multiple linear regression models are a natural extension of the simpler linear regression models we learned last week. One of the differences that we must keep in mind this week is that our data space is now 3D instead of 2D. The difference between 3D and 2D has implications when considering how to do things like creating meaningful visualizations. It is essential to understand how to interpret coefficients. Machine learning involves strategically iterating and improving upon a model. In this week's lab and Peer Review, you will identify weaknesses with linear regression models and strategically improve on them. Hopefully, as you progress through this course specialization, you will get better and better at this iterative process.
Logistic Regression
Even though the name logistic regression might suggest otherwise, we will be shifting our attention from regression tasks to classification tasks this week. Logistic regression is a particular case of a generalized linear model. Like linear regression, logistic regression is a widely used statistical tool and one of the foundational tools for your data science toolkit. There are many real-world applications for classification tasks, including the financial and biomedical realms. In this week's lab, you will see how this classic algorithm will help you predict whether a biopsy slide from the famous Wisconsin Breast Cancer dataset shows a benign or malignant mass. We also advise starting the final project that you will turn in Week 7 of the course this week. This week, find a project dataset, start performing EDA and define your problem. Use the project rubric as a guide, and don't be afraid to look at a few datasets until you find one well-suited to the project.
Non-parametric Models
This week we will learn about non-parametric models. k-Nearest Neighbors makes sense on an intuitive level. Decision trees are a supervised learning model that can be used for either regression or classification tasks. In Module 2, we learned about the bias-variance tradeoff, and we've kept that tradeoff in mind as we've moved through the course. Highly flexible tree models have the benefit that they can capture complex, non-linear relationships. However, they are prone to overfitting. This week and next, we will explore strategies like pruning to avoid overfitting with tree-based models. In this week's lab, you will make a KNN classifier for the famous MNIST dataset and then build a spam classifier using a decision tree model. This week we will once again appreciate the power of simple, understandable models. Keep going with your final project. Once you've finalized your dataset and EDA, start on the initial approach for your main supervised learning task. Review the course material, read research papers, look at GitHub repositories and Medium articles to understand your topic and plan your approach.
关于 Machine Learning: Theory and Hands-on Practice with Python 专项课程
In the Machine Learning specialization, we will cover Supervised Learning, Unsupervised Learning, and the basics of Deep Learning. You will apply ML algorithms to real-world data, learn when to use which model and why, and improve the performance of your models. Starting with supervised learning, we will cover linear and logistic regression, KNN, Decision trees, ensembling methods such as Random Forest and Boosting, and kernel methods such as SVM. Then we turn our attention to unsupervised methods, including dimensionality reduction techniques (e.g., PCA), clustering, and recommender systems. We finish with an introduction to deep learning basics, including choosing model architectures, building/training neural networks with libraries like Keras, and hands-on examples of CNNs and RNNs.

常见问题
我什么时候能够访问课程视频和作业?
我订阅此专项课程后会得到什么?
有助学金吗?
还有其他问题吗?请访问 学生帮助中心。