Linear regression with one variable - Cost function intuition I

摘要: 本文是吴恩达 (Andrew Ng)老师《机器学习》课程,第二章《单变量线性回归》中第8课时《代价函数的直观认识 - 1》的视频原文字幕。为本人在视频学习过程中逐字逐句记录下来以便日后查阅使用。现分享给大家。如有错误,欢迎大家批评指正,在此表示诚挚地感谢!同时希望对大家的学习能有所帮助。

In the previous video (article), we gave the mathematical definition of the cost function. In this video (article), let's look at some examples to get back to intuition about what the cost function is doing, and why we want to use it.

Linear regression with one variable - Cost function intuition I

To recap, here's what we had last time. We want to fit a straight line to our data, so we had this formed as a hypothesis with these parameters Linear regression with one variable - Cost function intuition I and Linear regression with one variable - Cost function intuition I, and with different choices of the parameters, we end up with different straight-line fits. So, the data which are fit like so. And there's a cost function, and that was our optimization objective. For this video (article), in order to better visualize the cost function J, I'm going to work with a simplified hypothesis function, like that shown on the right. So, I'm gonna use my simplified hypothesis which is just Linear regression with one variable - Cost function intuition I. We can, if you want, think of this as setting the parameter Linear regression with one variable - Cost function intuition I. So, I have only one parameter Linear regression with one variable - Cost function intuition I, and my cost function is similar to before except that now Linear regression with one variable - Cost function intuition I. And I have only one parameter Linear regression with one variable - Cost function intuition I, and so my optimization objective is to minimize Linear regression with one variable - Cost function intuition I. In pictures, what this means is that if Linear regression with one variable - Cost function intuition I that corresponds to choosing only hypothesis functions that pass through the origin, that pass through the point Linear regression with one variable - Cost function intuition I. Using the simplified definition of hypothesis cost function, let's try to understand the cost function concept better.

Linear regression with one variable - Cost function intuition I

It turns out that two key functions we want to understand. The first is the hypothesis function, and the second is the cost function. So, notice that the hypothesis, right, Linear regression with one variable - Cost function intuition I. For a fixed value of Linear regression with one variable - Cost function intuition I, this is a function of x. So, the hypothesis is a function of what is the size of the house x. In contrast, the cost function J, that's a function of the parameter Linear regression with one variable - Cost function intuition I which controls the slope of the straight line. Let's plot these functions and try to understand them both better. Let's start with the hypothesis. On the left, let's say here's my training set with three points at Linear regression with one variable - Cost function intuition I. Let's pick a value Linear regression with one variable - Cost function intuition I, so when set Linear regression with one variable - Cost function intuition I=1, and if that's my choice for Linear regression with one variable - Cost function intuition I, then my hypothesis is going to look like this straight line over here. And I'm gonna point out when I'm plotting my hypothesis function, my X-axis, my horizontal axis is labeled x, is labeled as you know, size of the house over here. Now, of temporary, set Linear regression with one variable - Cost function intuition I. What I want to do is figure out what is Linear regression with one variable - Cost function intuition I when Linear regression with one variable - Cost function intuition I=1. So, let's go ahead and compute what the cost function has for the value one. Well, as usual, my cost function is defined as follows, right? Sum from some of them are my training set of this usual squared error term. And this is therefore equal to Linear regression with one variable - Cost function intuition I, and if you simplify, this turns out to be Linear regression with one variable - Cost function intuition I, which is of course, just equal to 0. Now, inside the cost function, it turns out, each of these terms here is equal to 0. Because for the specific training set I have, for my 3 training examples there, Linear regression with one variable - Cost function intuition I, if Linear regression with one variable - Cost function intuition I, then Linear regression with one variable - Cost function intuition I exactly. And so, Linear regression with one variable - Cost function intuition I, each of these terms is equal to 0, which is why I find that Linear regression with one variable - Cost function intuition I. Let's plot that. What I'm gonna do on the right is plot my cost function J. And notice, because my cost function is a function of my parameter Linear regression with one variable - Cost function intuition I, when I plot my cost function, the horizontal axis is now labeled with Linear regression with one variable - Cost function intuition I. So, I have Linear regression with one variable - Cost function intuition I, so let's go ahead and plot that. End up with an X over there. Now let's look at some other examples. Linear regression with one variable - Cost function intuition I can take on a range of different values. Right? So Linear regression with one variable - Cost function intuition I can take on the negative values, zero and positive values. So, what if Linear regression with one variable - Cost function intuition I? Let's go ahead and plot that.

Linear regression with one variable - Cost function intuition I

I'm now going to set Linear regression with one variable - Cost function intuition I, and in that case, my hypothesis looks like this. As a line with slope equals to 0.5. And, let's compute Linear regression with one variable - Cost function intuition I. So, that is going to be Linear regression with one variable - Cost function intuition I of my usual cost function. It turns out that the cost function is going to be the sum of square values of the height of this line, plus the sum of square of the height of that line, plus the sum of square of the height of that line, right? Because just this vertical distance, that's the difference between Linear regression with one variable - Cost function intuition I and the predicted value Linear regression with one variable - Cost function intuition I. So, the first example is going to be Linear regression with one variable - Cost function intuition I. For my second example, I get Linear regression with one variable - Cost function intuition I, because my hypothesis predicted one, but the actual housing price was two. And finally, plus Linear regression with one variable - Cost function intuition I. And so that's equal to Linear regression with one variable - Cost function intuition I. So now we know Linear regression with one variable - Cost function intuition I is about 0.58. Let's go and plot that. So, we plot that which is maybe about over there. Now, let's do one more. How about if Linear regression with one variable - Cost function intuition I, what is Linear regression with one variable - Cost function intuition I equal to?

Linear regression with one variable - Cost function intuition I

It turns out if Linear regression with one variable - Cost function intuition I, Linear regression with one variable - Cost function intuition I is just equal to 0, you know, this flat line, that just goes horizontally like his. And so, measuring the errors. We have that just Linear regression with one variable - Cost function intuition I. So, let's go ahead and plot that as well. So, it ends up with a value around 2.3. And of course, we can keep on doing this for other values of Linear regression with one variable - Cost function intuition I. It turns out that you can have negative for other values of Linear regression with one variable - Cost function intuition I as well. So if Linear regression with one variable - Cost function intuition I is negative, then Linear regression with one variable - Cost function intuition I would be equal to say Linear regression with one variable - Cost function intuition I, then Linear regression with one variable - Cost function intuition I, and so that corresponds to a hypothesis with a slope of -0.5. And you can actually keep on computing these errors. This turns out to be, you know, for -0.5, it turns out to have really high error. It works out to be something, like, 5.25 and so on. And for different values of Linear regression with one variable - Cost function intuition I, you can compute these things. And it turns out that you computed range of values, you get something like that. And by computing the range of values, you can actually slowly create out what this function Linear regression with one variable - Cost function intuition I looks like. And that's what Linear regression with one variable - Cost function intuition I is. To recap, for each value of Linear regression with one variable - Cost function intuition I, right? Each value of Linear regression with one variable - Cost function intuition I corresponds to a different hypothesis, or to a different straight line fit on the left. And for each value of Linear regression with one variable - Cost function intuition I, we could then derive a different a different value of Linear regression with one variable - Cost function intuition I. And for example, Linear regression with one variable - Cost function intuition I corresponds to this straight line (in cyan) straight through the data. Whereas Linear regression with one variable - Cost function intuition I, and this point shown in magenta, corresponded to maybe that line (in magenta). And Linear regression with one variable - Cost function intuition I, which is shown in blue, that corresponds to this horizontal line (in blue). So, for each value of Linear regression with one variable - Cost function intuition I, we wound up with a different value of Linear regression with one variable - Cost function intuition I. And then we could use this to trace out this plot on the right. Now you remember the optimization objective for our learning algorithm is we want to choose the value of Linear regression with one variable - Cost function intuition I, that minimizes Linear regression with one variable - Cost function intuition I. This (Linear regression with one variable - Cost function intuition I) was our objective function for the linear regression. Well, looking at this curve, the value that minimizes Linear regression with one variable - Cost function intuition I is Linear regression with one variable - Cost function intuition I. And low and behold, that is indeed the best possible straight line fit throughout data, by setting Linear regression with one variable - Cost function intuition I. And just for this particular training set, we actually end up fitting it perfectly. And that's why minimizing Linear regression with one variable - Cost function intuition I corresponds to finding a straight line that fits the data well. So, to wrap up, in this video (article), we looked at some plots to understand the cost function. To do so, we simplified the algorithm, so that it only had one parameter Linear regression with one variable - Cost function intuition I. And we set the parameter Linear regression with one variable - Cost function intuition I. In the next video (article), we'll go back to the original problem formulation, and look at some visualizations involving both Linear regression with one variable - Cost function intuition I and Linear regression with one variable - Cost function intuition I. That is without setting Linear regression with one variable - Cost function intuition I. And hopefully that will give you an even better sense of what the cost function J is doing in the original linear regression.

上一篇:Critical-Value|Critical-Value Approach to Hypothesis Testing


下一篇:[NOIP2020] 字符串匹配