Polynomial regression is a form of linear regression in which the relationship between the independent variable x and the dependent variable y is modelled as an nth degree polynomial.
Here is the simplest Polynomial: One independent variable – Second order (also known as quadratic function)
The next larger Polynomial: One independent variable – Third order (also known as cubic function)
One independent variable – Fourth order (also known as quartic function):
So how do we actually fit a model like this to our data?
Using the machinery of multivariant linear regression, we can do this with a simple modification to our algorithm:
in which x1 = x, x2 = x^{2}, x3 = x^{3}
In this way, the polynomial regression problem becomes a multivariant linear regression problem, and can use the gradient decent or normal equation algorithm to solve it.

1 The 3rd Eye for Your Car

2 A few UW students hacked the Google Perspective API

3 A Complete List of Free Dev Resources Exclusive to Students and Educators

4 Microsoft Azure Machine Learning Cheat Sheet v6 – Released today

5 Interesting Visual Explaining Machine Learning to Beginners

6 New Book: Machine Learning Projects for .NET Developers

7 Best Machine Learning & AI Cloud Services in the Market

8 ML101: How to Choose a Machine Learning Algorithm for Multiclass Classification Problems

9 ML101: How to Choose Machine Learning Algorithms

10 ML101: How to Choose a Machine Learning Algorithm for Twoclass Classification Problems
Pingback: ML101: Logistic Regression  Scott Ge