When using gradient descent as learning algorithm for a regression problem, there are several ways to improve it. In the beginning, it is often unclear how to optimize it. This article shows it by choosing the right learning rate and initial parameters, applying feature scaling and vectorization.
Machine learning often starts out with a linear regression and gradient descent in an univariate training set. But often your data correlation isn't linear. That's where polynomial regression comes into play and selecting a model type to fit your underlying data.