We say Gradient is always increasing and gradient ascent maximizes the values, then can i say that gradient and gradient ascent terms can be used interchangeably
-
3Do you mean "the gradient always points in the direction of greatest increase?" In any case, the two terms are not interchangeable, as the gradient refers to a function and gradient ascent refers to an algorithm. – probably_someone Jul 26 '17 at 08:00
-
1No. The gradient is a derivative. Gradient ascent is a dynamical system, a numerical algorithm. – Rodrigo de Azevedo Jul 26 '17 at 20:56
-
As far as I get it from other sources Derivative is a scalar quantity while gradient is a vector where the no. of independent variables are more then one. From this the definition i get is as follows: At any given point it tells you the direction in which the function changes with the greatest rate – Nikhil Bansal Jul 30 '17 at 08:54
1 Answers
Your comment seems to be correct. Let $f:\mathbb{R}^n\rightarrow\mathbb{R}$ be a function.
The gradient of $f$ is given by: $$ \nabla f = (\partial_{x_1}f,\ldots,\partial_{x_n}f)$$ which is a vector field (i.e. vector function) $\nabla f:\mathbb{R}^n\rightarrow\mathbb{R}^n$. So, at every point $\vec{x}$, the gradient at that point $\nabla f(\;\!\vec{x}\;\!)$ is a vector that points in the direction of greatest increase of $f$.
So, given a function $f$, the gradient gives a special set of vectors (one for each point in space), that everywhere points in the direction one should move if one wanted to increase $f$. Notice that the gradient is just the regular derivative when $n=1$.
More correctly, I would say that the gradient is an operator, whose input is a function and whose output is a vector field, with this special property.
On the other hand, gradient ascent is an algorithm for maximizing functions. Suppose we have a set $D\subseteq \mathbb{R}^n$, and we want $y\in D$ such that $f$ is maximal. So, given some function $f$, we want to find: $$ y= \arg\max_{x\in D} f(x) $$ How can we do so? We use the special property of the gradient from before. If we want to maximize $f$, the smartest thing (ignoring local maxima) to do is follow the direction of greatest increase of $f$ - which happens to be the direction specifed by the gradient. See this picture, for example.
Summary: the gradient is a vector field associated to every function, while gradient ascent is an optimization algorithm that can find locations of extremas of a given function, which uses the special properties of the gradient field to work.

- 10,433