The Problem
The gradient descent algorithm finds a minimum of a convex function, but it does not guarantee that the found minimum is the global one. I don't know if it's possible to find a minimum of a function by its derivative (assuming no, then what would be the reason for using the gradient descent algorithm). Just for clarification, the function in this whole question context is assumed to be multidimensional.
An Alternate IDEA
An alternate idea came to mind when trying to solve this problem (please don't judge strictly as I am not a professional in this sphere): Let's have a plane compared to which the given function is convex (I am almost sure this is a wrong formulation, but I hope you understand what I wanted to say). If the plane crosses with the function, let's move it down (close to the bottom) with some steps (step size can initially be taken very big, unlike the gradient descent step) until it does not cross. Then, move the plane up, downsize the step size two times, and repeat moving the plane up and down, cutting the step in half on each iteration until reaching some epsilon distance. Finally, we can find the global minimum from that point (one of the crossing points of the plane with the function, being sure that it is one of the closest points to the global minimum) using the gradient descent algorithm.
The Actual Problem
The problem with this algorithm is that I am unsure if it's always possible to determine if a plane and a function are crossing. I mean, imagine a function with 50 unknowns. Will it be possible to determine if it crosses the y=10 plane? If so, then how expensive (meaning from a time perspective) can it be to use this algorithm to find the global minimum over the gradient descent?