We consider the minimization problem for a nonconvex function with Lipschitz continuous gradient on a proximally smooth (possibly nonconvex) subset of a finite-dimensiona Euclidean space .We introduce
the error bound condition with exponent α ∈ (0, 1] for the gradient mapping.
Under this condition, it is shown that the standard gradient projection
algorithm converges to a solution of the problem linearly or sublinearly,
depending on the value of the exponent α. This paper is theoretical.