The position is first chosen randomly from the search-space and then updated iteratively according to the following formula, regardless of fitness improvement:
As shown in the preceding formula, is the step-size. When is a minimization problem, the descent direction is followed, that is, we subtract the gradient from the current position instead of adding it—as we would have done for ascending a maximization problem.