Rapidly Convergent Steffensen-based Methods for Unconstrained Optimization

نویسندگانM. Afzalinejad
نشریهRAIRO-Operations Research
ارائه به نام دانشگاهYes
نوع مقالهOriginal Research
تاریخ انتشار2018
رتبه نشریهعلمی - پژوهشی
نوع نشریهچاپی
کشور محل چاپایران
نمایه نشریهISI

چکیده مقاله

A problem with rapidly convergent methods for unconstrained optimization like the Newton’s method is the computational difficulties arising specially from the second derivative. In this paper, a class of methods for solving unconstrained optimization problems is proposed which implicitly applies approximations to derivatives. This class of methods is based on a modified Steffensen method for finding roots of a function and attempts to make a quadratic model for the function without using the second derivative. Two methods of this kind with non-expensive computations are proposed which just use first derivative of the function. Derivative-free versions of these methods are also suggested for the cases where the gradient formulas are not available or difficult to evaluate. The theory as well as numerical examinations confirm the rapid convergence of this class of methods. 

لینک ثابت مقاله