On the influence of center-Lipschitz conditions in the convergence analysis of multi-point iterative methods

The aim of this article is to extend the local as well as the semilocal convergence analysis of multi-point iterative methods using center Lipschitz conditions in combination with our idea, of the restricted convergence region. It turns out that this way a finer convergence analysis for these methods is obtained than in earlier works and without additional hypotheses. Numerical examples favoring our technique over earlier ones completes this article. AMS Subject Classification: 65F08, 37F50, 65N12.


Introduction
Let X , Y be Banach spaces and Ω ⊂ X be a nonempty and open set.By B(X , Y), we denote the space of bounded linear operators from X into Y.Let also U (w, d), be an open set centered at w ∈ X and of radius d > 0 and Ū (w, d) be its closure.
Many problems from diverse disciplines such that Mathematics, Optimization, Mathematical Programming, Chemistry, Biology, Physics, Economics, Statistics, Engineering and other disciplines [?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?], can be reduced to finding a solution x * of the equation H(x) = 0, ( where H : Ω −→ Y is a continuous operator.Since,a unique solution x * of equation (??) in a neighborhood of some initial data x 0 can be obtained only in special cases.Researchers construct iterative methods which generate a sequence converging to x * .The most widely used iterative method is Newton's defined for each n = 0, 1, 2, . . .by The order of convergence is an important concern when dealing with iterative methods.The computational cost increases in general especially when the convergence order increases.That is why researchers and practitioners have developed iterative methods that on the one hand avoid the computation of derivatives and on the other hand achieve high order of convergence.
We consider the following multi-step iterative method defined for each n = 0, 1, 2, . . .by The semi-local convergence of method (??) was given in [?].It is well known that as the convergence order increases the convergence region decreases in general.To avoid this problem, we introduce a center-Lipschitz-type condition that helps us determine an at least as small region as before containing the iterates {u n }.This way the resulting Lipschitz constants are at least as small.A tighter convergence analysis is obtained this way.The order of convergence was shown using Taylor expansions and conditions reaching up to the k + 1 order derivative of H, although these derivatives do not appear in this method.
As an academic example: Obviously ϕ (x) is not bounded on Ω.So, the convergence of methods (??) is not guaranteed by the analysis in [?, ?, ?].
The rest of the article is organized as follows: Section 2 contains the conditions to be used in the semi-local convergence that follows in Section 3. Finally the numerical examples are given in the concluding Section 4.

Local convergence
Let L 0 > 0, L > 0 and L 1 ≥ 1 be parameters.Define the scalar quadratic polynomial p by (2.1) The discriminant D of p is given by so p has roots s 1 and s 2 with 0 < s 1 < s 2 by the Descarte's rule of signs.Define also parameters Notice that γ ∈ (0, 1], since p(s 1 ) = 0.The local convergence analysis of method (??) uses the conditions (A): (a1) H : Ω −→ Y is a differentiable operator in the sense of Fréchet and there exists x * ∈ Ω such that H(x * ) = 0 and H (x * ) −1 ∈ L(Y, X ).
(a5) There exists ).Based on the preceding conditions and notations we can show a local convergence result for method (??).THEOREM 2.1 Under the conditions (A), further assume that u 0 ∈ B(x * , s 1 )− {x * }.Then, lim n−→∞ u n = x * , and the following estimations hold ) Moreover, the point x * is the unique solution of equation H(x) = 0 in the set Ω 1 given in (a5).
REMARK 2.2 (a) In view of (a2), we can write so the second condition in (a3) can be dropped, and we choose It follows from the definition of s 1 and r A that s 1 < r A .That is the radius of convergence s 1 cannot be larger than the radius of convergence r A of Newton's method obtained by us [?, ?, ?, ?, ?].
(c) The local convergence of method (??) was not studied in [?].But if it was call; s1 the smallest positive solution of p(t) = 0, where where L and L1 are the constants for the conditions in (a3) holding on Ω.But, we have since Ω 0 ⊂ Ω.Hence, we have s1 ≤ s 1 . (2.25) Moreover, if strict inequality holds in (??) or (??), then, we have s1 < s 1 .Furthermore, by (??), our error bounds are more precise than the ones using L 0 , L, L1 and γ.Hence, we have expanded the applicability of method (??) in the local convergence case.
In a similar way, we improve the semi-local convergence analysis of method (??) given in [?].The work is given in the next section.

Semi-local convergence
We need the following auxiliary result on majorizing sequences for method (??).LEMMA 3.1 Let K 0 > 0, K > 0, and r 1 0 be parameters.Denote by δ the unique root in the interval (0, 1) of the polynomial ϕ given by Define the sequence {q n } for each n = 0, 1, 2, . . .and i = 1, 2, . . ., k − 1 by and Moreover, suppose that Then, the sequence {q n } is increasing, bounded from above by q * * = r 1 0 1−α and converges to its unique least upper bound q * satisfying q 1 ≤ q * ≤ q * * , ) and Proof.Replace t n , s i n , L 0 , L, α in [?] by q n , r i n , K 0 , K, δ.Next, we present the semi-local convergence analysis of method (??).THEOREM 3.2 Let H : Ω −→ Y be a continuously differentiable operator in the sense of Fréchet and [., .;H] : Ω −→ L(X , Y) be a divided difference of order one of H. Suppose there exist x 0 ∈ Ω and K 0 > 0 such that Set Ω 2 = Ω ∩ B(x 0 , 1 2K0 ).Moreover, suppose that for each x, y, z, w and the hypotheses of Lemma ?? hold.Then, Moreover, u * is the unique solution of equation H(x) = 0 in B(v 0 , q * ).
for each x, y, z, w ∈ Ω, some L > 0 is used in [?] instead of (??).But we have since Ω 2 ⊆ Ω. Denote by qn , ri n the majorizing sequences used in [?] and defined as sequences q n , r i n but with K 0 = L 0 and K replaced by L.Then, we have by a simple induction argument, that q n ≤ qn (3.15) and q * ≤ q * .(3.20) Moreover, if K < L, then (??-(??) hold as strict inequalities.Let us consider the set Notice that Ω 3 ⊆ Ω 2 , so λ ≤ K.Then, Ω 3 , (??), λ can replace Ω 2 , (??), and K respectively in Theorem ??.Clearly, the corresponding to {t n } majorizing sequence call it { tn } is even tighter than {t n }.Hence, we have extended the applicability of method (??) in the semi-local convergence analysis too.These improvements are derived under the same conditions as in [?], since the computation of L is included in the computation of K as a special case.Examples where the new constants are smaller than the older ones can be found in the numerical section that follows and in [?, ?, ?, ?, ?].

Numerical examples
We present the following examples to test the convergence criteria.Define the divided difference by For the points u = (u 1 , u 2 , u 3 ) T , the Fréchet derivative is given by Using the norm of the maximum of the rows and (a 3 )-(a 4 ) and since H (x * ) = diag(1, 1, 1), we can define parameters for method (??) by Then, we get by (??)-(??) and (??) that for (

Conclusion
Our idea of the convergence region in connection to the center Lipschitz condition were utilized to provide a local as well as a semilocal convergence analysis of method (??).Due to the fact that we locate a region at least as small as in earlier works [?] containing the iterates, the new Lipschitz parameters are also at least as small.This technique leads to a finer convergence analysis (see also Remark ??, Remark ?? and the numerical examples).The novelty of the paper not only lies in the introduction of the new idea but also obtained using special cases of Lipschitz parameters appearing in [?].Hence, no additional work to [?] is needed to arrive at these developments.This idea can be used to extend the applicability of other iterative methods appearing [?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?] along the same lines.