Explore chapters and articles related to this topic
Banach spaces
Published in Orr Moshe Shalit, A First Course in Functional Analysis, 2017
The usual (with respect to norm) convergence in a normed space is sometimes called strong convergence in discussions where confusion may be caused. It is clear from the definition that strong convergence implies weak convergence, but the converse does not hold.
Bounded perturbation resilience of a regularized forward-reflected-backward splitting method for solving variational inclusion problems with applications
Published in Optimization, 2023
It is well known that in infinite dimensional Hilbert spaces, strong convergence, that is, convergence in the norm, is more desirable than weak convergence. Motivated by Malitsky and Tam [6], Hieu et al. [21], Dixit et al. [8], and Censor et al. [15], the first purpose of the present paper is to propose a regularized FRBSM algorithm for solving VI (1) and prove its strong convergence. More precisely, we seek a solution satisfying the following requirements: where is a γ-strongly monotone and -Lipschitz continuous operator. Secondly, we prove that the regularized FRBSM has the bounded perturbation resilience property.
Strong convergence of two regularized relaxed extragradient schemes for solving the split feasibility and fixed point problem with multiple output sets
Published in Applicable Analysis, 2023
Adeolu Taiwo, Simeon Reich, Chinedu Izuchukwu
For some other algorithms proposed for solving the SFFPP, see [23, 24]. However, we note that the aforementioned algorithms proposed for solving the SFFPP involve the computation of four metric projections per iteration. Even in the case where the projections onto the feasible sets C and Q can be computed easily, computing four metric projections per iteration will increase the computational costs of the algorithm. Thus it is advantageous to reduce the number of projections per iteration. The authors of [25] worked in this direction, and by using a regularized relaxed extragradient algorithm, they proved a weak convergence theorem. However, generally speaking, in infinite dimensional Hilbert spaces, strong convergence, that is, convergence in the norm, is more desirable than weak convergence. Therefore our purpose in the present paper is to propose regularized relaxed algorithms that will converge strongly to a solution of the SFFPP.
Weak convergence for variational inequalities with inertial-type method
Published in Applicable Analysis, 2022
Yekini Shehu, Olaniyi S. Iyiola
Our proposed method in this paper gives weak convergence results in infinite dimensional Hilbert space. There exists strong convergence methods in the literature for solving variational inequality problem in infinite dimensional Hilbert space (see, for example, [10,12,17,18,47–50]). These methods use ideas of viscosity terms, Halpern iterations and hybrid methods. It has been shown numerically in [47] that viscosity and Halpern-type strongly convergent methods outperform those of hybrid methods. Nonetheless, proposed viscosity and Halpern-type strongly convergent methods involve the iterative parameter that is both diminishing and non-summable. These conditions on the iterative parameters make the viscosity and Halpern-type strongly convergent methods to be slower than our proposed method in this paper in terms number of iterations and CPU time.