Boukrouche, Mahdi; Tarzia, Domingo - In: Computational Optimization and Applications 53 (2012) 2, pp. 375-393
First, let u <Subscript> g </Subscript> be the unique solution of an elliptic variational inequality with source term g. We establish, in the general case, the error estimate between <InlineEquation ID="IEq1"> <EquationSource Format="TEX">$u_{3}(\mu)=\mu u_{g_{1}}+ (1-\mu)u_{g_{2}}$</EquationSource> </InlineEquation> and <InlineEquation ID="IEq2"> <EquationSource Format="TEX">$u_{4}(\mu)=u_{\mu g_{1}+ (1-\mu) g_{2}}$</EquationSource> </InlineEquation> for μ∈[0,1]. Secondly, we consider a...</equationsource></inlineequation></equationsource></inlineequation></subscript>