Faculty Publications

Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736

Publications by NITK Faculty

Browse

Search Results

Now showing 1 - 8 of 8
  • Item
    Ball Convergence of Multipoint Methods for Non-linear Systems
    (Springer Science and Business Media Deutschland GmbH, 2021) Argyros, I.K.; George, S.; Erappa, S.M.
    We study Multipoint methods using only the first derivative. Earlier studies use higher than three order derivatives not on the methods. Moreover Lipschitz constants are used to find error estimates not presented in earlier papers. Numerical examples complete this paper. © 2021, Springer Nature Singapore Pte Ltd.
  • Item
    Extended convergence of gauss-newton’s method and uniqueness of the solution
    (SINUS Association Office_CJEES@yahoo.ro, 2018) Argyros, I.K.; Cho, Y.J.; George, S.
    The aim of this paper is to extend the applicability of the Gauss-Newton’s method for solving nonlinear least squares problems using our new idea of restricted convergence domains. The new technique uses tighter Lipschitz functions than in earlier papers leading to a tighter ball convergence analysis. © 2018, SINUS Association. All rights reserved.
  • Item
    Enlarging the ball convergence for the modified Newton method to solve equations with solutions of multiplicity under weak conditions
    (Global Science Press schan@global-sci.org, 2018) Argyros, I.K.; George, S.
    The objective of this paper is to enlarge the ball of convergence and improve the error bounds of the modified Newton method for solving equations with solutions of multiplicity under weak conditions. © 2018 Global-Science Press.
  • Item
    Local convergence of a novel eighth order method under hypotheses only on the first derivative
    (Tusi Mathematical Research Group (TMRG) moslehian@memeber.ams.org, 2019) Argyros, I.K.; George, S.; Erappa, S.M.
    We expand the applicability of eighth order-iterative method stud- ied by Jaiswal in order to approximate a locally unique solution of an equation in Banach space setting. We provide a local convergence analysis using only hypotheses on the first Frechet-derivative. Moreover, we provide computable convergence radii, error bounds, and uniqueness results. Numerical examples computing the radii of the convergence balls as well as examples where earlier results cannot apply to solve equations but our results can apply are also given in this study. © 2019 Khayyam Journal of Mathematics.
  • Item
    Ball comparison for three optimal eight order methods under weak conditions
    (Babes-Bolyai University oeconomica@econ.ubbcluj.ro, 2019) Argyros, I.K.; George, S.
    We considered three optimal eighth order method for solving nonlinear equations. In earlier studies Taylors expansions and hypotheses reaching up to the eighth derivative are used to prove the convergence of these methods. These hypotheses restrict the applicability of the methods. In our study we use hypotheses on the first derivative. Numerical examples illustrating the theoretical results are also presented in this study. © 2019, Babes-Bolyai University.
  • Item
    Ball convergence of a novel bi-parametric iterative scheme for solving equations
    (International Publications internationalpubls@yahoo.com, 2020) Argyros, I.K.; George, S.
    The aim of this article is to establish a ball convergence result for a bi-parametric iterative scheme for solving equations involving Banach space-valued operators. In contrast to earlier approaches in the less general setting of the k-dimensional Eu-clidean space where hypotheses on the seventh derivative are used, while we only use hypotheses on the first derivative. Hence, we extend the applicability of the method. Moreover, the radius of convergence as well as error bounds on the distances are given based on Lipschitz-type functions. Numerical examples are given to test our conditions. These examples show that earlier convergence conditions are not satisfied but ours are satisfied. © 2020, International Publications. All rights reserved.
  • Item
    Direct comparison between two third convergence order schemes for solving equations
    (MDPI AG, 2020) Regmi, S.; Argyros, I.K.; George, S.
    We provide a comparison between two schemes for solving equations on Banach space. A comparison between same convergence order schemes has been given using numerical examples which can go in favor of either scheme. However, we do not know in advance and under the same set of conditions which scheme has the largest ball of convergence, tighter error bounds or best information on the location of the solution. We present a technique that allows us to achieve this objective. Numerical examples are also given to further justify the theoretical results. Our technique can be used to compare other schemes of the same convergence order. © 2020 by the authors. Licensee MDPI, Basel, Switzerland.
  • Item
    Ball Comparison Between Four Fourth Convergence Order Methods Under the Same Set of Hypotheses for Solving Equations
    (Springer, 2021) Argyros, I.K.; George, S.
    There is a plethora of techniques used to generate iterative methods. But the convergence order is determined by assuming the existence of higher order derivative for the operator involved. Moreover, these techniques do not provide estimates on error distances or results on the uniqueness of the solution based Lipschitz or Hölder type conditions. Hence, the use-fullness of these schemes is very restricted. We deal with these challenges using only the first derivative which is only actually appearing on these schemes and under the same set of conditions. Moreover, we provide a computable ball comparison between these schemes. That is how we extend these methods under weaker conditions. Numerical experiments are conducted to find the convergence balls and test the criteria of convergence. Moreover, different set of criteria usually based on the fifth derivative are needed in the ball convergence of fourth order methods. Then, these methods are compared using numerical examples. But we do not know: if the results of those comparisons are true if the examples change; the largest radii of convergence; error estimates on ? xn- ?? and uniqueness results that are computable. We conclude that DSNM is the best among these four methods. Examples are used to compare the results. Our ideas can be utilized to make comparisons between other methods. © 2021, The Author(s), under exclusive licence to Springer Nature India Private Limited part of Springer Nature.