[1] Abubakar, A.B., Kumam, P., Mohammed, H. and Sitthithakerngkiet, K. A modified Fletcher-Reeves conjugate gradient method for monotone nonlinear equations with some applications, Mathematics, 7 (745) (2019), 1–25.
[2] Ahmed, K., Waziri, M.Y. and Halilu, A.S. On two symmetric Dai-Kou type schemes for constrained monotone equations with image recovery application, Euro J. Comput. Optim. 11 100057 (2023) 1–32.
[3] Ahmed, K., Waziri, M.Y., Halilu, A.S. and Murtala, S. Sparse signal reconstruction via Hager-Zhang-type schemes for constrained system of nonlinear equations, Optimization, 73 (6) (2024), 1949–1980.
[4] Ahmed, K., Waziri, M.Y., Halilu, A.S., Murtala, S. and Sabi’u, J. Another Hager-Zhang type method via singular value study for constrained monotone equations with application, Numer. Algor. 96 (4) (2024), 1583–1623.
[5] Ahmed, K., Waziri, M.Y., Murtala, S., Halilu, A.S. and Sabi’u, J. On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications, J. Optim. Theory Appl. 200 (2024), 669–702.
[6] Al-Baali M. Numerical experience with a class of self-scaling quasi-Newton algorithms, J. Optim. Theory and Appl. 96(3) (1998), 533–553.
[7] Althobaiti, A., Sabi’u J., Emadifar, H., Junsawang, P., and Sahoo, S.K. A scaled Dai-Yuan projection-based conjugate gradient method for solving monotone equations with applications, Symmetry, 14 (1401) (2023), 1–28.
[8] Andrei, N. Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization, Optim. Methods Softw. 32(3)(2017), 534–551.
[9] Andrei, N. A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues, Numer. Algor. 77 (2018), 1273–1282.
[10] Andrei, N. A double parameter scaled BFGS method for unconstrained optimization, J. Comput. Appl. Math. 332 (2018), 26–44.
[11] Andrei, N. A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, J. Optim. Theory Appl. 178 (2018), 191–218.
[12] Babaie-Kafaki, S. and Aminifard, Z. Two parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length, Numer. Algor. 82 (2019), 1345–1357.
[13] Biggs, M.C. Minimization algorithms making use of non-quadratic properties of the objective function, J. Inst. Math. Appl. 8 (1971), 315–327.
[14] Broyden, C.G. A class of methods for solving nonlinear simultaneous equations, Math. Comput. 19 (1965), 577–593.
[15] Broyden, C.G. The convergence of a class double-rank minimization algorithms, J. Inst. Math. Appl. 6 (1970) 76–90.
[16] Byrd, R.H., Liu, D.C. and Nocedal, J. On the behavior of Broyden’s class of quasi-Newton methods, SIAM J. Optim. 2 (1992), 533–557.
[17] Byrd, R. and Nocedal, J. A tool for the analysis of quasi-Newton methods with application to unconstrained minimization, SIAM J. Numer. Anal. 26 (1989), 727–739.
[18] Dai, Y.H. and Kou, CX. A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search, SIAM J. Optim. 23 (2013), 296–320.
[19] Dai, Y.H. and Liao, L.Z. New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43 (1) (2001), 87–101.
[20] Dai, Y.H. and Yuan, Y. A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim. 10 (1999), 177–182.
[21] Dennis, J.E. and Schnabel, R.B. Numerical methods for unconstrained optimization and nonlinear equations, Englewood Cliffs, NJ: Prentice-Hall, 1983.
[22] Ding, Y., Xiao, Y. and Li, J. A class of conjugate gradient methods for convex constrained monotone equations, Optimization, 66 (12) (2017), 2309–2328.
[23] Dolan, E.D. and More, J.J. Benchmarking optimization software with performance profiles, Math. Program, 91 (2002), 201–2013.
[24] Figueiredo, M., Nowak, R. and Wright, S.J. Gradient projection for sparse reconstruction, application to compressed sensing and other in-verse problems, IEEE J-STSP IEEE Press, Piscataway, NJ. (2007), 586–597.
[25] Fletcher, R. and Reeves, C. Function minimization by conjugate gradients, Comput. J. 7 (1964), 149–154.
[26] Fletcher, R. A new approach to variable metric algorithms, Computer J. 13 (1970) 317–322.
[27] Fletcher, R. Practical method of optimization, Volume 1: Unconstrained Optimization, 2nd ed., Wiley, New York, (1997).
[28] Gao, P., Zheng, W., Wang, T., Li, Y. and Li, F. Signal recovery with constrained monotone nonlinear equations, J. Appl. Anal. Comput. 13 (4) (20023), 2006–2025.
[29] Gill, P.E. and Leonard, M.W. Reduced-Hessian quasi Newton methods for unconstrained optimization, SIAM J. Optim. 12 (2001), 209–237.
[30] Goldfarb, D. A family of variable metric methods derived by variation mean, Math. Comput. 23 (1970), 23–26.
[31] Hager, W.W. and Zhang, H. A survey of nonlinear conjugate gradient methods, Pacific J. Optim. 2 (2006), 35–58.
[32] Halilu, A.S., Majumder, A., Waziri, M.Y., Ahmed, K. and Awwal, A.M. Motion control of the two joint planar robotic manipulators through accelerated Dai-Liao method for solving system of nonlinear equations, Eng. Comput. 39 (5) (2021), 1–39.
[33] Hestenes, M.R. and Stiefel, E.L. Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), 409–436.
[34] Ivanov, B., Milanovic, G.V. and Stanimirovic, P.S. Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring, J. Global Optim. 85 (2023) 377–420.
[35] Kaporin, I.E. New convergence results and preconditioning strategies for the conjugate gradient methods, Numer. Linear Alg. Appl. 1(2) (1994), 179–210.
[36] Kiri, A.I., Waziri, M.Y. and Ahmed, K. Amodified Liu-Storey scheme for nonlinear systems with an application to image recovery, Iran, J. Numer. Anal. and Optim. 3 (1) (2023), 38–58.
[37] Kratzer, D., Parter, S.V. and Steuerwalt, M. Block splittings for the conjugate gradient method, Comp. Fluid, 11 (1983), 255–279.
[38] La Cruz, W. A Spectral algorithm for large-scale systems of nonlinear monotone equations, Numer. Algor. 76 (2017), 1109–1130.
[39] La Cruz, W., Martinez, J.M. and Raydan, M. Spectral residual method without gradient information for solving large-scale nonlinear systems of equations, Theory and experiments, Technical Report RT-04-08, 2005.
[40] Liu, J.K., and Li, S.J. A projection method for convex constrained monotone nonlinear equations with applications, Comput. Math. Appl. 70 (10) (2015), 2442–2453.
[41] Liu, Y. and Storey, C. Efficient generalized conjugate gradient algorithms, Part 1: Theory, J. Optim. Theory Appl. 69 (1991), 129–137.
[42] Narushima, Y. and Yabe, H. A survey of sufficient descent conjugate gradient methods for unconstrained optimization, SUT J. Math. 50(2) (2014), 167–203.
[43] Nocedal, J. Theory of algorithms for unconstrained optimization, Acta Numer. 1 (1992), 199–242.
[44] Nocedal, J. and Wright, S.J. Numerical optimization, Springer, New York, 2006.
[45] Oren, S.S. and Luenberger, D.G. Self-scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms, Manag. Sci. 20 (1974), 845–862.
[46] Oren S.S., and Luenberger D.G. Self scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms, Manag. Sci. 20(5) (1974), 845–862.
[47] Oren S.S., and Spedicato E. Optimal conditioning of self scaling variable metric algorithms, Math. Prog. 10(1) (1976), 70–90.
[48] Ortega, J.M. and Rheinboldt, W.C. Iterative solution of nonlinear equations in several variables, New York: Academic Press, 1970.
[49] Pang, J.S. Inexact Newton methods for the nonlinear complementarity problem, Math. Program. 36 (1986), 54–71.
[50] Polak, E. and Ribi´ere, G. Note Sur la convergence de directions con-jugèes, Rev. Francaise Informat. Recherche Operationelle, 3e Ann`e. 16 (1969), 35–43.
[51] Polyak, B.T. The conjugate gradient method in extreme problems, USSR Comp. Math. Math. Phys. 9 (1969), 94–112.
[52] Powell, M.J.D. Some global convergence properties of a variable metric algorithm for minimization without exact line search, In: Cottle, R.W., Lemke, C.E. (eds.) Nonlinear Programming, SIAM-AMS Proceedings, SIAM, Philadelphia. 9 (1976), 53–72.
[53] Shanno, D.F. Conditioning of quasi-Newton methods for function minimization, Math. Comput. 24 (1970), 647–656.
[54] Solodov, M.V. and Svaiter, B.F. A globally convergent inexact Newton method for systems of monotone equations, in: M. Fukushima, L. Qi (Eds.), Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, Kluwer Academic Publishers. (1998), 355–369.
[55] Sun, W. and Yuan, Y.X. Optimization theory and methods, Nonlinear programming, Springer, New York 2006.
[56] Waziri, M.Y., Ahmed, K. and Halilu, A.S. A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration, Comput. Appl. Math. 41(232) (2022), 1–33.
[57] Waziri, M.Y., Ahmed, K., Halilu, A.S. and Sabi’u, J. Two new Hager-Zhang iterative schemes with improved parameter choices for monotone nonlinear systems and their applications in compressed sensing. Rairo Oper. Res. 56 (1) (2021), 239–273.
[58] Waziri, M.Y., Ahmed, K. and Sabi’u, J. A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations, Arab. J. Math. 9 (2020), 443–457.
[59] Waziri, M.Y., Ahmed, K., Sabi’u, J. and Halilu, A.S. Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations, SeMA J. 78 (2020), 15–51.
[60] Winther, R. Some superlinear convergence results for the conjugate gradient method, SIAM J. Numer. Anal. 17 (1980), 14–17.
[61] Xiao, Y., Wang, Q. and Hu, Q. Non-smooth equations based method for ℓ1 − norm problems with applications to compressed sensing, Nonlinear Anal. Theory Methods Appl. 74(11) (2011), 3570–3577.
[62] Xiao, Y. and Zhu, H. A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing, J. Math. Anal. Appl. 405 (1) (2013), 310–319.
[63] Yin, J., Jian, J., Jiang, X., Liu, M. and Wang, L. A hybrid three-term
conjugate gradient projection method for constrained nonlinear monotone
equations with applications, Numer. Algor. 88 (2021), 389–418.
[64] Zhou, W.J. and Li, D.H. Limited memory BFGS methods for nonlinear
monotone equations, J. Comput. Math. 25 (2007), 89–96.