[1] Abdi, A., Nabi, R.M., Sardasht, M. and Mahmood R. Multiclass clas-sifiers for stock price prediction: A comparison study, J. Harbin Inst. Technol. 54(3) (2022), 32–39.
[2] Angluin, D. and Laird, P. Learning from noisy examples, Mach. Learn. 2 (1988), 343–370.
[3] Bertsimas, D., Dunn, J., Pawlowski, C. and Zhuo, Y.D. Robust classifi-cation, INFORMS Journal on Optimization 1(1) (2019), 2–34.
[4] Biggio, B., Nelson, B. and Laskov, P. Support vector machines under adversarial label noise, Asian conference on machine learning (2011), 97–112.
[5] Blanco, V., Japón, A. and Puerto, J. Robust optimal classification trees under noisy labels, Adv. Data Anal. Classif. 16(1) (2022), 155–179.
[6] Blanco, V., Japón, A. and Puerto, J. A mathematical programming ap-proach to SVM-based classification with label noise, Comput. Ind. Eng. 172 (2022), 108611.
[7] Chen, Z., Song, A., Wang, Y., Huang, X. and Kong, Y. A noise rate estimation method for image classification with label noise, J. Phys. Conf. Ser. IOP Publishing 2433(1) (2023), 012039.
[8] Cortes, C. and Vapnik, V.N. Support vector networks, Mach. Learn. 20(3) (1995), 273–297.
[9] Demsar, J. Statistical comparisons of classifiers over multiple data sets, J. Mach. Learn Res. 7 (2006), 1–30.
[10] Deng, N., Tian, Y. and Zhang, C. Support vector machines: Optimiza-tion based theory, algorithms, and extensions, CRC press, 2012.
[11] Ding, S., Zhang, N., Zhang, X. and Wu, F. Twin support vector ma-chine: Theory, algorithm and applications, Neural Comput. Appl. 28(11) (2017), 3119–3130.
[12] Ding, S., Zhao, X., Zhang, J., Zhang, X. and Xue, Y. A review on multi-class TWSVM, Artif. Intell. Rev. 52(2) (2019), 775–801.
[13] Duan, Y. and Wu, O. Learning with auxiliary less-noisy labels, IEEE Trans. Neural Netw. Learn. Syst. 28(7) (2018), 1716–1721.
[14] Duda, R.O., Hart, P.E. and Stork, D.G. Pattern Classification, John Wiley & Sons, 2012.
[15] Ekambaram, R., Fefilatyev, S., Shreve, M., Kramer, K., Hall, L.O. and Goldgof, D.B. Active cleaning of label noise, Pattern Recognit. 51 (2016), 463–480.
[16] Grant, M., Boyd, S. and Ye, Y. Cvx: Matlab software for disciplined convex programming, version 2.0 beta, 2013.
[17] Hassani, S.F., Eskandari, S. and Salahi, M. CInf-FS: An efficient infi-nite feature selection method using K-means clustering to partition large feature spaces, Pattern Anal. Appl. (2023), 1–9.
[18] Iman, R.L. and Davenport, J.M. Approximations of the critical region of the fbietkan statistic, Commun. Stat. Theory Methods 9 (6) (1980), 571–595.
[19] Jayadeva, Khemchandani, R. and Chandra, S. Twin support vector ma-chines for pattern classification, Trans. Pattern Anal. Mach. Intell. 29(5) (2007), 905–910.
[20] Jimenez-Castano, C., Alvarez-Meza, A. and Orozco-Gutierrez, A. En-hanced automatic twin support vector machine for imbalanced data clas-sification, Pattern Recognit. 107 (2020), 107442.
[21] Keerthi, S.S., Shevade, S.K., Bhattacharyya, C. and Murthy, K.R.K. Improvements to platt’s SMO algorithm for SVM classifier design, Neu-ral Comput. 13(3) (2001), 637–649.
[22] Kshirsagar, A.P. and Shakkeera, L. Recognizing Abnormal Activity Us-ing MultiClass SVM Classification Approach in Tele-health Care, IOT with Smart Systems: Proceedings of ICTIS 2021, Springer Singapore, 2 (2022), 739–750.
[23] Lachenbruch, P.A. Discriminant analysis when the initial samples are misclassified, Technometrics 8(4) (1966), 657–662.
[24] Lachenbruch, P.A. Note on initial misclassification effects on the quadratic discriminant function, Technometrics 21(1) (1979), 129–132.
[25] McLachlan, G.J. Asymptotic results for discriminant analysis when the initial samples are misclassified, Technometrics 14(2) (1972), 415–422.
[26] Nasiri, J.A. and Mir, A.M. An enhanced KNN-based twin support vector machine with stable learning rules, Neural Comput. Appl. 16 (2020), 12949–12969.
[27] Okamoto, S. and Yugami, N. An average-case analysis of the k-nearest neighbor classifier for noisy domains, 15th International Joint Confer-ence on Artificial Intelligence (IJCAI) (1997), 238–245.
[28] Platt, J. Fast Training of Support Vector Machines using Sequential Minimal Optimization, MIT Press, 1998.
[29] Sahleh, A., Salahi, M. and Eskandari, S. Multi-class nonparallel support vector machine, Prog. Artif. Intell. (2023), 1–15.
[30] Tanveer, M., Rajani, T., Rastogi, R., Shao, Y.H. and Ganaie, M.A. Comprehensive review on twin support vector machines, Ann. Oper. Res. (2022), 1–46.
[31] Thulasidasan, S., Bhattacharya, T., Bilmes, J., Chennupati, G., and Mohd-Yusof, J. Combating label noise in deep learning using abstention, arXiv preprint arXiv, (2019), 1905.10964.
[32] Tian, Y. and Qi, Z. Review on: twin support vector machines, Ann. Data Sci. 1 (2014), 253–277.
[33] Tian,Y., Qi, Z., Ju, X., Shi, Y. and Liu, X. Nonparallel support vector machines for pattern classification, IEEE Trans. Cybern. 44(7) (2014), 1067–1079.
[34] Vapnik, V.N. The Nature of Statistical Learning Theory, Springer, New York, 1996.
[35] Vapnik, V.N. Statistical Learning Theory, John Wiley & Sons, New York, 1998.
[36] Witoonchart, P. and Chongstitvatana, P. Application of structured sup-port vector machine backpropagation to a convolutional neural network for human pose estimation, Neural Networks 92 (2017), 39–46.
[37] Wolsey, L.A. and Nemhauser, G.L. Integer and combinatorial optimiza-tion, John Wiley & Sons, 55, 1999.
[38] Xiao, H., Biggio, B., Nelson, B., Xiao, H., Eckert, C. and Roli, F. Support vector machines under adversarial label contamination, Neurocomputing 160 (2015), 53–62.