Mostrar el registro sencillo del ítem

dc.contributor.authorBecerra-Suarez, F.L.es_PE
dc.contributor.authorBorrero-Ramírez, A.G.es_PE
dc.contributor.authorValencia-Castillo, E.es_PE
dc.contributor.authorForero, M.G.es_PE
dc.date.accessioned2026-02-23T14:24:31Z
dc.date.available2026-02-23T14:24:31Z
dc.date.issued2025
dc.identifier.urihttp://hdl.handle.net/20.500.14074/9817
dc.description.abstractNeural networks have become a fundamental tool for solving complex problems, from image processing and speech recognition to time series prediction and large-scale data classification. However, traditional neural architectures suffer from interpretability problems due to their opaque representations and lack of explicit interaction between linear and nonlinear transformations. To address these limitations, Kolmogorov–Arnold Networks (KAN) have emerged as a mathematically grounded approach capable of efficiently representing complex nonlinear functions. Based on the principles established by Kolmogorov and Arnold, KAN offer an alternative to traditional architectures, mitigating issues such as overfitting and lack of interpretability. Despite their solid theoretical basis, practical implementations of KAN face challenges, such as optimal function selection and computational efficiency. This paper provides a systematic review that goes beyond previous surveys by consolidating the diverse structural variants of KAN (e.g., Wavelet-KAN, Rational-KAN, MonoKAN, Physics-KAN, Linear Spline KAN, and Orthogonal Polynomial KAN) into a unified framework. In addition, we emphasize their mathematical foundations, compare their advantages and limitations, and discuss their applicability across domains. From this review, three main conclusions can be drawn: (i) spline-based KAN remain the most widely used due to their stability and simplicity, (ii) rational and wavelet-based variants provide greater expressivity but introduce numerical challenges, and (iii) emerging approaches such as Physics-KAN and automatic basis selection open promising directions for scalability and interpretability. These insights provide a benchmark for future research and practical implementations of KAN.es_PE
dc.formatapplication/pdfes_PE
dc.language.isoenges_PE
dc.publisherMultidisciplinary Digital Publishing Institute (MDPI).es_PE
dc.relation.ispartofhttps://www.scopus.com/pages/publications/105018970508es_PE
dc.relation.ispartofurn:issn:22277390es_PE
dc.relation.ispartofMathematics 2025; 13(19): 3128es_PE
dc.rightsinfo:eu-repo/semantics/openAccesses_PE
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/es_PE
dc.subjectKolmogorov-Arnold Networkses_PE
dc.subjectneural networkses_PE
dc.subjectinterpretabilityes_PE
dc.subjectmachine learninges_PE
dc.subjectdeep learninges_PE
dc.subjectfunction approximationes_PE
dc.subjectcomputational efficiencyes_PE
dc.subjectnonlinear functionses_PE
dc.titleMathematical Generalization of Kolmogorov-Arnold Networks (KAN) and Their Variants.es_PE
dc.typeinfo:eu-repo/semantics/articlees_PE
dc.type.versioninfo:eu-repo/semantics/publishedVersiones_PE
dc.subject.ocdehttps://purl.org/pe-repo/ocde/ford#1.01.02es_PE
dc.identifier.doihttps://doi.org/10.3390/math13193128es_PE


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

info:eu-repo/semantics/openAccess
Excepto si se señala otra cosa, la licencia del ítem se describe como info:eu-repo/semantics/openAccess
Universidad Nacional de Cajamarca

Av. Atahualpa 1050, Cajamarca - Perú | Telf. (+51)076-599220

Todos los contenidos de repositorio.unc.edu.pe están bajo la Licencia Creative Commons

repositorio@unc.edu.pe