BEGIN:VCALENDAR
VERSION:2.0
PRODID:-// - ECPv6.14.2//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://iis.memphis.edu
X-WR-CALDESC:Events for 
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Chicago
BEGIN:DAYLIGHT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
TZNAME:CDT
DTSTART:20250309T080000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
TZNAME:CST
DTSTART:20251102T070000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Chicago:20250829T090000
DTEND;TZID=America/Chicago:20250829T170000
DTSTAMP:20260410T041437
CREATED:20250819T150016Z
LAST-MODIFIED:20250819T150016Z
UID:1666-1756458000-1756486800@iis.memphis.edu
SUMMARY:Efficient training of Kolmogorov-Arnold Networks (KANs)
DESCRIPTION:Title: \nEffi­cient train­ing of Kol­mogorov-Arnold Net­works (KANs) – meth­ods\, bench­marks\, and appli­ca­tions \n  \nAbstract: \nKANs are non­lin­ear regres­sion mod­els with a spe­cif­ic archi­tec­ture that is based on a com­po­si­tion of func­tions. They branched off from the Kolmogorov’s proof that any con­tin­u­ous mul­ti­vari­ate func­tion can be exact­ly rep­re­sent­ed by a spe­cif­ic com­po­si­tion of con­tin­u­ous uni­vari­ate func­tions [1]. The exact form of the rep­re­sen­ta­tion is a uni­ver­sal approx­i­ma­tor and has been exten­sive­ly stud­ied from 1950s\, e.g. [2\,3]. Approx­i­mate forms acquired var­i­ous names – mod­els or net­works and have been stud­ied from 1990s\, when their pow­er was first dis­cov­ered [4]. KANs have been used for machine-learn­ing (ML) appli­ca­tions from 2000s [5]\, but remained large­ly unno­ticed until May 2024\, when a paper preprint by a team from MIT was post­ed online [6]. \nThe immense recent growth of pop­u­lar­i­ty of KANs led to the sig­nif­i­cant num­ber of preprints\, most of which demon­strate their supe­ri­or accu­ra­cy when com­pared to tra­di­tion­al neur­al net­works – mul­ti­lay­er per­cep­trons (MLPs). How­ev­er\, employ­ing tra­di­tion­al train­ing meth­ods\, which are used for oth­er ML mod­els\, leads to larg­er train­ing times than for MLPs. \nIn this talk\, a light­weight train­ing method for KANs\, first pro­posed in 2020 for piece­wise-lin­ear under­ly­ing func­tions [7] and gen­er­alised to arbi­trary basis rep­re­sen­ta­tion in 2023 [8]\, will be pre­sent­ed. The method is based on the Kacz­marz algo­rithm. Effi­cient imple­men­ta­tions of KANs (in C#\, C++\, and MATLAB) will be shown that sig­nif­i­cant­ly out­com­pete MLPs both in terms of accu­ra­cy and­train­ing time – e.g. 4–10 min­utes for KANs vs. 4–8 hours for MLPs on datasets with 25 inputs and 10 mil­lion records. Fur­ther­more\, aspects relat­ed to deep KANs\, par­al­lel imple­men­ta­tion of the train­ing\, and uncer­tain­ty quan­tifi­ca­tion for KANs will be dis­cussed [9]. \nRef­er­ences: \n[1] A. N. Kol­mogorov\, Dokl. Akad. Nauk SSSR\, 114(5):953–956\, 1957. \n[2] G. G. Lorentz\, Am. Math. Mon.\, 69(6):469–485\, 1962. \n[3] D. A. Sprech­er\, Trans. Am. Math. Soc.\, 115(3):340–355\, 1965. \n[4] V. Kurko­va\, Neur­al Netw.\, 5(3):501–506\, 1992. \n[5] B. Igel­nik\, N. Parikh\, IEEE Trans. Neur­al Netw.\, 14(4):725–733\, 2003. \n[6] Z. Liu et al.\, arXiv:2404.19756\, 2024. \n[7] A. Polar\, M. Poluek­tov\, Eng. Appl. Artif. Intell.\, 99:104137\, 2021. \n[8] M. Poluek­tov\, A. Polar\, arXiv:2305.08194\, 2023. \n[9] A. Polar\, M. Poluek­tov\, arXiv:2104.01714\, 2021. \n  \nBio: \nMikhail Poluek­tov is cur­rent­ly appoint­ed as a Lec­tur­er (Assis­tant Pro­fes­sor) in Math­e­mat­ics at the Uni­ver­si­ty of Dundee (UK). His research focus­es on com­pu­ta­tion­al and applied math­e­mat­ics cov­er­ing a large range of mod­els and meth­ods. In par­tic­u­lar\, his recent research includes fic­ti­tious-domain and mul­ti­scale meth­ods for non-lin­ear par­tial dif­fer­en­tial equa­tions\, as well as approx­i­ma­tion the­o­ry meth­ods. His work has been pub­lished in jour­nals such as Com­put­er Meth­ods in Applied Mechan­ics and Engi­neer­ing. Pri­or to cur­rent appoint­ment\, Dr Poluek­tov held a Senior Research Fel­low posi­tion at the Uni­ver­si­ty of War­wick (UK). Dr Poluek­tov obtained a PhD from the Eind­hoven Uni­ver­si­ty of Tech­nol­o­gy (Nether­lands).
URL:https://iis.memphis.edu/event/efficient-training-of-kolmogorov-arnold-networks-kans/
LOCATION:FBCE 381\, Fogelman College of Business & Economics\, Memphis\, TN\, 38112\, United States
END:VEVENT
END:VCALENDAR