Please use this identifier to cite or link to this item: https://er.knutd.edu.ua/handle/123456789/31595
Title: Machine learning with ann as important technology for new-age literary studies
Authors: Krasniuk, Svitlana
Keywords: literary studies
machine learning
Issue Date: Oct-2025
Publisher: Ark Reprints
Citation: Krasniuk Svitlana. Machine learning with ann as important technology for new-age literary studies / S. Krasniuk // Science: development and factors its influence : scientific collection «InterConf» рroceedings of the 6th International Scientific and Practical Conference (Amsterdam, October 6-8, 2025). – Amsterdam : Ark Reprints, 2025. – № 266. – Р. 86–89.
Abstract: The use of artificial neural networks within the framework of machine learning in literary studies demonstrates significant potential for updating and improving research methods. The use of ANNs makes it possible to analyze voluminous text corpora, identify hidden patterns, classify genres, conduct stylometric studies and perform deep semantic analysis. They provide automation of routine stages of working with text, which allows you to focus on more creative and interpretative tasks. In the future, the integration of ANNs into the humanities will contribute not only to increasing the accuracy and efficiency of scientific research, but also to the formation of a new scientific paradigm, where technologies and classical humanitarian methods interact for a more complete disclosure of literary processes and cultural phenomena.
URI: https://er.knutd.edu.ua/handle/123456789/31595
Faculty: Інститут права та сучасних технологій
Department: Кафедра філології та перекладу (ФП)
Appears in Collections:Кафедра філології та перекладу (ФП)
Матеріали наукових конференцій та семінарів

Files in This Item:
File Description SizeFormat 
Title_Amsterdam ТЕЗИ S. Krasniuk.pdf2,53 MBAdobe PDFView/Open
Content_Amsterdam ТЕЗИ S. Krasniuk.pdf238,88 kBAdobe PDFView/Open
Amsterdam ТЕЗИ S. Krasniuk 86-89.pdf252,24 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.