A VIRTUAL ASSISTANT BASED ON TRANSFORMER MODEL USING A VECTOR DATABASE
DOI:
https://doi.org/10.24867/30BE03BlagojevicKeywords:
Semantic search, transformer models, virtual assistants, , natural language processingAbstract
This paper presents the implementation of a virtual assistant based on transformer model using a vector database and advanced artificial intelligence techniques. The system will be specialized in a specific dataset related to the Faculty of Technical Sciences in Novi Sad. The goal of the system is to make it easier for students who want to enroll in technical sciences to get information about the faculty.
References
[1] Chowdhary, K.R. (2020). Natural Language Processing. In: Fundamentals of Artificial Intelligence. Springer, New Delhi. https://doi.org/10.1007/978-81 322-3972-7_19
[2] A. Gillioz, J. Casas, E. Mugellini and O. A. Khaled, "Overview of the Transformer-based Models for NLP Tasks," 2020 15th Conference on Computer Science and Information Systems (FedCSIS), Sofia, Bulgaria, 2020, pp. 179-183, doi: 10.15439/2020F20.
[3] An, J., W. Ding, and C. Lin. "ChatGPT." tackle the growing carbon footprint of generative AI 615 (2023): 586.
[4] Han, Yikun, Chunjiang Liu, and Pengfei Wang. "A comprehensive survey on vector database: Storage and retrieval technique, challenge." arXiv preprint arXiv:2310.11703 (2023).
[5] Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems 30 (2017).
[6] Niu, Zhaoyang, Guoqiang Zhong, and Hui Yu. "A review on the attention mechanism of deep learning." Neurocomputing 452 (2021): 48-62.
[7] Tunstall, Lewis, Leandro Von Werra, and Thomas Wolf. Natural language processing with transformers. " O'Reilly Media, Inc.", 2022.
[8] Cvetanović, Aleksa, and Predrag Tadić. "Synthetic Dataset Creation and Fine-Tuning of Transformer Models for Question Answering in Serbian." 2023 31st Telecommunications Forum (TELFOR). IEEE, 2023.
[9] Carrino, Casimiro Pio, Marta R. Costa-Jussà, and José AR Fonollosa. "Automatic spanish translation of the squad dataset for multilingual question answering." arXiv preprint arXiv:1912.05200 (2019).
[2] A. Gillioz, J. Casas, E. Mugellini and O. A. Khaled, "Overview of the Transformer-based Models for NLP Tasks," 2020 15th Conference on Computer Science and Information Systems (FedCSIS), Sofia, Bulgaria, 2020, pp. 179-183, doi: 10.15439/2020F20.
[3] An, J., W. Ding, and C. Lin. "ChatGPT." tackle the growing carbon footprint of generative AI 615 (2023): 586.
[4] Han, Yikun, Chunjiang Liu, and Pengfei Wang. "A comprehensive survey on vector database: Storage and retrieval technique, challenge." arXiv preprint arXiv:2310.11703 (2023).
[5] Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems 30 (2017).
[6] Niu, Zhaoyang, Guoqiang Zhong, and Hui Yu. "A review on the attention mechanism of deep learning." Neurocomputing 452 (2021): 48-62.
[7] Tunstall, Lewis, Leandro Von Werra, and Thomas Wolf. Natural language processing with transformers. " O'Reilly Media, Inc.", 2022.
[8] Cvetanović, Aleksa, and Predrag Tadić. "Synthetic Dataset Creation and Fine-Tuning of Transformer Models for Question Answering in Serbian." 2023 31st Telecommunications Forum (TELFOR). IEEE, 2023.
[9] Carrino, Casimiro Pio, Marta R. Costa-Jussà, and José AR Fonollosa. "Automatic spanish translation of the squad dataset for multilingual question answering." arXiv preprint arXiv:1912.05200 (2019).
Downloads
Published
2025-03-03
Issue
Section
Electrotechnical and Computer Engineering