CAISY: Chatbot using Artificial Intelligence and Sequential Model with YAML Dataset
Manoj Kumar M V1, Prajwal J M2, Shyamanth Kashyap3, Rahul G4, Pavan R Nargund5

1Manoj Kumar M V, Department of Information Science and Engineering, Nitte Meenakshi Institute of Technology, Yelahanka, Bengaluru (Karnataka), India.
2Prajwal J M, Department of Information Science and Engineering, Nitte Meenakshi Institute of Technology, Yelahanka, Bengaluru (Karnataka), India.
3Shyamanth Kashyap, Department of Information Science and Engineering, Nitte Meenakshi Institute of Technology, Yelahanka, Bengaluru (Karnataka), India.
4Rahul G, Department of Information Science and Engineering, Nitte Meenakshi Institute of Technology, Yelahanka, Bengaluru (Karnataka), India.
5Pavan R Nargund, Department of Information Science and Engineering, Nitte Meenakshi Institute of Technology, Yelahanka, Bengaluru (Karnataka), India.
Manuscript received on 21 May 2019 | Revised Manuscript received on 11 June 2019 | Manuscript Published on 27 June 2019 | PP: 110-116 | Volume-8 Issue-1C May 2019 | Retrieval Number: A10210581C19/2019©BEIESP
Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: This paper presents a method for constructing a software chat robot named “CAISY”. It trains itself to answer queries for any domain it has been trained to handle. CAISY uses word embedding, Sequence 2 Sequence model, and Long Short Term Memory neural networks. CAISY is capable of responding to test queries under a maximum of 5 millisecond delay. The exciting feature of Caisy is it could be applied to any domain where the data is available for training the model. For the demonstration, this paper presents the CAISY trained for Normal Conversation. CAISY has been optimized by minimizing the loss value with the help of variation of hyperparameters.
Keywords: Word Embedding, Sequence 2 Sequence, Long Short Term Memory(LSTM), Hyperparameters.
Scope of the Article: Artificial Intelligence