FAQ chatbot web framework for response comparisons and performance analysis
The Objective of this implementation is to develop a Chatbot framework to provide a modular platform for a given set of Question Answer Matching Engine. This framework aims to provide an all rounded input solution to assess response accuracy and performance comparison with existing Chatbot services....
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/138036 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | The Objective of this implementation is to develop a Chatbot framework to provide a modular platform for a given set of Question Answer Matching Engine. This framework aims to provide an all rounded input solution to assess response accuracy and performance comparison with existing Chatbot services. Chatbot serves various purposes ranging from FAQs, helpdesk to interactive smart home systems. There are currently no available web platforms that provide Chatbot analysis as different Chatbots serves various purposes and each service serves its own unique purpose, therefore there are currently no benchmark Chatbot model to make comparisons.
The challenge in this implementation is to be able to deliver a proof of concept of a modular Chatbot comparison service where Chatbot solutions can be segregated to the respective categories and tested with one another to determine the level of correctness with optimal query matching time. This platform hopes to help Chatbot engineers improve the quality of services and better understand the needs of the end users.
For the context of this implementation, the Chatbot category will be targeting FAQs. This Project will work closely with the Ministry of Social and Family Development’s Baby Bonus as a basis for the FAQ Chatbot service. Using a production grade Chatbot service Ask Jamie developed by Govtech Singapore, the implemented framework will be using Ask Jamie as a benchmark to make comparisons with Nanyang Technological University MICL Lab’s QA matching engine and various other open source Chatbot engines.
This report contains my approach to develop a full stack application with a ReactJS frontend and a NodeJS backend. Frontend includes a single dashboard with three components. The first component consists of three input methods, text, speech and audio file. The second component will include a collection of four Chatbot services, Ask Jamie, Dialogflow, MICL and a self-implemented text classification model. The last component will showcase a batch query feature for analysis comparison.
The backend will consist of five Application programming interfaces that connects the following Chatbot services, MICL QA Model, ask Jamie API, a Dialogflow API Client, Text Classification model and AISG Speech transcription API.
This report will also include the miscellaneous tasks of data pre-processing on raw data sets, construction of training and testing inputs with Natural Language Processing(NLP) tools. To support the functionality of each application components, system architecture designs and dialog maps will be included for better visualization. |
---|