Everything you wanted to know about ChatGPT: Components, capabilities, applications, and opportunities
No Thumbnail Available
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
John Wiley & Sons Ltd
Open Access Color
OpenAIRE Downloads
OpenAIRE Views
Abstract
Conversational Artificial Intelligence (AI) and Natural Language Processing have advanced significantly with the creation of a Generative Pre-trained Transformer (ChatGPT) by OpenAI. ChatGPT uses deep learning techniques like transformer architecture and self-attention mechanisms to replicate human speech and provide coherent and appropriate replies to the situation. The model mainly depends on the patterns discovered in the training data, which might result in incorrect or illogical conclusions. In the context of open-domain chats, we investigate the components, capabilities constraints, and potential applications of ChatGPT along with future opportunities. We begin by describing the components of ChatGPT followed by a definition of chatbots. We present a new taxonomy to classify them. Our taxonomy includes rule-based chatbots, retrieval-based chatbots, generative chatbots, and hybrid chatbots. Next, we describe the capabilities and constraints of ChatGPT. Finally, we present potential applications of ChatGPT and future research opportunities. The results showed that ChatGPT, a transformer-based chatbot model, utilizes encoders to produce coherent responses.
Description
Heidari, Arash/0000-0003-4279-8551
ORCID
Keywords
ChatGPT, conversational artificial intelligence, deep learning, generative pre-trained transformer, large language models, natural language processing, self-attention mechanisms
Turkish CoHE Thesis Center URL
Fields of Science
Citation
0
WoS Q
N/A
Scopus Q
Q3