Loading…
Friday April 10, 2026 12:15pm - 2:15pm GMT+07

Authors - Shylaja P, Jayasudha J S
Abstract - The Question Answering system (QA) is one of the popular and widely used ap-plications of NLP. It is an information retrieval system that attempts to find the correct answer for a question based on the given paragraph text. Transformers have been widely used for QA tasks, due to their contextual embedding, attention mechanism, and transfer learning for specialized tasks. Transformer-based models can be easily fine-tuned with large datasets. Such models provide state-of-the-art performance over large datasets for question-answering tasks. The proposed approach compares performance of transformer based model over a small sized dataset. We incorporated an answer formation layer along with transformers to comprehend contextual, syntactical, and semantic information from small-sized datasets. We developed a set of rules according to question categories to generate semantically and syntactically coherent option sentences based on the questions. These rules transformed option phrases into contextually appropriate sentences. We evaluated SBERT transformer models namely all-mpnet-base-v2, all-MiniLM-L6-v2, all-distilroberta-v1 over answer formatted data and it showed in-crease in accuracy. Answer formation rules over noun phrases of small-sized datasets can help transformers to learn contextual knowledge about the options in the QA sample, Addition of answer formation stage on samples of SciQ data resulted in a rise in accuracy from 86 % to 92 % when using all-MiniLM-L6-v2 model.
Paper Presenter
avatar for Shylaja P
Friday April 10, 2026 12:15pm - 2:15pm GMT+07
Virtual Room C Bangkok, Thailand

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link