Precise Text Summarization Using Deep Learning and NLP
- Version
- Download 8
- File Size 461.23 KB
- File Count 1
- Create Date 25 July 2025
- Last Updated 25 July 2025
Precise Text Summarization Using Deep Learning and NLP
RONGALA RAJESH, MEDAVARTHI VEERENDRA
Assistant professor, MCA Final Semester, Master of Computer Applications, Sanketika Vidya Parishad Engineering College, Vishakhapatnam,
Andhra Pradesh, India.
ABSTRACT
Text summary has become essential in an era of information overload from various sources, including social media, news articles, emails, and text messages. Using the most recent developments in Deep Learning and Natural Language Processing, this project builds a model that can efficiently summarize large amounts of text into clear, concise summaries that improve comprehension and save time. The project is on abstractive text summarization using seq2seq and self-attention mechanism models like BART, PEGASUS, and T5 from Transformers and LSTM in Convolution Neural Networks. Utilizing the auto-regressive and bidirectional properties of the BART encoder-decoder framework, the proposed system design extensively integrates it to enhance summarization performance. The model is ready for sequential data processing after the code implementation, including text pretreatment and tokenization. Sophisticated optimization techniques are applied to improve model performance, and strategies like padding are employed to ensure consistent input sequence lengths. Testing outcomes validate the efficacy of BART's attention mechanisms, demonstrating the significance of the research as a breakthrough in precise and context-aware text summarization
Download