Ensemble Learning with Transformer Models for Sentiment Analysis on Cryptocurrency Tweets
- Version
- Download 14
- File Size 737.70 KB
- File Count 1
- Create Date 19 March 2026
- Last Updated 21 March 2026
Ensemble Learning with Transformer Models for Sentiment Analysis on Cryptocurrency Tweets
Dr. Y. Mohammed Iqbal1, S. Mohamed Fawaz2, Dr. S. Peerbasha3, Dr. M. Mohamed Surputheen4, Dr. M.
Rajakumar5
Department of Computer Science, Jamal Mohamed College, Affiliated to Bharathidasan University,
Tiruchirappalli, Tamil Nadu, India
Abstract- Cryptocurrency markets are strongly influenced by public sentiment shared on social media platforms, especially Twitter, where opinions and reactions can rapidly affect market behaviour. Accurate sentiment analysis of cryptocurrency-related tweets is therefore important for market analysis and decision support systems. This study presents a comparative analysis of three transformer-based language models—RoBERTa, DeBERTa, and FinBERT—for multi-class sentiment classification of cryptocurrency tweets. To ensure reliable evaluation, experiments are conducted using three train–test split configurations: 60–40, 70–30, and 80–20. The dataset is systematically cleaned, normalized,labelled, preprocessing and techniques balanced to support usingtwo consistent experimentation. Model performance is evaluated using accuracy, precision, recall, F1-score, specificity, and ROC-AUC metrics. In addition to individual model evaluation, a weighted soft-voting ensemble framework is proposed to combine the probabilistic outputs of all models. Experimental results demonstrate that the ensemble approach consistently outperforms the individual models across all evaluation settings, achieving an F1-score of 85.82% and an accuracy of86.00%, with the best results obtained using the 80–20 split. These findings indicate that ensemble learning improvesprediction stability, reliability,generalization for cryptocurrency sentiment analysis.
Keywords:Cryptocurrency and Sentiment Analysis, Transformer Models, RoBERTa, DeBERTa, FinBERT, Ensemble Learning.
Download