SENTIMENT ANALYSIS OF TEXT USING BERT AND DISTILBERT MODELS

Authors

  • Marianna Prytula Ivan Franko National Univercity of Lviv
  • Igor Olenych Ivan Franko National Univercity of Lviv

Abstract

The work describes the fine-tuning of BERT and DistilBERT pre-trained transformer models for the binary sentiment classification of user comments in the Ukrainian language. The models were evaluated according to accuracy, precision, recall, and F1-score metrics. An overall accuracy of 91% obtained for both models demonstrates the significant potential of incorporating these models into sentiment analysis tasks.

References

Akpatsa S.K., Lei H., Li X., Obeng V.K.S., Martey E.M., Addo P.C., Fiawoo D.D. Online News Sentiment Classification Using DistilBERT. Journal of Quantum Computing. Vol. 4. P. 1–11.

Published

2024-06-09