Optimizing Language Models through Dataset-Specific Post-Training: A Focus on Financial Sentiment Analysis
Hui Do Jung, Jae Heon Kim, Beakcheol Jang, Journal of Internet Computing and Services, Vol. 25, No. 1, pp. 57-67, Feb. 2024
10.7472/jksii.2024.25.1.57, Full Text:
Keywords: BERT, FinBERT, Financial Sentiment Analysis, post-training, Pre-training Dataset
Abstract
Statistics
Show / Hide Statistics
Statistics (Cumulative Counts from November 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.
Statistics (Cumulative Counts from November 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.
|
Cite this article
[APA Style]
Jung, H., Kim, J., & Jang, B. (2024). Optimizing Language Models through Dataset-Specific Post-Training: A Focus on Financial Sentiment Analysis. Journal of Internet Computing and Services, 25(1), 57-67. DOI: 10.7472/jksii.2024.25.1.57.
[IEEE Style]
H. D. Jung, J. H. Kim, B. Jang, "Optimizing Language Models through Dataset-Specific Post-Training: A Focus on Financial Sentiment Analysis," Journal of Internet Computing and Services, vol. 25, no. 1, pp. 57-67, 2024. DOI: 10.7472/jksii.2024.25.1.57.
[ACM Style]
Hui Do Jung, Jae Heon Kim, and Beakcheol Jang. 2024. Optimizing Language Models through Dataset-Specific Post-Training: A Focus on Financial Sentiment Analysis. Journal of Internet Computing and Services, 25, 1, (2024), 57-67. DOI: 10.7472/jksii.2024.25.1.57.