Title | Rethinking Loss Functions for Fact Verification |
Author | Yuta Mukobara1, Yutaro Shigeto2,3, Masashi Shimbo2,3 |
Affiliation | 1Tokyo Institute of Technology, 2Chiba Institute of Technology, 3RIKEN AIP |
Date | 2024/03 |
Conference(Abbreviated Title) | The 18th Conference of the European Chapter of the Association for Computational Linguistics (EACL) |
Vol., No., Page | Non, Non, 432-442 |
Citation Example | Y. Mukobara, Y. Shigeto, and M. Shimbo, in Proceedings of EACL, pp. 432–442 (2024). |
I presented a paper titled "Rethinking Loss Functions for Fact Verification" at EACL 2024, an international conference in the field of natural language processing.
This research focuses on improving predictive accuracy in multi-label classification problems by revising loss functions.
We investigated loss functions for fact verification in the FEVER shared task. Standard cross-entropy loss fails to capture the heterogeneity between FEVER's verdict classes. To address this, we developed two task-specific objective functions. Our results demonstrated that the proposed functions outperformed standard cross-entropy loss. Moreover, combining these functions with simple class weighting effectively mitigated data imbalance in the training set, further enhancing performance.
A separate blog post detailing this presentation is available—be sure to check it out.
Blog post
Other Links
[Stair Lab | GitHub (Source Code)]