2024/03: Presented "Rethinking Loss Functions for Fact Verification" at EACL 2024

PDFURLBIBTEXARXIV
TitleRethinking Loss Functions for Fact Verification
AuthorYuta Mukobara1, Yutaro Shigeto2,3, Masashi Shimbo2,3
Affiliation1Tokyo Institute of Technology, 2Chiba Institute of Technology, 3RIKEN AIP
Date2024/03
Conference(Abbreviated Title)The 18th Conference of the European Chapter of the Association for Computational Linguistics (EACL)
Vol., No., PageNon, Non, 432-442
Citation ExampleY. Mukobara, Y. Shigeto, and M. Shimbo, in Proceedings of EACL, pp. 432–442 (2024).

I presented a paper titled "Rethinking Loss Functions for Fact Verification" at EACL 2024, an international conference in the field of natural language processing.

This research focuses on improving predictive accuracy in multi-label classification problems by revising loss functions.

We investigated loss functions for fact verification in the FEVER shared task. Standard cross-entropy loss fails to capture the heterogeneity between FEVER's verdict classes. To address this, we developed two task-specific objective functions. Our results demonstrated that the proposed functions outperformed standard cross-entropy loss. Moreover, combining these functions with simple class weighting effectively mitigated data imbalance in the training set, further enhancing performance.

A separate blog post detailing this presentation is available—be sure to check it out.
Blog post

Other Links
[Stair Lab | GitHub (Source Code)]

-(Presentation) PeerReviewed, English, Presentation
-, ,