728x90 BERT 논문리뷰1 [논문 리뷰] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 2019년 구글이 발표한 BERT 논문 리뷰입니다. 출처 : https://arxiv.org/abs/1810.04805 BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations fro.. 2024. 2. 13. 이전 1 다음 728x90