Bert 썸네일형 리스트형 [Pre-training of Deep Bidirectional Transformers for Language Understanding] BERT 논문 리뷰 [Pre-training of Deep Bidirectional Transformers for Language Understanding] BERT https://arxiv.org/abs/1810.04805 BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to .. 더보기 이전 1 다음