일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
31 |
- transformer
- 논문리뷰
- ai
- lifelong language learning
- KorQuAD
- 부스트캠프
- 기계독해
- nips
- 기계학습
- 논문정리
- 인공지능
- lifelong
- MRC
- paper review
- 네이버 부스트캠프
- 딥러닝
- 트랜스포머
- 부캠
- 머신러닝
- NLP
- 어텐션
- 네이버 부스트캠프 ai
- 트랜스포머 모델 설명
- AI Tech
- 자연어처리
- Lifelong Learning
- Today
- Total
목록KHU Software Convergence 28
SiRyu AI
합격한 지는 꽤나 지났지만, 앞으로의 학습을 기록하고자 합격 후기를 남기려고 한다. 전형은 서류 전형 -> 1차 코테 -> 2차코테 순서로 이루어져있었다. 1차 코테는 간단한 코딩테스트와 더불어 딥러닝 지식을 물어보는 문제들도 몇 출제되었는데, 1차 테스트에서는 코테가 그렇게 어렵게 나오지 않았기 때문에 이 딥러닝 지식 문제가 가장 중요할 것으로 보인다. 2차 코테는 8개로 나름 코테다운 문제가 나왔다고 생각하는데, 취업을 위한 코딩테스트보다는 그래도 난이도가 낮아 수월하게 해결할 수 있었고, 7솔로 마무리지을 수 있었다. 이렇게 시험을 본 후, 최종 합격을 하게 되었고, 이제 9월 19일부터 약 5개월 간 NLP로의 여정을 시작한다.
Title: 사전학습 어댑터를 사용한 평생 언어 학습 / Lifelong Language Learning Using Pretrained Adapters Authors: 조문기, 박경문 / Moon-Gi Cho, Gyeong-Moon Park Conference: 2022 한국컴퓨터종합학술대회 / KCC 2022 Keywords: Adapters, Lifelong Language Learning, NLP, Transformers Awards: 학부생부문 우수상
Title: Continual Sequence Generation with Adaptive Compositional Modules Authors: Yanzhe Zhang, Xuezhi Wang, Diyi YangYun-Nung Chen Conference: ACL 2022 Link: https://arxiv.org/pdf/2203.10652.pdf Keywords: Transformer, GPT-2, NLP, Continual Learning, Sequence generation, Adapter, Mixup, LAMOL
Title: Parameter-Efficient Transfer Learning for NLP Authors: Neil Houlsby, Andrei Giurgiu, Stanislaw Jastrzebski, Bruna Morrone, Quentin de Laroussilhe, Andrea Gesmundo, Mona Attariyan, Sylvain GellyYun-Nung Chen Conference: PMLR 2019 Link: https://arxiv.org/pdf/1902.00751.pdf Keywords: Transformer, BERT, Adapter, AdapterHub, Multitask learning, Continual learning, Transfer learning
Title: BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension Authors: Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Ves Stoyanov, Luke ZettlemoyerYun-Nung Chen Conference: ACL 2020 Link: https://arxiv.org/pdf/1910.13461.pdf Keywords: Transformer, Pretext task, BERT, BART, Denosing autoencoder
Title: Learning to Solve NLP Tasks in an Incremental Number of Languages Authors: Giuseppe Castellucci, Simone Filice, Danilo Croce, Roberto BasiliYun-Nung Chen Conference: ACL 2021 Link: https://aclanthology.org/2021.acl-short.106.pdf Keywords: Lifelong Learning, Lifelong Language Learning, Knowledge Distillation, Multilingual
Title: Lifelong Language Knowledge Distillation Authors: Yung-Sung Chuang, Shang-Yu Su, Yun-Nung Chen Conference: EMNLP 2020 Link: https://arxiv.org/abs/2010.02123 Keywords: Lifelong Learning, Lifelong Language Learning, Knowledge Distillation, LAMOL, L2KD
Title: Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning Authors: Xisen Jin, Bill Yuchen Lin, Mohammad Rostami, Xiang Ren Conference: EMNLP 2021 Link: https://arxiv.org/abs/2104.08808 Keywords: Lifelong Learning, Few-shot Learning, Lifelong Language Learning, BART-adapter, Hypernetwork, BiHNet-Reg
Title: Continual Learning for Text Classification with Information Disentanglement Based Regularization Authors: Yufan Huang, Yanzhe Zhang, Jiaao Chen, Xuezhi Wang, Diyi Yan Conference: NAACL 2021 Link: https://arxiv.org/abs/2104.05489 Keywords: Continual Learning, Lifelong Language Learning, Text Classification, Information Disentanglement, IDBR
Title: Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment Classification Tasks Authors: Zixuan Ke, Hu Xu, Bing Liu Conference: NAACL 2021 Link: https://arxiv.org/abs/2112.03271 Keywords: Capsule Network(CapsNet), Dynamic Routing, Continual Learning, Aspect Sentiment Classification, Lifelong Language Learning, BERT-adapter, B-CL