🚴‍♂️
TIL
Ctrlk
  • MAIN
  • : TIL?
  • : WIL
  • : Plan
  • : Retrospective
  • TIL : ML
    • Paper Analysis
    • Boostcamp 2st
      • [S]Data Viz
      • [P]MRC
      • [P]KLUE
      • [U]Stage-CV
      • [U]Stage-NLP
        • 7W Retrospective
        • (10강) Advanced Self-supervised Pre-training Models
        • (09강) Self-supervised Pre-training Models
        • (08강) Transformer (2)
        • (07강) Transformer (1)
        • 6W Retrospective
        • (06강) Beam Search and BLEU score
        • (05강) Sequence to Sequence with Attention
        • (04강) LSTM and GRU
        • (03강) Recurrent Neural Network and Language Modeling
        • (02강) Word Embedding
        • (01강) Intro to NLP, Bag-of-Words
        • [필수 과제 4] Preprocessing for NMT Model
        • [필수 과제 3] Subword-level Language Model
        • [필수 과제2] RNN-based Language Model
        • [선택 과제] BERT Fine-tuning with transformers
        • [필수 과제] Data Preprocessing
      • Mask Wear Image Classification
      • [P]Stage-1
      • [U]Stage-3
      • [U]Stage-2
      • [U]Stage-1
    • 딥러닝 CNN 완벽 가이드 - Fundamental 편
    • AI School 1st
    • 현업 실무자에게 배우는 Kaggle 머신러닝 입문
    • 파이썬 딥러닝 파이토치
  • TIL : Python & Math
    • Do It! 장고+부트스트랩: 파이썬 웹개발의 정석
    • Algorithm
    • Head First Python
    • 데이터 분석을 위한 SQL
    • 단 두 장의 문서로 데이터 분석과 시각화 뽀개기
    • Linear Algebra(Khan Academy)
    • 인공지능을 위한 선형대수
    • Statistics110
  • TIL : etc
    • [따배런] Kubernetes
    • [따배런] Docker
    • CoinTrading
    • Gatsby
  • TIL : Project
    • Mask Wear Image Classification
    • Project. GARIGO
  • 2021 TIL
    • CHANGED
    • JUN
    • MAY
    • APR
    • MAR
    • FEB
    • JAN
  • 2020 TIL
    • DEC
    • NOV
Powered by GitBook
On this page
  1. TIL : ML
  2. Boostcamp 2st

[U]Stage-NLP

7W Retrospective(10강) Advanced Self-supervised Pre-training Models(09강) Self-supervised Pre-training Models(08강) Transformer (2)(07강) Transformer (1)6W Retrospective(06강) Beam Search and BLEU score(05강) Sequence to Sequence with Attention(04강) LSTM and GRU(03강) Recurrent Neural Network and Language Modeling(02강) Word Embedding(01강) Intro to NLP, Bag-of-Words[필수 과제 4] Preprocessing for NMT Model[필수 과제 3] Subword-level Language Model[필수 과제2] RNN-based Language Model[선택 과제] BERT Fine-tuning with transformers[필수 과제] Data Preprocessing
Previous[U]Stage-CVNext7W Retrospective

Was this helpful?

Was this helpful?