๐Ÿšดโ€โ™‚๏ธ
TIL
  • MAIN
  • : TIL?
  • : WIL
  • : Plan
  • : Retrospective
    • 21Y
      • Wait a moment!
      • 9M 2W
      • 9M1W
      • 8M4W
      • 8M3W
      • 8M2W
      • 8M1W
      • 7M4W
      • 7M3W
      • 7M2W
      • 7M1W
      • 6M5W
      • 1H
    • ์ƒˆ์‚ฌ๋žŒ ๋˜๊ธฐ ํ”„๋กœ์ ํŠธ
      • 2ํšŒ์ฐจ
      • 1ํšŒ์ฐจ
  • TIL : ML
    • Paper Analysis
      • BERT
      • Transformer
    • Boostcamp 2st
      • [S]Data Viz
        • (4-3) Seaborn ์‹ฌํ™”
        • (4-2) Seaborn ๊ธฐ์ดˆ
        • (4-1) Seaborn ์†Œ๊ฐœ
        • (3-4) More Tips
        • (3-3) Facet ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-2) Color ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-1) Text ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-3) Scatter Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-2) Line Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-1) Bar Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (1-3) Python๊ณผ Matplotlib
        • (1-2) ์‹œ๊ฐํ™”์˜ ์š”์†Œ
        • (1-1) Welcome to Visualization (OT)
      • [P]MRC
        • (2๊ฐ•) Extraction-based MRC
        • (1๊ฐ•) MRC Intro & Python Basics
      • [P]KLUE
        • (5๊ฐ•) BERT ๊ธฐ๋ฐ˜ ๋‹จ์ผ ๋ฌธ์žฅ ๋ถ„๋ฅ˜ ๋ชจ๋ธ ํ•™์Šต
        • (4๊ฐ•) ํ•œ๊ตญ์–ด BERT ์–ธ์–ด ๋ชจ๋ธ ํ•™์Šต
        • [NLP] ๋ฌธ์žฅ ๋‚ด ๊ฐœ์ฒด๊ฐ„ ๊ด€๊ณ„ ์ถ”์ถœ
        • (3๊ฐ•) BERT ์–ธ์–ด๋ชจ๋ธ ์†Œ๊ฐœ
        • (2๊ฐ•) ์ž์—ฐ์–ด์˜ ์ „์ฒ˜๋ฆฌ
        • (1๊ฐ•) ์ธ๊ณต์ง€๋Šฅ๊ณผ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ
      • [U]Stage-CV
      • [U]Stage-NLP
        • 7W Retrospective
        • (10๊ฐ•) Advanced Self-supervised Pre-training Models
        • (09๊ฐ•) Self-supervised Pre-training Models
        • (08๊ฐ•) Transformer (2)
        • (07๊ฐ•) Transformer (1)
        • 6W Retrospective
        • (06๊ฐ•) Beam Search and BLEU score
        • (05๊ฐ•) Sequence to Sequence with Attention
        • (04๊ฐ•) LSTM and GRU
        • (03๊ฐ•) Recurrent Neural Network and Language Modeling
        • (02๊ฐ•) Word Embedding
        • (01๊ฐ•) Intro to NLP, Bag-of-Words
        • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Preprocessing for NMT Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Subword-level Language Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ2] RNN-based Language Model
        • [์„ ํƒ ๊ณผ์ œ] BERT Fine-tuning with transformers
        • [ํ•„์ˆ˜ ๊ณผ์ œ] Data Preprocessing
      • Mask Wear Image Classification
        • 5W Retrospective
        • Report_Level1_6
        • Performance | Review
        • DAY 11 : HardVoting | MultiLabelClassification
        • DAY 10 : Cutmix
        • DAY 9 : Loss Function
        • DAY 8 : Baseline
        • DAY 7 : Class Imbalance | Stratification
        • DAY 6 : Error Fix
        • DAY 5 : Facenet | Save
        • DAY 4 : VIT | F1_Loss | LrScheduler
        • DAY 3 : DataSet/Lodaer | EfficientNet
        • DAY 2 : Labeling
        • DAY 1 : EDA
        • 2_EDA Analysis
      • [P]Stage-1
        • 4W Retrospective
        • (10๊ฐ•) Experiment Toolkits & Tips
        • (9๊ฐ•) Ensemble
        • (8๊ฐ•) Training & Inference 2
        • (7๊ฐ•) Training & Inference 1
        • (6๊ฐ•) Model 2
        • (5๊ฐ•) Model 1
        • (4๊ฐ•) Data Generation
        • (3๊ฐ•) Dataset
        • (2๊ฐ•) Image Classification & EDA
        • (1๊ฐ•) Competition with AI Stages!
      • [U]Stage-3
        • 3W Retrospective
        • PyTorch
          • (10๊ฐ•) PyTorch Troubleshooting
          • (09๊ฐ•) Hyperparameter Tuning
          • (08๊ฐ•) Multi-GPU ํ•™์Šต
          • (07๊ฐ•) Monitoring tools for PyTorch
          • (06๊ฐ•) ๋ชจ๋ธ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ
          • (05๊ฐ•) Dataset & Dataloader
          • (04๊ฐ•) AutoGrad & Optimizer
          • (03๊ฐ•) PyTorch ํ”„๋กœ์ ํŠธ ๊ตฌ์กฐ ์ดํ•ดํ•˜๊ธฐ
          • (02๊ฐ•) PyTorch Basics
          • (01๊ฐ•) Introduction to PyTorch
      • [U]Stage-2
        • 2W Retrospective
        • DL Basic
          • (10๊ฐ•) Generative Models 2
          • (09๊ฐ•) Generative Models 1
          • (08๊ฐ•) Sequential Models - Transformer
          • (07๊ฐ•) Sequential Models - RNN
          • (06๊ฐ•) Computer Vision Applications
          • (05๊ฐ•) Modern CNN - 1x1 convolution์˜ ์ค‘์š”์„ฑ
          • (04๊ฐ•) Convolution์€ ๋ฌด์—‡์ธ๊ฐ€?
          • (03๊ฐ•) Optimization
          • (02๊ฐ•) ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ - MLP (Multi-Layer Perceptron)
          • (01๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ๊ธฐ๋ณธ ์šฉ์–ด ์„ค๋ช… - Historical Review
        • Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Multi-headed Attention Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] LSTM Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] CNN Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Optimization Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] MLP Assignment
      • [U]Stage-1
        • 1W Retrospective
        • AI Math
          • (AI Math 10๊ฐ•) RNN ์ฒซ๊ฑธ์Œ
          • (AI Math 9๊ฐ•) CNN ์ฒซ๊ฑธ์Œ
          • (AI Math 8๊ฐ•) ๋ฒ ์ด์ฆˆ ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 7๊ฐ•) ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 6๊ฐ•) ํ™•๋ฅ ๋ก  ๋ง›๋ณด๊ธฐ
          • (AI Math 5๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ํ•™์Šต๋ฐฉ๋ฒ• ์ดํ•ดํ•˜๊ธฐ
          • (AI Math 4๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ๋งค์šด๋ง›
          • (AI Math 3๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ์ˆœํ•œ๋ง›
          • (AI Math 2๊ฐ•) ํ–‰๋ ฌ์ด ๋ญ์˜ˆ์š”?
          • (AI Math 1๊ฐ•) ๋ฒกํ„ฐ๊ฐ€ ๋ญ์˜ˆ์š”?
        • Python
          • (Python 7-2๊ฐ•) pandas II
          • (Python 7-1๊ฐ•) pandas I
          • (Python 6๊ฐ•) numpy
          • (Python 5-2๊ฐ•) Python data handling
          • (Python 5-1๊ฐ•) File / Exception / Log Handling
          • (Python 4-2๊ฐ•) Module and Project
          • (Python 4-1๊ฐ•) Python Object Oriented Programming
          • (Python 3-2๊ฐ•) Pythonic code
          • (Python 3-1๊ฐ•) Python Data Structure
          • (Python 2-4๊ฐ•) String and advanced function concept
          • (Python 2-3๊ฐ•) Conditionals and Loops
          • (Python 2-2๊ฐ•) Function and Console I/O
          • (Python 2-1๊ฐ•) Variables
          • (Python 1-3๊ฐ•) ํŒŒ์ด์ฌ ์ฝ”๋”ฉ ํ™˜๊ฒฝ
          • (Python 1-2๊ฐ•) ํŒŒ์ด์ฌ ๊ฐœ์š”
          • (Python 1-1๊ฐ•) Basic computer class for newbies
        • Assignment
          • [์„ ํƒ ๊ณผ์ œ 3] Maximum Likelihood Estimate
          • [์„ ํƒ ๊ณผ์ œ 2] Backpropagation
          • [์„ ํƒ ๊ณผ์ œ 1] Gradient Descent
          • [ํ•„์ˆ˜ ๊ณผ์ œ 5] Morsecode
          • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Baseball
          • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Text Processing 2
          • [ํ•„์ˆ˜ ๊ณผ์ œ 2] Text Processing 1
          • [ํ•„์ˆ˜ ๊ณผ์ œ 1] Basic Math
    • ๋”ฅ๋Ÿฌ๋‹ CNN ์™„๋ฒฝ ๊ฐ€์ด๋“œ - Fundamental ํŽธ
      • ์ข…ํ•ฉ ์‹ค์Šต 2 - ์บ๊ธ€ Plant Pathology(๋‚˜๋ฌด์žŽ ๋ณ‘ ์ง„๋‹จ) ๊ฒฝ์—ฐ ๋Œ€ํšŒ
      • ์ข…ํ•ฉ ์‹ค์Šต 1 - 120์ข…์˜ Dog Breed Identification ๋ชจ๋ธ ์ตœ์ ํ™”
      • ์‚ฌ์ „ ํ›ˆ๋ จ ๋ชจ๋ธ์˜ ๋ฏธ์„ธ ์กฐ์ • ํ•™์Šต๊ณผ ๋‹ค์–‘ํ•œ Learning Rate Scheduler์˜ ์ ์šฉ
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - ResNet ์ƒ์„ธ์™€ EfficientNet ๊ฐœ์š”
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - AlexNet, VGGNet, GoogLeNet
      • Albumentation์„ ์ด์šฉํ•œ Augmentation๊ธฐ๋ฒ•๊ณผ Keras Sequence ํ™œ์šฉํ•˜๊ธฐ
      • ์‚ฌ์ „ ํ›ˆ๋ จ CNN ๋ชจ๋ธ์˜ ํ™œ์šฉ๊ณผ Keras Generator ๋ฉ”์ปค๋‹ˆ์ฆ˜ ์ดํ•ด
      • ๋ฐ์ดํ„ฐ ์ฆ๊ฐ•์˜ ์ดํ•ด - Keras ImageDataGenerator ํ™œ์šฉ
      • CNN ๋ชจ๋ธ ๊ตฌํ˜„ ๋ฐ ์„ฑ๋Šฅ ํ–ฅ์ƒ ๊ธฐ๋ณธ ๊ธฐ๋ฒ• ์ ์šฉํ•˜๊ธฐ
    • AI School 1st
    • ํ˜„์—… ์‹ค๋ฌด์ž์—๊ฒŒ ๋ฐฐ์šฐ๋Š” Kaggle ๋จธ์‹ ๋Ÿฌ๋‹ ์ž…๋ฌธ
    • ํŒŒ์ด์ฌ ๋”ฅ๋Ÿฌ๋‹ ํŒŒ์ดํ† ์น˜
  • TIL : Python & Math
    • Do It! ์žฅ๊ณ +๋ถ€ํŠธ์ŠคํŠธ๋žฉ: ํŒŒ์ด์ฌ ์›น๊ฐœ๋ฐœ์˜ ์ •์„
      • Relations - ๋‹ค๋Œ€๋‹ค ๊ด€๊ณ„
      • Relations - ๋‹ค๋Œ€์ผ ๊ด€๊ณ„
      • ํ…œํ”Œ๋ฆฟ ํŒŒ์ผ ๋ชจ๋“ˆํ™” ํ•˜๊ธฐ
      • TDD (Test Driven Development)
      • template tags & ์กฐ๊ฑด๋ฌธ
      • ์ •์  ํŒŒ์ผ(static files) & ๋ฏธ๋””์–ด ํŒŒ์ผ(media files)
      • FBV (Function Based View)์™€ CBV (Class Based View)
      • Django ์ž…๋ฌธํ•˜๊ธฐ
      • ๋ถ€ํŠธ์ŠคํŠธ๋žฉ
      • ํ”„๋ก ํŠธ์—”๋“œ ๊ธฐ์ดˆ๋‹ค์ง€๊ธฐ (HTML, CSS, JS)
      • ๋“ค์–ด๊ฐ€๊ธฐ + ํ™˜๊ฒฝ์„ค์ •
    • Algorithm
      • Programmers
        • Level1
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์ˆซ์ž ๋ฌธ์ž์—ด๊ณผ ์˜๋‹จ์–ด
          • ์ž์—ฐ์ˆ˜ ๋’ค์ง‘์–ด ๋ฐฐ์—ด๋กœ ๋งŒ๋“ค๊ธฐ
          • ์ •์ˆ˜ ๋‚ด๋ฆผ์ฐจ์ˆœ์œผ๋กœ ๋ฐฐ์น˜ํ•˜๊ธฐ
          • ์ •์ˆ˜ ์ œ๊ณฑ๊ทผ ํŒ๋ณ„
          • ์ œ์ผ ์ž‘์€ ์ˆ˜ ์ œ๊ฑฐํ•˜๊ธฐ
          • ์ง์‚ฌ๊ฐํ˜• ๋ณ„์ฐ๊ธฐ
          • ์ง์ˆ˜์™€ ํ™€์ˆ˜
          • ์ฒด์œก๋ณต
          • ์ตœ๋Œ€๊ณต์•ฝ์ˆ˜์™€ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • ์ฝœ๋ผ์ธ  ์ถ”์ธก
          • ํฌ๋ ˆ์ธ ์ธํ˜•๋ฝ‘๊ธฐ ๊ฒŒ์ž„
          • ํ‚คํŒจ๋“œ ๋ˆ„๋ฅด๊ธฐ
          • ํ‰๊ท  ๊ตฌํ•˜๊ธฐ
          • ํฐ์ผ“๋ชฌ
          • ํ•˜์ƒค๋“œ ์ˆ˜
          • ํ•ธ๋“œํฐ ๋ฒˆํ˜ธ ๊ฐ€๋ฆฌ๊ธฐ
          • ํ–‰๋ ฌ์˜ ๋ง์…ˆ
        • Level2
          • ์ˆซ์ž์˜ ํ‘œํ˜„
          • ์ˆœ์œ„ ๊ฒ€์ƒ‰
          • ์ˆ˜์‹ ์ตœ๋Œ€ํ™”
          • ์†Œ์ˆ˜ ์ฐพ๊ธฐ
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์‚ผ๊ฐ ๋‹ฌํŒฝ์ด
          • ๋ฌธ์ž์—ด ์••์ถ•
          • ๋ฉ”๋‰ด ๋ฆฌ๋‰ด์–ผ
          • ๋” ๋งต๊ฒŒ
          • ๋•…๋”ฐ๋จน๊ธฐ
          • ๋ฉ€์ฉกํ•œ ์‚ฌ๊ฐํ˜•
          • ๊ด„ํ˜ธ ํšŒ์ „ํ•˜๊ธฐ
          • ๊ด„ํ˜ธ ๋ณ€ํ™˜
          • ๊ตฌ๋ช…๋ณดํŠธ
          • ๊ธฐ๋Šฅ ๊ฐœ๋ฐœ
          • ๋‰ด์Šค ํด๋Ÿฌ์Šคํ„ฐ๋ง
          • ๋‹ค๋ฆฌ๋ฅผ ์ง€๋‚˜๋Š” ํŠธ๋Ÿญ
          • ๋‹ค์Œ ํฐ ์ˆซ์ž
          • ๊ฒŒ์ž„ ๋งต ์ตœ๋‹จ๊ฑฐ๋ฆฌ
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
          • ๊ฐ€์žฅ ํฐ ์ •์‚ฌ๊ฐํ˜• ์ฐพ๊ธฐ
          • H-Index
          • JadenCase ๋ฌธ์ž์—ด ๋งŒ๋“ค๊ธฐ
          • N๊ฐœ์˜ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • N์ง„์ˆ˜ ๊ฒŒ์ž„
          • ๊ฐ€์žฅ ํฐ ์ˆ˜
          • 124 ๋‚˜๋ผ์˜ ์ˆซ์ž
          • 2๊ฐœ ์ดํ•˜๋กœ ๋‹ค๋ฅธ ๋น„ํŠธ
          • [3์ฐจ] ํŒŒ์ผ๋ช… ์ •๋ ฌ
          • [3์ฐจ] ์••์ถ•
          • ์ค„ ์„œ๋Š” ๋ฐฉ๋ฒ•
          • [3์ฐจ] ๋ฐฉ๊ธˆ ๊ทธ๊ณก
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
        • Level3
          • ๋งค์นญ ์ ์ˆ˜
          • ์™ธ๋ฒฝ ์ ๊ฒ€
          • ๊ธฐ์ง€๊ตญ ์„ค์น˜
          • ์ˆซ์ž ๊ฒŒ์ž„
          • 110 ์˜ฎ๊ธฐ๊ธฐ
          • ๊ด‘๊ณ  ์ œ๊ฑฐ
          • ๊ธธ ์ฐพ๊ธฐ ๊ฒŒ์ž„
          • ์…”ํ‹€๋ฒ„์Šค
          • ๋‹จ์†์นด๋ฉ”๋ผ
          • ํ‘œ ํŽธ์ง‘
          • N-Queen
          • ์ง•๊ฒ€๋‹ค๋ฆฌ ๊ฑด๋„ˆ๊ธฐ
          • ์ตœ๊ณ ์˜ ์ง‘ํ•ฉ
          • ํ•ฉ์Šน ํƒ์‹œ ์š”๊ธˆ
          • ๊ฑฐ์Šค๋ฆ„๋ˆ
          • ํ•˜๋…ธ์ด์˜ ํƒ‘
          • ๋ฉ€๋ฆฌ ๋›ฐ๊ธฐ
          • ๋ชจ๋‘ 0์œผ๋กœ ๋งŒ๋“ค๊ธฐ
        • Level4
    • Head First Python
    • ๋ฐ์ดํ„ฐ ๋ถ„์„์„ ์œ„ํ•œ SQL
    • ๋‹จ ๋‘ ์žฅ์˜ ๋ฌธ์„œ๋กœ ๋ฐ์ดํ„ฐ ๋ถ„์„๊ณผ ์‹œ๊ฐํ™” ๋ฝ€๊ฐœ๊ธฐ
    • Linear Algebra(Khan Academy)
    • ์ธ๊ณต์ง€๋Šฅ์„ ์œ„ํ•œ ์„ ํ˜•๋Œ€์ˆ˜
    • Statistics110
  • TIL : etc
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Kubernetes
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Docker
      • 2. ๋„์ปค ์„ค์น˜ ์‹ค์Šต 1 - ํ•™์ŠตํŽธ(์ค€๋น„๋ฌผ/์‹ค์Šต ์œ ํ˜• ์†Œ๊ฐœ)
      • 1. ์ปจํ…Œ์ด๋„ˆ์™€ ๋„์ปค์˜ ์ดํ•ด - ์ปจํ…Œ์ด๋„ˆ๋ฅผ ์“ฐ๋Š”์ด์œ  / ์ผ๋ฐ˜ํ”„๋กœ๊ทธ๋žจ๊ณผ ์ปจํ…Œ์ด๋„ˆํ”„๋กœ๊ทธ๋žจ์˜ ์ฐจ์ด์ 
      • 0. ๋“œ๋””์–ด ์ฐพ์•„์˜จ Docker ๊ฐ•์˜! ์™•์ดˆ๋ณด์—์„œ ๋„์ปค ๋งˆ์Šคํ„ฐ๋กœ - OT
    • CoinTrading
      • [๊ฐ€์ƒ ํ™”ํ ์ž๋™ ๋งค๋งค ํ”„๋กœ๊ทธ๋žจ] ๋ฐฑํ…Œ์ŠคํŒ… : ๊ฐ„๋‹จํ•œ ํ…Œ์ŠคํŒ…
    • Gatsby
      • 01 ๊นƒ๋ถ ํฌ๊ธฐ ์„ ์–ธ
  • TIL : Project
    • Mask Wear Image Classification
    • Project. GARIGO
  • 2021 TIL
    • CHANGED
    • JUN
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Thu
      • 2 Wed
      • 1 Tue
    • MAY
      • 31 Mon
      • 30 Sun
      • 29 Sat
      • 28 Fri
      • 27 Thu
      • 26 Wed
      • 25 Tue
      • 24 Mon
      • 23 Sun
      • 22 Sat
      • 21 Fri
      • 20 Thu
      • 19 Wed
      • 18 Tue
      • 17 Mon
      • 16 Sun
      • 15 Sat
      • 14 Fri
      • 13 Thu
      • 12 Wed
      • 11 Tue
      • 10 Mon
      • 9 Sun
      • 8 Sat
      • 7 Fri
      • 6 Thu
      • 5 Wed
      • 4 Tue
      • 3 Mon
      • 2 Sun
      • 1 Sat
    • APR
      • 30 Fri
      • 29 Thu
      • 28 Wed
      • 27 Tue
      • 26 Mon
      • 25 Sun
      • 24 Sat
      • 23 Fri
      • 22 Thu
      • 21 Wed
      • 20 Tue
      • 19 Mon
      • 18 Sun
      • 17 Sat
      • 16 Fri
      • 15 Thu
      • 14 Wed
      • 13 Tue
      • 12 Mon
      • 11 Sun
      • 10 Sat
      • 9 Fri
      • 8 Thu
      • 7 Wed
      • 6 Tue
      • 5 Mon
      • 4 Sun
      • 3 Sat
      • 2 Fri
      • 1 Thu
    • MAR
      • 31 Wed
      • 30 Tue
      • 29 Mon
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • FEB
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • JAN
      • 31 Sun
      • 30 Sat
      • 29 Fri
      • 28 Thu
      • 27 Wed
      • 26 Tue
      • 25 Mon
      • 24 Sun
      • 23 Sat
      • 22 Fri
      • 21 Thu
      • 20 Wed
      • 19 Tue
      • 18 Mon
      • 17 Sun
      • 16 Sat
      • 15 Fri
      • 14 Thu
      • 13 Wed
      • 12 Tue
      • 11 Mon
      • 10 Sun
      • 9 Sat
      • 8 Fri
      • 7 Thu
      • 6 Wed
      • 5 Tue
      • 4 Mon
      • 3 Sun
      • 2 Sat
      • 1 Fri
  • 2020 TIL
    • DEC
      • 31 Thu
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Tue
      • 2 Wed
      • 1 Tue
    • NOV
      • 30 Mon
Powered by GitBook
On this page
  • 1. Seq2Seq with attention Encoder-decoder architecture Attention mechanism
  • Seq2Seq Model
  • Seq2Seq Model with Attention
  • Different Attention Mechanisms
  • Attention is Great
  • ์‹ค์Šต
  • Encoder
  • Seq2Seq ๋ชจ๋ธ ๊ตฌ์ถ•

Was this helpful?

  1. TIL : ML
  2. Boostcamp 2st
  3. [U]Stage-NLP

(05๊ฐ•) Sequence to Sequence with Attention

210908

Previous(06๊ฐ•) Beam Search and BLEU scoreNext(04๊ฐ•) LSTM and GRU

Last updated 3 years ago

Was this helpful?

1. Seq2Seq with attention Encoder-decoder architecture Attention mechanism

Seq2Seq Model

์•ž์„œ ๋ฐฐ์šด RNN์˜ ๊ตฌ์กฐ ์ค‘ Many to Many์— ํ•ด๋‹นํ•˜๋Š” ๋ชจ๋ธ์ด๋‹ค. ๋ณดํ†ต ์ž…๋ ฅ์€ word ๋‹จ์œ„์˜ ๋ฌธ์žฅ์ด๊ณ  ์ถœ๋ ฅ๋„ ๋™์ผํ•˜๋‹ค.

์ด ๋•Œ, ์ž…๋ ฅ ๋ฌธ์žฅ์„ ๋ฐ›๋Š” ๋ชจ๋ธ์„ ์ธ์ฝ”๋”๋ผ๊ณ  ํ•˜๊ณ  ํ•˜๋‚˜ํ•˜๋‚˜ ๋‹ต์„ ๋‚ด๋†“๋Š” ๋ถ€๋ถ„์„ ๋””์ฝ”๋”๋ผ๊ณ  ํ•œ๋‹ค. ์ธ์ฝ”๋”์™€ ๋””์ฝ”๋”๋Š” ์„œ๋กœ ๋‹ค๋ฅธ RNN ๋ชจ๋ธ์ด๋‹ค. ๊ทธ๋ž˜์„œ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ๊ณต์œ ํ•˜๊ฑฐ๋‚˜ ํ•˜์ง€ ์•Š๋Š”๋‹ค. (์ธ์ฝ”๋”์™€ ๋””์ฝ”๋” ๊ฐ๊ฐ์€ ๋‚ด๋ถ€์ ์œผ๋กœ ๊ณต์œ ํ•œ๋‹ค)

๋˜ํ•œ, ๋‚ด๋ถ€ ๊ตฌ์กฐ๋ฅผ ์ž์„ธํžˆ ๋ณด๋ฉด LSTM์„ ์ฑ„์šฉํ•œ ๊ฒƒ์„ ์•Œ ์ˆ˜ ์žˆ๋‹ค. ์ธ์ฝ”๋”์˜ ๋งˆ์ง€๋ง‰ ๋‹จ์–ด๊นŒ์ง€ ์ฝ์€ ํ›„ ์ƒ์„ฑ๋˜๋Š” ๋งˆ์ง€๋ง‰ ์Šคํ…์˜ Hidden state๋Š” ๋””์ฝ”๋”์˜ h0๋กœ์„œ์˜ ์—ญํ• ์„ ํ•œ๋‹ค. ์ด hidden state๋Š” ์ž…๋ ฅ์— ๋Œ€ํ•œ ์ •๋ณด๋ฅผ ์ž˜ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค๊ณ  ๋ณผ ์ˆ˜ ์žˆ๊ณ  ์ด๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ๋””์ฝ”๋”์—์„œ ์‚ฌ์šฉํ•œ๋‹ค๊ณ  ๋ณผ ์ˆ˜ ์žˆ๋‹ค.

<Start> ํ† ํฐ ๋˜๋Š” <SoS> (Start of Sentence) ํ† ํฐ์ด ์ž…๋ ฅ๋˜๋ฉด์„œ ๋””์ฝ”๋”๊ฐ€ ์ž‘๋™๋˜๊ธฐ ์‹œ์ž‘ํ•˜๋ฉฐ <End> ํ† ํฐ ๋˜๋Š” <EoS> (End of Sentence) ํ† ํฐ์ด ๋‚˜์˜ฌ ๋•Œ ๊นŒ์ง€ ๋””์ฝ”๋” RNN์„ ๊ตฌ๋™ํ•œ๋‹ค.

Hidden state์˜ ํฌ๊ธฐ๋Š” ์ฒ˜์Œ์— ๊ณ ์ •ํ•˜๊ธฐ ๋•Œ๋ฌธ์— ์•„๋ฌด๋ฆฌ ์งง์€ ๋ฌธ์žฅ์ด๋ผ๋„ hidden dimension๋งŒํผ์˜ ์ •๋ณด๋ฅผ ์ €์žฅํ•ด์•ผ ํ•˜๊ณ , ์•„๋ฌด๋ฆฌ ๊ธด ๋ฌธ์žฅ์ด๋ผ๋„ hidden dimnesion ๋งŒํผ์œผ๋กœ ์ •๋ณด๋ฅผ ์••์ถ•ํ•ด์•ผ ํ•œ๋‹ค.

๋˜, LSTM์ด Long Term Dependency๋ฅผ ํ•ด๊ฒฐํ–ˆ๋‹ค๊ณ  ํ•˜๋”๋ผ๋„ ํ›จ์”ฌ ์ด์ „์— ๋‚˜ํƒ€๋‚œ ์ •๋ณด๋Š” ๋ณ€์งˆ๋˜๊ฑฐ๋‚˜ ์†Œ์‹ค๋œ๋‹ค. ๊ทธ๋ž˜์„œ ๋ฌธ์žฅ์ด ๊ธธ๋‹ค๋ณด๋ฉด ์ฒซ๋ฒˆ์งธ ๋‹จ์–ด์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ์ ๊ธฐ ๋•Œ๋ฌธ์— ๋””์ฝ”๋”์˜ ์‹œ์ž‘๋ถ€ํ„ฐ ํ’ˆ์งˆ์ด ๋‚˜๋น ์ง€๋Š” ๋ฌธ์ œ๊ฐ€ ๋ฐœ์ƒํ•œ๋‹ค. ์ด์— ๋Œ€ํ•œ ํ…Œํฌ๋‹‰์œผ๋กœ "I go home" ์œผ๋กœ ์ž…๋ ฅํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹Œ "home go I"๋กœ ์ž…๋ ฅํ•ด์„œ ๋ฌธ์žฅ์˜ ์ดˆ๋ฐ˜ ์ •๋ณด๋ฅผ ์ž˜ ์œ ์ง€ํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•œ๋‹ค.

๋””์ฝ”๋”๋Š” ์ธ์ฝ”๋”์—์„œ ๋งˆ์ง€๋ง‰์œผ๋กœ ๋‚˜์˜จ hIdden state๋ฅผ h0์œผ๋กœ ์‚ฌ์šฉํ•˜์ง€๋งŒ ์ด๊ฒƒ๋งŒ์„ ์‚ฌ์šฉํ•˜์ง€ ์•Š๋Š”๋‹ค. ์ธ์ฝ”๋”์˜ ๊ฐ time step์—์„œ ๋‚˜์˜จ hidden state๋ฅผ ๋ชจ๋‘ ์ œ๊ณต๋ฐ›๊ณ  ์ด ์ค‘ ์„ ๋ณ„์ ์œผ๋กœ ์‚ฌ์šฉํ•ด์„œ ์˜ˆ์ธก์— ๋„์›€์„ ์ฃผ๋Š” ํ˜•ํƒœ๋กœ ํ™œ์šฉํ•œ๋‹ค. ์ด๊ฒƒ์ด attention ๋ชจ๋“ˆ์˜ ๊ธฐ๋ณธ์ ์ธ ์•„์ด๋””์–ด์ด๋‹ค.

Seq2Seq Model with Attention

hidden state๊ฐ€ 4๊ฐœ์˜ ์ฐจ์›์œผ๋กœ ๊ตฌ์„ฑ๋˜์—ˆ๊ณ  ํ”„๋ž‘์Šค์–ด๋ฅผ ์˜์–ด๋กœ ๋ณ€ํ™˜ํ•˜๋Š” ๊ณผ์ •์„ ์˜ˆ์‹œ๋กœ ๋“  ์ด๋ฏธ์ง€์ด๋‹ค. ๋‹ค์Œ๊ณผ ๊ฐ™์€ ์ˆœ์„œ๋กœ ๊ตฌ์„ฑ๋œ๋‹ค.

  • ์ธ์ฝ”๋”์—์„œ ์ž…๋ ฅ๋ณ„๋กœ hidden state๊ฐ€ ์ƒ์„ฑ๋˜๋ฉฐ ์ตœ์ข… hidden state๊ฐ€ ๋””์ฝ”๋”์— ์ œ๊ณต๋œ๋‹ค.

  • ๋””์ฝ”๋”๋Š” h0์™€ <sos> ํ† ํฐ์„ ๊ฐ€์ง€๊ณ  ์ฒซ๋ฒˆ์งธ h state๋ฅผ ์ƒ์„ฑํ•œ๋‹ค.

  • ์ฒซ๋ฒˆ์งธ h state๋Š” ์ธ์ฝ”๋”์˜ ๊ฐ๊ฐ์˜ h state์™€ ๋‚ด์ ์„ ํ•˜๊ฒŒ ๋œ๋‹ค.

    • ๋‚ด์ ์„ ํ•œ๋‹ค๋Š” ๊ฒƒ์€ ์œ ์‚ฌ๋„๋ฅผ ๋น„๊ตํ•˜๊ฒ ๋‹ค๋Š” ์˜๋ฏธ.

  • ์ดํ›„, ๊ฐ ์œ ์‚ฌ๋„๋ฅผ sofrmaxํ•œ ๊ฐ’์„ ๊ฐ€์ค‘์น˜๋กœ ์–ป๊ฒŒ๋œ๋‹ค.

  • ์ด ๋•Œ attention output ๋ฒกํ„ฐ๋Š” ๊ฐ€์ค‘ํ‰๊ท ๋œ ๋ฒกํ„ฐ์ด๋ฉฐ context ๋ฒกํ„ฐ๋ผ๊ณ ๋„ ๋ถ€๋ฅธ๋‹ค.

  • ์ดํ›„ ๋””์ฝ”๋”๋Š” ๋””์ฝ”๋”์˜ h state์™€ attention output ์„ concat ํ•˜๋ฉฐ ์˜ˆ์ธก๊ฐ’์„ ๋ฐ˜ํ™˜ํ•˜๊ฒŒ๋œ๋‹ค.

  • ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ, ๋””์ฝ”๋”์˜ ๋‘๋ฒˆ์งธ step์—์„œ๋„ ๋™์ผํ•œ ๋ฉ”์ปค๋‹ˆ์ฆ˜์ด ์ ์šฉ๋œ๋‹ค.

  • <eos> ํ† ํฐ์ด ๋‚˜์˜ฌ๋•Œ๊นŒ์ง€ ์ž‘๋™๋œ๋‹ค.

์ •๋ฆฌํ•˜๋ฉด RNN์˜ ๋””์ฝ”๋”๋Š” 1) ๋‹ค์Œ ๋‹จ์–ด๋ฅผ ์˜ˆ์ธกํ•˜๊ณ  2) ์ธ์ฝ”๋”๋กœ๋ถ€ํ„ฐ ํ•„์š”๋กœ ํ•˜๋Š” ์ •๋ณด๋ฅผ ์ทจ์‚ฌ์„ ํƒํ•˜๋„๋ก, ํ•™์Šต์ด ์ง„ํ–‰๋œ๋‹ค. ์—ญ์ „ํŒŒ์— ๊ด€์ ์—์„œ๋„, Attention ๋ฒกํ„ฐ๊ฐ€ ๋‹ค์‹œ ์„ ํƒ๋  ์ˆ˜ ์žˆ๋„๋ก ์ธ์ฝ”๋”์˜ hidden state๊ฐ€ ๊ฐฑ์‹ ๋œ๋‹ค. ์ธ์ฝ”๋”์˜ h state๊ฐ€ ๊ฐฑ์‹ ๋˜๋ฏ€๋กœ ๋‹น์—ฐํžˆ ๋””์ฝ”๋”์˜ h state๋„ ๊ฐฑ์‹ ๋œ๋‹ค.

ํ•™์Šต์„ ํ•  ๋•Œ์—๋Š” ๋””์ฝ”๋”์˜ ๊ฐ ํƒ€์ž„์Šคํ…์˜ ์˜ˆ์ธก๊ฐ’์ด ๋ฌด์—‡์ด๋“  ๊ฐ„์— Ground Truth ๊ฐ’์„ ๋„ฃ์–ด์ฃผ๊ฒŒ ๋˜์ง€๋งŒ ์ถ”๋ก ์„ ํ•  ๋•Œ์—๋Š” ์ด์ „ ํƒ€์ž„์Šคํ…์˜ ์˜ˆ์ธก๊ฐ’์„ ๋‹ค์Œ ํƒ€์ž„์Šคํ…์˜ ์ž…๋ ฅ๊ฐ’์œผ๋กœ ๋„ฃ์–ด์ฃผ๊ฒŒ ๋œ๋‹ค.

  • ์ด๋ ‡๊ฒŒ ํ•™์Šต ์ค‘์— ์ž…๋ ฅ์„ Ground Truth๋กœ ๋„ฃ์–ด์ฃผ๋Š” ๋ฐฉ๋ฒ•์„ Teacher Forcing ์ด๋ผ๊ณ  ํ•œ๋‹ค.

  • ๋ฌผ๋ก , ํ•™์Šต์€ ์ž˜ ๋˜์ง€๋งŒ ์‹ค์ œ๋กœ ์šฐ๋ฆฌ๊ฐ€ ์ ์šฉํ•ด์•ผ ํ•˜๋Š” ๋ฌธ์ œ๋Š” Teacher Forcing ๊ณผ๋Š” ๊ดด๋ฆฌ๊ฐ€ ์žˆ๋‹ค. ๊ทธ๋ž˜์„œ ์ด๋ฅผ ์„ž์–ด์„œ ์‚ฌ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์ด ๋‚˜์™”๋Š”๋ฐ, ํ•™์Šต ์ดˆ๋ฐ˜์—๋Š” ๋น ๋ฅธ ํ•™์Šต์„ ์œ„ํ•ด์„œ ์ด๋ฅผ ์ ์šฉํ–ˆ๋‹ค๊ฐ€, ํ•™์Šต์ด ์–ด๋А ์ •๋„ ๋˜๊ณ ๋‚˜์„œ๋Š” ์ ์šฉํ•˜์ง€ ์•Š๋„๋ก ํ•˜๋Š” ๋ฐฉ๋ฒ•๋„ ์กด์žฌํ•œ๋‹ค.

Different Attention Mechanisms

์ด์ „์—๋Š” ์œ ์‚ฌ๋„๋ฅผ ๊ตฌํ•˜๊ธฐ ์œ„ํ•ด ๋‚ด์ ์„ ์‚ฌ์šฉํ–ˆ๋Š”๋ฐ, ๋‚ด์  ์ด์™ธ์—๋„ ๋‹ค์–‘ํ•œ ๋ฐฉ๋ฒ•์œผ๋กœ attention์„ ๊ตฌ์„ฑํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด๋„๋ก ํ•œ๋‹ค.

  • h_t : ๋””์ฝ”๋”์—์„œ ์ฃผ์–ด์ง€๋Š” ํžˆ๋“  ๋ฒกํ„ฐ

  • h_s : ์ธ์ฝ”๋”์—์„œ ๊ฐ ์›Œ๋“œ๋ณ„๋กœ์˜ ํžˆ๋“  ๋ฒกํ„ฐ

๊ทธ๋ƒฅ ๋‚ด์ ์„ ํ•  ์ˆ˜๋„ ์žˆ์ง€๋งŒ generalized dot product ๋ผ๋Š” attention ๋ฐฉ๋ฒ•๋„ ์žˆ๋‹ค.

  • W๋Š” ๋Œ€๊ฐํ–‰๋ ฌ์˜ ๋ชจ์–‘์ด๋‹ค. ๊ฐ dimension ๋ณ„๋กœ ์ ์šฉํ•˜๋Š” ๊ฐ€์ค‘์น˜์˜ ์—ญํ• ์„ ํ•œ๋‹ค.

๋˜, concat ํ•˜๋Š” ๋ฐฉ๋ฒ•์ด ์žˆ๋Š”๋ฐ, ์ด์ „์˜ ๋‚ด์ ๋“ค๊ณผ๋Š” ๋‹ค๋ฅธ ๋ฐฉ๋ฒ•์ด๋‹ค. ์œ ์‚ฌ๋„๋ฅผ ๋‚ด์ ์ด ์•„๋‹ˆ๋ผ ์‹ ๊ฒฝ๋ง์„ ํ†ตํ•ด์„œ ๊ตฌํ•˜๋Š” ๋ฐฉ๋ฒ•์ด๋‹ค.

์—ฌ๊ธฐ์„œ W2์— ํ•ด๋‹นํ•˜๋Š” ๋ถ€๋ถ„์ด vaT v_a^T vaTโ€‹๊ฐ€ ๋œ๋‹ค.

  • 2-layer์˜ ์‹ ๊ฒฝ๋ง์œผ๋กœ ๊ตฌ์„ฑํ•  ์ˆ˜ ์žˆ๋‹ค.

์ด์ „์˜ attention์€ ํŒŒ๋ผ๋ฏธํ„ฐ๊ฐ€ ํ•„์š”์—†๋Š” ๋‚ด์  ์—ฐ์‚ฐ์˜ ๋ชจ๋“ˆ์ด์—ˆ๋Š”๋ฐ, ํŒŒ๋ผ๋ฏธํ„ฐ๊ฐ€ ํ•„์š”ํ•œ ํ•™์Šต์ด ๋˜๋ฉด์„œ ์ข€ ๋” ์ตœ์ ํ™” ํ•  ์ˆ˜ ์žˆ๊ฒŒ๋œ๋‹ค.

Attention is Great

  • ๋””์ฝ”๋”์˜ ๋งค ์Šคํ…๋งˆ๋‹ค ํŠน์ • ์ •๋ณด๋ฅผ ์ œ๊ณตํ•˜๋ฉด์„œ ์„ฑ๋Šฅ์ด ๋งค์šฐ ํ–ฅ์ƒ๋˜์—ˆ๋‹ค.

  • attention์„ ํ•˜๋ฉด์„œ ๊ธด ๋ฌธ์žฅ์˜ ๋ฒˆ์—ญ์ด ์–ด๋ ค์šด ์ , bottleneck problem์„ ํ•ด๊ฒฐํ–ˆ๋‹ค.

  • ์—ญ์ „ํŒŒ ๊ณผ์ •์—์„œ ๋””์ฝ”๋” ์Šคํ…๊ณผ ์ธ์ฝ”๋” ์Šคํ…์„ ๊ฑฐ์ณ๊ฐ€๋ฉด์„œ ๋งค์šฐ ๊ธด ํƒ€์ž„์Šคํ…์„ ์ง€๋‚˜๊ฒŒ๋˜๊ณ  ์ด ๋•Œ gradient ์†Œ์‹ค ๋˜๋Š” ์ฆํญ ๋ฌธ์ œ๊ฐ€ ๋ฐœ์ƒํ•  ์ˆ˜ ์žˆ๊ฒŒ๋˜๋Š”๋ฐ attention์„ ์‚ฌ์šฉํ•˜๋ฉด์„œ gradient๊ฐ€ ์ง์ ‘์ ์œผ๋กœ ์ „๋‹ฌํ•  ์ˆ˜ ์žˆ๋Š” ๋ฐฉ๋ฒ•์ด ์ถ”๊ฐ€๋˜๋ฉด์„œ gradient๊ฐ€ ๋ณ€์งˆ์—†์ด ์ „๋‹ฌ๋  ์ˆ˜ ์žˆ๊ฒŒ๋˜์—ˆ๋‹ค.

  • ํฅ๋ฏธ๋กœ์šด ํ•ด์„๊ฐ€๋Šฅ์„ฑ์„ ์ œ๊ณตํ•ด์ค€๋‹ค.

    • attention์„ ์กฐ์‚ฌํ•ด์„œ h state๊ฐ€ ๊ฐ ๋‹จ์–ด์˜ ์–ด๋–ค ๋ถ€๋ถ„์— ์ง‘์ค‘ํ–ˆ๋Š”์ง€ ๊ด€์ฐฐํ•  ์ˆ˜ ์žˆ๊ฒŒ๋˜์—ˆ๋‹ค.

์‹ค์Šต

๋งค ์‹ค์Šต๋งˆ๋‹ค ๋™์ผํ•œ ๋ถ€๋ถ„์ด ํ•ต์‹ฌ ํด๋ž˜์Šค๋งŒ ๋‹ค๋ฃน๋‹ˆ๋‹ค.

Encoder

embedding_size = 256
hidden_size = 512
num_layers = 2
num_dirs = 2
dropout = 0.1
class Encoder(nn.Module):
  def __init__(self):
    super(Encoder, self).__init__()

    self.embedding = nn.Embedding(vocab_size, embedding_size)
    self.gru = nn.GRU(
        input_size=embedding_size, 
        hidden_size=hidden_size,
        num_layers=num_layers,
        bidirectional=True if num_dirs > 1 else False,
        dropout=dropout
    )
    self.linear = nn.Linear(num_dirs * hidden_size, hidden_size)

  def forward(self, batch, batch_lens):  # batch: (B, S_L), batch_lens: (B)
    # d_w: word embedding size
    batch_emb = self.embedding(batch)  # (B, S_L, d_w)
    batch_emb = batch_emb.transpose(0, 1)  # (S_L, B, d_w)

    packed_input = pack_padded_sequence(batch_emb, batch_lens)

    h_0 = torch.zeros((num_layers * num_dirs, batch.shape[0], hidden_size))  # (num_layers*num_dirs, B, d_h) = (4, B, d_h)
    packed_outputs, h_n = self.gru(packed_input, h_0)  # h_n: (4, B, d_h)
    outputs = pad_packed_sequence(packed_outputs)[0]  # outputs: (S_L, B, 2d_h)

    forward_hidden = h_n[-2, :, :]
    backward_hidden = h_n[-1, :, :]
    hidden = self.linear(torch.cat((forward_hidden, backward_hidden), dim=-1)).unsqueeze(0)  # (1, B, d_h)

    return outputs, hidden
  • 3, 4๊ฐ•์— ๋“ฑ์žฅํ•œ ์ธ์ฝ”๋”์™€ ๋™์ผํ•˜๋‹ค. ๋‹ค๋งŒ, ๋‚ด๋ถ€ ์ธ์ž๊ฐ€ ์‚ด์ง ๋‹ฌ๋ผ์„œ ์ถ”๊ฐ€๋œ ์ฝ”๋“œ๊ฐ€ ์žˆ๋‹ค. ์—ฌ๊ธฐ์„œ๋Š”, layer์˜ ์ˆ˜๊ฐ€ 2๊ฐœ์ด๊ณ  ๋ฐฉํ–ฅ๋„ ์–‘๋ฐฉํ–ฅ์ด๋‹ค.

    • ๊ทธ๋ž˜์„œ hidden state์˜ 3์ฐจ์› ๊ฐœ์ˆ˜๊ฐ€ 1์—์„œ 4๋กœ ์ฆ๊ฐ€ํ–ˆ๋‹ค.

    • ๋˜ํ•œ, layer๊ฐ€ 2๊ฐœ์ด๋ฏ€๋กœ forward_hidden ์„ ์ฒซ๋ฒˆ์งธ layer๋กœ, backward_hidden ์„ ๋‘๋ฒˆ์งธ layer๋กœ ์ •ํ–ˆ๊ณ  ์‹ค์ œ hidden state๋ฅผ ๋ฐ˜ํ™˜ํ•  ๋•Œ๋Š” ์ด ๋‘˜์€ cat ํ•ด์„œ ๋ฐ˜ํ™˜ํ–ˆ๋‹ค.

๋””์ฝ”๋”๋Š” ์ด์ „๊ณผ ๋™์ผํ•˜๋ฏ€๋กœ ์ƒ๋žตํ•œ๋‹ค.

Seq2Seq ๋ชจ๋ธ ๊ตฌ์ถ•

class Seq2seq(nn.Module):
  def __init__(self, encoder, decoder):
    super(Seq2seq, self).__init__()

    self.encoder = encoder
    self.decoder = decoder

  def forward(self, src_batch, src_batch_lens, trg_batch, teacher_forcing_prob=0.5):
    # src_batch: (B, S_L), src_batch_lens: (B), trg_batch: (B, T_L)

    _, hidden = self.encoder(src_batch, src_batch_lens)  # hidden: (1, B, d_h)

    input_ids = trg_batch[:, 0]  # (B)
    batch_size = src_batch.shape[0]
    outputs = torch.zeros(trg_max_len, batch_size, vocab_size)  # (T_L, B, V)

    for t in range(1, trg_max_len):
      decoder_outputs, hidden = self.decoder(input_ids, hidden)  # decoder_outputs: (B, V), hidden: (1, B, d_h)

      outputs[t] = decoder_outputs
      _, top_ids = torch.max(decoder_outputs, dim=-1)  # top_ids: (B)

      input_ids = trg_batch[:, t] if random.random() > teacher_forcing_prob else top_ids

    return outputs
  • encoder์˜ output์€ ์‚ฌ์šฉํ•˜์ง€ ์•Š๋Š” ๋ชจ์Šต.

  • ๋˜ํ•œ, decoder์˜ output์€ encoder์ฒ˜๋Ÿผ ํ•œ๋ฒˆ์— ๋‚˜์˜ค์ง€ ์•Š์œผ๋ฏ€๋กœ for๋ฌธ์œผ๋กœ ์ž‘๋™์‹œํ‚จ๋‹ค. ๊ทธ๋ž˜์„œ ์ด๋ฅผ ๋‹ด์•„์ฃผ๊ธฐ ์œ„ํ•œ outputs๋ฅผ ์„ ์–ธํ•ด์ค€๋‹ค.

๋‚ด๊ฐ€ ๊ทธ๋ฆฐ ๊ธฐ๋ฆฐ ๊ทธ๋ฆผ