๐Ÿšดโ€โ™‚๏ธ
TIL
  • MAIN
  • : TIL?
  • : WIL
  • : Plan
  • : Retrospective
    • 21Y
      • Wait a moment!
      • 9M 2W
      • 9M1W
      • 8M4W
      • 8M3W
      • 8M2W
      • 8M1W
      • 7M4W
      • 7M3W
      • 7M2W
      • 7M1W
      • 6M5W
      • 1H
    • ์ƒˆ์‚ฌ๋žŒ ๋˜๊ธฐ ํ”„๋กœ์ ํŠธ
      • 2ํšŒ์ฐจ
      • 1ํšŒ์ฐจ
  • TIL : ML
    • Paper Analysis
      • BERT
      • Transformer
    • Boostcamp 2st
      • [S]Data Viz
        • (4-3) Seaborn ์‹ฌํ™”
        • (4-2) Seaborn ๊ธฐ์ดˆ
        • (4-1) Seaborn ์†Œ๊ฐœ
        • (3-4) More Tips
        • (3-3) Facet ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-2) Color ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-1) Text ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-3) Scatter Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-2) Line Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-1) Bar Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (1-3) Python๊ณผ Matplotlib
        • (1-2) ์‹œ๊ฐํ™”์˜ ์š”์†Œ
        • (1-1) Welcome to Visualization (OT)
      • [P]MRC
        • (2๊ฐ•) Extraction-based MRC
        • (1๊ฐ•) MRC Intro & Python Basics
      • [P]KLUE
        • (5๊ฐ•) BERT ๊ธฐ๋ฐ˜ ๋‹จ์ผ ๋ฌธ์žฅ ๋ถ„๋ฅ˜ ๋ชจ๋ธ ํ•™์Šต
        • (4๊ฐ•) ํ•œ๊ตญ์–ด BERT ์–ธ์–ด ๋ชจ๋ธ ํ•™์Šต
        • [NLP] ๋ฌธ์žฅ ๋‚ด ๊ฐœ์ฒด๊ฐ„ ๊ด€๊ณ„ ์ถ”์ถœ
        • (3๊ฐ•) BERT ์–ธ์–ด๋ชจ๋ธ ์†Œ๊ฐœ
        • (2๊ฐ•) ์ž์—ฐ์–ด์˜ ์ „์ฒ˜๋ฆฌ
        • (1๊ฐ•) ์ธ๊ณต์ง€๋Šฅ๊ณผ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ
      • [U]Stage-CV
      • [U]Stage-NLP
        • 7W Retrospective
        • (10๊ฐ•) Advanced Self-supervised Pre-training Models
        • (09๊ฐ•) Self-supervised Pre-training Models
        • (08๊ฐ•) Transformer (2)
        • (07๊ฐ•) Transformer (1)
        • 6W Retrospective
        • (06๊ฐ•) Beam Search and BLEU score
        • (05๊ฐ•) Sequence to Sequence with Attention
        • (04๊ฐ•) LSTM and GRU
        • (03๊ฐ•) Recurrent Neural Network and Language Modeling
        • (02๊ฐ•) Word Embedding
        • (01๊ฐ•) Intro to NLP, Bag-of-Words
        • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Preprocessing for NMT Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Subword-level Language Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ2] RNN-based Language Model
        • [์„ ํƒ ๊ณผ์ œ] BERT Fine-tuning with transformers
        • [ํ•„์ˆ˜ ๊ณผ์ œ] Data Preprocessing
      • Mask Wear Image Classification
        • 5W Retrospective
        • Report_Level1_6
        • Performance | Review
        • DAY 11 : HardVoting | MultiLabelClassification
        • DAY 10 : Cutmix
        • DAY 9 : Loss Function
        • DAY 8 : Baseline
        • DAY 7 : Class Imbalance | Stratification
        • DAY 6 : Error Fix
        • DAY 5 : Facenet | Save
        • DAY 4 : VIT | F1_Loss | LrScheduler
        • DAY 3 : DataSet/Lodaer | EfficientNet
        • DAY 2 : Labeling
        • DAY 1 : EDA
        • 2_EDA Analysis
      • [P]Stage-1
        • 4W Retrospective
        • (10๊ฐ•) Experiment Toolkits & Tips
        • (9๊ฐ•) Ensemble
        • (8๊ฐ•) Training & Inference 2
        • (7๊ฐ•) Training & Inference 1
        • (6๊ฐ•) Model 2
        • (5๊ฐ•) Model 1
        • (4๊ฐ•) Data Generation
        • (3๊ฐ•) Dataset
        • (2๊ฐ•) Image Classification & EDA
        • (1๊ฐ•) Competition with AI Stages!
      • [U]Stage-3
        • 3W Retrospective
        • PyTorch
          • (10๊ฐ•) PyTorch Troubleshooting
          • (09๊ฐ•) Hyperparameter Tuning
          • (08๊ฐ•) Multi-GPU ํ•™์Šต
          • (07๊ฐ•) Monitoring tools for PyTorch
          • (06๊ฐ•) ๋ชจ๋ธ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ
          • (05๊ฐ•) Dataset & Dataloader
          • (04๊ฐ•) AutoGrad & Optimizer
          • (03๊ฐ•) PyTorch ํ”„๋กœ์ ํŠธ ๊ตฌ์กฐ ์ดํ•ดํ•˜๊ธฐ
          • (02๊ฐ•) PyTorch Basics
          • (01๊ฐ•) Introduction to PyTorch
      • [U]Stage-2
        • 2W Retrospective
        • DL Basic
          • (10๊ฐ•) Generative Models 2
          • (09๊ฐ•) Generative Models 1
          • (08๊ฐ•) Sequential Models - Transformer
          • (07๊ฐ•) Sequential Models - RNN
          • (06๊ฐ•) Computer Vision Applications
          • (05๊ฐ•) Modern CNN - 1x1 convolution์˜ ์ค‘์š”์„ฑ
          • (04๊ฐ•) Convolution์€ ๋ฌด์—‡์ธ๊ฐ€?
          • (03๊ฐ•) Optimization
          • (02๊ฐ•) ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ - MLP (Multi-Layer Perceptron)
          • (01๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ๊ธฐ๋ณธ ์šฉ์–ด ์„ค๋ช… - Historical Review
        • Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Multi-headed Attention Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] LSTM Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] CNN Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Optimization Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] MLP Assignment
      • [U]Stage-1
        • 1W Retrospective
        • AI Math
          • (AI Math 10๊ฐ•) RNN ์ฒซ๊ฑธ์Œ
          • (AI Math 9๊ฐ•) CNN ์ฒซ๊ฑธ์Œ
          • (AI Math 8๊ฐ•) ๋ฒ ์ด์ฆˆ ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 7๊ฐ•) ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 6๊ฐ•) ํ™•๋ฅ ๋ก  ๋ง›๋ณด๊ธฐ
          • (AI Math 5๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ํ•™์Šต๋ฐฉ๋ฒ• ์ดํ•ดํ•˜๊ธฐ
          • (AI Math 4๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ๋งค์šด๋ง›
          • (AI Math 3๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ์ˆœํ•œ๋ง›
          • (AI Math 2๊ฐ•) ํ–‰๋ ฌ์ด ๋ญ์˜ˆ์š”?
          • (AI Math 1๊ฐ•) ๋ฒกํ„ฐ๊ฐ€ ๋ญ์˜ˆ์š”?
        • Python
          • (Python 7-2๊ฐ•) pandas II
          • (Python 7-1๊ฐ•) pandas I
          • (Python 6๊ฐ•) numpy
          • (Python 5-2๊ฐ•) Python data handling
          • (Python 5-1๊ฐ•) File / Exception / Log Handling
          • (Python 4-2๊ฐ•) Module and Project
          • (Python 4-1๊ฐ•) Python Object Oriented Programming
          • (Python 3-2๊ฐ•) Pythonic code
          • (Python 3-1๊ฐ•) Python Data Structure
          • (Python 2-4๊ฐ•) String and advanced function concept
          • (Python 2-3๊ฐ•) Conditionals and Loops
          • (Python 2-2๊ฐ•) Function and Console I/O
          • (Python 2-1๊ฐ•) Variables
          • (Python 1-3๊ฐ•) ํŒŒ์ด์ฌ ์ฝ”๋”ฉ ํ™˜๊ฒฝ
          • (Python 1-2๊ฐ•) ํŒŒ์ด์ฌ ๊ฐœ์š”
          • (Python 1-1๊ฐ•) Basic computer class for newbies
        • Assignment
          • [์„ ํƒ ๊ณผ์ œ 3] Maximum Likelihood Estimate
          • [์„ ํƒ ๊ณผ์ œ 2] Backpropagation
          • [์„ ํƒ ๊ณผ์ œ 1] Gradient Descent
          • [ํ•„์ˆ˜ ๊ณผ์ œ 5] Morsecode
          • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Baseball
          • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Text Processing 2
          • [ํ•„์ˆ˜ ๊ณผ์ œ 2] Text Processing 1
          • [ํ•„์ˆ˜ ๊ณผ์ œ 1] Basic Math
    • ๋”ฅ๋Ÿฌ๋‹ CNN ์™„๋ฒฝ ๊ฐ€์ด๋“œ - Fundamental ํŽธ
      • ์ข…ํ•ฉ ์‹ค์Šต 2 - ์บ๊ธ€ Plant Pathology(๋‚˜๋ฌด์žŽ ๋ณ‘ ์ง„๋‹จ) ๊ฒฝ์—ฐ ๋Œ€ํšŒ
      • ์ข…ํ•ฉ ์‹ค์Šต 1 - 120์ข…์˜ Dog Breed Identification ๋ชจ๋ธ ์ตœ์ ํ™”
      • ์‚ฌ์ „ ํ›ˆ๋ จ ๋ชจ๋ธ์˜ ๋ฏธ์„ธ ์กฐ์ • ํ•™์Šต๊ณผ ๋‹ค์–‘ํ•œ Learning Rate Scheduler์˜ ์ ์šฉ
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - ResNet ์ƒ์„ธ์™€ EfficientNet ๊ฐœ์š”
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - AlexNet, VGGNet, GoogLeNet
      • Albumentation์„ ์ด์šฉํ•œ Augmentation๊ธฐ๋ฒ•๊ณผ Keras Sequence ํ™œ์šฉํ•˜๊ธฐ
      • ์‚ฌ์ „ ํ›ˆ๋ จ CNN ๋ชจ๋ธ์˜ ํ™œ์šฉ๊ณผ Keras Generator ๋ฉ”์ปค๋‹ˆ์ฆ˜ ์ดํ•ด
      • ๋ฐ์ดํ„ฐ ์ฆ๊ฐ•์˜ ์ดํ•ด - Keras ImageDataGenerator ํ™œ์šฉ
      • CNN ๋ชจ๋ธ ๊ตฌํ˜„ ๋ฐ ์„ฑ๋Šฅ ํ–ฅ์ƒ ๊ธฐ๋ณธ ๊ธฐ๋ฒ• ์ ์šฉํ•˜๊ธฐ
    • AI School 1st
    • ํ˜„์—… ์‹ค๋ฌด์ž์—๊ฒŒ ๋ฐฐ์šฐ๋Š” Kaggle ๋จธ์‹ ๋Ÿฌ๋‹ ์ž…๋ฌธ
    • ํŒŒ์ด์ฌ ๋”ฅ๋Ÿฌ๋‹ ํŒŒ์ดํ† ์น˜
  • TIL : Python & Math
    • Do It! ์žฅ๊ณ +๋ถ€ํŠธ์ŠคํŠธ๋žฉ: ํŒŒ์ด์ฌ ์›น๊ฐœ๋ฐœ์˜ ์ •์„
      • Relations - ๋‹ค๋Œ€๋‹ค ๊ด€๊ณ„
      • Relations - ๋‹ค๋Œ€์ผ ๊ด€๊ณ„
      • ํ…œํ”Œ๋ฆฟ ํŒŒ์ผ ๋ชจ๋“ˆํ™” ํ•˜๊ธฐ
      • TDD (Test Driven Development)
      • template tags & ์กฐ๊ฑด๋ฌธ
      • ์ •์  ํŒŒ์ผ(static files) & ๋ฏธ๋””์–ด ํŒŒ์ผ(media files)
      • FBV (Function Based View)์™€ CBV (Class Based View)
      • Django ์ž…๋ฌธํ•˜๊ธฐ
      • ๋ถ€ํŠธ์ŠคํŠธ๋žฉ
      • ํ”„๋ก ํŠธ์—”๋“œ ๊ธฐ์ดˆ๋‹ค์ง€๊ธฐ (HTML, CSS, JS)
      • ๋“ค์–ด๊ฐ€๊ธฐ + ํ™˜๊ฒฝ์„ค์ •
    • Algorithm
      • Programmers
        • Level1
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์ˆซ์ž ๋ฌธ์ž์—ด๊ณผ ์˜๋‹จ์–ด
          • ์ž์—ฐ์ˆ˜ ๋’ค์ง‘์–ด ๋ฐฐ์—ด๋กœ ๋งŒ๋“ค๊ธฐ
          • ์ •์ˆ˜ ๋‚ด๋ฆผ์ฐจ์ˆœ์œผ๋กœ ๋ฐฐ์น˜ํ•˜๊ธฐ
          • ์ •์ˆ˜ ์ œ๊ณฑ๊ทผ ํŒ๋ณ„
          • ์ œ์ผ ์ž‘์€ ์ˆ˜ ์ œ๊ฑฐํ•˜๊ธฐ
          • ์ง์‚ฌ๊ฐํ˜• ๋ณ„์ฐ๊ธฐ
          • ์ง์ˆ˜์™€ ํ™€์ˆ˜
          • ์ฒด์œก๋ณต
          • ์ตœ๋Œ€๊ณต์•ฝ์ˆ˜์™€ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • ์ฝœ๋ผ์ธ  ์ถ”์ธก
          • ํฌ๋ ˆ์ธ ์ธํ˜•๋ฝ‘๊ธฐ ๊ฒŒ์ž„
          • ํ‚คํŒจ๋“œ ๋ˆ„๋ฅด๊ธฐ
          • ํ‰๊ท  ๊ตฌํ•˜๊ธฐ
          • ํฐ์ผ“๋ชฌ
          • ํ•˜์ƒค๋“œ ์ˆ˜
          • ํ•ธ๋“œํฐ ๋ฒˆํ˜ธ ๊ฐ€๋ฆฌ๊ธฐ
          • ํ–‰๋ ฌ์˜ ๋ง์…ˆ
        • Level2
          • ์ˆซ์ž์˜ ํ‘œํ˜„
          • ์ˆœ์œ„ ๊ฒ€์ƒ‰
          • ์ˆ˜์‹ ์ตœ๋Œ€ํ™”
          • ์†Œ์ˆ˜ ์ฐพ๊ธฐ
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์‚ผ๊ฐ ๋‹ฌํŒฝ์ด
          • ๋ฌธ์ž์—ด ์••์ถ•
          • ๋ฉ”๋‰ด ๋ฆฌ๋‰ด์–ผ
          • ๋” ๋งต๊ฒŒ
          • ๋•…๋”ฐ๋จน๊ธฐ
          • ๋ฉ€์ฉกํ•œ ์‚ฌ๊ฐํ˜•
          • ๊ด„ํ˜ธ ํšŒ์ „ํ•˜๊ธฐ
          • ๊ด„ํ˜ธ ๋ณ€ํ™˜
          • ๊ตฌ๋ช…๋ณดํŠธ
          • ๊ธฐ๋Šฅ ๊ฐœ๋ฐœ
          • ๋‰ด์Šค ํด๋Ÿฌ์Šคํ„ฐ๋ง
          • ๋‹ค๋ฆฌ๋ฅผ ์ง€๋‚˜๋Š” ํŠธ๋Ÿญ
          • ๋‹ค์Œ ํฐ ์ˆซ์ž
          • ๊ฒŒ์ž„ ๋งต ์ตœ๋‹จ๊ฑฐ๋ฆฌ
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
          • ๊ฐ€์žฅ ํฐ ์ •์‚ฌ๊ฐํ˜• ์ฐพ๊ธฐ
          • H-Index
          • JadenCase ๋ฌธ์ž์—ด ๋งŒ๋“ค๊ธฐ
          • N๊ฐœ์˜ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • N์ง„์ˆ˜ ๊ฒŒ์ž„
          • ๊ฐ€์žฅ ํฐ ์ˆ˜
          • 124 ๋‚˜๋ผ์˜ ์ˆซ์ž
          • 2๊ฐœ ์ดํ•˜๋กœ ๋‹ค๋ฅธ ๋น„ํŠธ
          • [3์ฐจ] ํŒŒ์ผ๋ช… ์ •๋ ฌ
          • [3์ฐจ] ์••์ถ•
          • ์ค„ ์„œ๋Š” ๋ฐฉ๋ฒ•
          • [3์ฐจ] ๋ฐฉ๊ธˆ ๊ทธ๊ณก
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
        • Level3
          • ๋งค์นญ ์ ์ˆ˜
          • ์™ธ๋ฒฝ ์ ๊ฒ€
          • ๊ธฐ์ง€๊ตญ ์„ค์น˜
          • ์ˆซ์ž ๊ฒŒ์ž„
          • 110 ์˜ฎ๊ธฐ๊ธฐ
          • ๊ด‘๊ณ  ์ œ๊ฑฐ
          • ๊ธธ ์ฐพ๊ธฐ ๊ฒŒ์ž„
          • ์…”ํ‹€๋ฒ„์Šค
          • ๋‹จ์†์นด๋ฉ”๋ผ
          • ํ‘œ ํŽธ์ง‘
          • N-Queen
          • ์ง•๊ฒ€๋‹ค๋ฆฌ ๊ฑด๋„ˆ๊ธฐ
          • ์ตœ๊ณ ์˜ ์ง‘ํ•ฉ
          • ํ•ฉ์Šน ํƒ์‹œ ์š”๊ธˆ
          • ๊ฑฐ์Šค๋ฆ„๋ˆ
          • ํ•˜๋…ธ์ด์˜ ํƒ‘
          • ๋ฉ€๋ฆฌ ๋›ฐ๊ธฐ
          • ๋ชจ๋‘ 0์œผ๋กœ ๋งŒ๋“ค๊ธฐ
        • Level4
    • Head First Python
    • ๋ฐ์ดํ„ฐ ๋ถ„์„์„ ์œ„ํ•œ SQL
    • ๋‹จ ๋‘ ์žฅ์˜ ๋ฌธ์„œ๋กœ ๋ฐ์ดํ„ฐ ๋ถ„์„๊ณผ ์‹œ๊ฐํ™” ๋ฝ€๊ฐœ๊ธฐ
    • Linear Algebra(Khan Academy)
    • ์ธ๊ณต์ง€๋Šฅ์„ ์œ„ํ•œ ์„ ํ˜•๋Œ€์ˆ˜
    • Statistics110
  • TIL : etc
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Kubernetes
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Docker
      • 2. ๋„์ปค ์„ค์น˜ ์‹ค์Šต 1 - ํ•™์ŠตํŽธ(์ค€๋น„๋ฌผ/์‹ค์Šต ์œ ํ˜• ์†Œ๊ฐœ)
      • 1. ์ปจํ…Œ์ด๋„ˆ์™€ ๋„์ปค์˜ ์ดํ•ด - ์ปจํ…Œ์ด๋„ˆ๋ฅผ ์“ฐ๋Š”์ด์œ  / ์ผ๋ฐ˜ํ”„๋กœ๊ทธ๋žจ๊ณผ ์ปจํ…Œ์ด๋„ˆํ”„๋กœ๊ทธ๋žจ์˜ ์ฐจ์ด์ 
      • 0. ๋“œ๋””์–ด ์ฐพ์•„์˜จ Docker ๊ฐ•์˜! ์™•์ดˆ๋ณด์—์„œ ๋„์ปค ๋งˆ์Šคํ„ฐ๋กœ - OT
    • CoinTrading
      • [๊ฐ€์ƒ ํ™”ํ ์ž๋™ ๋งค๋งค ํ”„๋กœ๊ทธ๋žจ] ๋ฐฑํ…Œ์ŠคํŒ… : ๊ฐ„๋‹จํ•œ ํ…Œ์ŠคํŒ…
    • Gatsby
      • 01 ๊นƒ๋ถ ํฌ๊ธฐ ์„ ์–ธ
  • TIL : Project
    • Mask Wear Image Classification
    • Project. GARIGO
  • 2021 TIL
    • CHANGED
    • JUN
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Thu
      • 2 Wed
      • 1 Tue
    • MAY
      • 31 Mon
      • 30 Sun
      • 29 Sat
      • 28 Fri
      • 27 Thu
      • 26 Wed
      • 25 Tue
      • 24 Mon
      • 23 Sun
      • 22 Sat
      • 21 Fri
      • 20 Thu
      • 19 Wed
      • 18 Tue
      • 17 Mon
      • 16 Sun
      • 15 Sat
      • 14 Fri
      • 13 Thu
      • 12 Wed
      • 11 Tue
      • 10 Mon
      • 9 Sun
      • 8 Sat
      • 7 Fri
      • 6 Thu
      • 5 Wed
      • 4 Tue
      • 3 Mon
      • 2 Sun
      • 1 Sat
    • APR
      • 30 Fri
      • 29 Thu
      • 28 Wed
      • 27 Tue
      • 26 Mon
      • 25 Sun
      • 24 Sat
      • 23 Fri
      • 22 Thu
      • 21 Wed
      • 20 Tue
      • 19 Mon
      • 18 Sun
      • 17 Sat
      • 16 Fri
      • 15 Thu
      • 14 Wed
      • 13 Tue
      • 12 Mon
      • 11 Sun
      • 10 Sat
      • 9 Fri
      • 8 Thu
      • 7 Wed
      • 6 Tue
      • 5 Mon
      • 4 Sun
      • 3 Sat
      • 2 Fri
      • 1 Thu
    • MAR
      • 31 Wed
      • 30 Tue
      • 29 Mon
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • FEB
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • JAN
      • 31 Sun
      • 30 Sat
      • 29 Fri
      • 28 Thu
      • 27 Wed
      • 26 Tue
      • 25 Mon
      • 24 Sun
      • 23 Sat
      • 22 Fri
      • 21 Thu
      • 20 Wed
      • 19 Tue
      • 18 Mon
      • 17 Sun
      • 16 Sat
      • 15 Fri
      • 14 Thu
      • 13 Wed
      • 12 Tue
      • 11 Mon
      • 10 Sun
      • 9 Sat
      • 8 Fri
      • 7 Thu
      • 6 Wed
      • 5 Tue
      • 4 Mon
      • 3 Sun
      • 2 Sat
      • 1 Fri
  • 2020 TIL
    • DEC
      • 31 Thu
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Tue
      • 2 Wed
      • 1 Tue
    • NOV
      • 30 Mon
Powered by GitBook
On this page
  • Transformer
  • Vision Transformer
  • DALL-E

Was this helpful?

  1. TIL : ML
  2. Boostcamp 2st
  3. [U]Stage-2
  4. DL Basic

(08๊ฐ•) Sequential Models - Transformer

210812

Previous(09๊ฐ•) Generative Models 1Next(07๊ฐ•) Sequential Models - RNN

Last updated 3 years ago

Was this helpful?

์‹œํ€€์Šค๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์€ ์‹œํ€€์Šค๋“ค ๋•Œ๋ฌธ์— ๋ชจ๋ธ์„ ๊ตฌํ˜„ํ•˜๊ธฐ ์–ด๋ ต๋‹ค.

Transformer

ํŠธ๋žœ์Šคํฌ๋จธ๋Š” RNN๊ณผ ๋‹ฌ๋ฆฌ ์žฌ๊ท€์ ์ธ ๋ถ€๋ถ„์ด ์—†๊ณ  attention์ด๋ผ๋Š” ๊ฒƒ์„ ์‚ฌ์šฉํ–ˆ๋‹ค

์ด ํŠธ๋žœ์Šคํฌ๋จธ๋Š” ์ธ์ฝ”๋”ฉ์˜ ์ž‘์—…์„ ๊ฑฐ์น˜๊ธฐ ๋•Œ๋ฌธ์— ๋‹จ์ˆœํžˆ NMT ๋ฌธ์ œ์—๋งŒ ์ ์šฉ๋˜์ง€ ์•Š๋Š”๋‹ค. ์ด๋ฏธ์ง€ ๋ถ„๋ฅ˜๋‚˜ ํƒ์ง€ ๋˜๋Š” DALL:E ๋“ฑ์— ์ ์šฉ๋  ์ˆ˜ ์žˆ๋‹ค

  • NMT๋Š” Nerual machine translation์˜ ์•ฝ์–ด์ด๋‹ค.

  • DALL:E ๋Š” ๋‹จ์–ด๋ฅผ ์ œ์‹œํ•˜๋ฉด ๋‹จ์–ด์— ํ•ด๋‹นํ•˜๋Š” ์ด๋ฏธ์ง€๋ฅผ ์ฃผ๋Š” ๊ฒƒ

๋ถˆ์–ด๋ฅผ ์˜์–ด๋กœ ๋ฐ”๊พธ๋Š” ๋“ฑ์˜ ๊ณผ์ •์„ Sequence To Sequnce ๋ผ๊ณ  ํ•œ๋‹ค.

์ž…๋ ฅ ์‹œํ€€์Šค์™€ ์ถœ๋ ฅ ์‹œํ€€์Šค์˜ ๊ธธ์ด, ๋„๋ฉ”์ธ์ด ๋‹ค๋ฅผ ์ˆ˜ ์žˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ํ•˜๋‚˜์˜ ๋ชจ๋ธ๋กœ ์ด๋ฃจ์–ด์ ธ์žˆ๋‹ค. RNN ๊ฐ™์€ ๊ฒฝ์šฐ๋Š” 3๊ฐœ์˜ ๋‹จ์–ด๊ฐ€ ๋“ค์–ด๊ฐ€๋ฉด 3๋ฒˆ์„ ์žฌ๊ท€์ ์œผ๋กœ ๋„๋Š”๋ฐ์ด ๋ฐ˜ํ•ด Transformer๋Š” 100๊ฐœ๋“  1000๊ฐœ๋“  1๋ฒˆ์— Encoding ํ•  ์ˆ˜ ์žˆ๋‹ค.

  • ๋ฌผ๋ก  Generation ํ•  ๋•Œ๋Š” ํ•œ ๋‹จ์–ด์”ฉ ๋งŒ์ง€๊ฒŒ ๋œ๋‹ค. Enocding ํ•  ๋•Œ๋ฅผ ๋งํ•˜๋Š” ๊ฒƒ

์šฐ๋ฆฌ๊ฐ€ ์ดํ•ดํ•ด์•ผ ํ•  ๊ฒƒ์€ ๋‹ค์Œ๊ณผ ๊ฐ™๋‹ค

  • N๊ฐœ์˜ ๋‹จ์–ด๊ฐ€ ์–ด๋–ป๊ฒŒ ์ธ์ฝ”๋”์—์„œ ํ•œ๋ฒˆ์— ์ฒ˜๋ฆฌ๊ฐ€ ๋˜๋Š”์ง€

  • ์ธ์ฝ”๋”์™€ ๋””์ฝ”๋”๊ฐ€ ์–ด๋–ค ์ •๋ณด๋ฅผ ์ฃผ๊ณ  ๋ฐ›๋Š”์ง€

  • ๋””์ฝ”๋”๊ฐ€ ์–ด๋–ป๊ฒŒ ์ œ๋„ˆ๋ ˆ์ด์…˜ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€

์ด ์„ธ๋ฒˆ์งธ๋Š” ์‹œ๊ฐ„์ƒ์˜ ์ด์œ ๋กœ ์ˆ˜์—…์—์„œ ๋œ ๋‹ค๋ฃฐ ๊ฒƒ์ด๋‹ค.

์ƒˆ๋กœ์šด ๋ถ€๋ถ„์€ Self-attention

์ฃผ์–ด์ง„ 3๊ฐœ์˜ ๋ฒกํ„ฐ๊ฐ€ ์žˆ๋‹ค๊ณ  ํ•˜์ž.

์ด ๋•Œ, ์ธ์ฝ”๋”๋Š” ๋ฒกํ„ฐ๋“ค์„ ๋‹ค๋ฅธ ๋ฒกํ„ฐ๋“ค๋กœ ๋Œ€์‘์‹œ์ผœ์ค€๋‹ค.

์ด ๋•Œ ์ค‘์š”ํ•œ ์ ์€ ๋‹จ์ˆœํžˆ x1 ์—์„œ z1์œผ๋กœ ๋Œ€์‘๋˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ ๋‚˜๋จธ์ง€ x2์™€ x3๋ฒกํ„ฐ๋ฅผ ๊ณ ๋ คํ•ด์„œ z1์œผ๋กœ ๋Œ€์‘๋œ๋‹ค๋Š” ๊ฒƒ. ์ฆ‰, i๋ฒˆ์งธ x๋ฅผ i๋ฒˆ์งธ z๋กœ ๋ฐ”๊ฟ€ ๋•Œ์—๋Š” ๋‚˜๋จธ์ง€ i-1 ๊ฐœ์˜ x๋ฒกํ„ฐ๋ฅผ ๊ณ ๋ คํ•˜๊ฒŒ ๋œ๋‹ค. ๊ทธ๋ž˜์„œ dependencies ๊ฐ€ ์žˆ๋‹ค๊ณ  ํ•œ๋‹ค.

๋ฐ˜๋Œ€๋กœ, ์ดํ›„์˜ feed-forward๋Š” ๊ทธ๋ƒฅ ๋„คํŠธ์›Œํฌ๋ฅผ ํ†ต๊ณผํ•˜๋Š” ๊ณผ์ •์ด๊ธฐ ๋•Œ๋ฌธ์— dependencies๊ฐ€ ์—†๋‹ค.

์ด๋ฒˆ์—๋Š” ๋‘ ์‹œํ€€์Šค "Thinking" ๊ณผ "Machine" ์ด ์žˆ๋‹ค๊ณ  ํ•ด๋ณด์ž. ์ด ๋•Œ Self-Attention ์„ ์„ค๋ช…ํ•˜๊ณ ์ž ํ•˜๋Š”๋ฐ, ๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋ฌธ์žฅ์ด ์žˆ๋‹ค๊ณ  ํ•˜์ž

The animal didn't cross the street because it was too tired.

์—ฌ๊ธฐ์„œ It ์€ ๋‹จ์ˆœํžˆ ๋‹จ์–ด์˜ ์˜๋ฏธ๋กœ ํ•ด์„ํ•˜๋ฉด ์•ˆ๋˜๊ณ  ๋‹จ์–ด๊ฐ€ ๋ฌธ์žฅ์†์—์„œ ๋‹ค๋ฅธ ๋‹จ์–ด๋“ค๊ณผ ์–ด๋– ํ•œ Interaction์ด ์žˆ๋Š”์ง€ ํŒŒ์•…ํ•ด์•ผ ํ•œ๋‹ค.

์ด ๋•Œ, Transformer๋Š” ๋‹ค๋ฅธ ๋‹จ์–ด๋“ค๊ณผ์˜ ๊ด€๊ณ„์„ฑ์„ ํ•™์Šตํ•˜๊ฒŒ ๋˜๊ณ  ๊ฒฐ๋ก ์ ์œผ๋กœ Animal๊ณผ ๋†’์€ ๊ด€๊ณ„์„ฑ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค๊ณ  ํ•™์Šตํ•˜๊ฒŒ ๋œ๋‹ค.

Self-Attention ๊ตฌ์กฐ๋Š” 3๊ฐ€์ง€ ๋ฒกํ„ฐ๋ฅผ ๋งŒ๋“ค์–ด ๋‚ด๊ฒŒ ๋œ๋‹ค. 3๊ฐ€์ง€ ๋ฒกํ„ฐ๊ฐ€ ์žˆ๋‹ค๋Š” ๊ฒƒ์€ 3๊ฐœ์˜ NN์ด ์žˆ๋‹ค๋Š” ๋œป๊ณผ ๊ฐ™๋‹ค.

  • 3๊ฐœ์˜ ๋ฒกํ„ฐ๋Š” Queries, Keys, Values ์ด๋‹ค.

๊ทธ๋ž˜์„œ x1๋ฒกํ„ฐ๊ฐ€ ์ž…๋ ฅ๋˜๋Š” ์ด๋ฅผ ํ†ตํ•ด q1, k1, v1 ๋ฒกํ„ฐ๋ฅผ ๋งŒ๋“ค๊ฒŒ ๋˜๊ณ  ์ด ๋ฒกํ„ฐ๋“ค์„ ํ†ตํ•ด x1 ๋ฒกํ„ฐ๋ฅผ ๋‹ค๋ฅธ ๋ฒกํ„ฐ๋กœ ๋ฐ”๊ฟ”์ฃผ๊ฒŒ ๋œ๋‹ค.

Thinking๊ณผ Machine์ด๋ผ๋Š” ๋‹จ์–ด๊ฐ€ ์ž…๋ ฅ๋˜์—ˆ๋‹ค๋ฉด ๊ฐ๊ฐ์˜ ๋‹จ์–ด์— ๋Œ€ํ•ด ์„ธ๊ฐ€์ง€ ๋ฒกํ„ฐ๋ฅผ ๋งŒ๋“ค๊ฒŒ ๋œ๋‹ค. ์ดํ›„์—, Score๋ผ๋Š” ๋ฒกํ„ฐ๋ฅผ ๋งŒ๋“ค๊ฒŒ ๋˜๋Š”๋ฐ ๋‚ด๊ฐ€ ์ธ์ฝ”๋”ฉํ•˜๊ณ ์ž ํ•˜๋Š” ๋ฒกํ„ฐ์˜ ์ฟผ๋ฆฌ๋ฒกํ„ฐ์™€ ์ž์‹ ์„ ํฌํ•จํ•œ ๋‚˜๋จธ์ง€ ๋‹จ์–ด๋ฒกํ„ฐ๋“ค์˜ ํ‚ค๋ฒกํ„ฐ๋ฅผ ๋‚ด์ ํ•œ๋‹ค.

  • ์ด๋ฅผ ํ†ตํ•ด ํ•ด๋‹น ๋‹จ์–ด๊ฐ€ ๋‚˜๋จธ์ง€ ๋‹จ์–ด๋“ค๊ณผ ์–ผ๋งˆ๋‚˜ ์œ ์‚ฌํ•œ์ง€๋ฅผ ํŒŒ์•…ํ•˜๊ฒŒ ๋œ๋‹ค

  • ๋‚ด์ ์„ ํ•œ ๊ฒƒ์€ ํ•ด๋‹น ๋‹จ์–ด์™€ ๋‚˜๋จธ์ง€ ๋‹จ์–ด๋“ค ์‚ฌ์ด์—์„œ ์–ผ๋งˆ๋‚˜ Interaction์„ ํ•ด์•ผํ•˜๋Š”์ง€ ์•Œ์•„์„œ ํ•™์Šตํ•˜๊ฒŒ ํ•˜๊ธฐ ์œ„ํ•จ์ด๋‹ค.

  • ์ด๊ฒƒ์ด ๋ฐ”๋กœ Attention ์— ํ•ด๋‹นํ•œ๋‹ค. ๋‚ด๊ฐ€ ์–ด๋–ค ๋‹จ์–ด๋ฅผ ์ธ์ฝ”๋”ฉํ•˜๊ณ ์‹ถ์€๋ฐ ์–ด๋–ค ๋‚˜๋จธ์ง€ ๋‹จ์–ด๋“ค๊ณผ intraction์ด ๋งŽ์ด ์ผ์–ด๋‚˜์•ผ ํ•˜๋Š”์ง€ ํŒŒ์•…ํ•ด์•ผํ•จ

์ดํ›„ ์ด ์ ์ˆ˜๋ฅผ 8๋กœ ๋‚˜๋ˆ ์ฃผ๋Š”๋ฐ ์ด 8์ด๋ผ๋Š” ์ˆซ์ž๋Š” Keys ๋ฒกํ„ฐ์˜ Dimension์— ๋”ฐ๋ผ ๊ฒฐ์ •๋œ๋‹ค. ์—ฌ๊ธฐ์„œ๋Š” ํ‚ค๋ฒกํ„ฐ๋Š” ์ด 64๊ฐœ๊ฐ€ ์žˆ๊ณ  ์ด๊ฒƒ์„ squareํ•œ 8๋กœ ๋‚˜๋ˆ ์ฃผ๊ฒŒ๋œ๋‹ค.

  • ์ด๋Š” score value๊ฐ€ ๋„ˆ๋ฌด ์ปค์ง€์ง€ ์•Š๋„๋ก ์–ด๋–ค ๋ ˆ์ธ์ง€ ์•ˆ์— ๋‘๊ฒŒ ํ•˜๋Š” ํšจ๊ณผ๊ฐ€์žˆ๋‹ค.

์ดํ›„ ๊ณ„์‚ฐ๋œ ์ ์ˆ˜๋Š” Softmax๋ฅผ ๊ฑฐ์น˜๊ฒŒ๋˜๊ณ  ์ดํ›„ Values ๋ฒกํ„ฐ์™€ ๊ณฑํ•ด์ง€๊ฒŒ ๋œ๋‹ค.

  • ๊ฒฐ๊ตญ Value ๋ฒกํ„ฐ์˜ ๊ฐ€์ค‘์น˜๋ฅผ ๊ตฌํ•˜๋Š” ๊ณผ์ •์€ Queries ๋ฒกํ„ฐ์™€ Keys ๋ฒกํ„ฐ์˜ ๋‚ด์ ์„ ํ†ตํ•ด ์–ป์€ ๊ฐ’์„ ๊ตฌํ•œ๋‹ค. ์ด ๊ฐ’์„ Key์˜ Dimension์˜ square๋กœ ๋‚˜๋ˆ ์ฃผ๊ณ  Softmaxํ•ด์„œ ๋‚˜์˜จ attention์„ Values ๋ฒกํ„ฐ์™€ ๊ณฑํ•˜๊ฒŒ ๋œ๋‹ค.

์—ฌ๊ธฐ์„œ ์ค‘์š”ํ•œ ์ 

  • ์ฟผ๋ฆฌ๋ฒกํ„ฐ์™€ ํ‚ค ๋ฒกํ„ฐ๋Š” ๋‚ด์ ์„ ํ•ด์•ผํ•˜๊ธฐ ๋•Œ๋ฌธ์— ์ฐจ์›์ด ํ•ญ์ƒ ๊ฐ™์•„์•ผ ํ•œ๋‹ค.

  • ํ•˜์ง€๋งŒ ๋ฐธ๋ฅ˜ ๋ฒกํ„ฐ๋Š” Weight sum๋งŒ ํ•˜๋ฉด ๋˜๊ธฐ ๋•Œ๋ฌธ์— ์ฐจ์›์ด ๋‹ฌ๋ผ๋„ ๋œ๋‹ค.

  • ๋‹จ์–ด ๋ฒกํ„ฐ ์ธ์ฝ”๋”ฉ ๋œ ๋ฒกํ„ฐ์˜ ์ฐจ์›์€ ๋ฐธ๋ฅ˜ ๋ฒกํ„ฐ์˜ ์ฐจ์›๊ณผ ๋™์ผํ•ด์•ผ ํ•œ๋‹ค

    • Mutl-Attention์—์„œ๋Š” ๋˜ ๋‹ฌ๋ผ์ง„๋‹ค.

ํ–‰๋ ฌ ์—ฐ์‚ฐ์œผ๋กœ ์ƒ๊ฐํ•ด๋ณด์ž. ๋‹จ์–ด๊ฐ€ ๋‘๊ฐœ์ด๊ณ  ์ž„๋ฒ ๋”ฉ ์ฐจ์›์€ 4์ฐจ์›์ด์–ด์„œ (2, 4)๋กœ ํ‘œํ˜„ํ•œ๋‹ค

  • ์›๋ž˜ ๋‹จ์–ด์˜ ๊ฐœ์ˆ˜๋งŒํผ ์ž„๋ฒ ๋”ฉ ๋ฒกํ„ฐ์˜ ์ฐจ์›์˜ ํฌ๊ธฐ๊ฐ€ ๊ฒฐ์ •๋˜๋Š”๋ฐ ์—ฌ๊ธฐ์„œ๋Š” ๊ทธ๋ƒฅ 4์ฐจ์›์œผ๋กœ ํ‘œํ˜„ํ•œ ๊ฒƒ ๊ฐ™๋‹ค

๊ทธ๋ฆฌ๊ณ  W๋ฅผ ๊ณฑํ•ด์„œ Q, K, V๋ฅผ ์–ป๋Š”๋‹ค.

๊ทธ๋ฆฌ๊ณ  ์ฟผ๋ฆฌ์™€ ํ‚ค๋ฅผ ๊ณฑํ•ด์„œ ์†Œํ”„ํŠธ๋งฅ์Šค๋ฅผ ๊ฑฐ์นœ๋’ค ๋ฐธ๋ฅ˜๊ฐ’์„ ๊ตฌํ•ด์„œ Sum์„ ๊ตฌํ•œ๋‹ค.

์ฝ”๋“œ๋กœ ๊ตฌํ˜„ํ•˜๋ฉด ํ•œ๋‘์ค„๋กœ ๊ตฌํ˜„์ด ๊ฐ€๋Šฅํ•˜๋‹ค.

์™œ ํŠธ๋žœ์Šคํฌ๋จธ๊ฐ€ ์ž˜๋ ๊นŒ?

๊ธฐ์กด์˜ CNN ๊ฐ™์€ ๋ชจ๋ธ์€ ์ž…๋ ฅ์ด ๊ณ ์ •๋˜๋ฉด ์ถœ๋ ฅ๋„ ๊ณ ์ •๋œ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ Transformer ๊ฐ™์€ ๊ฒฝ์šฐ๋Š” ์ž…๋ ฅ์ด ๊ณ ์ •๋˜์–ด๋„ ์ฃผ๋ณ€ ๋‹จ์–ด๋“ค์— ๋”ฐ๋ผ ์ธ์ฝ”๋”ฉ ๊ฐ’์ด ๋ฐ”๋€” ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์— Flexible ํ•œ ๋ชจ๋ธ์ด๋ผ๊ณ  ๋ณผ ์ˆ˜ ์žˆ๋‹ค.

  • ๊ทธ๋ž˜์„œ ๋‹ค์–‘ํ•œ ์ถœ๋ ฅ์„ ๋‚ผ ์ˆ˜ ์žˆ๋‹ค.

Competition๋„ ์กด์žฌํ•œ๋‹ค.

  • n๊ฐœ์˜ ๋‹จ์–ด๊ฐ€ ์žˆ์œผ๋ฉด n*n๊ฐœ์˜ attention map์ด ์žˆ์–ด์•ผ ํ•œ๋‹ค.

    • RNN์€ ์ฒœ๊ฐœ์˜ ์‹œํ€€์Šค๊ฐ€ ์žˆ์œผ๋ฉด ์ฒœ๋ฒˆ ๋Œ๋ฆฌ๋ฉด ๋œ๋‹ค.(์‹œ๊ฐ„์  ์—ฌ์œ ๊ฐ€ ์žˆ๋‹ค๋ฉด ๋Œ๋ฆฌ๋Š”๋ฐ ๋ฌธ์ œ๊ฐ€ ์—†๋‹ค๋Š” ๋œป) ๊ทธ๋Ÿฌ๋‚˜ ํŠธ๋žœ์Šคํฌ๋จธ๋Š” N๊ฐœ์˜ ๋‹จ์–ด๋ฅผ ํ•œ๋ฒˆ์— ์ฒ˜๋ฆฌํ•ด์•ผ ํ•˜๊ธฐ ๋•Œ๋ฌธ์— ๋ฉ”๋ชจ๋ฆฌ๋ฅผ ๋งŽ์ด ์žก์•„ ๋จน๋Š”๋‹ค

    • ๊ทธ๋Œ€์‹  ์ด๋Ÿฐ๊ฒƒ์„ ๊ทน๋ณตํ•˜๋ฉด ๋” Flexble ํ•˜๊ณ  ๋” ์„ฑ๋Šฅ์ด ์ข‹์€ ๋ชจ๋ธ์ด ๋œ๋‹ค

ํ•œ ๊ฐœ์˜ ๋‹จ์–ด์— ๋Œ€ํ•ด์„œ ์—ฌ๋Ÿฌ๊ฐœ์˜ ๋ฒกํ„ฐ๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ์œผ๋ฉด Multi-headed attention, MHA ๋ผ๊ณ  ํ•œ๋‹ค.

๋งŒ์•ฝ์— 8๊ฐœ์˜ ๋ฒกํ„ฐ๋ฅผ ๋งŒ๋“ ๋‹ค๊ณ  ํ•˜์ž.

๊ทธ๋Ÿฐ๋ฐ ์—ฌ๊ธฐ์„œ, ์ž…๋ ฅ ํฌ๊ธฐ์™€ ์ถœ๋ ฅ ํฌ๊ธฐ๊ฐ€ ๋™์ผํ•ด์•ผ ํ•˜๋Š”๋ฐ, ์ถœ๋ ฅ์ด 8๊ฐœ์ด๋ฏ€๋กœ ์ถœ๋ ฅ ํฌ๊ธฐ๊ฐ€ 8๋ฐฐ๊ฐ€ ๋œ๋‹ค.

๊ทธ๋ž˜์„œ ์ถœ๋ ฅ ํฌ๊ธฐ๋ฅผ ๋งž์ถฐ์ฃผ๊ธฐ ์œ„ํ•ด ์ถ”๊ฐ€๋กœ ํ–‰๋ ฌ์„ ๊ณฑํ•ด์ฃผ๊ฒŒ ๋œ๋‹ค.

  • ์˜ˆ๋ฅผ ๋“ค์–ด ์ด (10, 80) ํ–‰๋ ฌ์ด ์™„์„ฑ๋œ๋‹ค๋ฉด (80, 10) ํ–‰๋ ฌ์„ ๊ณฑํ•ด์„œ (10, 10) ํ–‰๋ ฌ๋กœ ๋งŒ๋“ ๋‹ค.

๊ทธ๋Ÿฌ๋‚˜, ์‹ค์ œ๋กœ ์ด๋ ‡๊ฒŒ ๊ตฌํ˜„๋˜์ง€๋Š” ์•Š๋‹ค. ์ž์„ธํ•œ ๊ฑด ์ฝ”๋“œ์‹ค์Šต์—์„œ.

์šฐ๋ฆฌ๊ฐ€ N๊ฐœ์˜ ๋‹จ์–ด๋ฅผ sequential ํ•˜๊ฒŒ ๋„ฃ์—ˆ์ง€๋งŒ ์‹ค์ œ๋กœ Transformer์—๋Š” ์ˆœ์„œ์ •๋ณด๊ฐ€ ๋ฐ˜์˜๋˜์ง€ ์•Š๋Š”๋‹ค. ๊ทธ๋ž˜์„œ positional encoding์ด ํ•„์š”ํ•˜๊ฒŒ ๋œ๋‹ค. ๋งŒ๋“ค์–ด์ง€๋Š” ๋ฐฉ๋ฒ•์€ ์‚ฌ์ „์— pre-defined ๋œ ๋ฐฉ๋ฒ•์„ ๊ฐ€์ง€๊ณ  ๋งŒ๋“ ๋‹ค๊ณ  ํ•œ๋‹ค(๊ทธ๋ž˜์„œ ๊ทธ ๋ฐฉ๋ฒ•์ด ๋ญ˜๊นŒ...ใ…Ž)

์™ผ์ชฝ์€ ์˜ˆ์ „์— ์“ฐ์ด๋˜ positional encoding ๋ฐฉ๋ฒ•์ด๊ณ  ์ตœ๊ทผ์—๋Š” ์˜ค๋ฅธ์ชฝ์ฒ˜๋Ÿผ ์‚ฌ์šฉํ•œ๋‹ค๊ณ  ํ•œ๋‹ค.

์–ด์จ‹๋“ , attention์ด ๋˜๋ฉด Z ๋ฒกํ„ฐ๋ฅผ ์ƒ์„ฑํ•˜๊ฒŒ ๋˜๊ณ  ์ด ๋•Œ LayerNorm ๊ณผ์ •์„ ๊ฑฐ์นœ๋‹ค. ์ด๊ฒŒ ๋ฌด์—‡์ธ์ง€ ์„ค๋ช…ํ•˜์ง€ ์•Š์œผ๋ฏ€๋กœ ๋‚ด๊ฐ€ ์ฐพ์•„๋ด์•ผ๊ฒ ์ง€...

๋จผ์ € BN์€ โ€œ๊ฐ feature์˜ ํ‰๊ท ๊ณผ ๋ถ„์‚ฐโ€์„ ๊ตฌํ•ด์„œ batch์— ์žˆ๋Š” โ€œ๊ฐ feature ๋ฅผ ์ •๊ทœํ™”โ€ ํ•œ๋‹ค.

๋ฐ˜๋ฉด LN์€ โ€œ๊ฐ input์˜ feature๋“ค์— ๋Œ€ํ•œ ํ‰๊ท ๊ณผ ๋ถ„์‚ฐโ€์„ ๊ตฌํ•ด์„œ batch์— ์žˆ๋Š” โ€œ๊ฐ input์„ ์ •๊ทœํ™”โ€ ํ•œ๋‹ค.

๊ทธ๋ฆผ๋งŒ ๋ด๋„ ์ดํ•ด๊ฐ€ ์ž˜๋œ๋‹ค!

์ธ์ฝ”๋”๋Š” ๊ฒฐ๊ตญ ๋””์ฝ”๋”์—๊ฒŒ ํ‚ค์™€ ๋ฐธ๋ฅ˜๋ฅผ ๋ณด๋‚ด๊ฒŒ ๋œ๋‹ค.

  • ์ถœ๋ ฅํ•˜๊ณ ์ž ํ•˜๋Š” ๋‹จ์–ด๋“ค์— ๋Œ€ํ•ด attention map์„ ๋งŒ๋“œ๋ ค๋ฉด ์ธํ’‹์— ํ•ด๋‹นํ•˜๋Š” ๋‹จ์–ด๋“ค์˜ ํ‚ค๋ฒกํ„ฐ์™€ ๋ฐธ๋ฅ˜๋ฒกํ„ฐ๊ฐ€ ํ•„์š”ํ•˜๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค.

์ธํ’‹์€ ํ•œ๋ฒˆ์— ์ž…๋ ฅ๋ฐ›์ง€๋งŒ ์ถœ๋ ฅ์€ ํ•œ ๋‹จ์–ด์”ฉ ๋””์ฝ”๋”์— ๋„ฃ์–ด์„œ ์ถœ๋ ฅํ•˜๊ฒŒ ๋œ๋‹ค.

๋””์ฝ”๋”์—์„œ๋Š” ์ธ์ฝ”๋”์™€ ๋‹ฌ๋ฆฌ ์ˆœ์ฐจ์ ์œผ๋กœ ๊ฒฐ๊ณผ๋ฅผ ๋งŒ๋“ค์–ด๋‚ด์•ผ ํ•ด์„œ self-attention์„ ๋ณ€ํ˜•ํ•˜๊ฒŒ๋œ๋‹ค. ๋ฐ”๋กœ masking์„ ํ•ด์ฃผ๋Š” ๊ฒƒ. ์ธ์ฝ”๋”๋Š” ์ž…๋ ฅ์ˆœ์„œ๊ฐ€ ์ด๋ฏธ ์ •ํ•ด์ ธ์žˆ๊ธฐ ๋•Œ๋ฌธ์— decoder์ž…์žฅ์—์„œ๋Š” i๋ฒˆ์งธ ๋‹จ์–ด๊ฐ€ ๋ฌด์—‡์ธ์ง€ ์˜ˆ์ธกํ•˜๊ธฐ๊ฐ€ ์‰ฌ์›Œ์ง€๊ธฐ ๋•Œ๋ฌธ์— ์ด๋Ÿฌํ•œ ๋งˆ์Šคํ‚น์„ ํ•ด์ค€๋‹ค.

Encoder-Decoder Attention ์€ ๋””์ฝ”๋”๊ฐ€ ์ฟผ๋ฆฌ ๋ฒกํ„ฐ๋ฅผ ์ œ์™ธํ•˜๊ณ ๋Š” ํ‚ค ๋ฒกํ„ฐ์™€ ๋ฐธ๋ฅ˜ ๋ฒกํ„ฐ๋Š” ์ธ์ฝ”๋”์—์„œ ์ƒ์„ฑ๋œ ๊ฒƒ์„ ์‚ฌ์šฉํ•˜๊ฒ ๋‹ค๋ผ๋Š” ์ธ์ฝ”๋”์™€ ๋””์ฝ”๋”์˜ ์ƒํ˜ธ์—ฐ๊ฒฐ์„ฑ์„ ์˜๋ฏธํ•œ๋‹ค

์ถ”๊ฐ€์ ์œผ๋กœ ๋™๋ฃŒ์™€ ์ด๋Ÿฐ ์ด์•ผ๊ธฐ๋ฅผ ๋‚˜๋ˆ„์—ˆ๋‹ค.

๊ทผ๋ฐ positional encoding์ด ๊ณผ์—ฐ ํ•„์ˆ˜์ผ๊นŒ?

๋ณดํ†ต ์–ธ์–ด๋ชจ๋ธ์—์„œ๋Š” positional encoding์ด ๊ฑฐ์˜ ํ•„์ˆ˜์ด๋‹ค. ์‚ฌ์šฉํ•˜์ง€ ์•Š์•„๋„ ์ž‘๋™์€ ๊ทธ๋Ÿญ์ €๋Ÿญ ํ•œ๋‹ค. ๊ทผ๋ฐ ๊นŠ์€ autoregressive model(ํŠธ๋žœ์Šคํฌ๋จธ ๋ชจ๋ธ์˜ ํ•œ ์ข…๋ฅ˜)์—์„œ๋Š” ๋ฐ์ดํ„ฐ์…‹์ด ๋งŽ๊ณ  ๋ชจ๋ธ์ด ๊นŠ๋‹ค๋ณด๋‹ˆ๊นŒ ์ˆœ์„œ์ •๋ณด๊ฐ€ ์—†๋Š”๋ฐ๋„ ๋ฐ์ดํ„ฐ ์ž์ฒด์—์„œ ๊ฐ€์ค‘์น˜์—๊ฒŒ ์ˆœ์„œ์— ๋Œ€ํ•œ ์ •๋ณด๋ฅผ ์ œ๊ณตํ•ด์ค€๋‹ค. ๊ทธ๋ž˜์„œ ๋งˆ์ง€๋ง‰์—๋Š” ์˜คํžˆ๋ ค ์ˆœ์„œ์ •๋ณด๋ฅผ ์ œ๊ณตํ•˜์ง€ ์•Š์•˜๋”๋‹ˆ ์„ฑ๋Šฅ์ด ๋Š˜์–ด๋‚ฌ๋‹ค.

Vision Transformer

self-attention์„ ๋‹จ์–ด๋“ค์˜ sequence์—๋งŒ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ ์ด๋ฏธ์ง€์—๋„ ์‚ฌ์šฉํ•˜๊ฒŒ ๋˜์—ˆ๋‹ค.

์ธ์ฝ”๋”๋งŒ ์‚ฌ์šฉํ•˜๊ณ , ์ธ์ฝ”๋”์—์„œ ๋‚˜์˜ค๋Š” ๋ฒกํ„ฐ๋ฅผ ๋ฐ”๋กœ ๋ถ„๋ฅ˜๋ชจ๋ธ์— ์‚ฌ์šฉํ•˜๊ฒŒ๋œ๋‹ค.

์ฐจ์ด์ ์ด๋ผ๊ณ  ํ•˜๋ฉด ์–ธ์–ด๋Š” ๋ฌธ์žฅ๋“ค์„ sequenceํ•˜๊ฒŒ ๋„ฃ์–ด์ค€ ๊ฒƒ์— ๋น„ํ•ด ์ด๋ฏธ์ง€๋Š” ๋ช‡ ๊ฐœ์˜ ๋ถ€๋ถ„์กฐ๊ฐ์œผ๋กœ ๋‚˜๋ˆˆ๋’ค Linear Layer๋ฅผ ํ†ต๊ณผํ•ด๊ฐ€์ง€๊ณ  ํ•˜๋‚˜์˜ ์ž…๋ ฅ์ธ ๊ฒƒ ์ฒ˜๋Ÿผํ•ด์„œ ๋„ฃ๋Š”๋‹ค.

  • ๋ฌผ๋ก  positional encoding์ด ๋“ค์–ด๊ฐ„๋‹ค.

DALL-E

๋ฌธ์žฅ์„ ์ฃผ๋ฉด ์ด๋ฏธ์ง€๋ฅผ ์ œ๊ณตํ•˜๋Š” ๊ฒƒ. ํŠธ๋žœ์Šคํฌ๋จธ์— ์žˆ๋Š” ๋””์ฝ”๋”๋งŒ ํ™œ์šฉ์„ ํ–ˆ๋‹ค.

๋ฅผ ์ฐธ๊ณ ํ•œ ๊ฒฐ๊ณผ LN์€ ๋‹ค์Œ๊ณผ ๊ฐ™๋‹ค (BN๊ณผ์˜ ์ฐจ์ด๋ฅผ ํ†ตํ•ด ์„ค๋ช…)

๋…ผ๋ฌธ์—์„œ๋Š” ์ด๋ ‡๊ฒŒ ๋งํ•œ๋‹ค

์—ฌ๊ธฐ
Language Modeling with Deep Transformers