๐Ÿšดโ€โ™‚๏ธ
TIL
  • MAIN
  • : TIL?
  • : WIL
  • : Plan
  • : Retrospective
    • 21Y
      • Wait a moment!
      • 9M 2W
      • 9M1W
      • 8M4W
      • 8M3W
      • 8M2W
      • 8M1W
      • 7M4W
      • 7M3W
      • 7M2W
      • 7M1W
      • 6M5W
      • 1H
    • ์ƒˆ์‚ฌ๋žŒ ๋˜๊ธฐ ํ”„๋กœ์ ํŠธ
      • 2ํšŒ์ฐจ
      • 1ํšŒ์ฐจ
  • TIL : ML
    • Paper Analysis
      • BERT
      • Transformer
    • Boostcamp 2st
      • [S]Data Viz
        • (4-3) Seaborn ์‹ฌํ™”
        • (4-2) Seaborn ๊ธฐ์ดˆ
        • (4-1) Seaborn ์†Œ๊ฐœ
        • (3-4) More Tips
        • (3-3) Facet ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-2) Color ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-1) Text ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-3) Scatter Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-2) Line Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-1) Bar Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (1-3) Python๊ณผ Matplotlib
        • (1-2) ์‹œ๊ฐํ™”์˜ ์š”์†Œ
        • (1-1) Welcome to Visualization (OT)
      • [P]MRC
        • (2๊ฐ•) Extraction-based MRC
        • (1๊ฐ•) MRC Intro & Python Basics
      • [P]KLUE
        • (5๊ฐ•) BERT ๊ธฐ๋ฐ˜ ๋‹จ์ผ ๋ฌธ์žฅ ๋ถ„๋ฅ˜ ๋ชจ๋ธ ํ•™์Šต
        • (4๊ฐ•) ํ•œ๊ตญ์–ด BERT ์–ธ์–ด ๋ชจ๋ธ ํ•™์Šต
        • [NLP] ๋ฌธ์žฅ ๋‚ด ๊ฐœ์ฒด๊ฐ„ ๊ด€๊ณ„ ์ถ”์ถœ
        • (3๊ฐ•) BERT ์–ธ์–ด๋ชจ๋ธ ์†Œ๊ฐœ
        • (2๊ฐ•) ์ž์—ฐ์–ด์˜ ์ „์ฒ˜๋ฆฌ
        • (1๊ฐ•) ์ธ๊ณต์ง€๋Šฅ๊ณผ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ
      • [U]Stage-CV
      • [U]Stage-NLP
        • 7W Retrospective
        • (10๊ฐ•) Advanced Self-supervised Pre-training Models
        • (09๊ฐ•) Self-supervised Pre-training Models
        • (08๊ฐ•) Transformer (2)
        • (07๊ฐ•) Transformer (1)
        • 6W Retrospective
        • (06๊ฐ•) Beam Search and BLEU score
        • (05๊ฐ•) Sequence to Sequence with Attention
        • (04๊ฐ•) LSTM and GRU
        • (03๊ฐ•) Recurrent Neural Network and Language Modeling
        • (02๊ฐ•) Word Embedding
        • (01๊ฐ•) Intro to NLP, Bag-of-Words
        • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Preprocessing for NMT Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Subword-level Language Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ2] RNN-based Language Model
        • [์„ ํƒ ๊ณผ์ œ] BERT Fine-tuning with transformers
        • [ํ•„์ˆ˜ ๊ณผ์ œ] Data Preprocessing
      • Mask Wear Image Classification
        • 5W Retrospective
        • Report_Level1_6
        • Performance | Review
        • DAY 11 : HardVoting | MultiLabelClassification
        • DAY 10 : Cutmix
        • DAY 9 : Loss Function
        • DAY 8 : Baseline
        • DAY 7 : Class Imbalance | Stratification
        • DAY 6 : Error Fix
        • DAY 5 : Facenet | Save
        • DAY 4 : VIT | F1_Loss | LrScheduler
        • DAY 3 : DataSet/Lodaer | EfficientNet
        • DAY 2 : Labeling
        • DAY 1 : EDA
        • 2_EDA Analysis
      • [P]Stage-1
        • 4W Retrospective
        • (10๊ฐ•) Experiment Toolkits & Tips
        • (9๊ฐ•) Ensemble
        • (8๊ฐ•) Training & Inference 2
        • (7๊ฐ•) Training & Inference 1
        • (6๊ฐ•) Model 2
        • (5๊ฐ•) Model 1
        • (4๊ฐ•) Data Generation
        • (3๊ฐ•) Dataset
        • (2๊ฐ•) Image Classification & EDA
        • (1๊ฐ•) Competition with AI Stages!
      • [U]Stage-3
        • 3W Retrospective
        • PyTorch
          • (10๊ฐ•) PyTorch Troubleshooting
          • (09๊ฐ•) Hyperparameter Tuning
          • (08๊ฐ•) Multi-GPU ํ•™์Šต
          • (07๊ฐ•) Monitoring tools for PyTorch
          • (06๊ฐ•) ๋ชจ๋ธ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ
          • (05๊ฐ•) Dataset & Dataloader
          • (04๊ฐ•) AutoGrad & Optimizer
          • (03๊ฐ•) PyTorch ํ”„๋กœ์ ํŠธ ๊ตฌ์กฐ ์ดํ•ดํ•˜๊ธฐ
          • (02๊ฐ•) PyTorch Basics
          • (01๊ฐ•) Introduction to PyTorch
      • [U]Stage-2
        • 2W Retrospective
        • DL Basic
          • (10๊ฐ•) Generative Models 2
          • (09๊ฐ•) Generative Models 1
          • (08๊ฐ•) Sequential Models - Transformer
          • (07๊ฐ•) Sequential Models - RNN
          • (06๊ฐ•) Computer Vision Applications
          • (05๊ฐ•) Modern CNN - 1x1 convolution์˜ ์ค‘์š”์„ฑ
          • (04๊ฐ•) Convolution์€ ๋ฌด์—‡์ธ๊ฐ€?
          • (03๊ฐ•) Optimization
          • (02๊ฐ•) ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ - MLP (Multi-Layer Perceptron)
          • (01๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ๊ธฐ๋ณธ ์šฉ์–ด ์„ค๋ช… - Historical Review
        • Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Multi-headed Attention Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] LSTM Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] CNN Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Optimization Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] MLP Assignment
      • [U]Stage-1
        • 1W Retrospective
        • AI Math
          • (AI Math 10๊ฐ•) RNN ์ฒซ๊ฑธ์Œ
          • (AI Math 9๊ฐ•) CNN ์ฒซ๊ฑธ์Œ
          • (AI Math 8๊ฐ•) ๋ฒ ์ด์ฆˆ ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 7๊ฐ•) ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 6๊ฐ•) ํ™•๋ฅ ๋ก  ๋ง›๋ณด๊ธฐ
          • (AI Math 5๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ํ•™์Šต๋ฐฉ๋ฒ• ์ดํ•ดํ•˜๊ธฐ
          • (AI Math 4๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ๋งค์šด๋ง›
          • (AI Math 3๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ์ˆœํ•œ๋ง›
          • (AI Math 2๊ฐ•) ํ–‰๋ ฌ์ด ๋ญ์˜ˆ์š”?
          • (AI Math 1๊ฐ•) ๋ฒกํ„ฐ๊ฐ€ ๋ญ์˜ˆ์š”?
        • Python
          • (Python 7-2๊ฐ•) pandas II
          • (Python 7-1๊ฐ•) pandas I
          • (Python 6๊ฐ•) numpy
          • (Python 5-2๊ฐ•) Python data handling
          • (Python 5-1๊ฐ•) File / Exception / Log Handling
          • (Python 4-2๊ฐ•) Module and Project
          • (Python 4-1๊ฐ•) Python Object Oriented Programming
          • (Python 3-2๊ฐ•) Pythonic code
          • (Python 3-1๊ฐ•) Python Data Structure
          • (Python 2-4๊ฐ•) String and advanced function concept
          • (Python 2-3๊ฐ•) Conditionals and Loops
          • (Python 2-2๊ฐ•) Function and Console I/O
          • (Python 2-1๊ฐ•) Variables
          • (Python 1-3๊ฐ•) ํŒŒ์ด์ฌ ์ฝ”๋”ฉ ํ™˜๊ฒฝ
          • (Python 1-2๊ฐ•) ํŒŒ์ด์ฌ ๊ฐœ์š”
          • (Python 1-1๊ฐ•) Basic computer class for newbies
        • Assignment
          • [์„ ํƒ ๊ณผ์ œ 3] Maximum Likelihood Estimate
          • [์„ ํƒ ๊ณผ์ œ 2] Backpropagation
          • [์„ ํƒ ๊ณผ์ œ 1] Gradient Descent
          • [ํ•„์ˆ˜ ๊ณผ์ œ 5] Morsecode
          • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Baseball
          • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Text Processing 2
          • [ํ•„์ˆ˜ ๊ณผ์ œ 2] Text Processing 1
          • [ํ•„์ˆ˜ ๊ณผ์ œ 1] Basic Math
    • ๋”ฅ๋Ÿฌ๋‹ CNN ์™„๋ฒฝ ๊ฐ€์ด๋“œ - Fundamental ํŽธ
      • ์ข…ํ•ฉ ์‹ค์Šต 2 - ์บ๊ธ€ Plant Pathology(๋‚˜๋ฌด์žŽ ๋ณ‘ ์ง„๋‹จ) ๊ฒฝ์—ฐ ๋Œ€ํšŒ
      • ์ข…ํ•ฉ ์‹ค์Šต 1 - 120์ข…์˜ Dog Breed Identification ๋ชจ๋ธ ์ตœ์ ํ™”
      • ์‚ฌ์ „ ํ›ˆ๋ จ ๋ชจ๋ธ์˜ ๋ฏธ์„ธ ์กฐ์ • ํ•™์Šต๊ณผ ๋‹ค์–‘ํ•œ Learning Rate Scheduler์˜ ์ ์šฉ
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - ResNet ์ƒ์„ธ์™€ EfficientNet ๊ฐœ์š”
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - AlexNet, VGGNet, GoogLeNet
      • Albumentation์„ ์ด์šฉํ•œ Augmentation๊ธฐ๋ฒ•๊ณผ Keras Sequence ํ™œ์šฉํ•˜๊ธฐ
      • ์‚ฌ์ „ ํ›ˆ๋ จ CNN ๋ชจ๋ธ์˜ ํ™œ์šฉ๊ณผ Keras Generator ๋ฉ”์ปค๋‹ˆ์ฆ˜ ์ดํ•ด
      • ๋ฐ์ดํ„ฐ ์ฆ๊ฐ•์˜ ์ดํ•ด - Keras ImageDataGenerator ํ™œ์šฉ
      • CNN ๋ชจ๋ธ ๊ตฌํ˜„ ๋ฐ ์„ฑ๋Šฅ ํ–ฅ์ƒ ๊ธฐ๋ณธ ๊ธฐ๋ฒ• ์ ์šฉํ•˜๊ธฐ
    • AI School 1st
    • ํ˜„์—… ์‹ค๋ฌด์ž์—๊ฒŒ ๋ฐฐ์šฐ๋Š” Kaggle ๋จธ์‹ ๋Ÿฌ๋‹ ์ž…๋ฌธ
    • ํŒŒ์ด์ฌ ๋”ฅ๋Ÿฌ๋‹ ํŒŒ์ดํ† ์น˜
  • TIL : Python & Math
    • Do It! ์žฅ๊ณ +๋ถ€ํŠธ์ŠคํŠธ๋žฉ: ํŒŒ์ด์ฌ ์›น๊ฐœ๋ฐœ์˜ ์ •์„
      • Relations - ๋‹ค๋Œ€๋‹ค ๊ด€๊ณ„
      • Relations - ๋‹ค๋Œ€์ผ ๊ด€๊ณ„
      • ํ…œํ”Œ๋ฆฟ ํŒŒ์ผ ๋ชจ๋“ˆํ™” ํ•˜๊ธฐ
      • TDD (Test Driven Development)
      • template tags & ์กฐ๊ฑด๋ฌธ
      • ์ •์  ํŒŒ์ผ(static files) & ๋ฏธ๋””์–ด ํŒŒ์ผ(media files)
      • FBV (Function Based View)์™€ CBV (Class Based View)
      • Django ์ž…๋ฌธํ•˜๊ธฐ
      • ๋ถ€ํŠธ์ŠคํŠธ๋žฉ
      • ํ”„๋ก ํŠธ์—”๋“œ ๊ธฐ์ดˆ๋‹ค์ง€๊ธฐ (HTML, CSS, JS)
      • ๋“ค์–ด๊ฐ€๊ธฐ + ํ™˜๊ฒฝ์„ค์ •
    • Algorithm
      • Programmers
        • Level1
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์ˆซ์ž ๋ฌธ์ž์—ด๊ณผ ์˜๋‹จ์–ด
          • ์ž์—ฐ์ˆ˜ ๋’ค์ง‘์–ด ๋ฐฐ์—ด๋กœ ๋งŒ๋“ค๊ธฐ
          • ์ •์ˆ˜ ๋‚ด๋ฆผ์ฐจ์ˆœ์œผ๋กœ ๋ฐฐ์น˜ํ•˜๊ธฐ
          • ์ •์ˆ˜ ์ œ๊ณฑ๊ทผ ํŒ๋ณ„
          • ์ œ์ผ ์ž‘์€ ์ˆ˜ ์ œ๊ฑฐํ•˜๊ธฐ
          • ์ง์‚ฌ๊ฐํ˜• ๋ณ„์ฐ๊ธฐ
          • ์ง์ˆ˜์™€ ํ™€์ˆ˜
          • ์ฒด์œก๋ณต
          • ์ตœ๋Œ€๊ณต์•ฝ์ˆ˜์™€ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • ์ฝœ๋ผ์ธ  ์ถ”์ธก
          • ํฌ๋ ˆ์ธ ์ธํ˜•๋ฝ‘๊ธฐ ๊ฒŒ์ž„
          • ํ‚คํŒจ๋“œ ๋ˆ„๋ฅด๊ธฐ
          • ํ‰๊ท  ๊ตฌํ•˜๊ธฐ
          • ํฐ์ผ“๋ชฌ
          • ํ•˜์ƒค๋“œ ์ˆ˜
          • ํ•ธ๋“œํฐ ๋ฒˆํ˜ธ ๊ฐ€๋ฆฌ๊ธฐ
          • ํ–‰๋ ฌ์˜ ๋ง์…ˆ
        • Level2
          • ์ˆซ์ž์˜ ํ‘œํ˜„
          • ์ˆœ์œ„ ๊ฒ€์ƒ‰
          • ์ˆ˜์‹ ์ตœ๋Œ€ํ™”
          • ์†Œ์ˆ˜ ์ฐพ๊ธฐ
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์‚ผ๊ฐ ๋‹ฌํŒฝ์ด
          • ๋ฌธ์ž์—ด ์••์ถ•
          • ๋ฉ”๋‰ด ๋ฆฌ๋‰ด์–ผ
          • ๋” ๋งต๊ฒŒ
          • ๋•…๋”ฐ๋จน๊ธฐ
          • ๋ฉ€์ฉกํ•œ ์‚ฌ๊ฐํ˜•
          • ๊ด„ํ˜ธ ํšŒ์ „ํ•˜๊ธฐ
          • ๊ด„ํ˜ธ ๋ณ€ํ™˜
          • ๊ตฌ๋ช…๋ณดํŠธ
          • ๊ธฐ๋Šฅ ๊ฐœ๋ฐœ
          • ๋‰ด์Šค ํด๋Ÿฌ์Šคํ„ฐ๋ง
          • ๋‹ค๋ฆฌ๋ฅผ ์ง€๋‚˜๋Š” ํŠธ๋Ÿญ
          • ๋‹ค์Œ ํฐ ์ˆซ์ž
          • ๊ฒŒ์ž„ ๋งต ์ตœ๋‹จ๊ฑฐ๋ฆฌ
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
          • ๊ฐ€์žฅ ํฐ ์ •์‚ฌ๊ฐํ˜• ์ฐพ๊ธฐ
          • H-Index
          • JadenCase ๋ฌธ์ž์—ด ๋งŒ๋“ค๊ธฐ
          • N๊ฐœ์˜ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • N์ง„์ˆ˜ ๊ฒŒ์ž„
          • ๊ฐ€์žฅ ํฐ ์ˆ˜
          • 124 ๋‚˜๋ผ์˜ ์ˆซ์ž
          • 2๊ฐœ ์ดํ•˜๋กœ ๋‹ค๋ฅธ ๋น„ํŠธ
          • [3์ฐจ] ํŒŒ์ผ๋ช… ์ •๋ ฌ
          • [3์ฐจ] ์••์ถ•
          • ์ค„ ์„œ๋Š” ๋ฐฉ๋ฒ•
          • [3์ฐจ] ๋ฐฉ๊ธˆ ๊ทธ๊ณก
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
        • Level3
          • ๋งค์นญ ์ ์ˆ˜
          • ์™ธ๋ฒฝ ์ ๊ฒ€
          • ๊ธฐ์ง€๊ตญ ์„ค์น˜
          • ์ˆซ์ž ๊ฒŒ์ž„
          • 110 ์˜ฎ๊ธฐ๊ธฐ
          • ๊ด‘๊ณ  ์ œ๊ฑฐ
          • ๊ธธ ์ฐพ๊ธฐ ๊ฒŒ์ž„
          • ์…”ํ‹€๋ฒ„์Šค
          • ๋‹จ์†์นด๋ฉ”๋ผ
          • ํ‘œ ํŽธ์ง‘
          • N-Queen
          • ์ง•๊ฒ€๋‹ค๋ฆฌ ๊ฑด๋„ˆ๊ธฐ
          • ์ตœ๊ณ ์˜ ์ง‘ํ•ฉ
          • ํ•ฉ์Šน ํƒ์‹œ ์š”๊ธˆ
          • ๊ฑฐ์Šค๋ฆ„๋ˆ
          • ํ•˜๋…ธ์ด์˜ ํƒ‘
          • ๋ฉ€๋ฆฌ ๋›ฐ๊ธฐ
          • ๋ชจ๋‘ 0์œผ๋กœ ๋งŒ๋“ค๊ธฐ
        • Level4
    • Head First Python
    • ๋ฐ์ดํ„ฐ ๋ถ„์„์„ ์œ„ํ•œ SQL
    • ๋‹จ ๋‘ ์žฅ์˜ ๋ฌธ์„œ๋กœ ๋ฐ์ดํ„ฐ ๋ถ„์„๊ณผ ์‹œ๊ฐํ™” ๋ฝ€๊ฐœ๊ธฐ
    • Linear Algebra(Khan Academy)
    • ์ธ๊ณต์ง€๋Šฅ์„ ์œ„ํ•œ ์„ ํ˜•๋Œ€์ˆ˜
    • Statistics110
  • TIL : etc
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Kubernetes
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Docker
      • 2. ๋„์ปค ์„ค์น˜ ์‹ค์Šต 1 - ํ•™์ŠตํŽธ(์ค€๋น„๋ฌผ/์‹ค์Šต ์œ ํ˜• ์†Œ๊ฐœ)
      • 1. ์ปจํ…Œ์ด๋„ˆ์™€ ๋„์ปค์˜ ์ดํ•ด - ์ปจํ…Œ์ด๋„ˆ๋ฅผ ์“ฐ๋Š”์ด์œ  / ์ผ๋ฐ˜ํ”„๋กœ๊ทธ๋žจ๊ณผ ์ปจํ…Œ์ด๋„ˆํ”„๋กœ๊ทธ๋žจ์˜ ์ฐจ์ด์ 
      • 0. ๋“œ๋””์–ด ์ฐพ์•„์˜จ Docker ๊ฐ•์˜! ์™•์ดˆ๋ณด์—์„œ ๋„์ปค ๋งˆ์Šคํ„ฐ๋กœ - OT
    • CoinTrading
      • [๊ฐ€์ƒ ํ™”ํ ์ž๋™ ๋งค๋งค ํ”„๋กœ๊ทธ๋žจ] ๋ฐฑํ…Œ์ŠคํŒ… : ๊ฐ„๋‹จํ•œ ํ…Œ์ŠคํŒ…
    • Gatsby
      • 01 ๊นƒ๋ถ ํฌ๊ธฐ ์„ ์–ธ
  • TIL : Project
    • Mask Wear Image Classification
    • Project. GARIGO
  • 2021 TIL
    • CHANGED
    • JUN
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Thu
      • 2 Wed
      • 1 Tue
    • MAY
      • 31 Mon
      • 30 Sun
      • 29 Sat
      • 28 Fri
      • 27 Thu
      • 26 Wed
      • 25 Tue
      • 24 Mon
      • 23 Sun
      • 22 Sat
      • 21 Fri
      • 20 Thu
      • 19 Wed
      • 18 Tue
      • 17 Mon
      • 16 Sun
      • 15 Sat
      • 14 Fri
      • 13 Thu
      • 12 Wed
      • 11 Tue
      • 10 Mon
      • 9 Sun
      • 8 Sat
      • 7 Fri
      • 6 Thu
      • 5 Wed
      • 4 Tue
      • 3 Mon
      • 2 Sun
      • 1 Sat
    • APR
      • 30 Fri
      • 29 Thu
      • 28 Wed
      • 27 Tue
      • 26 Mon
      • 25 Sun
      • 24 Sat
      • 23 Fri
      • 22 Thu
      • 21 Wed
      • 20 Tue
      • 19 Mon
      • 18 Sun
      • 17 Sat
      • 16 Fri
      • 15 Thu
      • 14 Wed
      • 13 Tue
      • 12 Mon
      • 11 Sun
      • 10 Sat
      • 9 Fri
      • 8 Thu
      • 7 Wed
      • 6 Tue
      • 5 Mon
      • 4 Sun
      • 3 Sat
      • 2 Fri
      • 1 Thu
    • MAR
      • 31 Wed
      • 30 Tue
      • 29 Mon
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • FEB
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • JAN
      • 31 Sun
      • 30 Sat
      • 29 Fri
      • 28 Thu
      • 27 Wed
      • 26 Tue
      • 25 Mon
      • 24 Sun
      • 23 Sat
      • 22 Fri
      • 21 Thu
      • 20 Wed
      • 19 Tue
      • 18 Mon
      • 17 Sun
      • 16 Sat
      • 15 Fri
      • 14 Thu
      • 13 Wed
      • 12 Tue
      • 11 Mon
      • 10 Sun
      • 9 Sat
      • 8 Fri
      • 7 Thu
      • 6 Wed
      • 5 Tue
      • 4 Mon
      • 3 Sun
      • 2 Sat
      • 1 Fri
  • 2020 TIL
    • DEC
      • 31 Thu
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Tue
      • 2 Wed
      • 1 Tue
    • NOV
      • 30 Mon
Powered by GitBook
On this page
  • [AI ์Šค์ฟจ 1๊ธฐ] 6์ฃผ์ฐจ DAY 2
  • ์„ ํ˜•ํšŒ๊ท€(Linear Models for Regression)
  • ์„ ํ˜• ๊ธฐ์ € ํ•จ์ˆ˜ ๋ชจ๋ธ
  • ์ตœ๋Œ€์šฐ๋„์™€ ์ตœ์†Œ์ œ๊ณฑ๋ฒ•(Maximum Likelihood and Least Squares)
  • ์˜จ๋ผ์ธ ํ•™์Šต (Sequential Learning)
  • ๊ทœ์ œํ™”๋œ ์ตœ์†Œ์ œ๊ณฑ๋ฒ•(Regularized Least Squares)
  • ํŽธํ–ฅ-๋ถ„์‚ฐ ๋ถ„ํ•ด(Bias-Variance Decomposition)
  • ๋ฒ ์ด์ง€์•ˆ ์„ ํ˜•ํšŒ๊ท€(Bayesian Linear Regression)
  • [Statistics 110]
  • 2๊ฐ•- ํ•ด์„์„ ํ†ตํ•œ ๋ฌธ์ œํ’€์ด ๋ฐ ํ™•๋ฅ ์˜ ๊ณต๋ฆฌ (Story Proofs, Axioms of Probability)

Was this helpful?

  1. 2021 TIL
  2. JAN

12 Tue

TIL

Previous13 WedNext11 Mon

Last updated 4 years ago

Was this helpful?

[AI ์Šค์ฟจ 1๊ธฐ] 6์ฃผ์ฐจ DAY 2

์„ ํ˜•ํšŒ๊ท€(Linear Models for Regression)

์ถœ์ฒ˜ :

์„ ํ˜• ๊ธฐ์ € ํ•จ์ˆ˜ ๋ชจ๋ธ

  • ๊ฐ€์žฅ ๋‹จ์ˆœํ•œ ๋ชจ๋ธ

    • x์— ๊ด€ํ•œ ๋น„์„ ํ˜• ํ•จ์ˆ˜๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™๋‹ค.

      ๊ธฐ์ €ํ•จ์ˆ˜(basis function) :

  • ๋ช‡๊ฐ€์ง€ ๊ธฐ์ € ํ•จ์ˆ˜

    • ๋‹คํ•ญ์‹(polynomial) ๊ธฐ์ €ํ•จ์ˆ˜

    • ๊ฐ€์šฐ์‹œ์•ˆ ๊ธฐ์ €ํ•จ์ˆ˜

    • ์‹œ๊ทธ๋ชจ์ด๋“œ(sigmoid) ๊ธฐ์ €ํ•จ์ˆ˜

      ์™ผ์ชฝ๋ถ€ํ„ฐ ๋‹คํ•ญ์‹ ๊ธฐ์ €ํ•จ์ˆ˜, ๊ฐ€์šฐ์‹œ์•ˆ ๊ธฐ์ €ํ•จ์ˆ˜, ์‹œ๊ทธ๋ชจ์ด๋“œ ๊ธฐ์ €ํ•จ์ˆ˜

์ตœ๋Œ€์šฐ๋„์™€ ์ตœ์†Œ์ œ๊ณฑ๋ฒ•(Maximum Likelihood and Least Squares)

  • ๊ฐ€์šฐ์‹œ์•ˆ ๋…ธ์ด์ฆˆ๊ฐ€ ํฌํ•จ๋œ ํƒ€๊ฒŸ t

  • t์˜ ๋ถ„ํฌ

    • ์ œ๊ณฑํ•ฉ์ด ์†์‹คํ•จ์ˆ˜๋กœ ์“ฐ์ด๋Š” ๊ฒฝ์šฐ, ์ƒˆ๋กœ์šด x๊ฐ€ ์ฃผ์–ด์กŒ์„๋•Œ t์˜ ์ตœ์  ์˜ˆ์ธก ๊ฐ’์€ t์˜ ์กฐ๊ฑด๋ถ€ ๊ธฐ๋Œ“๊ฐ’ (์ด์ „ ๊ฐ•์˜)

      t๊ฐ€ ์œ„์˜ ๋ถ„ํฌ๋ฅผ ๋”ฐ๋ฅด๋Š” ๊ฒฝ์šฐ ์กฐ๊ฑด๋ถ€ ๊ธฐ๋Œ“๊ฐ’

  • ์ตœ๋Œ€์šฐ๋„์ถ”์ •๋ฒ•์„ ํ†ตํ•œ ์ตœ์ ์˜ w ๊ตฌํ•˜๊ธฐ

    • ์šฐ๋„ํ•จ์ˆ˜

    • ๋กœ๊ทธ ์šฐ๋„ํ•จ์ˆ˜

    • w์— ๋Œ€ํ•œ ๊ธฐ์šธ๊ธฐ ๋ฒกํ„ฐ

      ๋””์ž์ธ ํ–‰๋ ฌ(design matrix)

    • normal equations ์œ ๋„ํ•˜๊ธฐ

      ์ „๊ฐœํ•˜๋ฉด,

    • ์œ„์˜ ๋กœ๊ทธ์šฐ๋„ํ•จ์ˆ˜ ์‹์—์„œ ฮฒ์— ๋Œ€ํ•ด ํŽธ๋ฏธ๋ถ„์„ ํ†ตํ•ด ์ตœ์ ๊ฐ’์„ ๊ตฌํ• ์ˆ˜ ์žˆ์Œ

  • ๊ธฐํ•˜ํ•™์  ์˜๋ฏธ

    • span, range, projection ๋ณต์Šต > ์„ ํ˜•๋Œ€์ˆ˜

    • ํ–‰๋ ฌ A์— ๊ด€ํ•œ ์‚ฌ์˜

์˜จ๋ผ์ธ ํ•™์Šต (Sequential Learning)

  • ๋ฐ์ดํ„ฐ์˜ ์‚ฌ์ด์ฆˆ๊ฐ€ ๋„ˆ๋ฌด ํฌ๋ฉด ๊ณ„์‚ฐ์ด ์–ด๋ ค์›€ -> ์—ฌ๋Ÿฌ ๋Œ€์•ˆ์ด ์กด์žฌ, ๊ทธ ์ค‘ ํ•˜๋‚˜

  • ๊ฐ–๊ณ ์žˆ๋Š” ํ•™์Šต๋ฐ์ดํ„ฐ๋ฅผ ์กฐ๊ธˆ ๋‚˜๋ˆ ์„œ ์กฐ๊ธˆ์”ฉ ์—…๋ฐ์ดํ„ฐ ์ง„ํ–‰

  • ๋ฐ์ดํ„ฐ๊ฐ€ ์•„๋ฌด๋ฆฌ ํฌ๋”๋ผ๋„ ์–ด๋А์ •๋„ ๋ชจ๋ธ ํ•™์Šต ๊ฐ€๋Šฅ

  • ๊ทธ ์ค‘ ๋งŽ์ด ์“ฐ์ด๋Š” Stochastic gradient decent

    • ์ œ๊ณฑํ•ฉ ์—๋Ÿฌํ•จ์ˆ˜์ธ ๊ฒฝ์šฐ,

  • ์‹œ๊ฐ„์€ ๋งŽ์ด ๊ฑธ๋ฆฌ๋”๋ผ๋„, ๋ฉ”๋ชจ๋ฆฌ์— ๋Œ€ํ•œ ๋ถ€๋‹ด์€ โ†“

๊ทœ์ œํ™”๋œ ์ตœ์†Œ์ œ๊ณฑ๋ฒ•(Regularized Least Squares)

  • ์—๋Ÿฌํ•จ์ˆ˜์˜ ๊ฐ€์žฅ ๋‹จ์ˆœํ•œ ํ˜•ํƒœ

    lambda์— ์˜ํ•ด ๊ทœ์ œํ™” ์ปจํŠธ๋กค

  • ์ผ๋ฐ˜ํ™”๋œ ๊ทœ์ œํ™”

    • ์ด๋•Œ, ์ œ์•ฝ์กฐ๊ฑด์„ ๋‘์–ด ์ƒ๊ฐํ•˜๋ฉด (constrained minimization ๋ฌธ์ œ๋กœ ๋‚˜ํƒ€๋‚ด๋ฉด),

ํŽธํ–ฅ-๋ถ„์‚ฐ ๋ถ„ํ•ด(Bias-Variance Decomposition)

  • ๋ชจ๋ธ ๊ณผ์ ํ•ฉ์— ๋Œ€ํ•œ ์ด๋ก ์ ์ธ ๋ถ„์„

  • ์ œ๊ณฑํ•ฉ ์†์‹คํ•จ์ˆ˜๊ฐ€ ์ฃผ์–ด์กŒ์„๋•Œ์˜ ์ตœ์  ์˜ˆ์ธก๊ฐ’

  • ์†์‹คํ•จ์ˆ˜์˜ ๊ธฐ๋Œ“๊ฐ’

    ์ œํ•œ๋œ ๋ฐ์ดํ„ฐ์…‹๋งŒ ์•Œ์•„์„œ๋Š” ์ตœ์  ์˜ˆ์ธก๊ฐ’ h(x)๋ฅผ ์•Œ์ˆ˜ ์—†๋‹ค.

    ๋”ฐ๋ผ์„œ ๋ชจ๋ธ์˜ ๋ถˆํ™•์‹ค์„ฑ์„ ํ‘œํ˜„ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋ฒ ์ด์ง€์•ˆ/๋นˆ๋„์ฃผ์˜ ๋ฐฉ๋ฒ•์ด ์žˆ์Œ

    • ๋ฒ ์ด์ง€์•ˆ ๋ฐฉ๋ฒ•์€ ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ w์˜ ์‚ฌํ›„ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ๊ณ„์‚ฐ

    • ๋นˆ๋„์ฃผ์˜ ๋ฐฉ๋ฒ•์€ w์˜ ์ ์ถ”์ •๊ฐ’์„ ๊ตฌํ•˜๊ณ , ์—ฌ๋Ÿฌ ๋ฐ์ดํ„ฐ์…‹์— ๋Œ€ํ•ด ๋ฐœ์ƒํ•˜๋Š” ํ‰๊ท ์ ์ธ ์†์‹ค์„ ๊ณ„์‚ฐํ•˜๋Š” ๊ฐ€์ƒ ์‹คํ—˜์„ ํ†ตํ•ด ์ ์ถ”์ •๊ฐ’์˜ ๋ถˆํ™•์‹ค์„ฑ์„ ํ•ด์„โ—

/ ๋นˆ๋„์ฃผ์˜ ๋ฐฉ๋ฒ•...

  • ํŠน์ • ๋ฐ์ดํ„ฐ ์…‹ D์— ๋Œ€ํ•œ ์†์‹ค

  • ์†์‹ค ํ•จ์ˆ˜์˜ ๊ธฐ๋Œ“๊ฐ’

  • ์—ฌ๋Ÿฌ ๊ฐœ(L๊ฐœ)์˜ ๋ฐ์ดํ„ฐ์…‹์ด ์ฃผ์–ด์กŒ์„ ๋–„, ์ด ๊ฐ’๋“ค์˜ ํ‰๊ท ?

    ๋”ฐ๋ผ์„œ,

    • ์ž์œ ๋„ โ†‘, ๋ณต์žก๋„ โ†‘, var โ†‘, bias^2 โ†“ (var, bias : trade-off)

    • ๋ชจ๋ธํ•™์Šต์— ์ ์ ˆํ•œ ๋ชจ๋ธ๋ณต์žก๋„(์ž์œ ๋„)๋ฅผ ๊ฐ€์งˆ์ˆ˜ ์žˆ๋„๋ก ํ•ด์•ผ ์ข‹์€ ๋ชจ๋ธ(์ƒˆ๋กœ์šด ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•ด ๋„ˆ๋ฌด ๊ณผ์ ํ•ฉ๋˜์ง€ ์•Š์€ ๊ฒฐ๊ณผ๋ฅผ ๋‚ผ ์ˆ˜ ์žˆ๋Š” ๋ชจ๋ธ)โ—

๋ฒ ์ด์ง€์•ˆ ์„ ํ˜•ํšŒ๊ท€(Bayesian Linear Regression)

  • ์œ„์—์„œ ์ฒ˜๋Ÿผ ๋นˆ๋„์ฃผ์˜๋ฐฉ๋ฒ•์œผ๋กœ๋Š” ๋ชจ๋ธ์˜ ๋ถˆํ™•์‹ค์„ฑ์„ ๋‚˜ํƒ€๋‚ด๊ธฐ ํž˜๋“ฆโ— ๋ฒ ์ด์ง€์•ˆ ์„ ํ˜•ํšŒ๊ท€๋ฅผ ํ†ตํ•ด ํ›จ์”ฌ ๋” ๋ถˆํ™•์‹ค์„ฑ์„ ๊น”๋”ํ•˜๊ฒŒ ๋‹ค๋ฃฐ ์ˆ˜ ์žˆ์Œโ—

  • ํŒŒ๋ผ๋ฏธํ„ฐ w์˜ ์‚ฌ์ „ํ™•๋ฅ 

  • ์šฐ๋„

  • ์‚ฌํ›„ํ™•๋ฅ 

    (using ๊ฐ€์šฐ์‹œ์•ˆ ๋ถ„ํฌ๋ฅผ ์œ„ํ•œ ๋ฒ ์ด์ฆˆ ์ •๋ฆฌ)

  • ์‚ฌํ›„ํ™•๋ฅ ์˜ ๋กœ๊ทธ๊ฐ’

  • ๋ฒ ์ด์ง€์•ˆ ๋ฐฉ๋ฒ•์€ ๋นˆ๋„์ฃผ์˜๋ณด๋‹ค ์ผ๋ฐ˜์ ์ด๊ณ , ๊ฐ•๋ ฅํ•œ ๋ฐฉ๋ฒ•๋ก โ—

  • ์˜ˆ์ธก๊ฐ’์˜ ๋ถ„ํฌ๋ฅผ ๊ตฌํ•  ์ˆ˜ ์žˆ์Œโ—

  • ์˜ˆ์ธก ๋ถ„ํฌ (Predictive Distribution)

    • ์ƒˆ๋กœ์šด ์ž…๋ ฅ x์ด ์ฃผ์–ด์กŒ์„ ๋•Œ, t ์˜ˆ์ธก

    • ์ด์ „ ๊ฒฐ๊ณผ ์ ์šฉํ•˜๋ฉด,

[Statistics 110]

Present Part [2 / 34]

2๊ฐ•- ํ•ด์„์„ ํ†ตํ•œ ๋ฌธ์ œํ’€์ด ๋ฐ ํ™•๋ฅ ์˜ ๊ณต๋ฆฌ (Story Proofs, Axioms of Probability)

10๋ช…์„ 4๋ช…๊ณผ 6๋ช…์œผ๋กœ ๋‚˜๋ˆ„์–ด ๋‘ ํŒ€์„ ๋งŒ๋“œ๋Š” ๊ฒฝ์šฐ์˜ ์ˆ˜

10๋ช… ์ค‘ 4๋ช…์„ ๋ฝ‘์€ ๊ฒฝ์šฐ์˜ ์ˆ˜ (104) {10 \choose 4 }(410โ€‹) ์™€ ๊ฐ™๋‹ค. 4๋ช…์„ ๋ฝ‘์œผ๋ฉด 6๋ช…์ด ์ž๋™์œผ๋กœ ๊ฒฐ์ •๋˜๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค. ๋˜ํ•œ ์ด๋Š” 10๋ช… ์ค‘ 6๋ช…์„ ๋ฝ‘์€ ๊ฒฝ์šฐ์˜ ์ˆ˜์™€ ๊ฐ™์œผ๋ฉฐ ์ฆ๋ช…ํ•˜์ง€ ์•Š์•„๋„ ๊ฐœ๋…์ ์œผ๋กœ (104)=(106) {10 \choose 4 } = {10 \choose 6 } (410โ€‹)=(610โ€‹) ๋ผ๋Š” ๊ฒƒ์„ ์•Œ ์ˆ˜ ์žˆ๋‹ค.

10๋ช…์„ 5๋ช…๊ณผ 5๋ช…์œผ๋กœ ๋‚˜๋ˆ„์–ด ๋‘ ํŒ€์„ ๋งŒ๋“œ๋Š” ๊ฒฝ์šฐ์˜ ์ˆ˜

์ด๋Š” 10๋ช… ์ค‘ 5๋ช…์„ ๋ฝ‘๋Š” ๊ฒฝ์šฐ์˜ ์ˆ˜(105) {10 \choose 5 } (510โ€‹)์™€ ๊ฐ™์€๋ฐ ์—ฌ๊ธฐ์„œ 1/2 ๋ฅผ ํ•ด์•ผํ•œ๋‹ค. ์™œ๋ƒํ•˜๋ฉด 5๋ช…์„ ๋ฝ‘์œผ๋ฉด ์ž๋™์œผ๋กœ 5๋ช…์ด ๊ฒฐ์ •๋˜๋Š” ๊ฒฝ์šฐ์—์„œ ์ด๋ฏธ ๋ฐ˜๋Œ€ํŽธ์—์„œ๋„ ๋™์‹œ์— 5๋ช…์„ ๋ฝ‘๋Š” ๊ฒฝ์šฐ์˜ ์ˆ˜๊ฐ€ ๊ฐ™์ด ์นด์šดํŠธ ๋˜๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค. ๋”ฐ๋ผ์„œ ์ค‘๋ณต์œผ๋กœ ์นด์šดํŠธ๋œ ๊ฒƒ์„ ์ œํ•˜๊ธฐ ์œ„ํ•ด 1/2์„ ๊ณฑํ•ด์•ผ ํ•œ๋‹ค.

์ด๋ ‡๊ฒŒ, ํ•ญ์ƒ ๊ฒฝ์šฐ๋ฅผ ์ž˜ ๋ณด๊ณ  ์ค‘๋ณต์˜ ์—ฌ๋ถ€๋ฅผ ์ž˜ ํŒ๋‹จํ•ด์•ผ ํ•œ๋‹ค.

๋ณต์› ๊ฐ€๋Šฅ, ์ˆœ์„œ ๋ฏธ๊ณ ๋ ค (n+kโˆ’1k) {n+k-1 \choose k} (kn+kโˆ’1โ€‹)

๋‹ค์Œ์˜ ๊ฒฝ์šฐ์˜ ์ˆ˜๊ฐ€ ๋งž๋Š”์ง€๋ฅผ ๊ฒฐ์ •ํ•˜๊ธฐ ์œ„ํ•ด ๋ช‡๊ฐ€์ง€ ์ผ€์ด์Šค๋ฅผ ์ƒ๊ฐํ•ด๋ณธ๋‹ค.

  • k = 0์ผ๋•Œ,(nโˆ’10) {n-1 \choose 0} (0nโˆ’1โ€‹) = 1

    • ์•„๋ฌด๊ฒƒ๋„ ๊ฒฐ์ •ํ•˜์ง€ ์•Š์„ ๋•Œ์˜ ๊ฒฝ์šฐ์˜ ์ˆ˜๋Š” 1์ด ๋งž๋‹ค. 0! = 1

  • k = 1์ผ๋•Œ, (n1) {n \choose 1} (1nโ€‹)= n

    • n๊ฐœ ์ค‘ 1๊ฐœ๋ฅผ ๋ฝ‘๋Š” ๊ฒฝ์šฐ์˜ ์ˆ˜๋Š” n๊ฐœ

  • n = 2์ผ๋•Œ, (k+1k) {k+1 \choose k} (kk+1โ€‹)= (k+11) {k+1 \choose 1} (1k+1โ€‹)= k+1

n๊ฐœ์˜ ๊ตฌ๋ถ„ ๊ฐ€๋Šฅํ•œ ๋ฐ•์Šค ์•ˆ์— k๊ฐœ์˜ ๊ตฌ๋ถ„ ๋ถˆ๊ฐ€๋Šฅํ•œ ์ž…์ž๋ฅผ ๋„ฃ๋Š” ๋ฐฉ๋ฒ•

  • (n+kโˆ’1k) {n+k-1 \choose k} (kn+kโˆ’1โ€‹)

  • n = 4, k= 6 ์ผ๋•Œ

    • 3 / 0 / 2 / 1

    • ooo||oo|o => o of k and | of n-1

    • ์ด๋Š” n+k-1 ๊ฐœ์˜ ์œ„์น˜์—์„œ k๊ฐœ์˜ ์ ์„ ์œ„์น˜์‹œํ‚ค๋Š” ๊ฐœ๋…๊ณผ ๋™์ผ

    • ์ ์˜ ์œ„์น˜๊ฐ€ ๊ฒฐ์ •๋˜๋ฉด ๋ถ„๋ฆฌ์„ ์˜ ์œ„์น˜๊ฐ€ ์ž๋™์œผ๋กœ ๊ฒฐ์ •๋œ๋‹ค

  • n = 2์ผ๋•Œ => ๋™์ „์„ ๋’ค์ง‘๋Š” ์ƒํ™ฉ

    • ์•ž๋ฉด๊ณผ ๋’ท๋ฉด์ด ๊ณต์ •ํ•œ ํ™•๋ฅ ์„ ๊ฐ€์ง€๋ฉด 4๊ฐ€์ง€ ๊ฒฝ์šฐ๋ฅผ ๊ฐ€์ง„๋‹ค

    • ๋™์ „ ํ•œ ๊ฐœ๋ฅผ ๋‘๋ฒˆ๋˜์ง€๊ณ  ์ด ๋™์ „์ด ์ ˆ๋Œ€ ๊ตฌ๋ณ„ํ•  ์ˆ˜ ์—†๋‹ค๊ณ  ๊ฐ€์ •ํ•˜๋ฉด ์‹ค์ œ๋กœ๋Š” 3๊ฐ€์ง€ ๊ฒฝ์šฐ(?)

์ด์•ผ๊ธฐ ์ฆ๋ช…(Story Proof)

  • ํ•ด์„์— ์˜ํ•œ ์ฆ๋ช…

    • ex)์ดˆ๋ฐ˜์— ์ด์•ผ๊ธฐํ•œ 10๊ฐœ์ค‘ 4๊ฐœ๋ฅผ ๋ฝ‘์„ ํ™•๋ฅ  = 10๊ฐœ์ค‘ 6๊ฐœ๋ฅผ ๋ฝ‘์„ ํ™•๋ฅ 

    • ํŒฉํ† ๋ฆฌ์–ผ์„ ๊ฐ€์ง€๊ณ  ๋น„๊ตํ•˜์ง€ ์•Š์•˜์Œ

  • n(nโˆ’1kโˆ’1){n-1 \choose k-1}(kโˆ’1nโˆ’1โ€‹)= k(nk){n \choose k} (knโ€‹)

    • n๋ช…์ค‘์—์„œ ๋™์•„๋ฆฌ์— ๋“ค์–ด๊ฐˆ k๋ช…์˜ ์‚ฌ๋žŒ์„ ๊ณ ๋ฅด๊ณ  ์ด ์ค‘ ๋Œ€ํ‘œ 1๋ช…์„ ๋ฝ‘์„ ๊ฒฝ์šฐ์˜ ์ˆ˜

    • ๊ทธ๋Ÿฌ๋‚˜ ์ด๋Š” ํ•ด์„์ ์œผ๋กœ ๋‹ค๋ฅด๊ฒŒ ๋งํ•  ์ˆ˜ ์žˆ๋‹ค

    • n๋ช…์ค‘์—์„œ ๋Œ€ํ‘œ 1๋ช…์„ ๋ฝ‘๊ณ  ๋‚˜๋จธ์ง€ ์ค‘์—์„œ k-1๋ช…์˜ ์‚ฌ๋žŒ์„ ๋ฝ‘์„ ๊ฒฝ์šฐ์˜ ์ˆ˜

  • (m+nk){m+n \choose k}(km+nโ€‹)= โˆ‘j=0k(mj)(nkโˆ’j) \sum^k_{j=0} {m\choose j} {n \choose k-j} โˆ‘j=0kโ€‹(jmโ€‹)(kโˆ’jnโ€‹): ๋ฐฉ๋ฐ๋ฅด๋ชฝ๋“œ ํ•ญ๋“ฑ์‹

    • ์ด๋ฅผ ์ฆ๋ช…ํ•˜๋ ค๋ฉด ํŒฉํ† ๋ฆฌ์–ผ์„ ์‚ฌ์šฉํ•˜๊ฑฐ๋‚˜ ์ดํ•ญ ์ •๋ฆฌ๋ฅผ ์ด์šฉํ•ด์•ผ ํ•œ๋‹ค

    • ์ขŒ๋ณ€ : m+n๊ฐœ ์ค‘ k๊ฐœ๋ฅผ ๊ณ ๋ฅด๋Š” ๊ฒƒ

    • ์šฐ๋ณ€ : m+n๊ฐœ๋ฅผ m๊ฐœ์™€ n๊ฐœ์˜ ๊ทธ๋ฃน์œผ๋กœ ๋‚˜๋ˆˆ ๋’ค k๋ช…์ด ๋˜๋„๋ก ๋ฝ‘๋Š” ๊ฒƒ. (์ด ๋•Œ m์—์„œ 0๋ช…์ด ๋ฝ‘ํžˆ๋ฉด n๋ช…์—์„œ๋Š” ์ž๋™์œผ๋กœ k๋ช…์ด ๋ฝ‘ํžˆ๊ณ , m๋ช…์—์„œ 1๋ช…์ด ๋ฝ‘ํžˆ๋ฉด n๋ช…์—์„œ k-1๋ช…์ด ๋ฝ‘ํžŒ๋‹ค)

ํ™•๋ฅ 

์ง€๊ธˆ๊นŒ์ง€๋Š” ๋ชจ๋‘ ๋™์ผํ•œ ํ™•๋ฅ ์ด ๋ฐœ์ƒํ•œ๋‹ค๊ณ  ๊ฐ€์ •ํ•˜๊ณ  ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ–ˆ์œผ๋ฉฐ ๊ทธ ๊ฒฝ์šฐ์˜ ์ˆ˜๋„ ์œ ํ•œํ•˜๋‹ค๊ณ  ๊ฐ€์ •ํ•œ ๋’ค ํ™•๋ฅ ์„ ์ •์˜ํ–ˆ๋‹ค.

๊ฐ„๋‹จํ•˜์ง€ ์•Š์€ ํ™•๋ฅ ์˜ ์ •์˜๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™๋‹ค.

  • ํ™•๋ฅ  ๊ณต๊ฐ„์—๋Š” ๋‘ ๊ฐœ์˜ ์„ฑ๋ถ„ S์™€ P๊ฐ€ ์žˆ๋‹ค.

  • S๋Š” ํ‘œ๋ณธ ๊ณต๊ฐ„์ด๋ฉฐ ๋ชจ๋“  ์‹คํ—˜์ด ์ด๋ฃจ์–ด ์งˆ ์ˆ˜ ์žˆ๋Š” ๊ณต๊ฐ„.

  • P๋Š” ํ•จ์ˆ˜์ด๋‹ค.(๋„๋Ÿฌ๋‚˜ f(x) = x+ 3 ๊ฐ™์€ ํ•จ์ˆ˜๋Š” ์•„๋‹ˆ๋‹ค) ์–ด๋–ค ์‚ฌ๊ฑด์„ ์ž…๋ ฅ์œผ๋กœ ํ•˜๋Š” ํ•จ์ˆ˜์ด๋‹ค. P์˜ ์ •์˜์—ญ์€ S์˜ ๋ถ€๋ถ„์ง‘ํ•ฉ์ด๋‹ค.

  • S์˜ ๋ถ€๋ถ„์ง‘ํ•ฉ A๊ฐ€ ์žˆ์„ ๋•Œ P(A)๋Š” 0๋ถ€ํ„ฐ 1 ์‚ฌ์ด์˜ ์ˆ˜์ด๋ฉฐ, ์ผ๋ฐ˜์ ์ธ ํ™•๋ฅ ์€ 0๊ณผ 1์‚ฌ์ด์˜ ๊ธฐ์ค€์ด๋‹ค. ์ด ๋•Œ P๋ฅผ ์ •์˜ํ•˜๊ธฐ ์œ„ํ•œ ๋‘ ๊ฐ€์ง€ ์ •๋ฆฌ๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™๋‹ค.

    • P(โˆ…\varnothingโˆ…) = 0, P(S) = 1

      • ๋ชจ๋“  ๊ฐ€๋Šฅํ•œ ๊ฒฐ๊ณผ์˜ ์ง‘ํ•ฉ์ด S์ผ ๋•Œ, ๋ฐœ์ƒํ•  ์ˆ˜ ์—†๋Š” ๋ถˆ๊ฐ€๋Šฅํ•œ ์‚ฌ๊ฑด์ด โˆ…\varnothingโˆ… ์ด๋‹ค.

      • P(โˆชn=1โˆž)=โˆ‘n=1โˆžP(An)ย ย ย ย ifย ย Anย ย isย ย disjointย ย withย ย Am(m=ฬธn) P (\cup^\infty_{n=1}) = \sum^\infty_{n=1}P(A_n) \ \ \ \ if \ \ A_n \ \ is \ \ disjoint\ \ with \ \ A_m(m \not= n)P(โˆชn=1โˆžโ€‹)=โˆ‘n=1โˆžโ€‹P(Anโ€‹)ย ย ย ย ifย ย Anโ€‹ย ย isย ย disjointย ย withย ย Amโ€‹(m๎€ =n)

    • ํ™•๋ฅ ์˜ ๋ชจ๋“  ์ •์˜์™€ ๊ทœ์น™์€ ์ด ๋‘ ์ •๋ฆฌ๋กœ๋ถ€ํ„ฐ ํŒŒ์ƒ๋œ๋‹ค.

: ๊ฒฐ์ •๋ก ์  ํ•จ์ˆ˜(deterministic)

: ๋…ธ์ด์ฆˆ ํ™•๋ฅ ๋ณ€์ˆ˜

์ž…๋ ฅ ๊ฐ’ :

์ถœ๋ ฅ ๊ฐ’ :

๋กœ๊ทธ์šฐ๋„ํ•จ์ˆ˜ ์ตœ๋Œ€ํ™”์‹œํ‚ค๋Š” w๊ฐ’ = ๋กœ ์ฃผ์–ด์ง„ ์ œ๊ณฑํ•ฉ ์—๋Ÿฌํ•จ์ˆ˜ ์ตœ์†Œํ™”์‹œํ‚ค๋Š” ๊ฐ’โ—

w์˜ ์ตœ์ ๊ฐ’ : (normal equations)

์˜ Moore-Penrose pseudo-inverse :

design matrix์˜ ๋ชจ๋“  ์—ด์ด ์„ ํ˜• ๋…๋ฆฝ์ด๋ฉด, ์กด์žฌโ— (ํ•ญ์ƒ์„ฑ๋ฆฝํ•˜์ง„ ์•Š์ง€๋งŒ ๋งŽ์€ ๊ฒฝ์šฐ์— ์„ฑ๋ฆฝํ•˜๋ฉฐ, ์„ฑ๋ฆฝํ•˜์ง€ ์•Š์€ ๊ฒฝ์šฐ์— ๋Œ€ํ•ด ์„ฑ๋ฆฝํ•˜๋„๋ก ๋ฐ์ดํ„ฐ ์กฐ์ ˆ ๊ฐ€๋Šฅ)

๊ตฌํ•œ ์—๋Ÿฌํ•จ์ˆ˜๋ฅผ ํŽธํ–ฅ ํŒŒ๋ผ๋ฏธํ„ฐ(bias parameter) ๋กœ ํ‘œํ˜„ํ•˜๋ฉด

์ด๋•Œ ๋Š” ์˜ ์ฐจ์ด๋ฅผ ๋ณด์ •ํ•ด์ฃผ๋Š” ์—ญํ• โ—

์—๋Ÿฌํ•จ์ˆ˜๊ฐ€ ๋ผ ํ• ๋•Œ, ์œผ๋กœ ํ•™์Šต ์ง„ํ–‰

์œผ๋กœ ํ‘œํ˜„

w์— ๋Œ€ํ•ด ๋ฏธ๋ถ„ํ•˜๊ณ , ์ •๋ฆฌํ•˜๋ฉด w์˜ ์ตœ์ ๊ฐ’ ->

๋ฅผ ๋งŒ์กฑํ•˜๋ฉด์„œ ๋ฅผ ์ตœ์†Œํ™”์‹œํ‚ค๋Š” ํ•ด ์ฐพ๊ธฐ !

์— ๋Œ€ํ•ด ๋ณ€ํ˜•ํ•˜๋ฉด

์ •๋ฆฌํ•˜๋ฉด,

์‚ฌ์ „ ํ™•๋ฅ ์˜ ๊ณต๋ถ„์‚ฐ ์ด๋ผ๊ณ  ๊ฐ€์ •ํ•˜๋ฉด,

, ์ˆ˜๋ ด

mN์— ๋Œ€์ž…ํ•˜๋ฉด,

(normal equations)

โœจ๊ด€๋ จ ์‹ค์Šต
โœจ ๊ณต๋ถ€ํ•˜๋ฉด์„œ ์ฐธ๊ณ ํ•œ ์‚ฌ์ดํŠธ
https://github.com/sujiny-tech/k-digital-training-AI-dev/blob/main/Machine-Learning-basics/Linear%20Models%20for%20Regression.md