๐Ÿšดโ€โ™‚๏ธ
TIL
  • MAIN
  • : TIL?
  • : WIL
  • : Plan
  • : Retrospective
    • 21Y
      • Wait a moment!
      • 9M 2W
      • 9M1W
      • 8M4W
      • 8M3W
      • 8M2W
      • 8M1W
      • 7M4W
      • 7M3W
      • 7M2W
      • 7M1W
      • 6M5W
      • 1H
    • ์ƒˆ์‚ฌ๋žŒ ๋˜๊ธฐ ํ”„๋กœ์ ํŠธ
      • 2ํšŒ์ฐจ
      • 1ํšŒ์ฐจ
  • TIL : ML
    • Paper Analysis
      • BERT
      • Transformer
    • Boostcamp 2st
      • [S]Data Viz
        • (4-3) Seaborn ์‹ฌํ™”
        • (4-2) Seaborn ๊ธฐ์ดˆ
        • (4-1) Seaborn ์†Œ๊ฐœ
        • (3-4) More Tips
        • (3-3) Facet ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-2) Color ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-1) Text ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-3) Scatter Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-2) Line Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-1) Bar Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (1-3) Python๊ณผ Matplotlib
        • (1-2) ์‹œ๊ฐํ™”์˜ ์š”์†Œ
        • (1-1) Welcome to Visualization (OT)
      • [P]MRC
        • (2๊ฐ•) Extraction-based MRC
        • (1๊ฐ•) MRC Intro & Python Basics
      • [P]KLUE
        • (5๊ฐ•) BERT ๊ธฐ๋ฐ˜ ๋‹จ์ผ ๋ฌธ์žฅ ๋ถ„๋ฅ˜ ๋ชจ๋ธ ํ•™์Šต
        • (4๊ฐ•) ํ•œ๊ตญ์–ด BERT ์–ธ์–ด ๋ชจ๋ธ ํ•™์Šต
        • [NLP] ๋ฌธ์žฅ ๋‚ด ๊ฐœ์ฒด๊ฐ„ ๊ด€๊ณ„ ์ถ”์ถœ
        • (3๊ฐ•) BERT ์–ธ์–ด๋ชจ๋ธ ์†Œ๊ฐœ
        • (2๊ฐ•) ์ž์—ฐ์–ด์˜ ์ „์ฒ˜๋ฆฌ
        • (1๊ฐ•) ์ธ๊ณต์ง€๋Šฅ๊ณผ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ
      • [U]Stage-CV
      • [U]Stage-NLP
        • 7W Retrospective
        • (10๊ฐ•) Advanced Self-supervised Pre-training Models
        • (09๊ฐ•) Self-supervised Pre-training Models
        • (08๊ฐ•) Transformer (2)
        • (07๊ฐ•) Transformer (1)
        • 6W Retrospective
        • (06๊ฐ•) Beam Search and BLEU score
        • (05๊ฐ•) Sequence to Sequence with Attention
        • (04๊ฐ•) LSTM and GRU
        • (03๊ฐ•) Recurrent Neural Network and Language Modeling
        • (02๊ฐ•) Word Embedding
        • (01๊ฐ•) Intro to NLP, Bag-of-Words
        • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Preprocessing for NMT Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Subword-level Language Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ2] RNN-based Language Model
        • [์„ ํƒ ๊ณผ์ œ] BERT Fine-tuning with transformers
        • [ํ•„์ˆ˜ ๊ณผ์ œ] Data Preprocessing
      • Mask Wear Image Classification
        • 5W Retrospective
        • Report_Level1_6
        • Performance | Review
        • DAY 11 : HardVoting | MultiLabelClassification
        • DAY 10 : Cutmix
        • DAY 9 : Loss Function
        • DAY 8 : Baseline
        • DAY 7 : Class Imbalance | Stratification
        • DAY 6 : Error Fix
        • DAY 5 : Facenet | Save
        • DAY 4 : VIT | F1_Loss | LrScheduler
        • DAY 3 : DataSet/Lodaer | EfficientNet
        • DAY 2 : Labeling
        • DAY 1 : EDA
        • 2_EDA Analysis
      • [P]Stage-1
        • 4W Retrospective
        • (10๊ฐ•) Experiment Toolkits & Tips
        • (9๊ฐ•) Ensemble
        • (8๊ฐ•) Training & Inference 2
        • (7๊ฐ•) Training & Inference 1
        • (6๊ฐ•) Model 2
        • (5๊ฐ•) Model 1
        • (4๊ฐ•) Data Generation
        • (3๊ฐ•) Dataset
        • (2๊ฐ•) Image Classification & EDA
        • (1๊ฐ•) Competition with AI Stages!
      • [U]Stage-3
        • 3W Retrospective
        • PyTorch
          • (10๊ฐ•) PyTorch Troubleshooting
          • (09๊ฐ•) Hyperparameter Tuning
          • (08๊ฐ•) Multi-GPU ํ•™์Šต
          • (07๊ฐ•) Monitoring tools for PyTorch
          • (06๊ฐ•) ๋ชจ๋ธ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ
          • (05๊ฐ•) Dataset & Dataloader
          • (04๊ฐ•) AutoGrad & Optimizer
          • (03๊ฐ•) PyTorch ํ”„๋กœ์ ํŠธ ๊ตฌ์กฐ ์ดํ•ดํ•˜๊ธฐ
          • (02๊ฐ•) PyTorch Basics
          • (01๊ฐ•) Introduction to PyTorch
      • [U]Stage-2
        • 2W Retrospective
        • DL Basic
          • (10๊ฐ•) Generative Models 2
          • (09๊ฐ•) Generative Models 1
          • (08๊ฐ•) Sequential Models - Transformer
          • (07๊ฐ•) Sequential Models - RNN
          • (06๊ฐ•) Computer Vision Applications
          • (05๊ฐ•) Modern CNN - 1x1 convolution์˜ ์ค‘์š”์„ฑ
          • (04๊ฐ•) Convolution์€ ๋ฌด์—‡์ธ๊ฐ€?
          • (03๊ฐ•) Optimization
          • (02๊ฐ•) ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ - MLP (Multi-Layer Perceptron)
          • (01๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ๊ธฐ๋ณธ ์šฉ์–ด ์„ค๋ช… - Historical Review
        • Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Multi-headed Attention Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] LSTM Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] CNN Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Optimization Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] MLP Assignment
      • [U]Stage-1
        • 1W Retrospective
        • AI Math
          • (AI Math 10๊ฐ•) RNN ์ฒซ๊ฑธ์Œ
          • (AI Math 9๊ฐ•) CNN ์ฒซ๊ฑธ์Œ
          • (AI Math 8๊ฐ•) ๋ฒ ์ด์ฆˆ ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 7๊ฐ•) ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 6๊ฐ•) ํ™•๋ฅ ๋ก  ๋ง›๋ณด๊ธฐ
          • (AI Math 5๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ํ•™์Šต๋ฐฉ๋ฒ• ์ดํ•ดํ•˜๊ธฐ
          • (AI Math 4๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ๋งค์šด๋ง›
          • (AI Math 3๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ์ˆœํ•œ๋ง›
          • (AI Math 2๊ฐ•) ํ–‰๋ ฌ์ด ๋ญ์˜ˆ์š”?
          • (AI Math 1๊ฐ•) ๋ฒกํ„ฐ๊ฐ€ ๋ญ์˜ˆ์š”?
        • Python
          • (Python 7-2๊ฐ•) pandas II
          • (Python 7-1๊ฐ•) pandas I
          • (Python 6๊ฐ•) numpy
          • (Python 5-2๊ฐ•) Python data handling
          • (Python 5-1๊ฐ•) File / Exception / Log Handling
          • (Python 4-2๊ฐ•) Module and Project
          • (Python 4-1๊ฐ•) Python Object Oriented Programming
          • (Python 3-2๊ฐ•) Pythonic code
          • (Python 3-1๊ฐ•) Python Data Structure
          • (Python 2-4๊ฐ•) String and advanced function concept
          • (Python 2-3๊ฐ•) Conditionals and Loops
          • (Python 2-2๊ฐ•) Function and Console I/O
          • (Python 2-1๊ฐ•) Variables
          • (Python 1-3๊ฐ•) ํŒŒ์ด์ฌ ์ฝ”๋”ฉ ํ™˜๊ฒฝ
          • (Python 1-2๊ฐ•) ํŒŒ์ด์ฌ ๊ฐœ์š”
          • (Python 1-1๊ฐ•) Basic computer class for newbies
        • Assignment
          • [์„ ํƒ ๊ณผ์ œ 3] Maximum Likelihood Estimate
          • [์„ ํƒ ๊ณผ์ œ 2] Backpropagation
          • [์„ ํƒ ๊ณผ์ œ 1] Gradient Descent
          • [ํ•„์ˆ˜ ๊ณผ์ œ 5] Morsecode
          • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Baseball
          • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Text Processing 2
          • [ํ•„์ˆ˜ ๊ณผ์ œ 2] Text Processing 1
          • [ํ•„์ˆ˜ ๊ณผ์ œ 1] Basic Math
    • ๋”ฅ๋Ÿฌ๋‹ CNN ์™„๋ฒฝ ๊ฐ€์ด๋“œ - Fundamental ํŽธ
      • ์ข…ํ•ฉ ์‹ค์Šต 2 - ์บ๊ธ€ Plant Pathology(๋‚˜๋ฌด์žŽ ๋ณ‘ ์ง„๋‹จ) ๊ฒฝ์—ฐ ๋Œ€ํšŒ
      • ์ข…ํ•ฉ ์‹ค์Šต 1 - 120์ข…์˜ Dog Breed Identification ๋ชจ๋ธ ์ตœ์ ํ™”
      • ์‚ฌ์ „ ํ›ˆ๋ จ ๋ชจ๋ธ์˜ ๋ฏธ์„ธ ์กฐ์ • ํ•™์Šต๊ณผ ๋‹ค์–‘ํ•œ Learning Rate Scheduler์˜ ์ ์šฉ
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - ResNet ์ƒ์„ธ์™€ EfficientNet ๊ฐœ์š”
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - AlexNet, VGGNet, GoogLeNet
      • Albumentation์„ ์ด์šฉํ•œ Augmentation๊ธฐ๋ฒ•๊ณผ Keras Sequence ํ™œ์šฉํ•˜๊ธฐ
      • ์‚ฌ์ „ ํ›ˆ๋ จ CNN ๋ชจ๋ธ์˜ ํ™œ์šฉ๊ณผ Keras Generator ๋ฉ”์ปค๋‹ˆ์ฆ˜ ์ดํ•ด
      • ๋ฐ์ดํ„ฐ ์ฆ๊ฐ•์˜ ์ดํ•ด - Keras ImageDataGenerator ํ™œ์šฉ
      • CNN ๋ชจ๋ธ ๊ตฌํ˜„ ๋ฐ ์„ฑ๋Šฅ ํ–ฅ์ƒ ๊ธฐ๋ณธ ๊ธฐ๋ฒ• ์ ์šฉํ•˜๊ธฐ
    • AI School 1st
    • ํ˜„์—… ์‹ค๋ฌด์ž์—๊ฒŒ ๋ฐฐ์šฐ๋Š” Kaggle ๋จธ์‹ ๋Ÿฌ๋‹ ์ž…๋ฌธ
    • ํŒŒ์ด์ฌ ๋”ฅ๋Ÿฌ๋‹ ํŒŒ์ดํ† ์น˜
  • TIL : Python & Math
    • Do It! ์žฅ๊ณ +๋ถ€ํŠธ์ŠคํŠธ๋žฉ: ํŒŒ์ด์ฌ ์›น๊ฐœ๋ฐœ์˜ ์ •์„
      • Relations - ๋‹ค๋Œ€๋‹ค ๊ด€๊ณ„
      • Relations - ๋‹ค๋Œ€์ผ ๊ด€๊ณ„
      • ํ…œํ”Œ๋ฆฟ ํŒŒ์ผ ๋ชจ๋“ˆํ™” ํ•˜๊ธฐ
      • TDD (Test Driven Development)
      • template tags & ์กฐ๊ฑด๋ฌธ
      • ์ •์  ํŒŒ์ผ(static files) & ๋ฏธ๋””์–ด ํŒŒ์ผ(media files)
      • FBV (Function Based View)์™€ CBV (Class Based View)
      • Django ์ž…๋ฌธํ•˜๊ธฐ
      • ๋ถ€ํŠธ์ŠคํŠธ๋žฉ
      • ํ”„๋ก ํŠธ์—”๋“œ ๊ธฐ์ดˆ๋‹ค์ง€๊ธฐ (HTML, CSS, JS)
      • ๋“ค์–ด๊ฐ€๊ธฐ + ํ™˜๊ฒฝ์„ค์ •
    • Algorithm
      • Programmers
        • Level1
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์ˆซ์ž ๋ฌธ์ž์—ด๊ณผ ์˜๋‹จ์–ด
          • ์ž์—ฐ์ˆ˜ ๋’ค์ง‘์–ด ๋ฐฐ์—ด๋กœ ๋งŒ๋“ค๊ธฐ
          • ์ •์ˆ˜ ๋‚ด๋ฆผ์ฐจ์ˆœ์œผ๋กœ ๋ฐฐ์น˜ํ•˜๊ธฐ
          • ์ •์ˆ˜ ์ œ๊ณฑ๊ทผ ํŒ๋ณ„
          • ์ œ์ผ ์ž‘์€ ์ˆ˜ ์ œ๊ฑฐํ•˜๊ธฐ
          • ์ง์‚ฌ๊ฐํ˜• ๋ณ„์ฐ๊ธฐ
          • ์ง์ˆ˜์™€ ํ™€์ˆ˜
          • ์ฒด์œก๋ณต
          • ์ตœ๋Œ€๊ณต์•ฝ์ˆ˜์™€ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • ์ฝœ๋ผ์ธ  ์ถ”์ธก
          • ํฌ๋ ˆ์ธ ์ธํ˜•๋ฝ‘๊ธฐ ๊ฒŒ์ž„
          • ํ‚คํŒจ๋“œ ๋ˆ„๋ฅด๊ธฐ
          • ํ‰๊ท  ๊ตฌํ•˜๊ธฐ
          • ํฐ์ผ“๋ชฌ
          • ํ•˜์ƒค๋“œ ์ˆ˜
          • ํ•ธ๋“œํฐ ๋ฒˆํ˜ธ ๊ฐ€๋ฆฌ๊ธฐ
          • ํ–‰๋ ฌ์˜ ๋ง์…ˆ
        • Level2
          • ์ˆซ์ž์˜ ํ‘œํ˜„
          • ์ˆœ์œ„ ๊ฒ€์ƒ‰
          • ์ˆ˜์‹ ์ตœ๋Œ€ํ™”
          • ์†Œ์ˆ˜ ์ฐพ๊ธฐ
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์‚ผ๊ฐ ๋‹ฌํŒฝ์ด
          • ๋ฌธ์ž์—ด ์••์ถ•
          • ๋ฉ”๋‰ด ๋ฆฌ๋‰ด์–ผ
          • ๋” ๋งต๊ฒŒ
          • ๋•…๋”ฐ๋จน๊ธฐ
          • ๋ฉ€์ฉกํ•œ ์‚ฌ๊ฐํ˜•
          • ๊ด„ํ˜ธ ํšŒ์ „ํ•˜๊ธฐ
          • ๊ด„ํ˜ธ ๋ณ€ํ™˜
          • ๊ตฌ๋ช…๋ณดํŠธ
          • ๊ธฐ๋Šฅ ๊ฐœ๋ฐœ
          • ๋‰ด์Šค ํด๋Ÿฌ์Šคํ„ฐ๋ง
          • ๋‹ค๋ฆฌ๋ฅผ ์ง€๋‚˜๋Š” ํŠธ๋Ÿญ
          • ๋‹ค์Œ ํฐ ์ˆซ์ž
          • ๊ฒŒ์ž„ ๋งต ์ตœ๋‹จ๊ฑฐ๋ฆฌ
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
          • ๊ฐ€์žฅ ํฐ ์ •์‚ฌ๊ฐํ˜• ์ฐพ๊ธฐ
          • H-Index
          • JadenCase ๋ฌธ์ž์—ด ๋งŒ๋“ค๊ธฐ
          • N๊ฐœ์˜ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • N์ง„์ˆ˜ ๊ฒŒ์ž„
          • ๊ฐ€์žฅ ํฐ ์ˆ˜
          • 124 ๋‚˜๋ผ์˜ ์ˆซ์ž
          • 2๊ฐœ ์ดํ•˜๋กœ ๋‹ค๋ฅธ ๋น„ํŠธ
          • [3์ฐจ] ํŒŒ์ผ๋ช… ์ •๋ ฌ
          • [3์ฐจ] ์••์ถ•
          • ์ค„ ์„œ๋Š” ๋ฐฉ๋ฒ•
          • [3์ฐจ] ๋ฐฉ๊ธˆ ๊ทธ๊ณก
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
        • Level3
          • ๋งค์นญ ์ ์ˆ˜
          • ์™ธ๋ฒฝ ์ ๊ฒ€
          • ๊ธฐ์ง€๊ตญ ์„ค์น˜
          • ์ˆซ์ž ๊ฒŒ์ž„
          • 110 ์˜ฎ๊ธฐ๊ธฐ
          • ๊ด‘๊ณ  ์ œ๊ฑฐ
          • ๊ธธ ์ฐพ๊ธฐ ๊ฒŒ์ž„
          • ์…”ํ‹€๋ฒ„์Šค
          • ๋‹จ์†์นด๋ฉ”๋ผ
          • ํ‘œ ํŽธ์ง‘
          • N-Queen
          • ์ง•๊ฒ€๋‹ค๋ฆฌ ๊ฑด๋„ˆ๊ธฐ
          • ์ตœ๊ณ ์˜ ์ง‘ํ•ฉ
          • ํ•ฉ์Šน ํƒ์‹œ ์š”๊ธˆ
          • ๊ฑฐ์Šค๋ฆ„๋ˆ
          • ํ•˜๋…ธ์ด์˜ ํƒ‘
          • ๋ฉ€๋ฆฌ ๋›ฐ๊ธฐ
          • ๋ชจ๋‘ 0์œผ๋กœ ๋งŒ๋“ค๊ธฐ
        • Level4
    • Head First Python
    • ๋ฐ์ดํ„ฐ ๋ถ„์„์„ ์œ„ํ•œ SQL
    • ๋‹จ ๋‘ ์žฅ์˜ ๋ฌธ์„œ๋กœ ๋ฐ์ดํ„ฐ ๋ถ„์„๊ณผ ์‹œ๊ฐํ™” ๋ฝ€๊ฐœ๊ธฐ
    • Linear Algebra(Khan Academy)
    • ์ธ๊ณต์ง€๋Šฅ์„ ์œ„ํ•œ ์„ ํ˜•๋Œ€์ˆ˜
    • Statistics110
  • TIL : etc
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Kubernetes
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Docker
      • 2. ๋„์ปค ์„ค์น˜ ์‹ค์Šต 1 - ํ•™์ŠตํŽธ(์ค€๋น„๋ฌผ/์‹ค์Šต ์œ ํ˜• ์†Œ๊ฐœ)
      • 1. ์ปจํ…Œ์ด๋„ˆ์™€ ๋„์ปค์˜ ์ดํ•ด - ์ปจํ…Œ์ด๋„ˆ๋ฅผ ์“ฐ๋Š”์ด์œ  / ์ผ๋ฐ˜ํ”„๋กœ๊ทธ๋žจ๊ณผ ์ปจํ…Œ์ด๋„ˆํ”„๋กœ๊ทธ๋žจ์˜ ์ฐจ์ด์ 
      • 0. ๋“œ๋””์–ด ์ฐพ์•„์˜จ Docker ๊ฐ•์˜! ์™•์ดˆ๋ณด์—์„œ ๋„์ปค ๋งˆ์Šคํ„ฐ๋กœ - OT
    • CoinTrading
      • [๊ฐ€์ƒ ํ™”ํ ์ž๋™ ๋งค๋งค ํ”„๋กœ๊ทธ๋žจ] ๋ฐฑํ…Œ์ŠคํŒ… : ๊ฐ„๋‹จํ•œ ํ…Œ์ŠคํŒ…
    • Gatsby
      • 01 ๊นƒ๋ถ ํฌ๊ธฐ ์„ ์–ธ
  • TIL : Project
    • Mask Wear Image Classification
    • Project. GARIGO
  • 2021 TIL
    • CHANGED
    • JUN
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Thu
      • 2 Wed
      • 1 Tue
    • MAY
      • 31 Mon
      • 30 Sun
      • 29 Sat
      • 28 Fri
      • 27 Thu
      • 26 Wed
      • 25 Tue
      • 24 Mon
      • 23 Sun
      • 22 Sat
      • 21 Fri
      • 20 Thu
      • 19 Wed
      • 18 Tue
      • 17 Mon
      • 16 Sun
      • 15 Sat
      • 14 Fri
      • 13 Thu
      • 12 Wed
      • 11 Tue
      • 10 Mon
      • 9 Sun
      • 8 Sat
      • 7 Fri
      • 6 Thu
      • 5 Wed
      • 4 Tue
      • 3 Mon
      • 2 Sun
      • 1 Sat
    • APR
      • 30 Fri
      • 29 Thu
      • 28 Wed
      • 27 Tue
      • 26 Mon
      • 25 Sun
      • 24 Sat
      • 23 Fri
      • 22 Thu
      • 21 Wed
      • 20 Tue
      • 19 Mon
      • 18 Sun
      • 17 Sat
      • 16 Fri
      • 15 Thu
      • 14 Wed
      • 13 Tue
      • 12 Mon
      • 11 Sun
      • 10 Sat
      • 9 Fri
      • 8 Thu
      • 7 Wed
      • 6 Tue
      • 5 Mon
      • 4 Sun
      • 3 Sat
      • 2 Fri
      • 1 Thu
    • MAR
      • 31 Wed
      • 30 Tue
      • 29 Mon
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • FEB
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • JAN
      • 31 Sun
      • 30 Sat
      • 29 Fri
      • 28 Thu
      • 27 Wed
      • 26 Tue
      • 25 Mon
      • 24 Sun
      • 23 Sat
      • 22 Fri
      • 21 Thu
      • 20 Wed
      • 19 Tue
      • 18 Mon
      • 17 Sun
      • 16 Sat
      • 15 Fri
      • 14 Thu
      • 13 Wed
      • 12 Tue
      • 11 Mon
      • 10 Sun
      • 9 Sat
      • 8 Fri
      • 7 Thu
      • 6 Wed
      • 5 Tue
      • 4 Mon
      • 3 Sun
      • 2 Sat
      • 1 Fri
  • 2020 TIL
    • DEC
      • 31 Thu
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Tue
      • 2 Wed
      • 1 Tue
    • NOV
      • 30 Mon
Powered by GitBook
On this page
  • [์ธ๊ณต์ง€๋Šฅ์„ ์œ„ํ•œ ์„ ํ˜•๋Œ€์ˆ˜] CHAPTER 2. ์„ ํ˜•์‹œ์Šคํ…œ ๋ฐ ์„ ํ˜•๋ณ€ํ™˜
  • ์„ ํ˜•๋ฐฉ์ •์‹๊ณผ ์„ ํ˜•์‹œ์Šคํ…œ
  • ์‹ค์Šต I
  • ์„ ํ˜•๊ฒฐํ•ฉ
  • ์„ ํ˜•๋…๋ฆฝ๊ณผ ์„ ํ˜•์ข…์†
  • ๋ถ€๋ถ„๊ณต๊ฐ„์˜ ๊ธฐ์ €์™€ ์ฐจ์›
  • ์„ ํ˜•๋ณ€ํ™˜
  • ์„ ํ˜•๋ณ€ํ™˜ with Neural Networks
  • ์ „์‚ฌํ•จ์ˆ˜์™€ ์ผ๋Œ€์ผํ•จ์ˆ˜

Was this helpful?

  1. 2021 TIL
  2. JAN

16 Sat

TIL

Previous17 SunNext15 Fri

Last updated 4 years ago

Was this helpful?

[์ธ๊ณต์ง€๋Šฅ์„ ์œ„ํ•œ ์„ ํ˜•๋Œ€์ˆ˜] CHAPTER 2. ์„ ํ˜•์‹œ์Šคํ…œ ๋ฐ ์„ ํ˜•๋ณ€ํ™˜

์„ ํ˜•๋ฐฉ์ •์‹๊ณผ ์„ ํ˜•์‹œ์Šคํ…œ

์„ ํ˜•๋ฐฉ์ •์‹

  • a1x1+a2x2+...+anxn=b a_1x_1 + a_2x_2 + ... + a_nx_n = ba1โ€‹x1โ€‹+a2โ€‹x2โ€‹+...+anโ€‹xnโ€‹=b

  • ์ด ๋•Œ a๋ฅผ ๊ณ„์ˆ˜, b๋ฅผ ์ƒ์ˆ˜๋ผ๊ณ  ํ•˜๋ฉฐ x๋Š” ์šฐ๋ฆฌ๊ฐ€ ํ’€์–ด์•ผ ํ•  ๋ฏธ์ง€์ˆ˜ ๋˜๋Š” ๋ณ€์ˆ˜

์ด๋ฅผ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ํ‘œํ˜„ ๊ฐ€๋Šฅํ•˜๋‹ค

  • aTx=b a^Tx = b aTx=b

  • ๊ธฐ๋ณธ์ ์œผ๋กœ ์Šค์ผ€์ผ๋Ÿฌ๋Š” ์†Œ๋ฌธ์ž๋กœ ํ‘œ์‹œํ•˜๊ณ  ๋ฒกํ„ฐ์ผ ๊ฒฝ์šฐ์—๋Š” ๋ณผ๋“œ์ฒด๋กœ ํ‘œ์‹œํ•œ๋‹ค.

  • ๋งคํŠธ๋ฆญ์Šค๋Š” ๋Œ€๋ฌธ์ž๋กœ ํ‘œ์‹œํ•œ๋‹ค.

์„ ํ˜•์‹œ์Šคํ…œ

  • ์„ ํ˜•๋ฐฉ์ •์‹์˜ ์ง‘ํ•ฉ (์—ฐ๋ฆฝ๋ฐฉ์ •์‹์ด๋ผ๊ณ ๋„ ํ•œ๋‹ค)

์„ ํ˜•์‹œ์Šคํ…œ์˜ ์˜ˆ

60x1+5.5x2+1x3=6665x1+5.0x2+0x3=7455x1+6.0x2+1x3=7860 x_1 + 5.5 x_2 + 1x_3 = 66 \\ 65 x_1 + 5.0 x_2 + 0x_3 = 74 \\ 55 x_1 + 6.0 x_2 + 1x_3 = 78 \\60x1โ€‹+5.5x2โ€‹+1x3โ€‹=6665x1โ€‹+5.0x2โ€‹+0x3โ€‹=7455x1โ€‹+6.0x2โ€‹+1x3โ€‹=78

์ด ๋•Œ, 3๊ฐœ์˜ ์—ฐ๋ฆญ๋ฐฉ์ •์‹์€ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์“ธ ์ˆ˜ ์žˆ๋‹ค. ์ด๋ฅผ Matrix Equation ์ด๋ผ๊ณ  ํ•œ๋‹ค.

[605.51655.00556.01] \left[\begin{array}{rrr} 60&5.5&1\\ 65&5.0&0\\ 55&6.0&1 \end{array}\right] โ€‹606555โ€‹5.55.06.0โ€‹101โ€‹โ€‹[x1x2x3]=[667478] \left[\begin{array}{rrr} x_1\\x_2\\x_3\end{array}\right] = \left[\begin{array}{rrr} 66\\74\\78\end{array}\right] โ€‹x1โ€‹x2โ€‹x3โ€‹โ€‹โ€‹=โ€‹667478โ€‹โ€‹

๋˜๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์ด ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค

a1Tx=66a2Tx=74a3Tx=78 a^T_1x = 66 \\ a^T_2x = 74 \\ a^T_3x = 78 a1Tโ€‹x=66a2Tโ€‹x=74a3Tโ€‹x=78

ํ•ญ๋“ฑ ํ–‰๋ ฌ (Identity Matrix)

AB = BA = A ๊ฐ€ ๋˜๋„๋ก ํ•˜๋Š” ํ–‰๋ ฌ B๋ฅผ ํ•ญ๋“ฑ ํ–‰๋ ฌ์ด๋ผ๊ณ  ํ‘œํ˜„ํ•˜๋ฉฐ I ๋กœ ํ‘œํ˜„ํ•œ๋‹ค. ์ •์‚ฌ๊ฐํ–‰๋ ฌ์— ๋Œ€ํ•ด์„œ๋งŒ ์ •์˜ํ•  ์ˆ˜ ์žˆ๋‹ค.

์—ญํ–‰๋ ฌ (Inverse Matrix)

Aโˆ’1A=AAโˆ’1=In A^{-1}A = AA^{-1} = I_n Aโˆ’1A=AAโˆ’1=Inโ€‹

2์ฐจ ์—ญํ–‰๋ ฌ ๊ฐ™์€ ๊ฒฝ์šฐ๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋ฐฉ๋ฒ•์œผ๋กœ ์—ญํ–‰๋ ฌ์„ ๊ตฌํ•  ์ˆ˜ ์žˆ๋‹ค.

Aโˆ’1=1adโˆ’bc[dโˆ’bโˆ’ca] A^{-1} = \frac {1} {ad-bc} \left[\begin{array}{rrr} d&-b\\ -c&a \end{array}\right] Aโˆ’1=adโˆ’bc1โ€‹[dโˆ’cโ€‹โˆ’baโ€‹]

3์ฐจ ์ด์ƒ์˜ ํ–‰๋ ฌ์— ๋Œ€ํ•œ ์—ญํ–‰๋ ฌ๋„ 2์ฐจ ํ–‰๋ ฌ์˜ ์—ญํ–‰๋ ฌ ๊ฐ™์€ ๊ณต์‹์€ ์—†์ง€๋งŒ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ ์œผ๋กœ ํ’€์–ด๋‚ผ ์ˆ˜ ์žˆ๋Š” ๊ณผ์ •์ด ์กด์žฌํ•œ๋‹ค.

์—ญํ–‰๋ ฌ ์‹์„ ๋งŒ์กฑํ•˜๋Š” ํ–‰๋ ฌ A๋Š” ์ •์‚ฌ๊ฐํ–‰๋ ฌ์— ๋Œ€ํ•ด์„œ๋Š” ํ•ญ์ƒ ๋งŒ์กฑํ•˜์ง€๋งŒ ์ง์‚ฌ๊ฐํ–‰๋ ฌ์— ๋Œ€ํ•ด์„œ๋Š” ํ•œ์ชฝ๋งŒ ๋งŒ์กฑํ•œ๋‹ค.

์—ญํ–‰๋ ฌ์„ ํ†ตํ•œ ์„ ํ˜• ์‹œ์Šคํ…œ ํ’€์ด

ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย Aย ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย xย ย ย ย ย =ย ย ย ย ย b\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ A\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ x \ \ \ \ \ =\ \ \ \\ \ \ b ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย Aย ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย ย xย ย ย ย ย =ย ย ย ย ย b

[605.51655.00556.01] \left[\begin{array}{rrr} 60&5.5&1\\ 65&5.0&0\\ 55&6.0&1 \end{array}\right] โ€‹606555โ€‹5.55.06.0โ€‹101โ€‹โ€‹[x1x2x3]=[667478] \left[\begin{array}{rrr} x_1\\x_2\\x_3\end{array}\right] = \left[\begin{array}{rrr} 66\\74\\78\end{array}\right] โ€‹x1โ€‹x2โ€‹x3โ€‹โ€‹โ€‹=โ€‹667478โ€‹โ€‹

Aโˆ’1=[0.08700.0870โˆ’0.0870โˆ’1.13040.08701.13142.0000โˆ’1.0000โˆ’1.0000] A^{-1} = \left[\begin{array}{rrr} 0.0870&0.0870&-0.0870\\ -1.1304&0.0870&1.1314\\ 2.0000&-1.0000&-1.0000 \end{array}\right] Aโˆ’1=โ€‹0.0870โˆ’1.13042.0000โ€‹0.08700.0870โˆ’1.0000โ€‹โˆ’0.08701.1314โˆ’1.0000โ€‹โ€‹

x=Aโˆ’1b=[0.08700.0870โˆ’0.0870โˆ’1.13040.08701.13142.0000โˆ’1.0000โˆ’1.0000][667478]=[โˆ’0.420โˆ’20] x = A^{-1}b = \left[\begin{array}{rrr} 0.0870&0.0870&-0.0870\\ -1.1304&0.0870&1.1314\\ 2.0000&-1.0000&-1.0000 \end{array}\right] \left[\begin{array}{rrr} 66\\74\\78\end{array}\right] = \left[\begin{array}{rrr} -0.4\\20\\-20\end{array}\right] x=Aโˆ’1b=โ€‹0.0870โˆ’1.13042.0000โ€‹0.08700.0870โˆ’1.0000โ€‹โˆ’0.08701.1314โˆ’1.0000โ€‹โ€‹โ€‹667478โ€‹โ€‹=โ€‹โˆ’0.420โˆ’20โ€‹โ€‹

์ด ์†”๋ฃจ์…˜์˜ ์˜๋ฏธ๋Š” ์ฃผ์–ด์ง„ ๋ฐ์ดํ„ฐ๋ฅผ ๋งŒ์กฑํ•  ์ˆ˜ ์žˆ๋Š” x๋ฅผ ์ฐพ์€ ๊ฒƒ

(life-span) = -0.4(weight) + 20(height) -20(is_smoking)

์—ญํ–‰๋ ฌ์ด ์กด์žฌํ•˜์ง€ ์•Š์€ ํ–‰๋ ฌ A for Ax = b

  • ์—ญํ–‰๋ ฌ์ด ์กด์žฌํ•  ๋•Œ์˜ ๊ทผ์€ ํ•˜๋‚˜๋กœ ํŠน์ •๋œ๋‹ค.

  • ์—ญํ–‰๋ ฌ์ด ์กด์žฌํ•˜์ง€ ์•Š์„ ๋•Œ์˜ ๊ทผ์€ ๋ฌด์ˆ˜ํžˆ ๋งŽ๊ฑฐ๋‚˜ ์กด์žฌํ•˜์ง€ ์•Š๋Š”๋‹ค.

  • ad-bc๊ฐ€ 0์ด ๋  ๋•Œ ์—ญํ–‰๋ ฌ์ด ์กด์žฌํ•˜์ง€ ์•Š์œผ๋ฉฐ ์ด๋ฅผ A์˜ ํŒ๋ณ„์ž ๋˜๋Š” det A ๋ผ๊ณ  ํ•œ๋‹ค.

  • a : b = c : d ์˜ ๊ด€๊ณ„๊ฐ€ ๋งŒ์กฑํ•˜๋ฉด ์—ญํ–‰๋ ฌ์ด ์กด์žฌํ•˜์ง€ ์•Š๋Š”๋‹ค.

์ง์‚ฌ๊ฐํ–‰๋ ฌ A in Ax = b

๋ฐฉ์ •์‹์˜ ๊ฐœ์ˆ˜๊ฐ€ m๊ฐœ, ๋ณ€์ˆ˜๊ฐ€ n๊ฐœ ์ผ ๋•Œ

  • m < n (๋ณ€์ˆ˜๊ฐ€ ๋” ๋งŽ์„ ๋•Œ) : ๋ฌดํ•œํžˆ ๋งŽ์€ ํ•ด๋‹ต์ด ์กด์žฌํ•œ๋‹ค. (under-determined system)

  • m > n (๋ณ€์ˆ˜๊ฐ€ ๋” ์ ์„ ๋•Œ) : ์™„๋ฒฝํžˆ ๋งŒ์กฑํ•˜๋Š” ํ•ด๋‹ต์€ ์กด์žฌํ•˜์ง€ ์•Š๋Š”๋‹ค (over-determined system)

  • ๊ทธ๋Ÿฌ๋‚˜ ๋จธ์‹ ๋Ÿฌ๋‹์—์„œ๋Š” m > n ์˜ ๊ฒฝ์šฐ๋”๋ผ๋„ ์ตœ๋Œ€ํ•œ ๋ชจ๋“  ์ ์„ ์ง€๋‚˜๊ฐ€๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ณด์ด๊ฒŒ ํ•˜๋Š” ๋ฐฉ์ •์‹์„ ๊ตฌํ•  ์ˆ˜ ์žˆ๋‹ค.

์‹ค์Šต

from numpy.linalg import solve
x = solve(A, b)
x

๋‹ค์Œ๊ณผ ๊ฐ™์ด ์‰ฝ๊ฒŒ ๊ตฌํ•  ์ˆ˜ ์žˆ๋Š”๋ฐ, ์ด ๋•Œ๋Š” ์—ญํ–‰๋ ฌ์„ ์ด์šฉํ•˜์—ฌ ๊ตฌํ•˜์ง€๋Š” ์•Š๋Š”๋‹ค. ์™œ๋ƒํ•˜๋ฉด

3x = 6 ์ด๋ผ๋Š” ์‹์—์„œ, x๋Š” ๋‹น์—ฐํžˆ 2์ง€๋งŒ ์—ญํ–‰๋ ฌ์„ ๊ตฌํ•˜๋Š” ๊ณผ์ •์œผ๋กœ ํ’€์ดํ•œ๋‹ค๊ณ  ํ•  ๋•Œ,

x = 3โˆ’16 3^{-1}6 3โˆ’16 ์ด ๋˜๋ฉฐ ์ด ๋•Œ 3์˜ ์—ญ์ˆ˜๋Š” 0.33333.. ์˜ ํ˜•ํƒœ๋ฅผ ๊ฐ€์ง€๊ฒŒ๋œ๋‹ค. ๊ฒฐ๊ตญ ์ด๋Ÿฌํ•œ ํ’€์ด๋Š” ์ปดํ“จํ„ฐ์˜ ์‹ค์ˆ˜ ํ‘œํ˜„์˜ ํ•œ๊ณ„๋•Œ๋ฌธ์— ์•„์ฃผ ์ ์€ ์˜ค์ฐจ๋ฅผ ๋ฐœ์ƒํ•˜๊ฒŒ ๋˜๋Š”๋ฐ, ์ด๊ฒƒ์ด ํ–‰๋ ฌ๊ฐ„์˜ ์—ญํ–‰๋ ฌ์—์„œ๋„ ๋ฐœ์ƒํ•˜๋ฏ€๋กœ ์—ญํ–‰๋ ฌ์„ ์ด์šฉํ•˜์—ฌ ๊ตฌํ•˜์ง€๋Š” ์•Š๋Š”๋‹ค.

์‹ค์Šต I

๊ฐ„๋‹จํ•œ numpy array

import numpy as np
# column vector
c = np.array([1,2,3])
print(c.shape)

# obtaining a particular entry
print (c[0])
(3,)
1

2์ฐจ์› numpy array : vector

# row vector
r = np.array([ [1,2,3] ])
print (r.shape)
(1, 3)

์ƒ‰์ธ

# obtaining a particular entry
print (r[0,1])
2

np.zeros, np.ones, np.full, np.random.random

# creating a matrix with all zeros
a = np.zeros((2,2))
print (a)
# creating a matrix with all ones
b = np.ones((2,2))
print (b)
             
# creating a matrix filled with the same constant
c = np.full((2,2), 7)
print (c)
             
# creating a matrix with random values
d = np.random.random((2,2))
print (d)
[[ 0.  0.]
 [ 0.  0.]]
[[ 1.  1.]
 [ 1.  1.]]
[[7 7]
 [7 7]]
[[ 0.93589863  0.19331487]
 [ 0.14309097  0.43003853]]

2์ฐจ์› numpy array : matrix

# creating a matrix
A=np.array([[1,2],[3,4],[5,6]])
print (A)
[[1 2]
 [3 4]
 [5 6]]
# creating another matrix
B=np.array([[11,12,13,14],[15,16,17,18]])
B
array([[11, 12, 13, 14],
       [15, 16, 17, 18]])

์ „์น˜ํ–‰๋ ฌ

# transpose a matrix
A.T
array([[1, 3, 5],
       [2, 4, 6]])

ํ–‰๋ ฌ ๊ณฑ

# matrix-matrix multiplication
np.dot(A,B)
array([[ 41,  44,  47,  50],
       [ 93, 100, 107, 114],
       [145, 156, 167, 178]])

ํ–‰๋ ฌ ๊ณฑ์˜ ์ž˜๋ชป๋œ ์˜ˆ์‹œ

# matrix-matrix multiplication 
# size should match!
np.dot(B,A)
---------------------------------------------------------------
ValueError                    Traceback (most recent call last)
<ipython-input-30-1c2410a4aca9> in <module>()
      1 # matrix-matrix multiplication
      2 # size should match!
----> 3 np.dot(B,A)

ValueError: shapes (2,4) and (3,2) not aligned: 4 (dim 1) != 3 (dim 0)
# coefficient matrix A and a vector b
A=np.array([[60, 5.5, 1],[65, 5.0, 0],[55, 6.0, 1]])
b=np.array([66, 70, 78])

ํ•ญ๋“ฑํ–‰๋ ฌ eye

# identity matrix 
eye3 = np.eye(3)
eye3
array([[ 1.,  0.,  0.],
       [ 0.,  1.,  0.],
       [ 0.,  0.,  1.]])

์—ญํ–‰๋ ฌ ๊ตฌํ•˜๊ธฐ numpy.linalg.inv

# computing an inverse
from numpy.linalg import inv
A_inv = inv(A)
A_inv
array([[ 0.08695652,  0.00869565, -0.08695652],
       [-1.13043478,  0.08695652,  1.13043478],
       [ 2.        , -1.        , -1.        ]])

์ž˜๋ชป๋œ ํ–‰๋ ฌ ๊ณฑ ์—ฐ์‚ฐ. DOT์„ ์‚ฌ์šฉํ•˜์ง€ ์•Š๊ณ  *์„ ์‚ฌ์šฉํ•ด์„œ element-wise ๋ฐฉ์‹์œผ๋กœ ๊ณฑํ•ด์ง„๋‹ค.

# wrong matrix multiplication
A*A_inv
array([[  5.21739130e+00,   4.78260870e-02,  -8.69565217e-02],
       [ -7.34782609e+01,   4.34782609e-01,   0.00000000e+00],
       [  1.10000000e+02,  -6.00000000e+00,  -1.00000000e+00]])

์˜ฌ๋ฐ”๋ฅธ ํ–‰๋ ฌ ๊ณฑ

# correct matrix multiplication
A.dot(A_inv)
array([[ 1.,  0.,  0.],
       [ 0.,  1.,  0.],
       [ 0.,  0.,  1.]])

ํ–‰๋ ฌ๊ณผ ๋ฒกํ„ฐ์˜ ๊ณฑ

# solution of a linear system
x=A_inv.dot(b)
x
array([ -0.43478261,  19.65217391, -16.        ])

numpy.linalg.solve

# a better way to solve the same linear system
from numpy.linalg import solve
x = solve(A,b)
x
array([ -0.43478261,  19.65217391, -16.        ])

์ด ๋•Œ solve์˜ ๊ณ„์‚ฐ์€ ์—ญํ–‰๋ ฌ์„ ๊ตฌํ•˜๋Š” ๊ณ„์‚ฐ์œผ๋กœ ํ•˜์ง€ ์•Š๋Š”๋‹ค. ์—ญํ–‰๋ ฌ์„ ๊ตฌํ•˜๋Š” ๋ฐฉ๋ฒ•์€ ์กฐ๊ทธ๋งˆํ•œ ์˜ค์ฐจ๋ฅผ ๋ฐœ์ƒ์‹œํ‚ค๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค.

์„ ํ˜•๊ฒฐํ•ฉ

์„ ํ˜• ๊ฒฐํ•ฉ์—๋Š” ๋ฒกํ„ฐ v์™€ ์Šค์นผ๋ผ c๊ฐ€ ์ฃผ์–ด์ง€๋ฉฐ ์ด ๋•Œ ์Šค์นผ๋ผ c๋ฅผ ๊ฐ€์ค‘์น˜๋ผ๊ณ  ํ•œ๋‹ค. ์ด ๊ฐ€์ค‘์น˜๋Š” ์‹ค์ˆ˜ ๋ฒ”์œ„ ๋‚ด์— ์žˆ๋Š” ์ˆ˜์ด๋ฉฐ, 0๋„ ๋‹น์—ฐํžˆ ํฌํ•จ๋œ๋‹ค.

๋งคํŠธ๋ฆญ์Šค ๋ฐฉ์ •์‹์—์„œ ๋ฒกํ„ฐ ๋ฐฉ์ •์‹

๋งคํŠธ๋ฆญ์Šค์˜ ์—ด์„ ๋ถ„๋ฆฌํ•ด์„œ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค.

Span

์„ ํ˜• ๊ฒฐํ•ฉ์—์„œ ์žฌ๋ฃŒ๋ฒกํ„ฐ(์ฃผ์–ด์ง„ ๋ฒกํ„ฐ)๋ฅผ ๊ฐ€์ง€๊ณ  ์ž„์˜์˜ ๊ฐ€์ค‘์น˜๋ฅผ ๊ณฑํ•ด ๋งŒ๋“ค ์ˆ˜ ์žˆ๋Š” ๋ชจ๋“  ์„ ํ˜• ๊ฒฐํ•ฉ์˜ ๊ฒฐ๊ณผ๋ฌผ์„ ์˜๋ฏธํ•œ๋‹ค.

์œ„ ๊ทธ๋ฆผ์—์„œ ์žฌ๋ฃŒ๋ฒกํ„ฐ๋Š” v1 v_1 v1โ€‹๊ณผ v2 v_2 v2โ€‹์ด๋ฉฐ ์ด ๋‘ ์žฌ๋ฃŒ๋ฒกํ„ฐ์™€ ๊ฐ€์ค‘์น˜๋ฅผ ์ด์šฉํ•˜์—ฌ ๋งŒ๋“ค์–ด์ง€๋Š” ๋ชจ๋“  ์ ์˜ ์ง‘ํ•ฉ์ด Span์ด๋‹ค. ์œ„์™€ ๊ฐ™๋‹ค.

๋งŒ์•ฝ ์žฌ๋ฃŒ๋ฒกํ„ฐ๊ฐ€ ํ•œ๊ฐœ๋ผ๋ฉด?

  • Span์€ Line

์žฌ๋ฃŒ๋ฒกํ„ฐ๊ฐ€ 3๊ฐœ๋ผ๋ฉด?

  • Span์€ ํ‰ํ–‰์‚ฌ๋ณ€ํ˜•์˜ ์ž…์ฒด๋ฒ„์ „!

ํ•ด์˜ ์กด์žฌ ์—ฌ๋ถ€

3๊ฐœ์˜ ์žฌ๋ฃŒ๋ฒกํ„ฐ๋กœ ๋งŒ๋“ค์–ด์ง€๋Š” Span ์•ˆ์— ๊ฒฐ๊ณผ๋ฒกํ„ฐ๊ฐ€ ์กด์žฌํ•˜๋ฉด ์ด๋ฅผ ๋งŒ์กฑํ•˜๋Š” ๊ฐ€์ค‘์น˜๊ฐ€ ์กด์žฌํ•œ๋‹ค.

๋‹ค์ˆ˜์˜ ๋ฐฉ์ •์‹์œผ๋กœ ์ธ์‹ํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ ๋ฒกํ„ฐ ๋ฐฉ์ •์‹์œผ๋กœ ๊ธฐํ•˜ํ•™์  ํ•ด์„์„ ํ•  ์ˆ˜ ์žˆ๋‹ค.

์œ„์™€ ๊ฐ™์ด 6๋ฒˆ์˜ ๊ณ„์‚ฐ์„ ํ•„์š”๋กœ ํ•˜๋Š” ์„ ํ˜• ๋ฐฉ์ •์‹์œผ๋กœ ๋ณผ ์ˆ˜ ์žˆ์ง€๋งŒ,

์•„๋ž˜์™€ ๊ฐ™์ด ๋ฒกํ„ฐ๋‹จ์œ„๋กœ ๋ถ„๋ฆฌํ•ด์„œ ๊ณ„์‚ฐํ•  ์ˆ˜๋„ ์žˆ๋‹ค.

(Ax)T=xTAT (Ax)^T = x^TA^T (Ax)T=xTAT ์™€ ๊ฐ™์œผ๋ฏ€๋กœ ์—ด๋ฒกํ„ฐ์˜ ๊ณ„์‚ฐ์„ ํ–‰๋ฒกํ„ฐ์˜ ๊ณ„์‚ฐ์œผ๋กœ๋„ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ๋‹ค.

๋‚ด์  ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ์™ธ์ ๋„ ๋ฒกํ„ฐ๋กœ ๋ถ„๋ฆฌํ•ด์„œ ๊ณ„์‚ฐ์ด ๊ฐ€๋Šฅํ•˜๋‹ค.

๋จธ์‹ ๋Ÿฌ๋‹ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ๋•Œ ์ง€๊ธˆ๊นŒ์ง€์˜ ๊ณผ์ •์ฒ˜๋Ÿผ ํ–‰๋ ฌ์„ ๋ถ„๋ฆฌํ•ด์„œ ๊ณ„์‚ฐํ•˜๋Š” ๋ฐฉ๋ฒ•์ด ์ค‘์š”ํ•˜๋‹ค.

์˜ˆ๋ฅผ ๋“ค์–ด 100๋ช…์˜ ์‚ฌ๋žŒ๊ณผ 50๋ช…์˜ ํ”ผ์ฒ˜๊ฐ€ ์žˆ๋Š” ํ‘œ๊ฐ€ ์žˆ๋‹ค๊ณ  ํ•  ๋•Œ, ์ด๋Š” 100๋ช…์˜ ์‚ฌ๋žŒ ์—ด๋ฒกํ„ฐ์™€ 50๊ฐœ์˜ ํŠน์ง• ํ–‰๋ฒกํ„ฐ๋ฅผ ์ด์šฉํ•˜์—ฌ ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค.

์„ ํ˜•๋…๋ฆฝ๊ณผ ์„ ํ˜•์ข…์†

๋งŒ์•ฝ Span์— x๊ฐ€ ํฌํ•จ๋œ๋‹ค๊ณ  ํ•  ๋•Œ, x๋Š” ์œ ์ผํ•œ ๊ฐ€? ์•„๋‹ˆ๋ฉด ๋งค์šฐ ๋งŽ์€๊ฐ€? ์— ๋Œ€ํ•œ ์˜๋ฌธ์„ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๋‹ค.

์ด๋ฅผ ์„ค๋ช…ํ•  ์ˆ˜ ์žˆ๋Š”๊ฒƒ์ด ์„ ํ˜•๋…๋ฆฝ๊ณผ ์„ ํ˜•์ข…์†(์„ ํ˜•์˜์กด)

ํ•ด๊ฐ€ ์—ฌ๋Ÿฌ๊ฐœ๋ผ๋Š” ๊ฒƒ์€ ์–ด๋–ค ์˜๋ฏธ์ผ๊นŒ?

v1๊ณผ v2๋งŒ์œผ๋กœ๋„ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋Š” span์— v3๊ฐ€ ์กด์žฌํ•œ๋‹ค๋Š” ์˜๋ฏธ

์„ ํ˜• ๋…๋ฆฝ

์ƒˆ๋กœ์šด ์„ ํ˜• ๋ฒกํ„ฐ๊ฐ€ ์ถ”๊ฐ€๋  ๋•Œ ๊ธฐ์กด์— Span์— ์ถ”๊ฐ€๋˜์ง€ ์•Š์œผ๋ฉด ์„ ํ˜•์ ์œผ๋กœ ๋…๋ฆฝ์ ์ด๋ผ๊ณ  ํ•œ๋‹ค

3์ฐจ์› ๊ณต๊ฐ„์—์„œ ์žˆ๋Š” 4๊ฐœ์˜ ๋ฒกํ„ฐ๋Š” ํ•ญ์ƒ ์„ ํ˜• ์˜์กด์ด๋‹ค. (๋น„๋‘˜๊ธฐ์ง‘ ์›๋ฆฌ์™€ ๋น„์Šท)

๋‹ค์Œ๊ณผ ๊ฐ™์ด 3์ฐจ์› ๊ณต๊ฐ„์—์„œ 4๊ฐœ์˜ ๋ฒกํ„ฐ๊ฐ€ ์žˆ์„ ๊ฒฝ์šฐ๋Š” ํ•ด๊ฐ€ ๋ฌด์ˆ˜ํžˆ ๋งŽ์ด ์กด์žฌํ•œ๋‹ค.

๋งŒ์•ฝ 3์ฐจ์› ๊ณต๊ฐ„์—์„œ 2๊ฐœ์˜ ๋ฒกํ„ฐ๊ฐ€ ์žˆ์„ ๊ฒฝ์šฐ๋Š” Case by Case => ์„ ํ˜• ์˜์กด์ผ์ˆ˜๋„ ๋…๋ฆฝ์ผ์ˆ˜๋„ ์žˆ๋‹ค.

์„ ํ˜• ๋…๋ฆฝ : homogeneous equation

  • Ax = b ์—์„œ b๊ฐ€ ๋ฌด์—‡์ด๋“  0์ด๋ผ๊ณ  ๋‘๊ณ  ํ‘ธ๋Š” ๋ฐฉ์ •์‹

  • b๋Š” ์˜๋ฒกํ„ฐ O๋กœ ๋Œ€์น˜. ์ด ๋•Œ, O๊ฐ€ ์ง„ํ•œ ์ด์œ ๋Š” ๋ฒกํ„ฐ์ด๊ธฐ ๋•Œ๋ฌธ

  • ์ด ๋ฐฉ์ •์‹์€ ํ•ด์˜ ์กด์žฌ ์œ ๋ฌด๋ฅผ ํŒ๋‹จํ•  ํ•„์š”๊ฐ€ ์—†๋Š”๋ฐ ์ด์œ ๋Š” ๋ชจ๋“  xv๋ฅผ 0์œผ๋กœ ์„ธํŒ…ํ•˜๋ฉด ์ตœ์†Œํ•œ ํ•œ ๊ฐœ์˜ ํ•ด๋ฅผ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ => ์˜๋ฒกํ„ฐ๋ฅผ ํ•ญ์ƒ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Œ

  • ํ•ด๊ฐ€ 1๊ฐœ์ด๋ƒ, ๋” ์กด์žฌํ•˜๋Š” ์ง€๋ฅผ ์•„๋Š” ๊ฒƒ์ด ์ด ๋ฐฉ์ •์‹์˜ ๋ชฉํ‘œ

  • ํ•ด๊ฐ€ ๋ชจ๋‘ 0์ธ solution์„ trivial solution ์ด๋ผ๊ณ  ํ•˜๋ฉฐ ํ•˜๋‚˜๋ผ๋„ 0์ด ์•„๋‹Œ ํ•ด๋Š” nontrivial solution ์ด๋ผ๊ณ  ํ•œ๋‹ค.

Two definitions are equvalent

  • 0์ด ์•„๋‹Œ ํ•ด๋ฅผ ๊ฐ€์ง€๋Š” ๋งˆ์ง€๋ง‰ ๋ฒกํ„ฐ์— ๋Œ€ํ•œ ์‹์œผ๋กœ ๋‹ค๋ฅธ ๋ฒกํ„ฐ๋ฅผ ํ‘œํ˜„ํ•œ๋‹ค. ์ด ๋ฒกํ„ฐ๊ฐ€ ๋‚˜๋จธ์ง€ ๋ฒกํ„ฐ์˜ Span์— ํฌํ•จ๋˜๋Š” ์ง€ ํ™•์ธํ•œ๋‹ค.

  • ํฌํ•จ ๋œ๋‹ค๋ฉด ํ•ด๋ฅผ ๊ฐ€์ง„๋‹ค๋Š” ์˜๋ฏธ. Span{v1,v2,...vnโˆ’1}=Span{vn} Span\{v_1, v2, ... v_{n-1}\} = Span\{v_n\} Span{v1โ€‹,v2,...vnโˆ’1โ€‹}=Span{vnโ€‹}

0์ด ์•„๋‹Œ b๊ฐ€ ์กด์žฌํ•  ๋•Œ๋Š”?

  • ์›์ ์—์„œ ์ถœ๋ฐœํ•ด์„œ b๋กœ ๋„์ฐฉํ•ด์•ผ ํ•˜๋Š” ์ƒํ™ฉ

๊ฒฐ๋ก 

Ax = b๋Š” ์–ธ์ œ ํ•˜๋‚˜์˜ ํ•ด๋ฅผ ๊ฐ€์ง€๋Š”๊ฐ€?

  • ํ•˜๋‚˜์˜ ํ‰ํ–‰์‚ฌ๋ณ€ํ˜•์œผ๋กœ๋งŒ ๊ทธ๋ ค์งˆ ๋•Œ

  • ์ด ๋•Œ๋ฅผ ์„ ํ˜• ๋…๋ฆฝ์ด๋ผ๊ณ  ํ•œ๋‹ค.

๋ถ€๋ถ„๊ณต๊ฐ„์˜ ๊ธฐ์ €์™€ ์ฐจ์›

Span and Subspace

  • Subset : ๋ถ€๋ถ„์ง‘ํ•ฉ

  • Subspace : Span๊ณผ ๋น„์Šทํ•˜๋‹ค. ์„ ํ˜• ๊ฒฐํ•ฉ์— ๋‹ซํ˜€์žˆ๋Š” subset์„ subspace๋ผ๊ณ  ํ•œ๋‹ค.

  • subspace๋Š” span์˜ ์žฌ๋ฃŒ๋ฒกํ„ฐ๋กœ ํ•ญ์ƒ ํ‘œํ˜„ ๊ฐ€๋Šฅํ•˜๋‹ค

Basis of a Subspace

ํ•œ ํ‰๋ฉด์ด ์ฃผ์–ด์งˆ ๋•Œ ์ค‘๋ณต์ด ํ—ˆ์šฉ๋˜์ง€ ์•Š๋Š”(Lineary independent) ๋‘ ๋ฒกํ„ฐ์˜ Span์ด ํ‰๋ฉด์„ ์ด๋ฃฌ๋‹ค๋ฉด ์ด ๋ฒกํ„ฐ๋“ค์„ ๊ธฐ์ €๋ฒกํ„ฐ๋ผ๊ณ  ํ•œ๋‹ค.

Non-Uniqueness of Basis

์ด ๋•Œ ๊ธฐ์ €๋ฒกํ„ฐ๋Š” ์œ ๋‹ˆํฌ ํ•˜์ง€๋Š” ์•Š๋Š”๋‹ค. ์—ฌ๋Ÿฌ ๊ธฐ์ €๋ฒกํ„ฐ๋กœ Subspace๋ฅผ ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค.

Dimension of Subspace

์„œ๋ธŒ์ŠคํŽ˜์ด์Šค์˜ ๊ธฐ์ €๋ฒกํ„ฐ์˜ ๊ฐœ์ˆ˜๋Š” ๋™์ผํ•˜๋‹ค.

ํ‘œ์ค€ ๊ธฐ์ € ๋ฒกํ„ฐ๋Š” ๊ธธ์ด๊ฐ€ 1์ด๊ณ  ๊ฐ ๋ฒกํ„ฐ๊ฐ€ ์ˆ˜์ง์ธ ๋ฒกํ„ฐ์ด๋‹ค. ์ฃผ๋กœ ์ถ•์„ ๊ธฐ์ค€์œผ๋กœ ๋ฒกํ„ฐ๋ฅผ ์„ค์ •ํ•œ๋‹ค.

Column Space of Matrix

๋งคํŠธ๋ฆญ์Šค์˜ ์ปฌ๋Ÿผ์— ์˜ํ•œ span์€ subspace์˜ ์กฐ๊ฑด์„ ๋งŒ์กฑํ•˜๋ฏ€๋กœ column space๋ผ๊ณ  ํ•œ๋‹ค.

Matrix with Linearly Dependent Columns

๋ฌผ๋ก  3๊ฐœ์˜ ์ปฌ๋Ÿผ์ด ๋‹ค ๊ธฐ์ €๋ฒกํ„ฐ๊ฐ€ ๋  ์ˆ˜ ์žˆ์ง€๋งŒ ์„ ํ˜• ๋…๋ฆฝ๊ด€๊ณ„์— ์žˆ์ง€ ์•Š์œผ๋ฏ€๋กœ 2๊ฐœ๋งŒ ์ŠคํŒฌ์˜ ๊ธฐ์ €๋ฒกํ„ฐ๋ผ๊ณ  ๋งํ•œ๋‹ค

Rank of Matrix

๋žญํฌ๋Š” column space์˜ ๊ฐœ์ˆ˜๋ฅผ ๋งํ•œ๋‹ค. ์ฆ‰, ์„œ๋ธŒ์ŠคํŽ˜์ด์Šค์˜ ์ฐจ์›๊ณผ ๊ฐ™์€ ๊ฐ’์ด๋ฉฐ ํŠนํžˆ column space์˜ ๊ฐœ์ˆ˜๋ฅผ ํŠน์ •ํ•ด์„œ ๋งํ•œ๋‹ค.

์•„๋ฌด๋ฆฌ ๋งคํŠธ๋ฆญ์Šค๊ฐ€ ์ปค๋„ ๊ฐ column๊ฐ„์˜ ํŒจํ„ด์ด ์žˆ๋‹ค๋ฉด ๋žญํฌ๋Š” ํฌ์ง€ ์•Š๋‹ค.

์„ ํ˜•๋ณ€ํ™˜

Transformation

x๋ฅผ y๋กœ transform ํ•  ๋•Œ ๋‹ค์Œ๊ณผ ๊ฐ™์€ ๊ธฐํ˜ธ๋ฅผ ์‚ฌ์šฉํ•œ๋‹ค.

  • domain : ์ •์˜์—ญ์„ ์˜๋ฏธ

  • co-domain : ๊ณต์—ญ์„ ์˜๋ฏธ

  • image : ํ•จ์ˆซ๊ฐ’์„ ์˜๋ฏธํ•œ๋‹ค. f(1) = 3 ์ผ ๋•Œ 1์˜ ์ด๋ฏธ์ง€ ๊ฐ’์€ 3์ด๋‹ค.

  • range : ์น˜์—ญ์„ ์˜๋ฏธ. ๊ณต์—ญ ์ค‘ ์ •์˜์—ญ๊ณผ ๋งค์นญ ๋˜๋Š” ๊ฐ’๋“ค์˜ ์ง‘ํ•ฉ์ด ์น˜์—ญ์ด๋‹ค.

  • ์ด ๋•Œ x์˜ ์ด๋ฏธ์ง€ ๊ฐ’์€ ๋”ฑ ํ•˜๋‚˜์ผ ๋•Œ ํ•จ์ˆ˜๋ผ๊ณ  ์ •์˜ํ•œ๋‹ค.

Linear Transformation

f(ax+by)=af(x)+bf(y) f(ax+by) = af(x) + bf(y) f(ax+by)=af(x)+bf(y)๋ฅผ ๋งŒ์กฑํ•  ๋•Œ, ํ•ด๋‹น transformation์„ ์„ ํ˜•์ ์ด๋ผ๊ณ  ํ•œ๋‹ค.

  • f(x)=3x+2 f(x) = 3x + 2 f(x)=3x+2 ์™€ ๊ฐ™์€ bias๊ฐ€ ์กด์žฌํ•˜๋Š” ํ•จ์ˆ˜ ๊ผด์€ ๋งŒ์กฑํ•˜์ง€ ์•Š๋Š”๋‹ค

    • 3โ‹…1+4โ‹…2=11 3 \cdot 1 + 4 \cdot 2 = 11 3โ‹…1+4โ‹…2=11

    • a=3,x=1,b=4,y=2 a = 3, x = 1, b = 4, y = 2 a=3,x=1,b=4,y=2

    • f(ax+by)=f(11)=35 f(ax+by) = f(11) = 35 f(ax+by)=f(11)=35

    • af(x)+bf(y)=3f(1)+4f(2)=15+32=47=ฬธ35 af(x) + bf(y) = 3f(1) + 4f(2) = 15 + 32 = 47 \not= 35 af(x)+bf(y)=3f(1)+4f(2)=15+32=47๎€ =35

  • ๊ทผ๋ฐ ์ด๊ฒƒ์„ [32][x1]\left[\begin{array}{rrr} 3&2 \end{array}\right] \left[\begin{array}{rrr} x\\ 1 \end{array}\right][3โ€‹2โ€‹][x1โ€‹]๋กœ ๋ณ€ํ™˜ํ•˜๋ฉด ์„ ํ˜• ๋ณ€ํ™˜์„ ๋งŒ์กฑํ•œ๋‹ค

    • 3x+4y=3[11]+4[21]=[117]3x + 4y = 3\left[\begin{array}{rrr} 1\\ 1 \end{array}\right] + 4\left[\begin{array}{rrr} 2\\ 1 \end{array}\right] = \left[\begin{array}{rrr} 11\\ 7 \end{array}\right] 3x+4y=3[11โ€‹]+4[21โ€‹]=[117โ€‹]

    • [32][117]=47\left[\begin{array}{rrr} 3&2 \end{array}\right] \left[\begin{array}{rrr} 11\\7 \end{array}\right] = 47[3โ€‹2โ€‹][117โ€‹]=47

    • [32]3[11]+[32]4[21]=47\left[\begin{array}{rrr} 3&2 \end{array}\right] 3\left[\begin{array}{rrr} 1\\ 1 \end{array}\right] + \left[\begin{array}{rrr} 3&2 \end{array}\right] 4\left[\begin{array}{rrr} 2\\ 1 \end{array}\right] = 47[3โ€‹2โ€‹]3[11โ€‹]+[3โ€‹2โ€‹]4[21โ€‹]=47

Matrix of Linear Transformation

1) T๋Š” 2์ฐจ์› ์‹ค์ˆ˜์—์„œ 3์ฐจ์› ์‹ค์ˆ˜๋กœ์˜ linear transformation ์ด๋‹ค.

2)

๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋‘ ๊ฐ€์ง€ ๋‹จ์„œ๋ฅผ ํ†ตํ•ด T๋ผ๋Š” ๋ณ€ํ™˜์„ ์™„๋ฒฝํ•˜๊ฒŒ ํŒŒํ—ค์น  ์ˆ˜ ์žˆ๋‹ค.

์„ ํ˜•์„ฑ๊ณผ T์˜ ๊ธฐ์ €๋ฒกํ„ฐ๋ฅผ ๊ฐ€์ง€๊ณ  T๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์˜ ์ •์˜๋˜๋Š” ๋ณ€ํ™˜์ด๋‹ค ๋ผ๊ณ  ๋งํ•  ์ˆ˜ ์žˆ๋‹ค.

=> ์„ ํ˜•์„ฑ์„ ๋งŒ์กฑํ•˜๋Š” ๋ณ€ํ™˜์€ ํ•ญ์ƒ ํ–‰๋ ฌ๊ณผ ์ž…๋ ฅ๋ฒกํ„ฐ์˜ ๊ณฑ์œผ๋กœ๋งŒ ์ด๋ฃจ์–ด์ง„ ๊ด€๊ณ„๋กœ ๋˜์–ด์žˆ๋‹ค.

์ผ๋ฐ˜์ ์œผ๋กœ, N์ฐจ์›์—์„œ M์ฐจ์›์œผ๋กœ์˜ ์„ ํ˜•๋ณ€ํ™˜ T๋Š” ํ•ญ์ƒ ๋งคํŠธ๋ฆญ์Šค์™€ ๋ฒกํ„ฐ์˜ ๊ณฑ์œผ๋กœ ํ‘œํ˜„๋œ๋‹ค.

Mร—N M \times N Mร—N์ฐจ์›์˜ ๋ณ€ํ™˜ T์˜ ๋งคํŠธ๋ฆญ์Šค ์ธ์ž๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์ด ํ‘œํ˜„๋œ๋‹ค. ์ด ๋•Œ ์ด A๋ฅผ ์„ ํ˜• ๋ณ€ํ™˜ T์˜ ๊ธฐ์ค€ ํ–‰๋ ฌ์ด๋ผ๊ณ  ํ•œ๋‹ค.

์„ ํ˜•๋ณ€ํ™˜ with Neural Networks

์ •์‚ฌ๊ฐํ˜•์˜ ๋ชจ๋ˆˆ ์ข…์ด๋ฅผ ํ‰ํ–‰์‚ฌ๋ณ€ํ˜•์˜ ๋ชจ๋ˆˆ ์ข…์ด๋กœ ๋ฐ”๊ฟ”์ฃผ๋Š” ๊ฒƒ์ด ์„ ํ˜•๋ณ€ํ™˜

Affine Layer in Neural Networks

y = 3x + 5 ์ฒ˜๋Ÿผ bias term ์ด ์กด์žฌํ•˜๋Š” ๊ฒฝ์šฐ Affine Layer๋ผ๊ณ  ํ•˜๋Š”๋ฐ, ์ด ๋•Œ๋„ ๋‹จ์ง€ 1๋กœ ์ด๋ฃจ์–ด์ง„ ๋ฒกํ„ฐ๋ฅผ ์ถ”๊ฐ€ํ•ด์คŒ์œผ๋กœ์จ Linear ํ•˜๊ฒŒ transform ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋œ๋‹ค.

์ „์‚ฌํ•จ์ˆ˜์™€ ์ผ๋Œ€์ผํ•จ์ˆ˜

ONTO and ONE-TO-ONE

  • ONTO : ๊ณต์—ญ = ์น˜์—ญ

    • ์ •์˜์—ญ์˜ ๊ฐœ์ˆ˜๊ฐ€ ๊ณต์—ญ๋ณด๋‹ค ๋งŽ์•„์•ผ ํ•œ๋‹ค. (ํ•„์š”์กฐ๊ฑด)

    • 2์ฐจ -> 3์ฐจ๋กœ ์„ ํ˜• ๋ณ€ํ™˜์€ ONTO๊ฐ€ ์ ˆ๋Œ€ ๋  ์ˆ˜ ์—†๋‹ค.

    • ์ธ์ฝ”๋”ฉ : ์ž…๋ ฅ๋ฒกํ„ฐ๊ฐ€ ๋” ํฌ๋‹ค

    • ๋””์ฝ”๋”ฉ : ์ถœ๋ ฅ๋ฒกํ„ฐ๊ฐ€ ๋” ํฌ๋‹ค

  • ONE TO ONE : ์ผ๋Œ€์ผ ํ•จ์ˆ˜

    • ์ •์˜์—ญ = ๊ณต์—ญ

    • 3์ฐจ -> 2์ฐจ๋กœ ์„ ํ˜• ๋ณ€ํ™˜์€ ONE TO ONE์ด ์ ˆ๋Œ€ ๋  ์ˆ˜ ์—†๋‹ค.

    • ์„ ํ˜• ๋ณ€ํ™˜ T์˜ ๊ธฐ์ € ๋ฒกํ„ฐ๊ฐ€ ์„ ํ˜• ๋…๋ฆฝ์ด๋ฉด ์ผ๋Œ€์ผ ํ•จ์ˆ˜์ด๋‹ค.

Neural Network Example

(๋‹จ, Non-Linear์€ ์ƒ๋žต)

  • T1์€ 3์ฐจ์› -> 2์ฐจ์› ์ด๋ฏ€๋กœ ONE TO ONE์€ ์•„๋‹ˆ๋‹ค