๐Ÿšดโ€โ™‚๏ธ
TIL
  • MAIN
  • : TIL?
  • : WIL
  • : Plan
  • : Retrospective
    • 21Y
      • Wait a moment!
      • 9M 2W
      • 9M1W
      • 8M4W
      • 8M3W
      • 8M2W
      • 8M1W
      • 7M4W
      • 7M3W
      • 7M2W
      • 7M1W
      • 6M5W
      • 1H
    • ์ƒˆ์‚ฌ๋žŒ ๋˜๊ธฐ ํ”„๋กœ์ ํŠธ
      • 2ํšŒ์ฐจ
      • 1ํšŒ์ฐจ
  • TIL : ML
    • Paper Analysis
      • BERT
      • Transformer
    • Boostcamp 2st
      • [S]Data Viz
        • (4-3) Seaborn ์‹ฌํ™”
        • (4-2) Seaborn ๊ธฐ์ดˆ
        • (4-1) Seaborn ์†Œ๊ฐœ
        • (3-4) More Tips
        • (3-3) Facet ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-2) Color ์‚ฌ์šฉํ•˜๊ธฐ
        • (3-1) Text ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-3) Scatter Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-2) Line Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (2-1) Bar Plot ์‚ฌ์šฉํ•˜๊ธฐ
        • (1-3) Python๊ณผ Matplotlib
        • (1-2) ์‹œ๊ฐํ™”์˜ ์š”์†Œ
        • (1-1) Welcome to Visualization (OT)
      • [P]MRC
        • (2๊ฐ•) Extraction-based MRC
        • (1๊ฐ•) MRC Intro & Python Basics
      • [P]KLUE
        • (5๊ฐ•) BERT ๊ธฐ๋ฐ˜ ๋‹จ์ผ ๋ฌธ์žฅ ๋ถ„๋ฅ˜ ๋ชจ๋ธ ํ•™์Šต
        • (4๊ฐ•) ํ•œ๊ตญ์–ด BERT ์–ธ์–ด ๋ชจ๋ธ ํ•™์Šต
        • [NLP] ๋ฌธ์žฅ ๋‚ด ๊ฐœ์ฒด๊ฐ„ ๊ด€๊ณ„ ์ถ”์ถœ
        • (3๊ฐ•) BERT ์–ธ์–ด๋ชจ๋ธ ์†Œ๊ฐœ
        • (2๊ฐ•) ์ž์—ฐ์–ด์˜ ์ „์ฒ˜๋ฆฌ
        • (1๊ฐ•) ์ธ๊ณต์ง€๋Šฅ๊ณผ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ
      • [U]Stage-CV
      • [U]Stage-NLP
        • 7W Retrospective
        • (10๊ฐ•) Advanced Self-supervised Pre-training Models
        • (09๊ฐ•) Self-supervised Pre-training Models
        • (08๊ฐ•) Transformer (2)
        • (07๊ฐ•) Transformer (1)
        • 6W Retrospective
        • (06๊ฐ•) Beam Search and BLEU score
        • (05๊ฐ•) Sequence to Sequence with Attention
        • (04๊ฐ•) LSTM and GRU
        • (03๊ฐ•) Recurrent Neural Network and Language Modeling
        • (02๊ฐ•) Word Embedding
        • (01๊ฐ•) Intro to NLP, Bag-of-Words
        • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Preprocessing for NMT Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Subword-level Language Model
        • [ํ•„์ˆ˜ ๊ณผ์ œ2] RNN-based Language Model
        • [์„ ํƒ ๊ณผ์ œ] BERT Fine-tuning with transformers
        • [ํ•„์ˆ˜ ๊ณผ์ œ] Data Preprocessing
      • Mask Wear Image Classification
        • 5W Retrospective
        • Report_Level1_6
        • Performance | Review
        • DAY 11 : HardVoting | MultiLabelClassification
        • DAY 10 : Cutmix
        • DAY 9 : Loss Function
        • DAY 8 : Baseline
        • DAY 7 : Class Imbalance | Stratification
        • DAY 6 : Error Fix
        • DAY 5 : Facenet | Save
        • DAY 4 : VIT | F1_Loss | LrScheduler
        • DAY 3 : DataSet/Lodaer | EfficientNet
        • DAY 2 : Labeling
        • DAY 1 : EDA
        • 2_EDA Analysis
      • [P]Stage-1
        • 4W Retrospective
        • (10๊ฐ•) Experiment Toolkits & Tips
        • (9๊ฐ•) Ensemble
        • (8๊ฐ•) Training & Inference 2
        • (7๊ฐ•) Training & Inference 1
        • (6๊ฐ•) Model 2
        • (5๊ฐ•) Model 1
        • (4๊ฐ•) Data Generation
        • (3๊ฐ•) Dataset
        • (2๊ฐ•) Image Classification & EDA
        • (1๊ฐ•) Competition with AI Stages!
      • [U]Stage-3
        • 3W Retrospective
        • PyTorch
          • (10๊ฐ•) PyTorch Troubleshooting
          • (09๊ฐ•) Hyperparameter Tuning
          • (08๊ฐ•) Multi-GPU ํ•™์Šต
          • (07๊ฐ•) Monitoring tools for PyTorch
          • (06๊ฐ•) ๋ชจ๋ธ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ
          • (05๊ฐ•) Dataset & Dataloader
          • (04๊ฐ•) AutoGrad & Optimizer
          • (03๊ฐ•) PyTorch ํ”„๋กœ์ ํŠธ ๊ตฌ์กฐ ์ดํ•ดํ•˜๊ธฐ
          • (02๊ฐ•) PyTorch Basics
          • (01๊ฐ•) Introduction to PyTorch
      • [U]Stage-2
        • 2W Retrospective
        • DL Basic
          • (10๊ฐ•) Generative Models 2
          • (09๊ฐ•) Generative Models 1
          • (08๊ฐ•) Sequential Models - Transformer
          • (07๊ฐ•) Sequential Models - RNN
          • (06๊ฐ•) Computer Vision Applications
          • (05๊ฐ•) Modern CNN - 1x1 convolution์˜ ์ค‘์š”์„ฑ
          • (04๊ฐ•) Convolution์€ ๋ฌด์—‡์ธ๊ฐ€?
          • (03๊ฐ•) Optimization
          • (02๊ฐ•) ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ - MLP (Multi-Layer Perceptron)
          • (01๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ๊ธฐ๋ณธ ์šฉ์–ด ์„ค๋ช… - Historical Review
        • Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Multi-headed Attention Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] LSTM Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] CNN Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] Optimization Assignment
          • [ํ•„์ˆ˜ ๊ณผ์ œ] MLP Assignment
      • [U]Stage-1
        • 1W Retrospective
        • AI Math
          • (AI Math 10๊ฐ•) RNN ์ฒซ๊ฑธ์Œ
          • (AI Math 9๊ฐ•) CNN ์ฒซ๊ฑธ์Œ
          • (AI Math 8๊ฐ•) ๋ฒ ์ด์ฆˆ ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 7๊ฐ•) ํ†ต๊ณ„ํ•™ ๋ง›๋ณด๊ธฐ
          • (AI Math 6๊ฐ•) ํ™•๋ฅ ๋ก  ๋ง›๋ณด๊ธฐ
          • (AI Math 5๊ฐ•) ๋”ฅ๋Ÿฌ๋‹ ํ•™์Šต๋ฐฉ๋ฒ• ์ดํ•ดํ•˜๊ธฐ
          • (AI Math 4๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ๋งค์šด๋ง›
          • (AI Math 3๊ฐ•) ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• - ์ˆœํ•œ๋ง›
          • (AI Math 2๊ฐ•) ํ–‰๋ ฌ์ด ๋ญ์˜ˆ์š”?
          • (AI Math 1๊ฐ•) ๋ฒกํ„ฐ๊ฐ€ ๋ญ์˜ˆ์š”?
        • Python
          • (Python 7-2๊ฐ•) pandas II
          • (Python 7-1๊ฐ•) pandas I
          • (Python 6๊ฐ•) numpy
          • (Python 5-2๊ฐ•) Python data handling
          • (Python 5-1๊ฐ•) File / Exception / Log Handling
          • (Python 4-2๊ฐ•) Module and Project
          • (Python 4-1๊ฐ•) Python Object Oriented Programming
          • (Python 3-2๊ฐ•) Pythonic code
          • (Python 3-1๊ฐ•) Python Data Structure
          • (Python 2-4๊ฐ•) String and advanced function concept
          • (Python 2-3๊ฐ•) Conditionals and Loops
          • (Python 2-2๊ฐ•) Function and Console I/O
          • (Python 2-1๊ฐ•) Variables
          • (Python 1-3๊ฐ•) ํŒŒ์ด์ฌ ์ฝ”๋”ฉ ํ™˜๊ฒฝ
          • (Python 1-2๊ฐ•) ํŒŒ์ด์ฌ ๊ฐœ์š”
          • (Python 1-1๊ฐ•) Basic computer class for newbies
        • Assignment
          • [์„ ํƒ ๊ณผ์ œ 3] Maximum Likelihood Estimate
          • [์„ ํƒ ๊ณผ์ œ 2] Backpropagation
          • [์„ ํƒ ๊ณผ์ œ 1] Gradient Descent
          • [ํ•„์ˆ˜ ๊ณผ์ œ 5] Morsecode
          • [ํ•„์ˆ˜ ๊ณผ์ œ 4] Baseball
          • [ํ•„์ˆ˜ ๊ณผ์ œ 3] Text Processing 2
          • [ํ•„์ˆ˜ ๊ณผ์ œ 2] Text Processing 1
          • [ํ•„์ˆ˜ ๊ณผ์ œ 1] Basic Math
    • ๋”ฅ๋Ÿฌ๋‹ CNN ์™„๋ฒฝ ๊ฐ€์ด๋“œ - Fundamental ํŽธ
      • ์ข…ํ•ฉ ์‹ค์Šต 2 - ์บ๊ธ€ Plant Pathology(๋‚˜๋ฌด์žŽ ๋ณ‘ ์ง„๋‹จ) ๊ฒฝ์—ฐ ๋Œ€ํšŒ
      • ์ข…ํ•ฉ ์‹ค์Šต 1 - 120์ข…์˜ Dog Breed Identification ๋ชจ๋ธ ์ตœ์ ํ™”
      • ์‚ฌ์ „ ํ›ˆ๋ จ ๋ชจ๋ธ์˜ ๋ฏธ์„ธ ์กฐ์ • ํ•™์Šต๊ณผ ๋‹ค์–‘ํ•œ Learning Rate Scheduler์˜ ์ ์šฉ
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - ResNet ์ƒ์„ธ์™€ EfficientNet ๊ฐœ์š”
      • Advanced CNN ๋ชจ๋ธ ํŒŒํ—ค์น˜๊ธฐ - AlexNet, VGGNet, GoogLeNet
      • Albumentation์„ ์ด์šฉํ•œ Augmentation๊ธฐ๋ฒ•๊ณผ Keras Sequence ํ™œ์šฉํ•˜๊ธฐ
      • ์‚ฌ์ „ ํ›ˆ๋ จ CNN ๋ชจ๋ธ์˜ ํ™œ์šฉ๊ณผ Keras Generator ๋ฉ”์ปค๋‹ˆ์ฆ˜ ์ดํ•ด
      • ๋ฐ์ดํ„ฐ ์ฆ๊ฐ•์˜ ์ดํ•ด - Keras ImageDataGenerator ํ™œ์šฉ
      • CNN ๋ชจ๋ธ ๊ตฌํ˜„ ๋ฐ ์„ฑ๋Šฅ ํ–ฅ์ƒ ๊ธฐ๋ณธ ๊ธฐ๋ฒ• ์ ์šฉํ•˜๊ธฐ
    • AI School 1st
    • ํ˜„์—… ์‹ค๋ฌด์ž์—๊ฒŒ ๋ฐฐ์šฐ๋Š” Kaggle ๋จธ์‹ ๋Ÿฌ๋‹ ์ž…๋ฌธ
    • ํŒŒ์ด์ฌ ๋”ฅ๋Ÿฌ๋‹ ํŒŒ์ดํ† ์น˜
  • TIL : Python & Math
    • Do It! ์žฅ๊ณ +๋ถ€ํŠธ์ŠคํŠธ๋žฉ: ํŒŒ์ด์ฌ ์›น๊ฐœ๋ฐœ์˜ ์ •์„
      • Relations - ๋‹ค๋Œ€๋‹ค ๊ด€๊ณ„
      • Relations - ๋‹ค๋Œ€์ผ ๊ด€๊ณ„
      • ํ…œํ”Œ๋ฆฟ ํŒŒ์ผ ๋ชจ๋“ˆํ™” ํ•˜๊ธฐ
      • TDD (Test Driven Development)
      • template tags & ์กฐ๊ฑด๋ฌธ
      • ์ •์  ํŒŒ์ผ(static files) & ๋ฏธ๋””์–ด ํŒŒ์ผ(media files)
      • FBV (Function Based View)์™€ CBV (Class Based View)
      • Django ์ž…๋ฌธํ•˜๊ธฐ
      • ๋ถ€ํŠธ์ŠคํŠธ๋žฉ
      • ํ”„๋ก ํŠธ์—”๋“œ ๊ธฐ์ดˆ๋‹ค์ง€๊ธฐ (HTML, CSS, JS)
      • ๋“ค์–ด๊ฐ€๊ธฐ + ํ™˜๊ฒฝ์„ค์ •
    • Algorithm
      • Programmers
        • Level1
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์ˆซ์ž ๋ฌธ์ž์—ด๊ณผ ์˜๋‹จ์–ด
          • ์ž์—ฐ์ˆ˜ ๋’ค์ง‘์–ด ๋ฐฐ์—ด๋กœ ๋งŒ๋“ค๊ธฐ
          • ์ •์ˆ˜ ๋‚ด๋ฆผ์ฐจ์ˆœ์œผ๋กœ ๋ฐฐ์น˜ํ•˜๊ธฐ
          • ์ •์ˆ˜ ์ œ๊ณฑ๊ทผ ํŒ๋ณ„
          • ์ œ์ผ ์ž‘์€ ์ˆ˜ ์ œ๊ฑฐํ•˜๊ธฐ
          • ์ง์‚ฌ๊ฐํ˜• ๋ณ„์ฐ๊ธฐ
          • ์ง์ˆ˜์™€ ํ™€์ˆ˜
          • ์ฒด์œก๋ณต
          • ์ตœ๋Œ€๊ณต์•ฝ์ˆ˜์™€ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • ์ฝœ๋ผ์ธ  ์ถ”์ธก
          • ํฌ๋ ˆ์ธ ์ธํ˜•๋ฝ‘๊ธฐ ๊ฒŒ์ž„
          • ํ‚คํŒจ๋“œ ๋ˆ„๋ฅด๊ธฐ
          • ํ‰๊ท  ๊ตฌํ•˜๊ธฐ
          • ํฐ์ผ“๋ชฌ
          • ํ•˜์ƒค๋“œ ์ˆ˜
          • ํ•ธ๋“œํฐ ๋ฒˆํ˜ธ ๊ฐ€๋ฆฌ๊ธฐ
          • ํ–‰๋ ฌ์˜ ๋ง์…ˆ
        • Level2
          • ์ˆซ์ž์˜ ํ‘œํ˜„
          • ์ˆœ์œ„ ๊ฒ€์ƒ‰
          • ์ˆ˜์‹ ์ตœ๋Œ€ํ™”
          • ์†Œ์ˆ˜ ์ฐพ๊ธฐ
          • ์†Œ์ˆ˜ ๋งŒ๋“ค๊ธฐ
          • ์‚ผ๊ฐ ๋‹ฌํŒฝ์ด
          • ๋ฌธ์ž์—ด ์••์ถ•
          • ๋ฉ”๋‰ด ๋ฆฌ๋‰ด์–ผ
          • ๋” ๋งต๊ฒŒ
          • ๋•…๋”ฐ๋จน๊ธฐ
          • ๋ฉ€์ฉกํ•œ ์‚ฌ๊ฐํ˜•
          • ๊ด„ํ˜ธ ํšŒ์ „ํ•˜๊ธฐ
          • ๊ด„ํ˜ธ ๋ณ€ํ™˜
          • ๊ตฌ๋ช…๋ณดํŠธ
          • ๊ธฐ๋Šฅ ๊ฐœ๋ฐœ
          • ๋‰ด์Šค ํด๋Ÿฌ์Šคํ„ฐ๋ง
          • ๋‹ค๋ฆฌ๋ฅผ ์ง€๋‚˜๋Š” ํŠธ๋Ÿญ
          • ๋‹ค์Œ ํฐ ์ˆซ์ž
          • ๊ฒŒ์ž„ ๋งต ์ตœ๋‹จ๊ฑฐ๋ฆฌ
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
          • ๊ฐ€์žฅ ํฐ ์ •์‚ฌ๊ฐํ˜• ์ฐพ๊ธฐ
          • H-Index
          • JadenCase ๋ฌธ์ž์—ด ๋งŒ๋“ค๊ธฐ
          • N๊ฐœ์˜ ์ตœ์†Œ๊ณต๋ฐฐ์ˆ˜
          • N์ง„์ˆ˜ ๊ฒŒ์ž„
          • ๊ฐ€์žฅ ํฐ ์ˆ˜
          • 124 ๋‚˜๋ผ์˜ ์ˆซ์ž
          • 2๊ฐœ ์ดํ•˜๋กœ ๋‹ค๋ฅธ ๋น„ํŠธ
          • [3์ฐจ] ํŒŒ์ผ๋ช… ์ •๋ ฌ
          • [3์ฐจ] ์••์ถ•
          • ์ค„ ์„œ๋Š” ๋ฐฉ๋ฒ•
          • [3์ฐจ] ๋ฐฉ๊ธˆ ๊ทธ๊ณก
          • ๊ฑฐ๋ฆฌ๋‘๊ธฐ ํ™•์ธํ•˜๊ธฐ
        • Level3
          • ๋งค์นญ ์ ์ˆ˜
          • ์™ธ๋ฒฝ ์ ๊ฒ€
          • ๊ธฐ์ง€๊ตญ ์„ค์น˜
          • ์ˆซ์ž ๊ฒŒ์ž„
          • 110 ์˜ฎ๊ธฐ๊ธฐ
          • ๊ด‘๊ณ  ์ œ๊ฑฐ
          • ๊ธธ ์ฐพ๊ธฐ ๊ฒŒ์ž„
          • ์…”ํ‹€๋ฒ„์Šค
          • ๋‹จ์†์นด๋ฉ”๋ผ
          • ํ‘œ ํŽธ์ง‘
          • N-Queen
          • ์ง•๊ฒ€๋‹ค๋ฆฌ ๊ฑด๋„ˆ๊ธฐ
          • ์ตœ๊ณ ์˜ ์ง‘ํ•ฉ
          • ํ•ฉ์Šน ํƒ์‹œ ์š”๊ธˆ
          • ๊ฑฐ์Šค๋ฆ„๋ˆ
          • ํ•˜๋…ธ์ด์˜ ํƒ‘
          • ๋ฉ€๋ฆฌ ๋›ฐ๊ธฐ
          • ๋ชจ๋‘ 0์œผ๋กœ ๋งŒ๋“ค๊ธฐ
        • Level4
    • Head First Python
    • ๋ฐ์ดํ„ฐ ๋ถ„์„์„ ์œ„ํ•œ SQL
    • ๋‹จ ๋‘ ์žฅ์˜ ๋ฌธ์„œ๋กœ ๋ฐ์ดํ„ฐ ๋ถ„์„๊ณผ ์‹œ๊ฐํ™” ๋ฝ€๊ฐœ๊ธฐ
    • Linear Algebra(Khan Academy)
    • ์ธ๊ณต์ง€๋Šฅ์„ ์œ„ํ•œ ์„ ํ˜•๋Œ€์ˆ˜
    • Statistics110
  • TIL : etc
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Kubernetes
    • [๋”ฐ๋ฐฐ๋Ÿฐ] Docker
      • 2. ๋„์ปค ์„ค์น˜ ์‹ค์Šต 1 - ํ•™์ŠตํŽธ(์ค€๋น„๋ฌผ/์‹ค์Šต ์œ ํ˜• ์†Œ๊ฐœ)
      • 1. ์ปจํ…Œ์ด๋„ˆ์™€ ๋„์ปค์˜ ์ดํ•ด - ์ปจํ…Œ์ด๋„ˆ๋ฅผ ์“ฐ๋Š”์ด์œ  / ์ผ๋ฐ˜ํ”„๋กœ๊ทธ๋žจ๊ณผ ์ปจํ…Œ์ด๋„ˆํ”„๋กœ๊ทธ๋žจ์˜ ์ฐจ์ด์ 
      • 0. ๋“œ๋””์–ด ์ฐพ์•„์˜จ Docker ๊ฐ•์˜! ์™•์ดˆ๋ณด์—์„œ ๋„์ปค ๋งˆ์Šคํ„ฐ๋กœ - OT
    • CoinTrading
      • [๊ฐ€์ƒ ํ™”ํ ์ž๋™ ๋งค๋งค ํ”„๋กœ๊ทธ๋žจ] ๋ฐฑํ…Œ์ŠคํŒ… : ๊ฐ„๋‹จํ•œ ํ…Œ์ŠคํŒ…
    • Gatsby
      • 01 ๊นƒ๋ถ ํฌ๊ธฐ ์„ ์–ธ
  • TIL : Project
    • Mask Wear Image Classification
    • Project. GARIGO
  • 2021 TIL
    • CHANGED
    • JUN
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Thu
      • 2 Wed
      • 1 Tue
    • MAY
      • 31 Mon
      • 30 Sun
      • 29 Sat
      • 28 Fri
      • 27 Thu
      • 26 Wed
      • 25 Tue
      • 24 Mon
      • 23 Sun
      • 22 Sat
      • 21 Fri
      • 20 Thu
      • 19 Wed
      • 18 Tue
      • 17 Mon
      • 16 Sun
      • 15 Sat
      • 14 Fri
      • 13 Thu
      • 12 Wed
      • 11 Tue
      • 10 Mon
      • 9 Sun
      • 8 Sat
      • 7 Fri
      • 6 Thu
      • 5 Wed
      • 4 Tue
      • 3 Mon
      • 2 Sun
      • 1 Sat
    • APR
      • 30 Fri
      • 29 Thu
      • 28 Wed
      • 27 Tue
      • 26 Mon
      • 25 Sun
      • 24 Sat
      • 23 Fri
      • 22 Thu
      • 21 Wed
      • 20 Tue
      • 19 Mon
      • 18 Sun
      • 17 Sat
      • 16 Fri
      • 15 Thu
      • 14 Wed
      • 13 Tue
      • 12 Mon
      • 11 Sun
      • 10 Sat
      • 9 Fri
      • 8 Thu
      • 7 Wed
      • 6 Tue
      • 5 Mon
      • 4 Sun
      • 3 Sat
      • 2 Fri
      • 1 Thu
    • MAR
      • 31 Wed
      • 30 Tue
      • 29 Mon
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • FEB
      • 28 Sun
      • 27 Sat
      • 26 Fri
      • 25 Thu
      • 24 Wed
      • 23 Tue
      • 22 Mon
      • 21 Sun
      • 20 Sat
      • 19 Fri
      • 18 Thu
      • 17 Wed
      • 16 Tue
      • 15 Mon
      • 14 Sun
      • 13 Sat
      • 12 Fri
      • 11 Thu
      • 10 Wed
      • 9 Tue
      • 8 Mon
      • 7 Sun
      • 6 Sat
      • 5 Fri
      • 4 Thu
      • 3 Wed
      • 2 Tue
      • 1 Mon
    • JAN
      • 31 Sun
      • 30 Sat
      • 29 Fri
      • 28 Thu
      • 27 Wed
      • 26 Tue
      • 25 Mon
      • 24 Sun
      • 23 Sat
      • 22 Fri
      • 21 Thu
      • 20 Wed
      • 19 Tue
      • 18 Mon
      • 17 Sun
      • 16 Sat
      • 15 Fri
      • 14 Thu
      • 13 Wed
      • 12 Tue
      • 11 Mon
      • 10 Sun
      • 9 Sat
      • 8 Fri
      • 7 Thu
      • 6 Wed
      • 5 Tue
      • 4 Mon
      • 3 Sun
      • 2 Sat
      • 1 Fri
  • 2020 TIL
    • DEC
      • 31 Thu
      • 30 Wed
      • 29 Tue
      • 28 Mon
      • 27 Sun
      • 26 Sat
      • 25 Fri
      • 24 Thu
      • 23 Wed
      • 22 Tue
      • 21 Mon
      • 20 Sun
      • 19 Sat
      • 18 Fri
      • 17 Thu
      • 16 Wed
      • 15 Tue
      • 14 Mon
      • 13 Sun
      • 12 Sat
      • 11 Fri
      • 10 Thu
      • 9 Wed
      • 8 Tue
      • 7 Mon
      • 6 Sun
      • 5 Sat
      • 4 Fri
      • 3 Tue
      • 2 Wed
      • 1 Tue
    • NOV
      • 30 Mon
Powered by GitBook
On this page
  • 2021 Dev-Matching : ๋จธ์‹ ๋Ÿฌ๋‹ ๊ณผ์ œํ…Œ์ŠคํŠธ
  • ํ”„๋กค๋กœ๊ทธ
  • ๋ฌธ์ œ
  • ๊ตฌํ˜„
  • ๋จธ์‹ ๋Ÿฌ๋‹ ๊ณผ์ œํ…Œ์ŠคํŠธ
  • Image Augmentation
  • VGG16 ๋ชจ๋ธ
  • Ensemble, ์•™์ƒ๋ธ”
  • ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ ์ ์šฉํ•˜๊ธฐ
  • ์—ํ•„๋กœ๊ทธ

Was this helpful?

  1. 2021 TIL
  2. MAY

23 Sun

Previous24 MonNext22 Sat

Last updated 4 years ago

Was this helpful?

2021 Dev-Matching : ๋จธ์‹ ๋Ÿฌ๋‹ ๊ณผ์ œํ…Œ์ŠคํŠธ

ํ”„๋กค๋กœ๊ทธ

์˜ค๋Š˜์€ 8์‹œ๊ฐ„์ด๋‚˜ ์ง„ํ–‰ํ•˜๋Š” ๋จธ์‹ ๋Ÿฌ๋‹ ํ…Œ์ŠคํŠธ์ด๋‹ค. ์‚ฌ์‹ค ์ข€ ๋” ๋งŽ์€ ์ค€๋น„๋ฅผ ํ•˜๊ณ  ์‹ถ์—ˆ๋Š”๋ฐ, ์–ผ๋งˆ ์ค€๋น„๋ฅผ ๋ชปํ•ด์„œ ์ž์‹ ๊ฐ์ด ์—†์—ˆ๋‹ค. ๋‚ด๊ฐ€ ํ•  ์ˆ˜ ์žˆ์„ ๊ฑฐ๋ผ๋Š” ์ƒ๊ฐ๋„ ๋ชปํ–ˆ๊ณ  ๊ทธ๋ƒฅ ๊ฒฝํ—˜์‚ผ์•„ ํ•ด๋ณด๋‹ค๊ฐ€ ์–ด๋ ค์šฐ๋ฉด ํฌ๊ธฐํ•˜์ž๋ผ๋Š” ๋งˆ์ธ๋“œ๋ฅผ ๊ฐ€์ง€๊ณ  ์‹œ์ž‘ํ–ˆ๋‹ค.

๊ทธ๋ฆฌ๊ณ , ์˜ํ™” ๊ฐ™์€ ์ผ์€ ๋ฒŒ์–ด์ง€์ง€ ์•Š๋Š”๋‹ค. ์—ญ์‹œ๋‚˜ ์–ด๋ ค์› ๊ณ  ๋งŽ์€ ๊ฒ€์ƒ‰์œผ๋กœ ์กฐ๊ทธ๋งˆํ•œ ๋ฌธ์ œ๋“ค์„ ํ•ด๊ฒฐํ•จ์— ์žˆ์–ด ๋‹ต๋‹ตํ–ˆ๋‹ค. ์•„๋‹ˆ ์‹œ๊ฐ„์ด ์ด๋งŒํผ์ด๋‚˜ ์ง€๋‚ฌ๋Š”๋ฐ ์•„์ง ์ด๊ฒƒ๋ฐ–์— ๋ชปํ–ˆ๋‹ค๊ณ ? ๊ฒฐ๊ตญ ๋‹ต๋‹ตํ•œ ๋งˆ์Œ์„ ๋ชป์ด๊ธฐ๊ณ  ํฌ๊ธฐํ–ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  1์‹œ๊ฐ„ ๋ฐ˜์„ ์žค๋‹ค.

์ž๊ณ  ์ผ์–ด๋‚˜์„œ ๋ชปํ•ด๋„ ๋˜๋‹ˆ๊นŒ ํ•œ๋ฒˆ ํ•  ์ˆ˜ ์žˆ๋Š”๋ฐ ๊นŒ์ง€ ํ•ด๋ณด์ž ๋ผ๋Š” ๋งˆ์ธ๋“œ๋กœ ๋‹ค์‹œ ์‹œ์ž‘ํ–ˆ๋‹ค. 10์‹œ๋ถ€ํ„ฐ ์‹œ์ž‘์ธ ๊ณผ์ œ๋ฅผ 2์‹œ๋ถ€ํ„ฐ ๋‹ค์‹œ ์‹œ์ž‘ํ–ˆ๋‹ค. 4์‹œ๊ฐ„๋งŒ์— ํ•  ์ˆ˜ ์žˆ์„๊นŒ?

๋ฌธ์ œ

์‚ฌ๋žŒ, ๋ง, ์ง‘, ๊ฐœ, ๊ธฐํƒ€, ๊ธฐ๋ฆฐ, ์ฝ”๋ผ๋ฆฌ์˜ 7์ž๋ฆฌ ์นดํ…Œ๊ณ ๋ฆฌ ๋ถ„๋ฅ˜ ๋ฌธ์ œ์˜€๋‹ค. 1698๊ฐœ์˜ ๋ฐ์ดํ„ฐ์…‹์ด ์ฃผ์–ด์กŒ๊ณ  ๊ฐ๊ฐ์˜ ๋ฐ์ดํ„ฐ๋Š” ๊ฐ ๋‹จ์–ด์˜ ์˜์–ด๋‹จ์–ด์˜ ์ด๋ฆ„์„ ๊ฐ€์ง„ ํด๋”์•ˆ์— ์žˆ์—ˆ๋‹ค.

์ œ์ถœ๋ฌผ

  • ์ฝ”๋“œ ์„ค๋ช…์ด ์žˆ๋Š” ์ฝ”๋“œ

  • ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•œ csv ๊ฒฐ๊ณผ ํŒŒ์ผ

๊ตฌํ˜„

๋จธ์‹ ๋Ÿฌ๋‹ ๊ณผ์ œํ…Œ์ŠคํŠธ

Image ํŒŒ์ผ๋“ค์ด ๋‹ด๊ธด ์••์ถ• ํŒŒ์ผ์„ ํ’‰๋‹ˆ๋‹ค.

!unzip test.zip
rm -r train
!unzip train.zip

ํ•„์š”ํ•œ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์„ ์–ธํ•ฉ๋‹ˆ๋‹ค

import os
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import cv2
from sklearn.utils import shuffle
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.utils import to_categorical
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Input, Conv2D, MaxPool2D, Activation, Flatten, Dense
from tensorflow.keras.losses import categorical_crossentropy
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.callbacks import ModelCheckpoint
from tensorflow.keras.models import load_model

์ด๋ฏธ์ง€์˜ ํฌ๊ธฐ๋ฅผ ์ •์˜ํ•ฉ๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ์ด๋ฏธ์ง€์˜ ๊ฐ ๋ผ๋ฒจ๋“ค์„ ์ •์ˆ˜์™€ ๋งค์นญํ•ฉ๋‹ˆ๋‹ค

image_width = 227
image_height = 227

class_names = ['dog', 'elephant', 'giraffe', 'guitar', 'horse', 'house', 'person',]
class_names_label = {class_name:i for i, class_name in enumerate(class_names)}
num_classes = len(class_names)

train_path = './train'

trainํด๋”์—์„œ ์ด๋ฏธ์ง€๋ฅผ ๋ถˆ๋Ÿฌ์™€ ๋ฆฌ์ŠคํŠธ์— ์ €์žฅํ•ฉ๋‹ˆ๋‹ค. ์ด ๋•Œ ๊ฐ ์ด๋ฏธ์ง€์˜ ๋ผ๋ฒจ์€ ์ด๋ฏธ์ง€๊ฐ€ ์กด์žฌํ•˜๋Š” ํด๋” ์ด๋ฆ„์œผ๋กœ ์ •ํ•ฉ๋‹ˆ๋‹ค.

images = []
labels = []

for dirname, _, filenames in os.walk(train_path):
  category = dirname.replace('./train/','')
  if category == './train':
    continue
  label = class_names_label[category]
  for filename in filenames:
    path = os.path.join(dirname, filename)
    image = cv2.imread(path)
    image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
    images.append(image)
    labels.append(label)

images = np.array(images, dtype = 'float32')
labels = np.array(labels, dtype = 'int32')   

๋ฐ์ดํ„ฐ์…‹์„ ์…”ํ”Œํ•˜๊ณ  ์„ฑ๋Šฅ ์ธก์ •์„ ์œ„ํ•ด ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ์…‹์„ 20% ๋ถ„๋ฆฌํ•ฉ๋‹ˆ๋‹ค

images, labels = shuffle(images, labels, random_state=25)
images = images / 255.0
train_images, test_images, train_labels, test_labels = train_test_split(images, labels, test_size=0.2)

ํ˜„์žฌ 1358๊ฐœ์˜ ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ์™€ 340๊ฐœ์˜ ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ ์ด 1698๊ฐœ์˜ ๋ฐ์ดํ„ฐ์…‹์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.

n_train = train_labels.shape[0]
n_test = test_labels.shape[0]

print ("Number of training examples: {}".format(n_train))
print ("Number of training examples: {}".format(n_test))
Number of training examples: 1358
Number of training examples: 340

๊ฐ„๋‹จํ•œ CNN ๋ชจ๋ธ์„ ์ง์ ‘ ๋งŒ๋“ค์–ด๋ณด๊ณ  ์„ฑ๋Šฅ์„ ํ™•์ธํ•ด๋ด…๋‹ˆ๋‹ค. ์ด ๋•Œ ๊ฒ€์ฆ ๋ฐ์ดํ„ฐ๋ฅผ 20% ๋น„์œจ๋กœ ๋งˆ๋ จํ•ด์„œ ์ถ”๊ฐ€๋กœ ์„ฑ๋Šฅ์„ ํ™•์ธํ•ด๋ด…๋‹ˆ๋‹ค.

model = Sequential([
    Conv2D(32, (3, 3), activation = 'relu', input_shape = (image_height, image_width, 3)), 
    MaxPool2D(2,2),
    Conv2D(64, (3, 3), activation = 'relu'),
    MaxPool2D(2,2),
    Conv2D(128, (3, 3), activation = 'relu'),
    MaxPool2D(2,2),
    Flatten(),
    Dense(256, activation=tf.nn.relu),
    Dense(num_classes, activation=tf.nn.softmax)
])
model.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
history = model.fit(train_images, train_labels, batch_size=64, epochs=20, validation_split = 0.2)
Epoch 1/20
17/17 [==============================] - 38s 240ms/step - loss: 3.6401 - accuracy: 0.1889 - val_loss: 1.9124 - val_accuracy: 0.2243
Epoch 2/20
17/17 [==============================] - 2s 110ms/step - loss: 1.8678 - accuracy: 0.2314 - val_loss: 1.8660 - val_accuracy: 0.2279
Epoch 3/20
17/17 [==============================] - 2s 110ms/step - loss: 1.7636 - accuracy: 0.2779 - val_loss: 1.7660 - val_accuracy: 0.3125
Epoch 4/20
17/17 [==============================] - 2s 111ms/step - loss: 1.6425 - accuracy: 0.3904 - val_loss: 1.6835 - val_accuracy: 0.3566
Epoch 5/20
17/17 [==============================] - 2s 110ms/step - loss: 1.4164 - accuracy: 0.4867 - val_loss: 1.7457 - val_accuracy: 0.3382
Epoch 6/20
17/17 [==============================] - 2s 110ms/step - loss: 1.0570 - accuracy: 0.6507 - val_loss: 1.6682 - val_accuracy: 0.3676
Epoch 7/20
17/17 [==============================] - 2s 109ms/step - loss: 0.7076 - accuracy: 0.8003 - val_loss: 1.8565 - val_accuracy: 0.3787
Epoch 8/20
17/17 [==============================] - 2s 109ms/step - loss: 0.4653 - accuracy: 0.8563 - val_loss: 2.2181 - val_accuracy: 0.3824
Epoch 9/20
17/17 [==============================] - 2s 111ms/step - loss: 0.2103 - accuracy: 0.9435 - val_loss: 2.4412 - val_accuracy: 0.4007
Epoch 10/20
17/17 [==============================] - 2s 112ms/step - loss: 0.0872 - accuracy: 0.9852 - val_loss: 3.1015 - val_accuracy: 0.3787
Epoch 11/20
17/17 [==============================] - 2s 110ms/step - loss: 0.0549 - accuracy: 0.9883 - val_loss: 3.1400 - val_accuracy: 0.3676
Epoch 12/20
17/17 [==============================] - 2s 112ms/step - loss: 0.0546 - accuracy: 0.9836 - val_loss: 3.5330 - val_accuracy: 0.3860
Epoch 13/20
17/17 [==============================] - 2s 111ms/step - loss: 0.0174 - accuracy: 0.9989 - val_loss: 3.5670 - val_accuracy: 0.4118
Epoch 14/20
17/17 [==============================] - 2s 110ms/step - loss: 0.0107 - accuracy: 1.0000 - val_loss: 3.6782 - val_accuracy: 0.4301
Epoch 15/20
17/17 [==============================] - 2s 110ms/step - loss: 0.0028 - accuracy: 1.0000 - val_loss: 3.9017 - val_accuracy: 0.4375
Epoch 16/20
17/17 [==============================] - 2s 111ms/step - loss: 0.0016 - accuracy: 1.0000 - val_loss: 4.0851 - val_accuracy: 0.4375
Epoch 17/20
17/17 [==============================] - 2s 111ms/step - loss: 9.3226e-04 - accuracy: 1.0000 - val_loss: 4.1403 - val_accuracy: 0.4412
Epoch 18/20
17/17 [==============================] - 2s 111ms/step - loss: 7.6645e-04 - accuracy: 1.0000 - val_loss: 4.1752 - val_accuracy: 0.4412
Epoch 19/20
17/17 [==============================] - 2s 111ms/step - loss: 5.6912e-04 - accuracy: 1.0000 - val_loss: 4.2507 - val_accuracy: 0.4412
Epoch 20/20
17/17 [==============================] - 2s 111ms/step - loss: 5.1643e-04 - accuracy: 1.0000 - val_loss: 4.3206 - val_accuracy: 0.4449

ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ์˜ ์„ฑ๋Šฅ์€ ๋งค์šฐ ์ข‹์ง€๋งŒ ๊ฒ€์ฆ ๋ฐ์ดํ„ฐ์˜ ์„ฑ๋Šฅ์€ ๋งŽ์ด ๋†’์ง€ ์•Š์€ ๊ฒฐ๊ณผ๊ฐ€ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

def plot_accuracy_loss(history):
    fig = plt.figure(figsize=(10,5))

    plt.subplot(221)
    plt.plot(history.history['accuracy'],'bo--', label = "accuracy")
    plt.plot(history.history['val_accuracy'], 'ro--', label = "val_accuracy")
    plt.title("train_acc vs val_acc")
    plt.ylabel("accuracy")
    plt.xlabel("epochs")
    plt.legend()

    plt.subplot(222)
    plt.plot(history.history['loss'],'bo--', label = "loss")
    plt.plot(history.history['val_loss'], 'ro--', label = "val_loss")
    plt.title("train_loss vs val_loss")
    plt.ylabel("loss")
    plt.xlabel("epochs")

    plt.legend()
    plt.show()
plot_accuracy_loss(history)

๊ทธ๋ž˜ํ”„๋ฅผ ํ†ตํ•ด ํ™•์ธํ•ด๋ณด๋ฉด ๋Œ€๋žต 5epochs ์ดํ›„ ๋ถ€ํ„ฐ๋Š” ๊ต์ฐจ ๋ฐ์ดํ„ฐ์˜ ์„ฑ๋Šฅ์ด ์ž˜ ๋‚˜์˜ค์ง€ ์•Š์Šต๋‹ˆ๋‹ค.

test_loss = model.evaluate(test_images, test_labels)
11/11 [==============================] - 1s 45ms/step - loss: 5.0795 - accuracy: 0.4118

ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ์˜ ์„ฑ๋Šฅ์€ ๋Œ€๋žต 0.4 ๋‚ด์™ธ์ž…๋‹ˆ๋‹ค. (์—ฌ๋Ÿฌ ์‹œํ–‰ ๊ฒฐ๊ณผ 0.35 ~ 0.45 ์‚ฌ์ด)

Image Augmentation

๋ฐ์ดํ„ฐ์…‹์˜ ๊ฐฏ์ˆ˜๊ฐ€ ๋งŽ์•„์งˆ์ˆ˜๋ก ๋ชจ๋ธ์ด ์˜ค๋ฒ„ํ”ผํŒ…๋  ๊ฐ€๋Šฅ์„ฑ์ด ์ค„์–ด๋“ค๋ฉฐ, ์ด๋Š” ์„ฑ๋Šฅ์˜ ๊ฐœ์„ ์š”์†Œ๊ฐ€ ๋ฉ๋‹ˆ๋‹ค. ๋ฐ์ดํ„ฐ์…‹์˜ ๊ฐฏ์ˆ˜๋ฅผ ์ฆ๊ฐ€์‹œํ‚ค๊ธฐ ์œ„ํ•ด ImageDataGenerator ๋ฅผ ์ด์šฉํ•ฉ๋‹ˆ๋‹ค.

ํ˜„์žฌ ๋ฐ์ดํ„ฐ์…‹์€ ์ด 1698๊ฐœ์ž…๋‹ˆ๋‹ค. ์ด๋ฏธ์ง€ ์กฐ์ž‘์„ ํ†ตํ•ด ๋ฐ์ดํ„ฐ์…‹์˜ ๊ฐฏ์ˆ˜๋ฅผ ๋Š˜๋ ค์ค๋‹ˆ๋‹ค. ์ด๋ฏธ์ง€ 1๊ฐœ๋‹น ์กฐ์ž‘์„ ํ†ตํ•ด 3๊ฐœ์˜ ์ด๋ฏธ์ง€๊ฐ€ ์ƒ์„ฑ๋ฉ๋‹ˆ๋‹ค. ๋”ฐ๋ผ์„œ ์ด 1698 4 = 6792๊ฐœ์˜ ๋ฐ์ดํ„ฐ์…‹์ด ๋งˆ๋ จ๋ฉ๋‹ˆ๋‹ค. 1698 2 = 3396 1698 3 = 5094 1698 4 = 6792 1698 * 5 = 8490

  • ๋” ๋งŽ์€ ์ด๋ฏธ์ง€๋Š” colab์—์„œ ๋ฆฌ์†Œ์Šค ๋ถ€์กฑ์œผ๋กœ ์ž‘๋™ํ•˜์ง€ ์•Š์•„์„œ ์ถ”๊ฐ€๋กœ 2๊ฐœ๊นŒ์ง€๋งŒ ์ƒ์„ฑํ–ˆ์Šต๋‹ˆ๋‹ค.(์ด 3๊ฐœ)

from tensorflow.keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img

imageGenerator = ImageDataGenerator(
    rotation_range=20,
    width_shift_range=0.1,
    height_shift_range=0.1,
    brightness_range=[.2, .2],
    horizontal_flip=True,
)

for dirname, _, filenames in os.walk(train_path):
  category = dirname.replace('./train/','')
  if category == './train':
    continue
  for filename in filenames:
    img = load_img(os.path.join(dirname, filename))
    x = img_to_array(img) 
    x = x.reshape((1, ) + x.shape)  
    i = 0
    for batch in imageGenerator.flow(x,batch_size = 1,
                                     save_to_dir = os.path.join(train_path, category),
                                     save_format ='jpg'):
        i += 1
        if i == 2: 
            break

์ด๋ฏธ์ง€๋ฅผ ์กฐ์ž‘ํ•  ๋ฐฉ๋ฒ•์„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์ •ํ•ฉ๋‹ˆ๋‹ค.

  • rotation_range = 20

    • ํšŒ์ „ํ•  ๊ฐ๋„๋ฅผ 20๋„๋กœ ์„ค์ •ํ•ฉ๋‹ˆ๋‹ค

  • width_shift_range = 0.1

  • height_shift_range = 0.1

    • ์ˆ˜ํ‰ ๋˜๋Š” ์ˆ˜์ง์œผ๋กœ ์ด๋™ ๋น„์œจ์„ 0.1๋กœ ์ •ํ•ฉ๋‹ˆ๋‹ค

  • brigthness_range = [.2, .2]

    • ๋ฐ๊ธฐ๋ฅผ -20% ๋ถ€ํ„ฐ 20%๊นŒ์ง€ ๋ณ€ํ™”์‹œํ‚ต๋‹ˆ๋‹ค.

  • horizontal_flip=True

    • ์ˆ˜ํ‰์œผ๋กœ ๋’ค์ง‘์Šต๋‹ˆ๋‹ค.

train_path = './train'
images = []
labels = []

for dirname, _, filenames in os.walk(train_path):
  category = dirname.split('/')[-1]
  if category == 'train':
    continue
  label = class_names_label[category]
  for filename in filenames:
    path = os.path.join(dirname, filename)
    image = cv2.imread(path)
    image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
    images.append(image)
    labels.append(label)

images = np.array(images, dtype = 'float32')
labels = np.array(labels, dtype = 'int32')   


images, labels = shuffle(images, labels, random_state=25)
images = images / 255.0
train_images, test_images, train_labels, test_labels = train_test_split(images, labels, test_size=0.2)
n_train = train_labels.shape[0]
n_test = test_labels.shape[0]

print ("Number of training examples: {}".format(n_train))
print ("Number of training examples: {}".format(n_test))
Number of training examples: 3989
Number of training examples: 998
model.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
history = model.fit(train_images, train_labels, batch_size=64, epochs=20, validation_split = 0.2)
Epoch 1/20
50/50 [==============================] - 8s 153ms/step - loss: 2.2086 - accuracy: 0.4334 - val_loss: 1.4567 - val_accuracy: 0.4900
Epoch 2/20
50/50 [==============================] - 5s 109ms/step - loss: 1.1767 - accuracy: 0.5928 - val_loss: 1.3364 - val_accuracy: 0.5426
Epoch 3/20
50/50 [==============================] - 5s 110ms/step - loss: 0.6962 - accuracy: 0.7762 - val_loss: 1.4134 - val_accuracy: 0.5564
Epoch 4/20
50/50 [==============================] - 6s 111ms/step - loss: 0.3223 - accuracy: 0.9011 - val_loss: 1.9013 - val_accuracy: 0.5501
Epoch 5/20
50/50 [==============================] - 6s 111ms/step - loss: 0.1151 - accuracy: 0.9695 - val_loss: 2.3870 - val_accuracy: 0.5313
Epoch 6/20
50/50 [==============================] - 6s 110ms/step - loss: 0.0513 - accuracy: 0.9889 - val_loss: 2.5908 - val_accuracy: 0.5602
Epoch 7/20
50/50 [==============================] - 6s 112ms/step - loss: 0.0282 - accuracy: 0.9938 - val_loss: 2.8788 - val_accuracy: 0.5238
Epoch 8/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0187 - accuracy: 0.9968 - val_loss: 3.5123 - val_accuracy: 0.5125
Epoch 9/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0382 - accuracy: 0.9899 - val_loss: 3.8634 - val_accuracy: 0.4561
Epoch 10/20
50/50 [==============================] - 6s 112ms/step - loss: 0.1146 - accuracy: 0.9645 - val_loss: 2.8725 - val_accuracy: 0.5088
Epoch 11/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0543 - accuracy: 0.9849 - val_loss: 2.9627 - val_accuracy: 0.5276
Epoch 12/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0163 - accuracy: 0.9961 - val_loss: 3.2457 - val_accuracy: 0.4987
Epoch 13/20
50/50 [==============================] - 6s 112ms/step - loss: 0.0150 - accuracy: 0.9974 - val_loss: 3.4321 - val_accuracy: 0.5238
Epoch 14/20
50/50 [==============================] - 6s 112ms/step - loss: 0.0058 - accuracy: 0.9990 - val_loss: 3.7935 - val_accuracy: 0.5288
Epoch 15/20
50/50 [==============================] - 6s 111ms/step - loss: 6.9671e-04 - accuracy: 1.0000 - val_loss: 3.8625 - val_accuracy: 0.5401
Epoch 16/20
50/50 [==============================] - 6s 112ms/step - loss: 2.6550e-04 - accuracy: 1.0000 - val_loss: 3.9490 - val_accuracy: 0.5414
Epoch 17/20
50/50 [==============================] - 6s 112ms/step - loss: 1.7575e-04 - accuracy: 1.0000 - val_loss: 3.9936 - val_accuracy: 0.5439
Epoch 18/20
50/50 [==============================] - 6s 112ms/step - loss: 1.4344e-04 - accuracy: 1.0000 - val_loss: 4.0374 - val_accuracy: 0.5439
Epoch 19/20
50/50 [==============================] - 6s 112ms/step - loss: 1.1886e-04 - accuracy: 1.0000 - val_loss: 4.0765 - val_accuracy: 0.5439
Epoch 20/20
50/50 [==============================] - 6s 112ms/step - loss: 1.0780e-04 - accuracy: 1.0000 - val_loss: 4.1103 - val_accuracy: 0.5439
plot_accuracy_loss(history)
test_loss = model.evaluate(test_images, test_labels)
32/32 [==============================] - 1s 25ms/step - loss: 3.8832 - accuracy: 0.5341

ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ์˜ ์„ฑ๋Šฅ์ด ๋‚˜์•„์ง„ ๊ฒƒ์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

  • Simple 0.41

  • Data Augmentation 0.53

VGG16 ๋ชจ๋ธ

ํ˜„์žฌ ๊ตฌํ˜„๋œ ์ด๋ฏธ์ง€ ๋ถ„๋ฅ˜๊ธฐ๋Š” ๊ทธ ์ธต์˜ ๊นŠ์ด๊ฐ€ ์–•์Šต๋‹ˆ๋‹ค. ๋” ์ข‹์€ ๋ถ„๋ฅ˜๊ธฐ๋ฅผ ์‚ฌ์šฉํ•˜๊ธฐ ์œ„ํ•ด tensorflow์—์„œ ์ง€์›ํ•˜๋Š” ๋ชจ๋ธ์ธ VGG๋ฅผ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. ๋˜ํ•œ VGG๋ชจ๋ธ์„ ๊ฑฐ์ณ ์ƒ์„ฑ๋œ ํŠน์ง•์œผ๋กœ ์ž…๋ ฅ๊ณผ ์ถœ๋ ฅ์„ ๋ชจ๋‘ ์—ฐ๊ฒฐํ•ด์ฃผ๋Š” Dense Layer๋ฅผ ๊ฑฐ์น˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.

from keras.applications.vgg16 import VGG16
from keras.preprocessing import image
from keras.applications.vgg16 import preprocess_input

model = VGG16(weights='imagenet', include_top=False)
train_features = model.predict(train_images)
test_features = model.predict(test_images)
n_train, x, y, z = train_features.shape
n_test, x, y, z = test_features.shape
numFeatures = x * y * z
model2 = tf.keras.Sequential([
    tf.keras.layers.Flatten(input_shape = (x, y, z)),
    tf.keras.layers.Dense(50, activation=tf.nn.relu),
    tf.keras.layers.Dense(num_classes, activation=tf.nn.softmax)
])

model2.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])

history2 = model2.fit(train_features, train_labels, batch_size=64, epochs=15, validation_split = 0.2)
Epoch 1/15
50/50 [==============================] - 1s 16ms/step - loss: 1.8851 - accuracy: 0.4079 - val_loss: 0.6769 - val_accuracy: 0.7581
Epoch 2/15
 1/50 [..............................] - ETA: 0s - loss: 0.5036 - accuracy: 0.8438
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-9-c73ec4987367> in <module>()
      7 model2.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
      8 
----> 9 history2 = model2.fit(train_features, train_labels, batch_size=64, epochs=15, validation_split = 0.2)

/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
   1098                 _r=1):
   1099               callbacks.on_train_batch_begin(step)
-> 1100               tmp_logs = self.train_function(iterator)
   1101               if data_handler.should_sync:
   1102                 context.async_wait()

/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
    826     tracing_count = self.experimental_get_tracing_count()
    827     with trace.Trace(self._name) as tm:
--> 828       result = self._call(*args, **kwds)
    829       compiler = "xla" if self._experimental_compile else "nonXla"
    830       new_tracing_count = self.experimental_get_tracing_count()

/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
    853       # In this case we have created variables on the first call, so we run the
    854       # defunned version which is guaranteed to never create variables.
--> 855       return self._stateless_fn(*args, **kwds)  # pylint: disable=not-callable
    856     elif self._stateful_fn is not None:
    857       # Release the lock early so that multiple threads can perform the call

/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs)
   2941        filtered_flat_args) = self._maybe_define_function(args, kwargs)
   2942     return graph_function._call_flat(
-> 2943         filtered_flat_args, captured_inputs=graph_function.captured_inputs)  # pylint: disable=protected-access
   2944 
   2945   @property

/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/function.py in _call_flat(self, args, captured_inputs, cancellation_manager)
   1917       # No tape is watching; skip to running the function.
   1918       return self._build_call_outputs(self._inference_function.call(
-> 1919           ctx, args, cancellation_manager=cancellation_manager))
   1920     forward_backward = self._select_forward_and_backward_functions(
   1921         args,

/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/function.py in call(self, ctx, args, cancellation_manager)
    558               inputs=args,
    559               attrs=attrs,
--> 560               ctx=ctx)
    561         else:
    562           outputs = execute.execute_with_cancellation(

/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
     58     ctx.ensure_initialized()
     59     tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 60                                         inputs, attrs, num_outputs)
     61   except core._NotOkStatusException as e:
     62     if name is not None:

KeyboardInterrupt: 
plot_accuracy_loss(history)
test_loss = model2.evaluate(test_features, test_labels)
32/32 [==============================] - 0s 3ms/step - loss: 0.3025 - accuracy: 0.8938

์„ฑ๋Šฅ์ด ๋น„์•ฝ์ ์œผ๋กœ ์ƒ์Šนํ•œ ๊ฒƒ์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

  • Simple 0.41

  • Data Augmentation 0.53

  • VGG16 + Data Augmentation 0.89

Ensemble, ์•™์ƒ๋ธ”

์•™์ƒ๋ธ” ๋Ÿฌ๋‹์€ ์—ฌ๋Ÿฌ๊ฐœ์˜ ๋ถ„๋ฅ˜๊ธฐ๋ฅผ ํ†ตํ•ด ํ•™์Šตํ•˜๊ณ  ํ•™์Šตํ•œ ๊ฒฐ๊ณผ๋ฅผ ์ข…ํ•ฉ(ํ‰๊ท  ๋˜๋Š” ์ตœ๋นˆ๊ฐ’)ํ•˜์—ฌ ์ตœ์ข… ๊ฒฐ๊ณผ๊ฐ’์„ ๊ฒฐ์ •ํ•˜๋Š” ๊ฒƒ์„ ๋งํ•ฉ๋‹ˆ๋‹ค.

์ด๋•Œ, ์•™์ƒ๋ธ” ๋Ÿฌ๋‹์„ ์šด์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์€ ์„ธ ๊ฐ€์ง€๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.

  • ์„œ๋กœ ๋‹ค๋ฅธ ๋ถ„๋ฅ˜๊ธฐ๋กœ ๋™์ผํ•œ ๋ฐ์ดํ„ฐ์…‹์— ๋Œ€ํ•ด ๋‹ค๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ์–ป๋Š”๋‹ค

  • ์„œ๋กœ ๋‹ค๋ฅธ ๋ฐ์ดํ„ฐ๋กœ ๋™์ผํ•œ ๋ถ„๋ฅ˜๊ธฐ์— ๋Œ€ํ•ด ๋‹ค๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ์–ป๋Š”๋‹ค

  • ๋˜๋Š”, ์ด ๋‘˜์„ ๋‘˜ ๋‹ค ์šด์šฉํ•œ๋‹ค.

์—ฌ๊ธฐ์„œ๋Š” 10๊ฐœ์˜ ์„œ๋กœ ๋‹ค๋ฅธ ๋ถ„๋ฅ˜๊ธฐ๋กœ ๋™์ผํ•œ ๋ฐ์ดํ„ฐ์…‹์— ๋Œ€ํ•ด ๋‹ค๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ์–ป์Šต๋‹ˆ๋‹ค. ๋ถ„๋ฅ˜๊ธฐ๊ฐ€ ์„œ๋กœ ๋‹ค๋ฅด๋‹ค๋Š” ๋œป์€ Dense Layer์—์„œ ์ž…๋ ฅ๊ฐ’์„ ๋‹ค๋ฅด๊ฒŒ ์ž…๋ ฅ๋ฐ›๋Š”๋‹ค๋Š” ๋œป์ด๋ฉฐ ์ด๋Š” VGG16์„ ๊ฑฐ์นœ ํŠน์ง•๋“ค์ด ๋žœ๋ค์œผ๋กœ Dropout๋œ 10๊ฐ€์ง€์˜ ์ž…๋ ฅ์„ ๋ฐ›๋Š”๋‹ค๋Š” ๋ง๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.

n_estimators = 10
max_samples = 0.8

max_samples *= n_train
max_samples = int(max_samples)
models = list()
random = np.random.randint(50, 100, size = n_estimators)

for i in range(n_estimators):
    
    model3 = tf.keras.Sequential([ tf.keras.layers.Flatten(input_shape = (x, y, z)),
                                    tf.keras.layers.Dense(random[i], activation=tf.nn.relu),
                                    tf.keras.layers.Dense(num_classes, activation=tf.nn.softmax)
                                ])
    
    model3.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
    
    models.append(model3)
histories = []

for i in range(n_estimators):
    train_idx = np.random.choice(len(train_features), size = max_samples)
    histories.append(models[i].fit(train_features[train_idx], train_labels[train_idx], batch_size=64, epochs=10, validation_split = 0.1))
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 1.9823 - accuracy: 0.3059 - val_loss: 1.0176 - val_accuracy: 0.7063
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.9019 - accuracy: 0.7247 - val_loss: 0.5121 - val_accuracy: 0.8875
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3701 - accuracy: 0.9178 - val_loss: 0.3479 - val_accuracy: 0.9094
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.2173 - accuracy: 0.9613 - val_loss: 0.2734 - val_accuracy: 0.9281
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1204 - accuracy: 0.9880 - val_loss: 0.2201 - val_accuracy: 0.9375
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0818 - accuracy: 0.9940 - val_loss: 0.1895 - val_accuracy: 0.9406
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0501 - accuracy: 0.9982 - val_loss: 0.1696 - val_accuracy: 0.9406
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0344 - accuracy: 0.9996 - val_loss: 0.1632 - val_accuracy: 0.9312
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0235 - accuracy: 0.9998 - val_loss: 0.1513 - val_accuracy: 0.9563
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0191 - accuracy: 0.9993 - val_loss: 0.1467 - val_accuracy: 0.9625
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.1596 - accuracy: 0.3024 - val_loss: 1.0193 - val_accuracy: 0.6375
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.8358 - accuracy: 0.7512 - val_loss: 0.6540 - val_accuracy: 0.8031
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.5051 - accuracy: 0.8747 - val_loss: 0.4970 - val_accuracy: 0.8375
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3183 - accuracy: 0.9158 - val_loss: 0.3074 - val_accuracy: 0.9125
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1145 - accuracy: 0.9833 - val_loss: 0.2050 - val_accuracy: 0.9500
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0617 - accuracy: 0.9970 - val_loss: 0.1897 - val_accuracy: 0.9625
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0357 - accuracy: 1.0000 - val_loss: 0.1739 - val_accuracy: 0.9656
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0302 - accuracy: 1.0000 - val_loss: 0.1671 - val_accuracy: 0.9563
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0192 - accuracy: 1.0000 - val_loss: 0.1580 - val_accuracy: 0.9563
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0151 - accuracy: 1.0000 - val_loss: 0.1578 - val_accuracy: 0.9625
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.2014 - accuracy: 0.3511 - val_loss: 0.8670 - val_accuracy: 0.6906
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.6973 - accuracy: 0.7739 - val_loss: 0.6392 - val_accuracy: 0.8125
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.4221 - accuracy: 0.9097 - val_loss: 0.4906 - val_accuracy: 0.8656
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.2574 - accuracy: 0.9617 - val_loss: 0.4151 - val_accuracy: 0.8906
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1797 - accuracy: 0.9754 - val_loss: 0.3713 - val_accuracy: 0.8813
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1147 - accuracy: 0.9935 - val_loss: 0.3483 - val_accuracy: 0.9000
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0843 - accuracy: 0.9957 - val_loss: 0.3210 - val_accuracy: 0.9031
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0618 - accuracy: 0.9977 - val_loss: 0.3275 - val_accuracy: 0.9125
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0476 - accuracy: 0.9982 - val_loss: 0.3028 - val_accuracy: 0.9125
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0344 - accuracy: 0.9998 - val_loss: 0.2974 - val_accuracy: 0.9094
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.9052 - accuracy: 0.3392 - val_loss: 0.7875 - val_accuracy: 0.7437
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.5499 - accuracy: 0.8438 - val_loss: 0.5012 - val_accuracy: 0.8406
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.2768 - accuracy: 0.9504 - val_loss: 0.3944 - val_accuracy: 0.8938
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1770 - accuracy: 0.9739 - val_loss: 0.3226 - val_accuracy: 0.9219
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1055 - accuracy: 0.9945 - val_loss: 0.2937 - val_accuracy: 0.9281
Epoch 6/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0730 - accuracy: 0.9987 - val_loss: 0.2598 - val_accuracy: 0.9312
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0519 - accuracy: 0.9994 - val_loss: 0.2530 - val_accuracy: 0.9281
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0383 - accuracy: 1.0000 - val_loss: 0.2581 - val_accuracy: 0.9250
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0281 - accuracy: 1.0000 - val_loss: 0.2265 - val_accuracy: 0.9406
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0230 - accuracy: 1.0000 - val_loss: 0.2292 - val_accuracy: 0.9281
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.1120 - accuracy: 0.4169 - val_loss: 0.5158 - val_accuracy: 0.8438
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.3176 - accuracy: 0.9168 - val_loss: 0.2947 - val_accuracy: 0.9094
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1382 - accuracy: 0.9806 - val_loss: 0.2272 - val_accuracy: 0.9281
Epoch 4/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0661 - accuracy: 0.9959 - val_loss: 0.2120 - val_accuracy: 0.9375
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0399 - accuracy: 0.9994 - val_loss: 0.1898 - val_accuracy: 0.9375
Epoch 6/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0238 - accuracy: 1.0000 - val_loss: 0.1754 - val_accuracy: 0.9469
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0172 - accuracy: 1.0000 - val_loss: 0.1769 - val_accuracy: 0.9344
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0131 - accuracy: 1.0000 - val_loss: 0.1698 - val_accuracy: 0.9438
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0116 - accuracy: 1.0000 - val_loss: 0.1685 - val_accuracy: 0.9438
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0087 - accuracy: 1.0000 - val_loss: 0.1692 - val_accuracy: 0.9438
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.4135 - accuracy: 0.3638 - val_loss: 0.8577 - val_accuracy: 0.6750
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.5958 - accuracy: 0.8420 - val_loss: 0.5768 - val_accuracy: 0.7969
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.3599 - accuracy: 0.9117 - val_loss: 0.4977 - val_accuracy: 0.8406
Epoch 4/10
45/45 [==============================] - 0s 6ms/step - loss: 0.2330 - accuracy: 0.9629 - val_loss: 0.3527 - val_accuracy: 0.8844
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1348 - accuracy: 0.9895 - val_loss: 0.3043 - val_accuracy: 0.8969
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1012 - accuracy: 0.9908 - val_loss: 0.2929 - val_accuracy: 0.9062
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0683 - accuracy: 0.9976 - val_loss: 0.2615 - val_accuracy: 0.9156
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0497 - accuracy: 0.9995 - val_loss: 0.2250 - val_accuracy: 0.9250
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0372 - accuracy: 1.0000 - val_loss: 0.2279 - val_accuracy: 0.9312
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0284 - accuracy: 1.0000 - val_loss: 0.2228 - val_accuracy: 0.9344
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 1.8665 - accuracy: 0.4245 - val_loss: 0.6087 - val_accuracy: 0.7781
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3551 - accuracy: 0.8953 - val_loss: 0.3399 - val_accuracy: 0.9156
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1476 - accuracy: 0.9772 - val_loss: 0.2929 - val_accuracy: 0.9156
Epoch 4/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0767 - accuracy: 0.9972 - val_loss: 0.2542 - val_accuracy: 0.9344
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0447 - accuracy: 0.9987 - val_loss: 0.2191 - val_accuracy: 0.9438
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0286 - accuracy: 1.0000 - val_loss: 0.2117 - val_accuracy: 0.9563
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0220 - accuracy: 1.0000 - val_loss: 0.2031 - val_accuracy: 0.9500
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0162 - accuracy: 1.0000 - val_loss: 0.2149 - val_accuracy: 0.9344
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0127 - accuracy: 1.0000 - val_loss: 0.2075 - val_accuracy: 0.9469
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0110 - accuracy: 1.0000 - val_loss: 0.2015 - val_accuracy: 0.9500
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.4010 - accuracy: 0.3874 - val_loss: 0.9141 - val_accuracy: 0.7000
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.7456 - accuracy: 0.7663 - val_loss: 0.5889 - val_accuracy: 0.8062
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3749 - accuracy: 0.9113 - val_loss: 0.3908 - val_accuracy: 0.8781
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1923 - accuracy: 0.9694 - val_loss: 0.2747 - val_accuracy: 0.8969
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1132 - accuracy: 0.9871 - val_loss: 0.2336 - val_accuracy: 0.9281
Epoch 6/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0662 - accuracy: 0.9937 - val_loss: 0.2369 - val_accuracy: 0.9062
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0419 - accuracy: 0.9990 - val_loss: 0.2026 - val_accuracy: 0.9406
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0250 - accuracy: 0.9993 - val_loss: 0.2055 - val_accuracy: 0.9406
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0180 - accuracy: 1.0000 - val_loss: 0.1853 - val_accuracy: 0.9344
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0120 - accuracy: 1.0000 - val_loss: 0.1871 - val_accuracy: 0.9312
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.0665 - accuracy: 0.4211 - val_loss: 0.5312 - val_accuracy: 0.8156
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3373 - accuracy: 0.9030 - val_loss: 0.3433 - val_accuracy: 0.9031
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1310 - accuracy: 0.9791 - val_loss: 0.2772 - val_accuracy: 0.9125
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0733 - accuracy: 0.9955 - val_loss: 0.2638 - val_accuracy: 0.9187
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0448 - accuracy: 1.0000 - val_loss: 0.2496 - val_accuracy: 0.9281
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0293 - accuracy: 0.9997 - val_loss: 0.2377 - val_accuracy: 0.9312
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0232 - accuracy: 1.0000 - val_loss: 0.2428 - val_accuracy: 0.9312
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0165 - accuracy: 1.0000 - val_loss: 0.2363 - val_accuracy: 0.9312
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0123 - accuracy: 1.0000 - val_loss: 0.2325 - val_accuracy: 0.9312
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0106 - accuracy: 1.0000 - val_loss: 0.2233 - val_accuracy: 0.9375
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.2188 - accuracy: 0.3826 - val_loss: 0.7416 - val_accuracy: 0.7563
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.5199 - accuracy: 0.8513 - val_loss: 0.4587 - val_accuracy: 0.8656
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.2311 - accuracy: 0.9584 - val_loss: 0.3136 - val_accuracy: 0.9375
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1234 - accuracy: 0.9838 - val_loss: 0.2402 - val_accuracy: 0.9469
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0727 - accuracy: 0.9968 - val_loss: 0.2090 - val_accuracy: 0.9500
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0448 - accuracy: 1.0000 - val_loss: 0.1904 - val_accuracy: 0.9531
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0323 - accuracy: 1.0000 - val_loss: 0.1819 - val_accuracy: 0.9531
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0230 - accuracy: 1.0000 - val_loss: 0.1719 - val_accuracy: 0.9594
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0167 - accuracy: 1.0000 - val_loss: 0.1676 - val_accuracy: 0.9469
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0133 - accuracy: 1.0000 - val_loss: 0.1618 - val_accuracy: 0.9469
predictions = []
for i in range(n_estimators):
    predictions.append(models[i].predict(test_features))
    
predictions = np.array(predictions)
predictions = predictions.sum(axis = 0)
pred_labels = predictions.argmax(axis=1)
print("Accuracy : {}".format(accuracy_score(test_labels, pred_labels)))
Accuracy : 0.8937875751503006

์„ฑ๋Šฅ์ด ์กฐ๊ธˆ ๋” ์ƒ์Šนํ•œ ๊ฒƒ์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

  • Simple 0.41

  • Data Augmentation 0.53

  • VGG16 + Data Augmentation 0.89

  • VGG16 + Data Augmentation + Ensemble 0.90 (0.89~0.91)

ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ ์ ์šฉํ•˜๊ธฐ

test_path = './test/0'
test_images = []

for dirname, _, filenames in os.walk(test_path):
  category = dirname.split('/')[-1]
  for filename in filenames:
    path = os.path.join(dirname, filename)
    image = cv2.imread(path)
    image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
    test_images.append(image)

test_images = np.array(test_images, dtype = 'float32')
test_features = model.predict(test_images)
predictions = []
for i in range(n_estimators):
    predictions.append(models[i].predict(test_features))
    
predictions = np.array(predictions)
predictions = predictions.sum(axis = 0)
pred = predictions.argmax(axis=1)

๋ชจ๋ธ์˜ ๊ฒฐ๊ณผ๊ฐ’์„ ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„์œผ๋กœ ๋งŒ๋“  ํ›„ csv ํŒŒ์ผ๋กœ ์ €์žฅํ•ฉ๋‹ˆ๋‹ค.

def write_preds(pred, fname):
    pd.DataFrame({"answer value": pred}).to_csv(fname, index=False, header=True)

write_preds(pred, "result.csv")

๊ฒฐ๊ณผํŒŒ์ผ๋กœ ์–ป์€ csv ํŒŒ์ผ์„ ๋‹ค์šด๋กœ๋“œ ํ•ฉ๋‹ˆ๋‹ค.

from google.colab import files
files.download('result.csv')

์—ํ•„๋กœ๊ทธ

์˜ค ์ž˜ ํ’€์€๊ฑฐ ๊ฐ™์€๋ฐ?? ์„ฑ๋Šฅ๋„ ์ข‹๊ณ !! ๊ฒฐ๊ณผ๋Š”?? ์ด๋ผ๊ณ  ํ•œ๋‹ค๋ฉด ์‹œ๊ฐ„์ด ๋ถ€์กฑํ–ˆ๋‹ค. ๋‚ด๊ฐ€ ์ž์ดˆํ•œ ์ผ์ด๊ธฐ์— ์‚ด์ง ์•„์‰ฝ๊ธด ํ•˜๋‹ค. ์ž ์„ ์•ˆ์žค์œผ๋ฉด ์ข‹์€ ์„ฑ์ ์„ ๋‚ด์ง€ ์•Š์•˜์„๊นŒ ์ƒ๊ฐ. ์•„์‰ฌ์šด ๋งˆ์Œ์— .ipynb ํŒŒ์ผ์ด๋ผ๋„ ์ œ์ถœํ–ˆ๋‹ค. ๋ฆฌ๋”๋ณด๋“œ ์„ฑ์ ์€ 0์ ๋„ ์•„๋‹ˆ๋‹ค. ๊ทธ๋ƒฅ ๋ฏธ์ œ์ถœ.

๊ฑฐ์˜ ๋‹ค ํ–ˆ๋Š”๋ฐ ๋งˆ์ง€๋ง‰์— ๋ณ€์ˆ˜๋ช…์„ ํ—ท๊ฐˆ๋ฆฌ๊ฒŒ ์จ์„œ ํ•œ์ฐธ ๋™์•ˆ ๋ง‰ํžŒ ์—๋Ÿฌ๋ฅผ ํ•ด๊ฒฐํ•˜์ง€ ๋ชปํ–ˆ๋‹ค. ๋๋‚˜๊ณ  30๋ถ„์ด ์ง€๋‚œ ํ›„์—์•ผ ์—๋Ÿฌ๋ฅผ ํ•ด๊ฒฐํ–ˆ๋‹ค.

ํฐ ์•„์‰ฌ์›€์€ ์žˆ์ง€๋งŒ, ๊ทธ๋ž˜๋„ ์ด๋Ÿฐ ๊ณผ์ œ๋ฅผ ์ฒ˜์Œ ํ•ด๋ณด๋Š”๋ฐ ๋‚˜๋ฆ„ ์ž˜ ํ•œ๊ฒƒ ๊ฐ™๋‹ค. ์ตœ๊ทผ์— ๋ฐฐ์šด ์•™์ƒ๋ธ”๊ณผ Image Augmentation์„ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ์–ด์„œ ์ข‹์•˜๋‹ค. ๊ทธ๋ฆฌ๊ณ  VGG16 ๋ชจ๋ธ๋„ ์ฒ˜์Œ ์‚ฌ์šฉํ•ด๋ดค๋‹ค. ๊ทธ๋ฆฌ๊ณ  ์„ฑ๋Šฅ๋„ ๊ฝค ์ค€์ˆ˜ํ•ด์„œ ์ข‹์•˜๋‹ค.

๋‹ค์Œ์—๋Š” ํ•œ๋ฒˆ์— ๊ธด ์ฝ”๋“œ๋ฅผ ์“ธ ์ˆ˜ ์žˆ์„ ์ •๋„๋กœ(๊ทธ๋งŒํผ ๊ฒ€์ƒ‰์„ ์ ๊ฒŒ ํ•  ์ •๋„๋กœ) ์ต์ˆ™ํ•ด์ง€๊ณ  ๋ฐ•ํ•™ํ•ด์ง€๊ณ  ์‹ถ๋‹ค! ๋‹ค์Œ Dev-matching ์ „์— ์ทจ์—…์„ ํ• ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์— ๋‹ค์Œ ์‹œํ—˜์„ ๋ชป๋ณด๋Š” ๊ฒƒ์ด ์•„์‰ฝ์ง€๋งŒ!

๋†€๋ž€์ (?)์ด ๋ช‡ ๊ฐœ ์žˆ์–ด์„œ ๋„์ ์ด๊ณ  ๊ฐ„๋‹ค

  • ์ƒ๊ฐ๋ณด๋‹ค ๋ฆฌ๋”๋ณด๋“œ 98์  ์ด์ƒ์ด ์ˆ˜๋‘๋ฃฉ ํ–ˆ๋‹ค. 100์ ๋„ ๋ง‰ํŒ๊ฐ€์„œ๋Š” 10๋ช… ์ด์ƒ์€ ์žˆ๋˜ ๊ฒƒ ๊ฐ™๋‹ค.

    • ์ด๋ฏธ 4์‹œ๊ฐ„์ด ์ง€๋‚œ ์‹œ์ ์— 100์ ์ด 3๋ช…์ด์—ˆ๋‹ค.

    • ๋‚˜๋Š” ๋ช‡์ ์ด์—ˆ์„๊นŒ ใ… ใ… 

    • ๊ทธ๋ž˜๋„ ๊ณ ๋“์  ์ˆœ์œ„์—๋Š” ๋ชป๋“ค์—ˆ์„ ๊ฒƒ ๊ฐ™๋‹ค. ์ด๋ฏธ ํ…Œ์ŠคํŠธ๋ž‘ ๊ฒ€์ฆ ๋ฐ์ดํ„ฐ ์„ฑ๋Šฅ์ด 90์ ๋Œ€์ด๋‹ˆ๊นŒ?

  • private test data์— ๋Œ€ํ•œ 100์ ์€ ํ•œ๋ช…์ด์—ˆ๋‹ค. ๊ทผ๋ฐ ์ด ํ•œ๋ช…์ด ๋ฆฌ๋”๋ณด๋“œ์—์„œ 100์ ์€ ์•„๋‹ˆ์—ˆ๋˜ ๊ฒƒ ๊ฐ™๋‹ค.

    • ์—ญ์‹œ ์šด๋นจ(?)

  • colab์—์„œ ์•™์ƒ๋ธ” ๊ตฌํ˜„ํ•  ๋•Œ RAM์ด ๋„ˆ๋ฌด ๋ถ€์กฑํ•˜๋‹ค. ์ด๊ฒƒ ๋•Œ๋ฌธ์— ์žก์•„๋จน์€ ์‹œ๊ฐ„์ด 1์‹œ๊ฐ„์€ ๋„˜๋Š” ๊ฒƒ ๊ฐ™๋‹ค. ์ž๊พธ ์„ธ์…˜์ด ์ดˆ๊ธฐํ™”๋ผ์„œ ๋„ˆ๋ฌด ํž˜๋“ค์—ˆ๋‹ค.

    • ๋ถ„๋ฅ˜๊ธฐ ๊ฐฏ์ˆ˜๋Š” 10๊ฐœ๋กœ ๋งŽ์ด ์„ค์ •ํ–ˆ๋Š”๋ฐ ๊ทธ๋Œ€์‹  epoch๋ฅผ 10๊นŒ์ง€ ๋ฐ–์— ๋ชปํ–ˆ๋‹ค.

      • epoch = 12๋„ ์„ธ์…˜์ด ์ดˆ๊ธฐํ™”๋๋‹ค ใ… ใ… 

  • Image Augmentation์„ ๋Œ๋ฆด ๋•Œ๋„ ๋งŽ์€ ๋ฐ์ดํ„ฐ๋ฅผ ์ฒ˜๋ฆฌํ•˜๊ธฐ์—๋Š” ๋„ˆ๋ฌด ๋ถ€์กฑํ–ˆ๋‹ค.

    • ๋งŒ ์žฅ ์ •๋„๋ฅผ ๋Œ๋ ธ๋Š”๋ฐ ๋Œ์ง€ ๋ชปํ–ˆ๋‹ค.

    • ํ•ฉ์˜ ๋ณธ๊ฒƒ์€ 4,500์žฅ ์ •๋„...

  • ์ƒ๊ฐ๋ณด๋‹ค ๋ฌธ์ œ๊ฐ€ ์‰ฌ์› ๋‹ค.

    • ๋‚œ ๋ชจ๋ธ ๊ตฌํ˜„ํ•˜๋Š” ์ฝ”๋“œ๋ฅผ ์ด๋ฒˆ์— ์ฒ˜์Œ ์ž‘์„ฑํ•ด๋ดค๋‹ค. ๋ฌผ๋ก  ๊ตฌ๊ธ€๋ง๊ณผ ๊ฐ์ข… ์ฝ”๋“œ๋ฅผ ์ทจํ•ฉํ•œ ๊ฒƒ์ด๊ธด ํ•œ๋ฐ.. ๋ˆ„๊ตฌ๋Š” ์•ˆ๊ทธ๋Ÿด๊นŒ?

    • ๋” ๋งŽ์ด ๊ณต๋ถ€ํ•ด์•ผ๊ฒ ๋‹ค. ํ™”์ดํŒ…!

์ „์ฒด ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ
dog ํด๋” ์•ˆ ์ด๋ฏธ์ง€