Learning Algorithm
Learning with Gradient Descent Method
Randomly choose an initial solution,
Composition Function Differentiation: 합성함수의 미분
Error Back Propagation
- Hard Limit Function: ex) Step Function
- Basic Idea
- Given Input-Targets Pairs and output of NN
- Minimize the error
Case 1: Weights between output and hidden layer
for Dn = (xn1, xn2, ..., xnd, tnd, tn1, tn2, ..., tnm
'DataScience > DeepLearning' 카테고리의 다른 글
[딥러닝 필기] week6. CNN Basics: Convolution (0) | 2022.03.27 |
---|---|
[딥러닝 필기] week6. Deep Learning-Various Technique: Batch Normalization (0) | 2022.03.27 |
[딥러닝 필기] week6. Deep Learning-Various Technique: Dropout (0) | 2022.03.27 |
[딥러닝 필기] week2. Neural Networks Overview (0) | 2022.03.01 |
[딥러닝 필기] week1. Machine Learning Basics (0) | 2022.02.24 |