๋ณธ๋ฌธ ๋ฐ”๋กœ๊ฐ€๊ธฐ
๐Ÿš“ Self Study/๐ŸŸ  Deep Learning Basic

Deep learning (Activation Function, softmax, Hidden Units, Output Units)

by UKHYUN22 2021. 12. 23.
728x90
Activation Function
" Non-linearity Function " ์ด๋ผ๊ณ  ๋ถˆ๋ฆฐ๋‹ค.
Weighted sum์„ ํ•œ ๋ฒˆ ๋” ์ฒ˜๋ฆฌํ•ด์ฃผ๋Š” ์—ญํ• ์„ ํ•œ๋‹ค.
 
Activation Functions
Sigmoid ํ•จ์ˆ˜ : 
Hyperbolic Tangent ํ•จ์ˆ˜ : -1์—์„œ +1๊นŒ์ง€์˜ ๊ฐ’์„ ๊ฐ€์ง„๋‹ค.
ReLU ํ•จ์ˆ˜ : net value์™€ 0 ์‚ฌ์ด์˜ max๊ฐ’์„ ์ทจํ•˜๋Š” ๊ฒƒ.
 

Softmax function
์—ฌ๋Ÿฌ ๊ฐœ์˜ ์นดํ…Œ๊ณ ๋ฆฌ๋ฅผ ๋‚˜ํƒ€๋‚ด๋Š” ๊ฒƒ์„ ํ™•๋ฅ ์ ์œผ๋กœ ํ‘œํ˜„ํ•˜๊ณ  ์‹ถ์„ ๋•Œ ์‚ฌ์šฉํ•œ๋‹ค.
ํ™•๋ฅ ์€ 0์—์„œ 1 ์‚ฌ์ด์˜ ๊ฐ’์„ ๊ฐ€์ ธ์•ผ ํ•œ๋‹ค. ํ•˜์ง€๋งŒ Net Value๋Š” ์ ˆ๋Œ€๋กœ ํ™•๋ฅ ์ด ์•ˆ๋€๋‹ค.
๊ทธ๋ž˜์„œ Exponential์„ ์ทจํ•ด์„œ ์ „๋ถ€ ์–‘์ˆ˜๋กœ ๋ณ€๊ฒฝ์„ ํ•˜๊ณ  ๋ชจ๋“  node์— ๋Œ€ํ•œ Exp ๊ฐ’์„ ๋”ํ•ด์„œ ๋‚˜๋ˆ„๊ฒŒ ๋œ๋‹ค.
๊ทธ๋ ‡๋‹ค๋ฉฐ ์ „์ฒด ๊ฐ’๋“ค์€ 1๋ณด๋‹ค ํฐ ๊ฐ’์ด ๋‚˜์˜ค์ง€ ์•Š๋Š”๋‹ค. score ๊ฐ’์ด ๋“ค์–ด์™”์„ ๋•Œ Softmax function์„ ๊ฑฐ์นœ๋‹ค๋ฉด
ํ™•๋ฅ ์  ์ ‘๊ทผ์ด ๊ฐ€๋Šฅํ•˜๊ฒŒ ๋œ๋‹ค. 
 
Activation Functions
Hidden Unit
: ๊ฑฐ์˜ ๋Œ€๋ถ€๋ถ„ ReLU๋ฅผ ์‚ฌ์šฉํ•˜๊ณ  ํŠน์ •ํ•œ ๊ฒฝ์šฐ LReLU์™€ PReLU๋ฅผ ์‚ฌ์šฉํ•œ๋‹ค.
: Hyper Tagzent๋ฅผ ๋งŽ์ด ์‚ฌ์šฉํ•˜๋Š” ๊ฒฝ์šฐ๋„ ์žˆ๋‹ค. recurent networtk๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๊ฒฝ์šฐ์—..!!!
: ์ผ๋ฐ˜์ ์ธ Network์—์„œ๋Š” ReLU function์„ ๋งŽ์ด ์‚ฌ์šฉํ•œ๋‹ค.
 
Output Unit
: regression์„ ํ•˜๋Š” ๊ฒฝ์šฐ activation function์„ ์•ˆํ•˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ๋งŽ๋‹ค.
-> identity function์„ ์‚ฌ์šฉํ•ด์„œ -๋ฌดํ•œ๋Œ€์—์„œ +๋ฌดํ•œ๋Œ€์˜ ๊ฐ’์œผ๋กœ ์„ธํŒ…ํ•  ์ˆ˜ ์žˆ๋‹ค.
Tanh
: -1์—์„œ +1 ์‚ฌ์ด๋กœ ๊ฐ’์„ ๋‹จ์ •ํ•˜๊ณ  ์‹ถ์„ ๋•Œ
Softmax