I asked @ilyasut how to set neural network init. He accidentally replied with a poem: You want to be on the edge of chaos Too small, and the init will be too stable, with vanishing gradients Too large, and you'll be unstable, due to exploding gradients You want to be on the edge

Apr 8, 2019 路 8:16 PM UTC

22
227
20
1,294
Replying to @gdb @ilyasut
@TheGregYang Looks like someone took a bit of inspiration from your work!
1
Replying to @gdb @ilyasut
Perfect. Can we get a book of these? 馃槑
1
Replying to @gdb @ilyasut
Balance on the edge of two infinities
1
Replying to @gdb @ilyasut
... Two roads diverged in a wood, and I, I took the one less traveled by, And that has made all the difference. The Road Not Taken
1
Replying to @gdb @ilyasut
@m_aggrey I just wanted to say this is the best birthday girft you will ever received
Replying to @gdb @ilyasut
Prof. Bengio mentioned in a talk once that interesting things happen when things are close to an edge (he was referring to ReLU activations that are positive and close to 0)
1
Replying to @gdb
Nice @ilyasut. We teach DL with rap at @Stanford too. Hmu for a feat. Hey yo bro, check our flow, It鈥檚 not slow, it goes faster than YOLO, Ends up beating Alpha Go We鈥檙e backproping ur network Minimising ur net worth, Just look at ur loss landscape, You vanished & can鈥檛 escape
3
2
3
31