A good analysis of the embeddings we use in the OpenAI Five model: neuro.cs.ut.ee/the-use-of-em…. After analyzing the rest of the model, concludes: “All the strategy and tactics must lie in one place – 1024-unit LSTM.” Pretty remarkable what LSTMs (invented in 1997!) can do at scale.

Sep 11, 2018 · 10:17 PM UTC

2
88
4
268
(Also, "at scale" is a very relative term. After two doublings, our latest Five model is 4096-units, and consumes roughly the same amount of compute per second as a honeybee.)
6
13
58
Replying to @gdb
Love this article! I'm curious now, ARE the weights shared between the five agents?
1
2