Two mechanisms for publishing GPT-2 (and hopefully future powerful models): staged release and partnership-based sharing.
We're:
1. Releasing the 345M model
2. Sharing the 1.5B model with partners working on countermeasures (please apply!) openai.com/blog/better-langu…
22
254
38
706
Wonder if this is in response to the release of the OpenWebTextCorpus: skylion007.github.io/OpenWeb…
1
3
Nope! Team's been working super hard on this since GPT-2 launch, and we just reached the point where we felt confident enough to take a next step (with a minor internal push to get it done by ICLR: iclr.cc/).
May 5, 2019 · 7:22 PM UTC
2


