President & Co-Founder @OpenAI

Joined July 2010
Filter
Exclude
Time range
-
Near
Longest-ever episode of the web series Two Minute Papers — clocking in at 11 min 30 seconds, despite the name — on OpenAI Five's victory over the Dota 2 world champions: youtube.com/watch?v=tfb6aEUM…
3
10
1
86
Winter is coming again — someone fine-tuned GPT-2 to generate alternate Game of Thrones endings. buff.ly/30HutaE
7
68
5
220
Very impressed by the work done by our second class of OpenAI Scholars! Check out their projects here: openai.com/blog/openai-schol…
4
11
59
"Starting out in a new field is a bit like learning a new language. At first everything feels foreign and overwhelming, and it's important to press on through that feeling." — @mcleavey's advice on getting started in deep learning resonates with me! deeplearning.ai/blog/deep-le…
2
21
3
123
The P is for Pokémon.
OMG I added colors and flavor text so *in theory* GPT-2 is first creating a description of a Pokemon and then drawing what it thinks it looks like based on that description. Take that StyleGAN!
1
1
25
Surprisingly mesmerizing: nitter.vloup.ch/SamBouiss/status…. Cool to see PPO solve a game that's simple enough to anticipate what move it should make at each step.
@OpenAI here’s my snake trained for ~100000 frames, 5 epochs per frame with PPO
2
3
40
The @BBCPhilharmonic will soon be playing a piece composed in collaboration with MuseNet
Thrilled to work with the talented and creative @Robert_Laidlow setting up #MuseNet as a part of the composition process. Can't wait to hear the performance!
1
20
1
87
Congrats @gralston! YC is in great hands.
I'm delighted @gralston is taking over as President of YC! He is YC at its best, and I'm excited to see where he takes things. blog.ycombinator.com/geoff-r…
1
3
Replying to @jerrick93
The program starts with teaching deep learning from the beginning. So usually better to apply for a full-time role if you're already doing deep learning! (We'll consider applications from people doing other forms of ML though.)
3
Our Fellows program trains people from other fields to make world-class machine learning contributions within 6 months. Kinda crazy that this is possible: nitter.vloup.ch/OpenAI/status/11… Applications are open for the next class of Fellows. Acceptances on a rolling basis, so apply today!
In just 6 months, our Fellows went from beginner to producing world-class ML contributions. Their accomplishments: MuseNet, analysis of bias/variance of gradients, implementations of very large scaling transformers, and learning adaptive weight sparsity: openai.com/blog/openai-fello…
4
9
85
Great podcast from @mcleavey on MuseNet and AI ethics: towardsdatascience.com/opena…
2
13
63
OpenAI robotics over the years:
Give AI a hand 👏
3
22
Very encouraging to see others engaging with the ethics of their NLP releases:
Replying to @Thom_Wolf
This release was a great occasion to discuss our ethics and values: open-source, education & community. We wrote a blog post discussing it. Tutorial: medium.com/huggingface/how-t… Code: github.com/huggingface/trans… Demo: convai.huggingface.co Ethics & values: medium.com/huggingface/ethic…
2
20
Replying to @RahelJhirad @OpenAI
OpenAI office in SF!
Please do! Many people currently working at OpenAI applied multiple times before getting an offer. One advantage is we get to see your growth over time.
1
9
The Scholars program is really inspiring — always amazing to see what people coming from other fields can accomplish in 3 months of training. Come see what's been built:
On Tues 5/14, we'll be hosting Scholars Demo Day. See what's been accomplished in the past 3 months: from intrinsic motivation for robotic tasks to fine-tuning GPT-2 for question answering. Join us in person! Request an invite: bit.ly/2vKlFlZ
9
51
Looks really cool. Great work!
Yesterday I launched Talk to Transformer, a site where you can try out @OpenAI's new text-generating language model on your own custom text. talktotransformer.com/ One example is below. @AlecRad @gdb
2
8
4
37
Replying to @SkyLi0n @OpenAI
Nope! Team's been working super hard on this since GPT-2 launch, and we just reached the point where we felt confident enough to take a next step (with a minor internal push to get it done by ICLR: iclr.cc/).
2