President & Co-Founder @OpenAI

Joined July 2010
Filter
Exclude
Time range
-
Near
This will be the first time an AI has attempted to play the world champions in an esports game. Excited and nervous to see what happens!
We’re going to San Fransisco! We will face @OpenAI Five for their final live event on April 13th. Read: facebook.com/notes/og/openai… #DreamOG
4
26
3
169
Let's find out :)
Replying to @gdb
We have intense arguments when it comes to the OpenAI for DotA amongst the team, I for one believe in humans. Bring it on ☠️
2
4
31
We'll continue to work on Dota 2 as an environment for AI development!
1
1
18
Super excited to see OpenAI Five play against the Dota world champions, @OGesports. Will be a unique event showing a taste of what's to come with AI + humans. If you're in the Bay Area (or wanna visit!), definitely request an invite to attend in person:
On April 13th, we'll host OpenAI Five Finals, a live event with surprises in store. One highlight will be playing against @OGesports, the Dota 2 world champions. Attend in person or watch on Twitch: openai.com/blog/openai-five-…
9
73
1
369
We don't do co-ops per se, but the Fellows program may be of interest: jobs.lever.co/openai/f5c8d70… (openai.com/blog/openai-fello…)!
1
Love working with data at massive scale and having end-to-end ownership over your work? Apply to become a data engineer at OpenAI: jobs.lever.co/openai/3558cf3… With results like GPT-2, unsupervised learning is now possible — and curated datasets are key for continued progress.
9
2
68
Energy-based models have many appealing properties (e.g. spend more compute to improve samples — reminiscent of the human creative process), but people have largely given up on training them. We've made some exciting progress:
Progress towards stable and scalable training of energy-based models: 💻Blog: openai.com/blog/energy-based… 📝Paper: s3-us-west-2.amazonaws.com/o… 🔤Code: sites.google.com/view/igebm
2
15
83
Been working with a codebase that uses Unicode variable names in Python 3. It's great for when I'm reading (policies are named `π` rather than `pi`!), but makes writing and debugging more difficult (kinda difficult to type `π`...). On the whole, pretty worthwhile tradeoff!
7
71
Also interesting: after playing with the small model, people viscerally feel that GPT-2 could be used to generate fake news —old.reddit.com/r/MachineLearning…
old.reddit.com/r/MachineLearning… --- r/ML gets the small GPT2 to write imaginary news. I didn't expect the small model to be that coherent.
3
3
33
Replying to @davidiach
My older brother! He spent a few years trying to get me to start, and I finally agreed to do his workout routine over the holidays in 2017 if he'd drop it afterwards. I loved it and have been hooked since.
11
Replying to @revlismas @sketch
That's a metal Husafell! It's a hollow metal box which you can fill up with weights. In my event, it was loaded up to a total of 200 pounds.
1
Thanks, will do my very best!!
Replying to @gdb
Good luck @gdb. Technique, adrenaline, then will to win. Looking forward to hearing of PB’s.
12
Weighed in at exactly 200 (my target weight)! Looking forward to competing tomorrow.
5
25
Replying to @TheGregYang
Agree!
Replying to @etzioni
Both are important! Look at GPT-2 for instance — that's a general-purpose architectural improvement (i.e. the Transformer) run at massive scale. One interesting point from the essay is that scale gets a bad rap — doing the reverse isn't a good way of fixing the problem!
Replying to @bilaltwovec
Agreed!
Replying to @etzioni
Both are important! Look at GPT-2 for instance — that's a general-purpose architectural improvement (i.e. the Transformer) run at massive scale. One interesting point from the essay is that scale gets a bad rap — doing the reverse isn't a good way of fixing the problem!
1
1
Replying to @etzioni
Both are important! Look at GPT-2 for instance — that's a general-purpose architectural improvement (i.e. the Transformer) run at massive scale. One interesting point from the essay is that scale gets a bad rap — doing the reverse isn't a good way of fixing the problem!
3
2
28