Energy-based models have many appealing properties (e.g. spend more compute to improve samples — reminiscent of the human creative process), but people have largely given up on training them. We've made some exciting progress:
Progress towards stable and scalable training of energy-based models:
💻Blog: openai.com/blog/energy-based…
📝Paper: s3-us-west-2.amazonaws.com/o…
🔤Code: sites.google.com/view/igebm
Mar 21, 2019 · 4:22 PM UTC
2
15
83

