Replying to @gdb
We need a motivational version. A version which meaningfully helps move people past their roadblocks in life. This would be the largest impact on humankind ever. Next to chlorinated water and sewers. This is the level of impact we need.
1
Replying to @gdb
This is crazy!
Replying to @gdb
4.5b words * 1.4 tokens/word * 0.06$/1000 tokens * 30 days = 1.2M$ in MRR Not bad.
1
8
Replying to @gdb
At this rate, what is that date when human generate language becomes the minority of all text?
3
Replying to @gdb
Tired to be in waiting list 😒
4
Replying to @gdb @latticecut
So when you scrap data for training, how do you know it didn't get generated by GPT-3? #justasking
1
2
Replying to @gdb
Yeah well, I generate that many dead skin cells per day!
17
Replying to @gdb @tszzl
How could we exclude all of this content when gathering the dataset for the training of GPT-4?
5
Replying to @gdb
Hi @gdb I'm studying Python with Natural Language Processing and I would be pleased if you gave me the opportunity to use the API.