AI supercomputing — the art of coaxing mind-boggling numbers of machines to perform their part of a single precisely-orchestrated computation — presents both the hardest & most rewarding technical challenge I've ever worked on.

Feb 15, 2022 · 4:37 PM UTC

18
30
3
478
Replying to @gdb
How many machines?
1
21
Replying to @gdb
You are not alone Greg! Potentially also some of the most important, impactful, and rewarding work for building a better future faster.
Replying to @gdb @paulg
That's a feat you should be proud off !
Replying to @gdb
Ours And already has seven memory slots, regardless of the language, works offline, communicates in natural language. And most importantly, we do not use neural networks, they will not lead to the creation of AI. Here we apply a different approach.
Replying to @gdb
i had this opportunity back when at Samsung to optimise for arm64-based devices, it was really wholesome! some great underrated technical leaps in this domain.
Replying to @gdb @paulg
Now do cats
Replying to @gdb
Supercomputers won't help. This is useful for neural networks, but not for creating AGI. We at Mind-Simulation have already passed the first stage of AGI on simple equipment and have achieved what you are talking about so far. Idea + implementation is a success in creating AGI.
Replying to @gdb
Sounds kind of a next level machine learning stuff, the future seems more than just promising
Replying to @gdb
How do you motivate yourself when you've exhausted every idea you have and the problem won't go away?