This is just the beginning. Thinking we can corral LLM tech is ignorance at it’s finest.
Omg omg omg float16 in webGPU finally works now and it's ~fast. Even on a $700 laptop with just 8gb ram. Fully local, fully controlable, web-based ai is here (in beta/Chrome Canary). Yes it works offline.
1
2
2
Replying to @RSnake
Bard disagrees.

Sep 5, 2023 · 3:27 PM UTC