STORY
Personal AI Agents
AUTHOR
Joined 2023.07.04
DATE
VOTES
sats
COMMENTS

Personal AI Agents

For some reason it doesn't feel like my own AI Agent if I am making api calls to intelligence hosted on someone else's server. I believe in a future where my AI Agent is running locally on my device, or even a computer that I own and control.

Even today its possible to run some smaller language models at the edge and in the browser. These models will continue to grow and improve. George Hotz even stated on the Lex Fridman podcast that he believes these language models will get smaller, but instead start to do some looping on themselves to think more and also have retrievers built in.

Running these models at the edge will also reduce costs and improve data privacy. With Data Privacy being Netonomy's number one mission we will continue to explore how to best implement AI at the edge.

I am using Web LLM to help bring this vision to life. I have built a chat interface that runs the Vicuna and RedPajama locally in the browser. I am looking to also integrate and fine tune Falcon 7b.

Since these models are still small it is hard to use them like AI Agents, but over time they will get better. I can't wait to integrate my AI Agent running locally with a retriever to my decentralized web node. With it running locally, it can also have access to my bitcoin wallet!

Loading LLM in the browserPrivate AI Chat