STORY
Our LLM techstack and its limitations
AUTHOR
Joined 2023.07.01
DATE
VOTES
sats
COMMENTS

Our LLM techstack and its limitations

For our hackathon, we chose to build our LLM on private-gpt instance and GPT4ALL. However we ran into some limitations.

When we initially prompted our LLM, responses it gave included non-bitcoin related topics. For example when we first asked the LLM about BIP-39, it returned information about a "business intelligence platform" which was definitely off the mark.

We looked into potentially disconnecting PrivateGPT from its general knowledge base but unfortunately we didn't find a way to do that. So we created an "on topic" filter instead as a work around. It worked pretty well as you can see in our tests here. The photo below depicts the initial key words we allowed.

It's not the best solution but it works for the purposes of our hackathon. Future considerations might be to consider llama2 as suggested by Starbuilder.