LLM

Running Local LLMs With Internet Access Using Docker Compose

I believe that LLMs have the potential to be incredibly useful, however the lack of privacy in online services is a major concern to me. Anything put into any service online will be put back into training in an attempt to make the service as good as possible. This can, and has, led to instances of information that should be private leaking. One such example of this is AI coding assistants outputing API keys. An easy way around privacy concerns is to run the LLM locally. If you haven’t taken a look into doing this it might seem a bit challenging, but a lot of work has been put into making this fairly easy. We’ll use docker compose and existing tools to set up a locally running LLM that is even able to access web results to inform it’s answers.
Logan Roberts
November 3, 2024
6 Minutes