

3·
6 hours agoUnfortunately I think most businesses will still prefer that their AI solution is hosted by a company like OpenAI rather than maintaining their own. There’s still going to be a need for these large data centers, but I do hope most people realize that hosting your own LLM isn’t that difficult, and it doesn’t cost you your privacy.
Anyone running a newer MacBook Pro can install Ollama and run it with just a few commands:
Then you can use it at the terminal, but it also has API access, so with a couple more commands you can put a web front end on it, or with a bit more effort you can add it to a new or existing app/service/system.