I installed llama3
the LLM from Meta on my MacBook M2 and tested it locally. You can download it from https://ollama.com/ and install it on your machine. I used the model 8b, which needs 4.7GB of download. My machine’s CPU has 12 cores, which didn’t go so high most of the time. A few gigs of RAM was enough to use it in terminal with just me as the only user. Here is my first talk with Meta’s LLama3.
Categories
Obsidian Or Notion, A Talk with LLAMA
