ALAgrApHY·FollowApr 23, 2024--2ListenShareI use llama 3 locally using ollama and it works like wonders using CPU! However, I'm still waiting for a llama-3-uncensored version!