June 2025
-
Using Ollama models within LM Studio's GUI
AILLMOllamaLinuxLM Studio
April 2025
-
Integrate Ollama with GitHub Copilot alternatives for AI-powered code completion using local LLMs.
AILLMOllamaLinuxCopilot -
Testing which LLMs my NVIDIA GeForce RTX 4060 Ti can run locally through Ollama.
AILLMOllamaLinux