LLM inference in C/C++
C++ llama ggmlAufgenommen vor 4 Tagen
Aktualisiert vor 4 Tagen
0
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
Go kubernetes api ai libp2p distributed llm llama gpt4all gemma stable-diffusion musicgen image-generation text-generation rerank mistral mamba rwkv llama3 audio-generation ttsA self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device. New: Code Llama support!
self-hosted TypeScript ai llm llama gpt-4 chatgpt openai gpt llama2 codellama gpt4all code-llama llama-2 llama-cpp localai llamacpp