基于SparkTTS、OrpheusTTS等模型,提供高质量中文语音合成与声音克隆服务。
-
Updated
May 18, 2025 - Python
基于SparkTTS、OrpheusTTS等模型,提供高质量中文语音合成与声音克隆服务。
A local and uncensored AI entity.
Analyze how "surprised" LLMs are when reading a piece of text
Run Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1, and other state-of-the-art language models locally with scorching-fast performance. Inferno provides an intuitive CLI and an OpenAI/Ollama-compatible API, putting the inferno of AI innovation directly in your hands.
Dolphin 3.0 🐬: Versatile AI for coding, math, and more
A genral RAG Search chatbot, with SoTA RAG techniques such as HyDE, Hybrid retrieval with BM25 + RRF and Cross encoder reranking. Evaluated on the BEIR scifact dataset and compared all the different pipelines i tried along the way
Runpod-LLM provides ready-to-use container scripts for running large language models (LLMs) easily on RunPod.
The fastest, most efficient library for running GGUF models with maximum throughput and zero-config hardware optimization.
High-performance Local LLM benchmarking and inference toolkit for Edge CPUs. Features automated profiling for GGUF models, RAM/KV-cache footprint analysis, and optimized llama.cpp execution.
Gemma 3: Google's multimodal, multilingual, long context LLM.
This is a Discord Bot designed to help noobs in my Discord server chat about Dying Light modding.
Using Large Language Vision Assistant(Llava) for scene understanding on MetaQuest 3(VR)
Simple LLM interface based on terminal + GUI option
FastAPI semantic search + custom entity detection platform.
Setting up a local inference environment with llama.cpp and pytorch, with CUDA support . Using huggingface transformers and outlines for structured generation.
This repository demonstrates how to use outlines and llama-cpp-python for structured JSON generation with streaming output, integrating llama.cpp for local model inference and outlines for schema-based text generation.
Add a description, image, and links to the llamacpp-python topic page so that developers can more easily learn about it.
To associate your repository with the llamacpp-python topic, visit your repo's landing page and select "manage topics."