Home
>
AI Assistant
>
Oobabooga

Oobabooga

GitHub - oobabooga/text-generation-webui: A Gradio web UI for Large Language Models. Supports transf
Introduction:
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models. - oobabooga/text-generation-webui
Oobabooga Product Information

What is Oobabooga ?

A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

Oobabooga's Core Features

Multiple model backends: Transformers, llama.cpp (through llama-cpp-python), ExLlamaV2, AutoGPTQ, AutoAWQ, GPTQ-for-LLaMa, QuIP#.

Dropdown menu for quickly switching between different models.

Large number of extensions (built-in and user-contributed), including Coqui TTS for realistic voice outputs, Whisper STT for voice inputs, translation, multimodal pipelines, vector databases, Stable Diffusion integration, and a lot more.

Chat with custom characters.

Precise chat templates for instruction-following models, including Llama-2-chat, Alpaca, Vicuna, Mistral.

LoRA: train new LoRAs with your own data, load/unload LoRAs on the fly for generation.

Transformers library integration: load models in 4-bit or 8-bit precision through bitsandbytes, use llama.cpp with transformers samplers (llamacpp_HF loader), CPU inference in 32-bit precision using PyTorch.

OpenAI-compatible API server with Chat and Completions endpoints.

FAQ from Oobabooga