·llava

Large Language and Vision Assistant. Enables visual instruction tuning and image-based conversations. Combines CLIP vision encoder with Vicuna/LLaMA language models. Supports multi-turn image chat, visual question answering, and instruction following. Use for vision-language chatbots or image understanding tasks. Best for conversational image analysis.

15Installs·0Trend·@orchestra-research

Installation

$npx skills add https://github.com/orchestra-research/ai-research-skills --skill llava

SKILL.md

Open-source vision-language model for conversational image understanding.

| LLaVA-v1.5-7B | 7B | 14 GB | Good | | LLaVA-v1.5-13B | 13B | 28 GB | Better | | LLaVA-v1.6-34B | 34B | 70 GB | Best |

| Model | VRAM (FP16) | VRAM (4-bit) | Speed (tokens/s) |

Large Language and Vision Assistant. Enables visual instruction tuning and image-based conversations. Combines CLIP vision encoder with Vicuna/LLaMA language models. Supports multi-turn image chat, visual question answering, and instruction following. Use for vision-language chatbots or image understanding tasks. Best for conversational image analysis. Source: orchestra-research/ai-research-skills.

View raw

Facts (cite-ready)

Stable fields and commands for AI/search citations.

Install command
npx skills add https://github.com/orchestra-research/ai-research-skills --skill llava
Category
{}Data Analysis
Verified
First Seen
2026-02-11
Updated
2026-02-18

Quick answers

What is llava?

Large Language and Vision Assistant. Enables visual instruction tuning and image-based conversations. Combines CLIP vision encoder with Vicuna/LLaMA language models. Supports multi-turn image chat, visual question answering, and instruction following. Use for vision-language chatbots or image understanding tasks. Best for conversational image analysis. Source: orchestra-research/ai-research-skills.

How do I install llava?

Open your terminal or command line tool (Terminal, iTerm, Windows Terminal, etc.) Copy and run this command: npx skills add https://github.com/orchestra-research/ai-research-skills --skill llava Once installed, the skill will be automatically configured in your AI coding environment and ready to use in Claude Code or Cursor

Where is the source repository?

https://github.com/orchestra-research/ai-research-skills

Details

Category
{}Data Analysis
Source
skills.sh
First Seen
2026-02-11