·runpod-serverless-builder
*

runpod-serverless-builder

avivk5498/my-claude-code-skills

Build production-ready RunPod serverless endpoints with optimized cold start times. Use when creating or modifying RunPod serverless workers for (1) vLLM-based LLM inference, (2) ComfyUI image/video generation, or (3) custom Python inference. Supports both baked models (fastest cold starts) and dynamic loading (shared models). Generates complete projects including Dockerfiles, worker handlers, startup scripts, and configuration optimized for minimal cold start latency.

2Installs·0Trend·@avivk5498

Installation

$npx skills add https://github.com/avivk5498/my-claude-code-skills --skill runpod-serverless-builder

SKILL.md

Build end-to-end RunPod serverless endpoints optimized for extremely short cold start times.

If you prefer manual implementation or need to understand the patterns:

User request: "Create a RunPod endpoint for Llama 3.1 8B with the fastest possible cold starts"

Build production-ready RunPod serverless endpoints with optimized cold start times. Use when creating or modifying RunPod serverless workers for (1) vLLM-based LLM inference, (2) ComfyUI image/video generation, or (3) custom Python inference. Supports both baked models (fastest cold starts) and dynamic loading (shared models). Generates complete projects including Dockerfiles, worker handlers, startup scripts, and configuration optimized for minimal cold start latency. Source: avivk5498/my-claude-code-skills.

View raw

Facts (cite-ready)

Stable fields and commands for AI/search citations.

Install command
npx skills add https://github.com/avivk5498/my-claude-code-skills --skill runpod-serverless-builder
Category
*Creative Media
Verified
First Seen
2026-02-01
Updated
2026-02-18

Quick answers

What is runpod-serverless-builder?

Build production-ready RunPod serverless endpoints with optimized cold start times. Use when creating or modifying RunPod serverless workers for (1) vLLM-based LLM inference, (2) ComfyUI image/video generation, or (3) custom Python inference. Supports both baked models (fastest cold starts) and dynamic loading (shared models). Generates complete projects including Dockerfiles, worker handlers, startup scripts, and configuration optimized for minimal cold start latency. Source: avivk5498/my-claude-code-skills.

How do I install runpod-serverless-builder?

Open your terminal or command line tool (Terminal, iTerm, Windows Terminal, etc.) Copy and run this command: npx skills add https://github.com/avivk5498/my-claude-code-skills --skill runpod-serverless-builder Once installed, the skill will be automatically configured in your AI coding environment and ready to use in Claude Code or Cursor

Where is the source repository?

https://github.com/avivk5498/my-claude-code-skills

Details

Category
*Creative Media
Source
skills.sh
First Seen
2026-02-01