LLM Prompt Optimizer
通过系统分析和细化优化提示,以获得更好的 LLM 输出
SKILL.md
The LLM Prompt Optimizer skill systematically analyzes and refines prompts to maximize the quality, accuracy, and relevance of large language model outputs. It applies evidence-based optimization techniques including structural improvements, context enrichment, constraint calibration, and output format specification.
This skill goes beyond basic prompt writing by leveraging understanding of how different LLMs process instructions, their attention patterns, and their response tendencies. It helps you transform underperforming prompts into high-yield instructions that consistently produce the results you need.
Whether you are building production AI systems, conducting research, or simply want better ChatGPT responses, this skill ensures your prompts are optimized for your specific model and use case.
通过系统分析和细化优化提示,以获得更好的 LLM 输出 来源:jmsktm/claude-settings。
可引用信息
为搜索与 AI 引用准备的稳定字段与命令。
- 安装命令
npx skills add https://github.com/jmsktm/claude-settings --skill LLM Prompt Optimizer- 分类
- </>开发工具
- 认证
- —
- 收录时间
- 2026-02-12
- 更新时间
- 2026-02-18
快速解答
什么是 LLM Prompt Optimizer?
通过系统分析和细化优化提示,以获得更好的 LLM 输出 来源:jmsktm/claude-settings。
如何安装 LLM Prompt Optimizer?
打开你的终端或命令行工具(如 Terminal、iTerm、Windows Terminal 等) 复制并运行以下命令:npx skills add https://github.com/jmsktm/claude-settings --skill LLM Prompt Optimizer 安装完成后,技能将自动配置到你的 AI 编程环境中,可以在 Claude Code 或 Cursor 中使用
这个 Skill 的源码在哪?
https://github.com/jmsktm/claude-settings
详情
- 分类
- </>开发工具
- 来源
- user
- 收录时间
- 2026-02-12