What is daft-distributed-scaling?
Scale Daft workflows to distributed Ray clusters. Invoke when optimizing performance or handling large data. Source: eventual-inc/daft.
Scale Daft workflows to distributed Ray clusters. Invoke when optimizing performance or handling large data.
Quickly install daft-distributed-scaling AI skill to your development environment via command line
Source: eventual-inc/daft.
| Strategy | API | Use Case | Pros/Cons |
| Shuffle | repartition(N) | Light data (e.g. file paths), Joins | Global balance. High memory usage (materializes data). | | Streaming | intobatches(N) | Heavy data (images, tensors) | Low memory (streaming). High scheduling overhead if batches too small. |
Light Data: Repartitioning Best for distributing file paths before heavy reads.
Scale Daft workflows to distributed Ray clusters. Invoke when optimizing performance or handling large data. Source: eventual-inc/daft.
Stable fields and commands for AI/search citations.
npx skills add https://github.com/eventual-inc/daft --skill daft-distributed-scalingScale Daft workflows to distributed Ray clusters. Invoke when optimizing performance or handling large data. Source: eventual-inc/daft.
Open your terminal or command line tool (Terminal, iTerm, Windows Terminal, etc.) Copy and run this command: npx skills add https://github.com/eventual-inc/daft --skill daft-distributed-scaling Once installed, the skill will be automatically configured in your AI coding environment and ready to use in Claude Code, Cursor, or OpenClaw
https://github.com/eventual-inc/daft