72 Hours to Full Pipeline: Building the AniKuku AI Animation Engine
Recently, we conducted a 72-hour intensive sprint with one goal: to transform AniKuku from a "demo-ready" prototype into a "production-ready" engine capable of running full episodes. This article reviews how we connected script parsing, asset generation, timeline editing, and automated rendering into a reusable pipeline—and the product decisions that were validated along the way.
Why the 72-Hour Stress Test?
- Proving End-to-End Viability: Brands and studios are asking: "Can we produce a consistent 2-min episode quickly?" We needed to prove that the entire workflow—not just isolated demos—is feasible.
- Validating our Tech Stack: We needed to ensure that our stack—Next.js 15, React 19, Drizzle/Postgres, OpenAI-compatible LLMs, and grsai (Nano Banana / Sora2) generation links—could work together seamlessly under pressure.
- Aligning the Team: By focusing on a single creation path, we unified product, design, engineering, and operations around a practical, shared workflow.
What We Accomplished in 72 Hours
- Script-to-Shot Analysis: The LLM now parses raw scripts into a detailed shot list, tagging characters, scenes, and props. These shots are automatically placed onto the timeline as a "to-do" list.
- Consistent Asset Management: Character and scene definitions are stored in Cloudflare R2. Every generated prompt, version number, and preview image is traceable, ensuring that switching styles doesn't mean losing existing progress.
- Automated Visuals & Voiceovers: Using grsai's Nano Banana model, we can batch-generate storyboards with customizable styles. Voiceovers are generated via Text-to-Speech (TTS) and automatically aligned with the shot duration.
- Resilient Timeline Editing: Built on React 19, our collaborative timeline flags pacing issues and duration overflows. It supports localized rendering, undos, and shot replacements to prevent "edit fatigue."
- One-Click Rendering: Once shots are locked, Sora2 generates the final clips, merges them with audio, and outputs the result to R2 for sharing or downloading. Failed segments can be retried individually without affecting the whole project.
- Production Infrastructure: We integrated BetterAuth (Magic Links + OAuth), Resend for emails, Stripe for payments, and Drizzle migration scripts to ensure the platform is ready for private beta testing and monetization.
The End-to-End Workflow
- Upload Script: Automatically receive a shot list, character cards, and scene cards.
- Select Style: Choose a preset or define custom prompts to batch-generate storyboards.
- Voice Alignment: Input dialogue to generate AI voices that automatically sync with each shot's timing.
- Fine-Tuning: Adjust pacing on the timeline, replace specific shots, and ensure the episode fits the ~2-minute target.
- Render & Export: Generate the final video and export it as a shareable link or a download package.
- Asset Persistence: All generated assets are saved to your library for reuse in future episodes or social media clips.
Key Decisions & Lessons Learned
- Speed vs. Quality: We default to 1K "draft" storyboards to prioritize pacing and flow. High-quality 2K renders are reserved for the final output or specific focal shots.
- Enforced Consistency: Characters and scenes are bound to prompt templates and reference images. This significantly reduces the "visual drift" where faces or settings change between shots.
- Atomic Reliability: Shot generation states are persisted (via Postgres + Drizzle). If a specific generation fails, it can be retried as a single unit without resetting the entire timeline.
- Transparent Costs: Usage is tracked by "shot credits," integrated with team seat billing. Stripe handles payments, while Resend manages invoices and notifications.
Value for Every Creator
- Brands & Marketing Teams: Rapidly test storyboards for 2-minute "mini-animations" and repurpose assets for vertical social ads.
- Animation Studios: Use AI to generate "living storyboards" to validate pacing and art direction before committing to expensive manual production.
- Solo Creators: Manage the entire lifecycle from script to screen alone. Build a personal asset library and sell successful templates to the community.
What’s Next?
- Template Marketplace: Introduce screenplay templates, style packs, and promotional asset kits to lower the barrier for new users.
- Team Collaboration: Enable multiple users to edit the timeline and comment on shots simultaneously—ideal for brand/agency workflows.
- Usage Analytics: Dashboards for shot generation time, costs, and success rates to help teams manage their budgets effectively.
- Broader Model Support: Expansion of OpenAI-standard API compatibility to include more image and video models for diverse artistic styles.
The 72-hour stress test proved that AniKuku is no longer just "cool technology"—it’s a production-ready tool. If you’re ready to see your script come to life, join our private beta and help us make the future of animation faster and more accessible.