Weavy

Overview

The same prompt. Different AI models. Very different outcomes.

This exploration started as a simple curiosity: how do different AI image models interpret the same intent? Using Weavy’s node-based canvas, I treated image generation as a design experiment rather than a one-click output—observing how models make creative decisions and how workflows shape results.

Prompting Across Models

AI Exploration · Image Generation

I began with a single prompt and ran it across multiple AI models within Weavy. By keeping the input consistent, I could focus entirely on the differences in interpretation—how each model handled composition, realism, lighting and mood.

Rather than optimizing for a “perfect” image, the goal was to understand how each model thinks visually and where their strengths naturally emerge.

Designing the Workflow

Node-Based Systems · Iteration

What stood out most was how Weavy’s node-based structure changed the way I approached experimentation. Instead of generating images in isolation, I could chain prompts, models and edits together into a visible workflow.

This made exploration feel intentional. I could branch ideas, tweak inputs and refine outputs without starting over—turning AI generation into a repeatable creative system rather than a series of disconnected attempts.

From Image to Motion

Prompting · Reference Conditioning · Video Generation

I explored how a single reference image could be used to guide motion-based generation. Using Weavy, I combined a prompt with a still image to generate a short cinematic video through the Pixverse model.

The prompt focused on capturing movement, perspective and emotion—a first-person, selfie-style view of a dog snowboarding—while the reference image anchored visual consistency in character, outfit and environment.

Reflections and Learnings

This experiment shifted how I think about AI’s role in design.

  • Models have personalities.
    Each model brought its own creative bias to the same prompt. Choosing a model felt less technical and more like choosing a visual collaborator.

  • Prompting is design intent in text form.
    Small changes in wording led to big shifts in mood and composition. Writing prompts felt closer to art direction than instruction.

  • Workflows matter more than outputs.
    The ability to see and reuse a workflow made experimentation more thoughtful and less random. Process became a design asset.

  • AI works best as a thinking partner.
    The most interesting moments weren’t the polished results, but the unexpected outputs that challenged assumptions and sparked new ideas.

Overall, this exploration reinforced my belief that AI is most powerful when designers stay in control—using it to explore, iterate, and learn rather than to shortcut creative thinking.

Let's collaborate and create something magical ✨

© 2026 Made with ♥️ by Ruchika Kankaria

Let's collaborate and create something magical ✨

© 2026 Made with ♥️ by Ruchika Kankaria

Let's collaborate and create something magical ✨

© 2026 Made with ♥️ by Ruchika Kankaria