Play

Happy Dragon

Play

Happy Dragon

Project
AI powered concept film demonstrating character consistency and mixed style world building.
Objective
To test the capabilities of the Weavy node based workflow and the newly released Nano Banana model by creating a coherent short film with consistent characters, controlled scenes and a blend of photorealistic and animated styles.
Role
I conceived, wrote and produced the entire film, leading the creative direction, narrative development, visual design, AI workflow engineering, motion graphics, sound design and final edit.
Project
AI powered concept film demonstrating character consistency and mixed style world building.
Objective
To test the capabilities of the Weavy node based workflow and the newly released Nano Banana model by creating a coherent short film with consistent characters, controlled scenes and a blend of photorealistic and animated styles.
Role
I conceived, wrote and produced the entire film, leading the creative direction, narrative development, visual design, AI workflow engineering, motion graphics, sound design and final edit.
Project
AI powered concept film demonstrating character consistency and mixed style world building.
Objective
To test the capabilities of the Weavy node based workflow and the newly released Nano Banana model by creating a coherent short film with consistent characters, controlled scenes and a blend of photorealistic and animated styles.
Role
I conceived, wrote and produced the entire film, leading the creative direction, narrative development, visual design, AI workflow engineering, motion graphics, sound design and final edit.

Happy Dragon is a short AI generated concept film created to explore the emerging strengths of multi stage generative workflows. I used the film as an opportunity to benchmark Nano Banana’s image consistency and Weavy’s sequencing logic in a real narrative context.

Happy Dragon is a short AI generated concept film created to explore the emerging strengths of multi stage generative workflows. I used the film as an opportunity to benchmark Nano Banana’s image consistency and Weavy’s sequencing logic in a real narrative context.

Happy Dragon is a short AI generated concept film created to explore the emerging strengths of multi stage generative workflows. I used the film as an opportunity to benchmark Nano Banana’s image consistency and Weavy’s sequencing logic in a real narrative context.

Creative exploration

The project began as an investigation into whether an AI pipeline could reliably maintain character identity and scene continuity across multiple shots. I developed a simple narrative about a spirited dragon to anchor the test, then used design thinking principles to break the story into modular beats that could be controlled and iterated through Weavy’s node structure. This approach allowed me to test variations quickly while preserving a narrative through line.

Using Nano Banana, I generated both photorealistic and stylised animated interpretations of the same characters to assess cross style consistency. I experimented with controlled prompt structures and iterative refinements, evaluating the model’s ability to interpret physical detail and maintain object shapes, especially for product style shots where precision was essential.

Creative exploration

The project began as an investigation into whether an AI pipeline could reliably maintain character identity and scene continuity across multiple shots. I developed a simple narrative about a spirited dragon to anchor the test, then used design thinking principles to break the story into modular beats that could be controlled and iterated through Weavy’s node structure. This approach allowed me to test variations quickly while preserving a narrative through line.

Using Nano Banana, I generated both photorealistic and stylised animated interpretations of the same characters to assess cross style consistency. I experimented with controlled prompt structures and iterative refinements, evaluating the model’s ability to interpret physical detail and maintain object shapes, especially for product style shots where precision was essential.

Creative exploration

The project began as an investigation into whether an AI pipeline could reliably maintain character identity and scene continuity across multiple shots. I developed a simple narrative about a spirited dragon to anchor the test, then used design thinking principles to break the story into modular beats that could be controlled and iterated through Weavy’s node structure. This approach allowed me to test variations quickly while preserving a narrative through line.

Using Nano Banana, I generated both photorealistic and stylised animated interpretations of the same characters to assess cross style consistency. I experimented with controlled prompt structures and iterative refinements, evaluating the model’s ability to interpret physical detail and maintain object shapes, especially for product style shots where precision was essential.

Technical workflow

Once the visual assets were generated, I refined each frame using image editing techniques before shifting into video creation. I used Kling for image to video transformations, testing how well motion could be applied without breaking the visual identity established in earlier stages. This step was critical in understanding how different AI models handled temporal coherence.

I then assembled the full sequence with motion graphics, sound design, SFX and manual polishing. This final pass ensured the film felt cohesive despite being the output of several distinct AI tools. The end result revealed the strengths and limitations of a multi model workflow and demonstrated how creative control can be layered over generative outputs.

Technical workflow

Once the visual assets were generated, I refined each frame using image editing techniques before shifting into video creation. I used Kling for image to video transformations, testing how well motion could be applied without breaking the visual identity established in earlier stages. This step was critical in understanding how different AI models handled temporal coherence.

I then assembled the full sequence with motion graphics, sound design, SFX and manual polishing. This final pass ensured the film felt cohesive despite being the output of several distinct AI tools. The end result revealed the strengths and limitations of a multi model workflow and demonstrated how creative control can be layered over generative outputs.

Technical workflow

Once the visual assets were generated, I refined each frame using image editing techniques before shifting into video creation. I used Kling for image to video transformations, testing how well motion could be applied without breaking the visual identity established in earlier stages. This step was critical in understanding how different AI models handled temporal coherence.

I then assembled the full sequence with motion graphics, sound design, SFX and manual polishing. This final pass ensured the film felt cohesive despite being the output of several distinct AI tools. The end result revealed the strengths and limitations of a multi model workflow and demonstrated how creative control can be layered over generative outputs.

The film performed strongly on LinkedIn with around 50,000 impressions and 900 reactions, reinforcing the growing interest in practical AI filmmaking techniques. More importantly, the project validated that a hybrid workflow can support consistent characters, blended visual styles and controlled product shots, establishing a blueprint for rapid AI concept development and narrative prototyping.

The film performed strongly on LinkedIn with around 50,000 impressions and 900 reactions, reinforcing the growing interest in practical AI filmmaking techniques. More importantly, the project validated that a hybrid workflow can support consistent characters, blended visual styles and controlled product shots, establishing a blueprint for rapid AI concept development and narrative prototyping.

The film performed strongly on LinkedIn with around 50,000 impressions and 900 reactions, reinforcing the growing interest in practical AI filmmaking techniques. More importantly, the project validated that a hybrid workflow can support consistent characters, blended visual styles and controlled product shots, establishing a blueprint for rapid AI concept development and narrative prototyping.

Hello Person Ltd
Strategy - Creative - Design - AI