Play
Dog & Bear
Play
Dog & Bear

Project
Dog and Bear AI animated short film.
Objective
To explore how far generative tools could be pushed to create a coherent, character driven animation with consistent visuals, fluid storytelling and a unified audio identity.
Role
I led the full end to end creative process including narrative development, visual direction, character design, shot creation, motion and edit. I also oversaw all AI workflows spanning imagery, video generation, audio composition and final production.
Project
Dog and Bear AI animated short film.
Objective
To explore how far generative tools could be pushed to create a coherent, character driven animation with consistent visuals, fluid storytelling and a unified audio identity.
Role
I led the full end to end creative process including narrative development, visual direction, character design, shot creation, motion and edit. I also oversaw all AI workflows spanning imagery, video generation, audio composition and final production.
Project
Dog and Bear AI animated short film.
Objective
To explore how far generative tools could be pushed to create a coherent, character driven animation with consistent visuals, fluid storytelling and a unified audio identity.
Role
I led the full end to end creative process including narrative development, visual direction, character design, shot creation, motion and edit. I also oversaw all AI workflows spanning imagery, video generation, audio composition and final production.
Dog and Bear was an experimental short film created to test the limits of AI video generation. The goal was to understand how consistent character design, structured storytelling and multi tool orchestration could be combined to produce a cohesive animated narrative.
Dog and Bear was an experimental short film created to test the limits of AI video generation. The goal was to understand how consistent character design, structured storytelling and multi tool orchestration could be combined to produce a cohesive animated narrative.
Dog and Bear was an experimental short film created to test the limits of AI video generation. The goal was to understand how consistent character design, structured storytelling and multi tool orchestration could be combined to produce a cohesive animated narrative.
Creative foundations
The project began with extensive ideation in Perplexity AI where I explored alternative story angles before refining them into a tight narrative arc. Perplexity AI was also used to produce structured shot descriptions, allowing me to establish a cinematic blueprint that could be translated directly into visual prompts. To anchor the aesthetic, I built a dedicated Perplexity AI space and trained it with creative direction inputs that ensured a stable stylistic language across every scene.
With the narrative and visual framework defined, I developed detailed character sheets in Midjourney. These became the foundation for consistent character portrayal, serving as reference images for every shot prompt. This approach ensured that Dog and Bear remained visually recognisable throughout, even as different tools generated frames and sequences. It also provided a reliable starting point for compositing, animation flow and emotional continuity.
Creative foundations
The project began with extensive ideation in Perplexity AI where I explored alternative story angles before refining them into a tight narrative arc. Perplexity AI was also used to produce structured shot descriptions, allowing me to establish a cinematic blueprint that could be translated directly into visual prompts. To anchor the aesthetic, I built a dedicated Perplexity AI space and trained it with creative direction inputs that ensured a stable stylistic language across every scene.
With the narrative and visual framework defined, I developed detailed character sheets in Midjourney. These became the foundation for consistent character portrayal, serving as reference images for every shot prompt. This approach ensured that Dog and Bear remained visually recognisable throughout, even as different tools generated frames and sequences. It also provided a reliable starting point for compositing, animation flow and emotional continuity.
Creative foundations
The project began with extensive ideation in Perplexity AI where I explored alternative story angles before refining them into a tight narrative arc. Perplexity AI was also used to produce structured shot descriptions, allowing me to establish a cinematic blueprint that could be translated directly into visual prompts. To anchor the aesthetic, I built a dedicated Perplexity AI space and trained it with creative direction inputs that ensured a stable stylistic language across every scene.
With the narrative and visual framework defined, I developed detailed character sheets in Midjourney. These became the foundation for consistent character portrayal, serving as reference images for every shot prompt. This approach ensured that Dog and Bear remained visually recognisable throughout, even as different tools generated frames and sequences. It also provided a reliable starting point for compositing, animation flow and emotional continuity.

Production workflow
Each visual asset was upscaled and processed through Luma AI and Kling AI using image to video techniques. This allowed the static concepts to evolve into motion while retaining the core artistic identity of the characters. The generated sequences were then further enhanced using 4K upscaling in Topaz Labs to achieve a polished, cinematic finish.
For audio, I built layered soundscapes and character themes using Udio, experimenting with different musical interpretations before cutting together the final composition. This provided an emotive structure that supported the tone of the film. The project concluded in Capcut, where I completed the colour grading, edit, transitions and overall composition to bring the full narrative together.
Production workflow
Each visual asset was upscaled and processed through Luma AI and Kling AI using image to video techniques. This allowed the static concepts to evolve into motion while retaining the core artistic identity of the characters. The generated sequences were then further enhanced using 4K upscaling in Topaz Labs to achieve a polished, cinematic finish.
For audio, I built layered soundscapes and character themes using Udio, experimenting with different musical interpretations before cutting together the final composition. This provided an emotive structure that supported the tone of the film. The project concluded in Capcut, where I completed the colour grading, edit, transitions and overall composition to bring the full narrative together.
Production workflow
Each visual asset was upscaled and processed through Luma AI and Kling AI using image to video techniques. This allowed the static concepts to evolve into motion while retaining the core artistic identity of the characters. The generated sequences were then further enhanced using 4K upscaling in Topaz Labs to achieve a polished, cinematic finish.
For audio, I built layered soundscapes and character themes using Udio, experimenting with different musical interpretations before cutting together the final composition. This provided an emotive structure that supported the tone of the film. The project concluded in Capcut, where I completed the colour grading, edit, transitions and overall composition to bring the full narrative together.



