MIXING AI WITH NARRATIVE 🖤 Another experiment with artificial intelligence! I wanted to try something different and find a way to incorporate my current favorite tools into a small short I wrote a while ago. The idea is to push how I already shoot and tell stories even further—not to replace it by any means! I have no interest in simply generating art. My emotional connection to what I do runs too deep. I’m just curious to see how far I can take it with what’s possible today! . Written & Directed by: Mateo Mejia DP & Gaffed🎥: Logan Key grip: Limbo Cast: @starletfrancis @currentlystuck_ @mateocaptures . Breakdown: We shot this project on the @blackmagicnewsofficial URSA Mini 12K and the Full-Frame 6K, using a set of @meike_global and a few @intellytech and @aputure.lighting lights! For the voice of the narrated therapist, I used a voice generator tool called @elevenlabsio . I trained it on a random clip of a 90s news reporter and fed it the script I wrote! The rest of the dialogue was performed by me, and the voice of the girl was done by @starletfrancis . For this project, I experimented with how well AI could generate water in a frame I had already shot. I used various stills and manipulated them with tools like @hailuoai_official , @luma_ai and @runwayapp . It took so many attempts to get it right, but I feel the results turned out even better because of the lighting setup we created. To achieve the slow-to-fast effect, I rescaled and adjusted the frame rate using @topazlabs . Toward the end, in the wide shot of me holding the actress, I expanded the frame using @runwayapp new video expansion tool! ^ A new era of sampling—this time, for film. :)
This project was an experiment I’ve wanted to try for a long time! If you’ve been following me, you know I’ve been diving into artificial intelligence and how to fit these advancements into my filmmaking process. This video was a test to see what I could create with no budget—just using these tools to enhance what I already do. I shot it on the @blackmagicnewsofficial Full Frame with a @meike_global 35mm lens at 1.4 and one @intellytech 2x4 soft box as a key light. I shot every scene twice: one with me and one without, to make it easier to manipulate the background. I used tools like @runwayapp , @luma_ai , @hailuoai_official and generative fill in @photoshop to add effects like fire and explosions. Then I brought everything into @davinciresolvestudios , used Magic Mask to rotoscope myself in, and color-graded to blend it all together. I’ve learned that AI tools are getting much better at creating VFX and tweaking images you’ve already shot. I’m not about generating entire films on a computer—I’m about enhancing the way I already make films, pushing them further, and doing things I’ve dreamed of but couldn’t afford to try. This isn’t perfect, but I’m happy with it. I know how I can prep better next time—like spending more time on lighting and fine-tuning the details—but not bad for a first attempt!
Never snows in Texas
2/4/25, 11:16 AM
Don’t let go…
Another experiment with artificial intelligence! I wanted to try something different and find a way to incorporate my current favorite tools into a small short I wrote a while ago. The idea is to push how I already shoot and tell stories even further—not to replace it by any means!
I have no interest in simply generating art. My emotional connection to what I do runs too deep. I’m just curious to see how far I can take it with what’s possible today!
.
Written & Directed by: Mateo Mejia
DP & Gaffed🎥: Logan
Key grip: Limbo
Cast:
@starletfrancis
@currentlystuck_ @mateocaptures
.
Breakdown:
We shot this project on the @blackmagicnewsofficial URSA Mini 12K and the Full-Frame 6K, using a set of @meike_global and a few @intellytech and @aputure.lighting lights! For the voice of the narrated therapist, I used a voice generator tool called @elevenlabsio . I trained it on a random clip of a 90s news reporter and fed it the script I wrote!
The rest of the dialogue was performed by me, and the voice of the girl was done by @starletfrancis . For this project, I experimented with how well AI could generate water in a frame I had already shot. I used various stills and manipulated them with tools like @hailuoai_official , @luma_ai and @runwayapp . It took so many attempts to get it right, but I feel the results turned out even better because of the lighting setup we created.
To achieve the slow-to-fast effect, I rescaled and adjusted the frame rate using @topazlabs . Toward the end, in the wide shot of me holding the actress, I expanded the frame using @runwayapp new video expansion tool!
^ A new era of sampling—this time, for film. :)
Nothing is gonna change my world…
I shot it on the @blackmagicnewsofficial Full Frame with a @meike_global 35mm lens at 1.4 and one @intellytech 2x4 soft box as a key light. I shot every scene twice: one with me and one without, to make it easier to manipulate the background. I used tools like @runwayapp , @luma_ai , @hailuoai_official and generative fill in @photoshop to add effects like fire and explosions. Then I brought everything into @davinciresolvestudios , used Magic Mask to rotoscope myself in, and color-graded to blend it all together.
I’ve learned that AI tools are getting much better at creating VFX and tweaking images you’ve already shot. I’m not about generating entire films on a computer—I’m about enhancing the way I already make films, pushing them further, and doing things I’ve dreamed of but couldn’t afford to try.
This isn’t perfect, but I’m happy with it. I know how I can prep better next time—like spending more time on lighting and fine-tuning the details—but not bad for a first attempt!
Mid
Can’t predict the expiration date for these sorta things…
-what a moment in my life to have been able to make a project like this.