We are witnessing an incredible evolution of text-to-image systems such as Stable Diffusion and MidJourney. Many of them are becoming easily accessible to everyone, through apps, Discord channels, sites or even through the same source code used by developers. The imagination of thousands of people has exploded by using them to generate the most diverse things and give rise to a universe of spectacular digital creations. In parallel with all this, Avatar, “The way of water”, James Cameron’s masterpiece, one of the most eagerly awaited films ever and the second film in one of the most successful sagas of all time, is being released in cinemas. Key to this success is the wonderful setting of Pandora, a planet rich in plants, colours and majestic and incredible creatures. Right from the first teaser trailer, it was clear how this second chapter would be able to catapult viewers back into this breathtaking world. During these Christmas holidays, Lucia Pifferi and I asked ourselves: “What would Pandora look like imagined by MidJourney?” and again: “Is it possible to create an alternative version of the teaser trailer for Avatar The Way of Water using MidJourney?”, “Can these systems be used to create trailer concepts from simple text descriptions?”.
To answer all these questions, we described verbatim the main scenes of the official teaser trailer of Avatar: The Way of Water, and gave them as input, one by one, to MidJourney.
All the images generated were combined to generate a teaser trailer in the same style as the original one. The video can be found at the bottom of the article.
Below are the individual scenes regenerated by MidJourney. For each one we specify a (minimized) caption that we used during generation and the original image taken from the trailer.