A fter years of flirting with it and embellishing it, is technology about to overwhelm storytelling?
IMAX, an immersive screen experience designed in the sixties, was used largely by museums and to make natural history documentaries. After an investment banking firm he ran with a partner acquired IMAX in 1994, lawyer/entrepreneur Richard Gelfond, now chief executive officer, came on board. It was almost impossible to get mainstream studios and theatre chains to use the technology that was wonderful to watch movies on, but was clunky and expensive. In 2001, an in-house algorithm that allowed reformatting regular films to IMAX helped push costs down. That is around the time theatres began digitising and filmmakers also started using IMAX equipment for making films.
By 2006, films like Harry Potter and the Goblet of Fire, Happy Feet and Transformers, made the IMAX format less of an oddity. James Cameron’s science fiction superhit Avatar (2009) mainstreamed it completely. The 163 IMAX screens (then) brought in roughly $250 million (9 per cent) of the almost $3 billion the film grossed globally. By 2023, the technology was available in 1,705 screens over the world — less than one per cent of the total. Yet it brought in over 3.5 per cent (over a billion dollars) of the global box office. That is because people are willing to pay three to five times the price of a regular ticket to watch an “IMAX” film.
Jonathan and Christopher Nolan’s vision for Interstellar (2014) involved travel through time and space via a wormhole or black hole. There was no evidence of what this means, no image. It took astrophysicist and Nobel Prize winner Kip Thorne and award-winning visual effects firm DNEG to figure that out, successfully. Interstellar grossed over $705 million at the global box-office, won praise from scientific journals, and also won the London-based DNEG, a subsidiary of Mumbai-based Prime Focus, one of its seven Academy Awards. The others include those for Dune (part one), Tenet and Inception. Namit Malhotra, founder Prime Focus and CEO, DNEG, says that if a filmmaker can dream it, his firm can deliver it.
To these advancements in the technology for storytelling at the retail and production end, add AI or artificial intelligence. The world and India is awash with startups attempting to use it in storytelling. There are some basic ones like Bengaluru-based NeuralGarage, which offers VisualDub. The generative AI-based tool syncs dubbing to facial and body movements of characters, ensuring that a video dubbed, for example, from Telugu to Korean or English to Hindi doesn’t appear odd. Brands like Britannia and Ultratech have used it to release ad films across India’s diverse consumer markets. Earlier this year, NeuralGarage tied up with digital cinema distribution firm UFO Moviez to pitch its technology to studios. In an age where streaming and theatres look for pan-Indian and pan-global stories (and, therefore, more languages) a tool that cuts costs by a third or more, is welcome.
There are others like AyeVee, an app from Chennai-based Asiaville Interactive. It has just patented a technology that marries gaming and fiction. One of its two 45-minute shows, Who Killed Kavitha, is a video whodunit that allows you to analyse clues, check out suspects, carry out handwriting analysis and so on. ScriptGPT, a generative AI tool developed in-house by Zee Entertainment, helps figure out character arcs, stories, plots, and twists that could attract more audiences and boost ratings. There are dozens of other similar apps being developed – to aid, embellish, assist in storytelling.
All this tech-fuelled creativity — whether at the screening and filmmaking end or the writing and script development end — dovetails with the massive disruption the business is seeing. Streaming, which took off globally in 2016, is chipping away at clusters of audiences. The global box-office collections fell from $42 billion in 2017 to $34 billion in 2023. It now takes something exceptional, an Oppenheimer, Barbie or a Pathaan to get people back into the theatre. That means you need either a spectacle, visual effects, or a star that a bulk of the audience simply cannot enjoy on the small screen. And you need a screen that makes it worth their while to step out. Much of the action that you see on the tech front then is a good sign.
But the question remains — when does technology stop being a support for the story and become the story itself? The decision about how much and what visual effects, tools, and gizmos a story needs ultimately lies with both the audience and the filmmaker. “No technology, no tool, no superstar means anything. Ultimately, people care only about the story being told. If the director wants to tell a story about going into time and space, he cannot tell it without the tools needed to show that. It is about servicing the need of the story,” reckons Mr Malhotra.
Think about it. Oppenheimer or Dune are not necessarily spectacle films. They are stories about extraordinary, real or imagined events, with atmosphere. Mission Impossible, Furiosa, and Brahmastra combine story and spectacle. Both types of films have succeeded, whereas many recent Marvel films or special effects-heavy ensembles that are thin on storyline have not.
This is true for both — technologies that use generative AI and those that don’t. Add another twist for “cognitively-enabled” technology, like generative AI. It needs to be trained. That can happen only on original creative work from the past. As writers, filmmakers, and actors rush to legally protect their work from being used to train AI, its creative abilities will be limited to hallucinating. That happens when generative AI does not get enough data to “generate”. Or it is trained on poor quality videos/stories leading to a subpar output. For now, AI’s role will be limited to enhancing the viewing experience, cutting costs, improving efficiencies, and maybe creating the odd AI-based character.
https://bsmedia.business-standard.comtwitter.com/vanitakohlik