An angry political leader screams with a message of hatred for a beloved audience. A child crying on the massacre of her family. The comic men in prison costume are starving on the edge of death because of their identities. When you read every sentence, it is possible that specific images appear in your mind, burned in your memory and team wounds through documentaries, textbooks, news media and museum visits.
We understand the importance of important historical images like this – the images that we must learn from in order to move forward – in a large part of them because they took something real about the world when we were not present to see it with our own eyes.
As archival producers of documentaries and participating managers of the alliance of reserved producers, we are very concerned about what can happen when we can no longer confidence that such images reflect reality. We are not the only ones: before Academy Awards for this yearFariti stated that the Motion Picure Academy is studying It requires competitors To detect the use of AI Tawnidi.
Although this disclosure may be important for feature films, it is important to documentary films. In the spring of 2023, we started seeing artificial images and sound used in the historical documentary films that we were working on. With the absence of transparency criteria, we are afraid that this supply with real and unrealistic sites will lead to a waiver of this non -fictional type and the indispensable role in our common history.
In February 2024, Openai examined the text platform to the new Video, Sora, with a clip called “Historic footage of California during the gold rush.The video was convincing: a flowing flow full of wealth promise. Blue sky and circulating hills. A prosperous city. Men on horseback. It seemed to be a Western, as the good man wins and rides to the sunset. He – she I looked originaland But she was fake.
Openai presented “historical footage of California during Gold Rush” to show how Sora, which was officially released in December 2024, creates videos based on user claims to use AI that “understands reality and simulates reality”. But this clip is not a reality. It is a random mix of real and imagined images by Hollywood, as well as the historical biases of industry and archives. Sora, such as other gynecological programs like Runway and Luma Dream Machine, are devoted to the content of the Internet and other digital materials. As a result, these platforms simply recycle the restrictions imposed on the media online, and there is no doubt that the biases are exaggerated. After watching it, we understand how the audience can be deceived. Cinema is strong in this way.
Some in the world of cinema met with the arrival of the tools of tweed intelligence with open arms. We and others see that it is something very anxious on the horizon. If our belief in the health of the images is destroyed, it may lose strong films and the task of its demand for the truth, even if they do not use the materials created from artificial intelligence.
Transparency, which is something closer to putting signs on the food that reaches consumers about what is going on in the things they eat, can be a small step forward. But it does not seem to be an organization to detect artificial intelligence over the next hill, as it is coming to save us.
The obstetric intelligence companies are in a world that anyone can create visible audio materials. This is deeply anxious when it is applied to history representations. The spread of artificial images makes the task of documentaries and researchers – protecting the safety of the basic source materials, drilling through the archive, and providing history accurately – more urgent. It is a human work that cannot be repeated or replaced. One only needs to look at the documentary that was nominated for the Academy Award this year, “Sugar Cafe” to see the strength of accurate research, the exact archive images and the well -reported personal narration to expose the hidden history, in this case the abuse of nations in Canadian residential schools.
The speed of the new artificial intelligence models is released and new content is produced that makes technology impossible to ignore. Although it may be fun to use these tools to imagine and test it, the results are not a real work for documentation – from people who are witnessing. It’s just a remix.
In response, we need to erase strong media illiteracy from artificial intelligence for our industry and the general public. In the alliance of preserved producers, we published a set of Guidelines – It was supported by more than 50 industrial organizations – for the responsible use of documentary males in documentaries, and the practices that our colleagues began integrating their work. We have also presented an invitation to the case studies to use artificial intelligence in the documentary. Our goal is to help the film industry in ensuring that documentaries will deserve this title and that the collective memory they reach will be protected.
We do not live in a classic west. Nobody comes to save us from the threat of irregular artificial intelligence. We must work individually and to preserve the various integrity and views of our true history. Micro -visual records do not document only what happened in the past, they help us understand them, learn their details, and perhaps most importantly at this historical moment – Believe He – is.
When we can no longer see the highest levels and do what happened before, the future that we share it may turn into more than just random remix as well.
Rachel Antille, Stephanie Jenkins and Jennifer Petroseli are managers participating in the preserved producers.