The battle over artificial intelligence has reached a boiling point in Western culture, with Hollywood screenwriters and actors taking a stand against the role of generative machine learning in shaping our future stories. Fran Drescher, known for her role in “The Nanny,” has been at the forefront of this fight, highlighting how digital streaming platforms are negatively impacting wages, conditions, and careers in the industry. At the heart of their demands is a plea for studio bosses to acknowledge that the work of machines should not be passed off as that of human creators.
Director Christopher Nolan, known for films like “Inception” and “The Dark Knight,” sees parallels between his lead character’s creation of the atomic bomb and the potential dangers of unleashing AI on the world without careful consideration. He cautions against portraying technology as all-powerful, as it absolves individuals of responsibility for their actions. Nolan emphasizes the need to recognize the limitations of AI.
The Automated Culture Wars have also ignited debates around AI-generated music and literature. Algorithms are now capable of distilling existing works into something that purportedly surpasses human creations. However, the recent Guardian Essential Report suggests that Australians are divided on their comfort levels with these developments. The report reveals that Australians are more accepting of AI-generated visual art but are less convinced when it comes to music, scripts, novels, and news reporting. There is a clear generational divide, with younger respondents being more accepting of automation but still agreeing that AI-generated content should be clearly labeled.
While the Hollywood strikers are often seen as the canaries in the coal mine, they are more like frogs trying to escape a boiling pot. Generative technology has already displaced numerous workers in various industries, reducing their labor to replicable processes controlled by AI. Whether it’s film scripts or automated checkouts, AI thrives on the labor of real people, yet these individuals receive no rewards or control over the output generated by AI.
Advocates of AI often emphasize its potential benefits, such as energy efficiency, improved healthcare, and increased accessibility. However, less attention is paid to the risks associated with AI. Sam Altman, CEO of Chat GTP, admits to being “a little bit scared” of his own technology and believes society has a limited amount of time to figure out how to handle it. Australians share this concern, prompting the Albanese government to initiate a review on AI regulation in the country.
Edward Santow, former human rights commissioner, and his team at the Human Technology Institute argue that any regulation of AI must be built on a solid foundation. They suggest that updating privacy laws should be the starting point, as personal information is the fuel for AI. Australia’s privacy laws have not been updated in almost 40 years, leaving the regulatory infrastructure ill-equipped for the digital age of surveillance capitalism. There have been attempts at reform, but resistance from media outlets claiming a right to operate without legal consequences has hindered progress.
Public opinion, however, is ahead of the political process, with strong cross-partisan support for tighter rules on data collection. This support could serve as the basis for a modern-day nonproliferation treaty for data. The public recognizes the need to protect their privacy and prevent the unchecked proliferation of personal information.
Drawing parallels between AI and the atomic bomb, as seen in Christopher Nolan’s film “Oppenheimer,” may not be the most fitting metaphor for this moment. AI is not a singular event with devastating consequences, but rather a slow burn that is gradually consuming jobs and culture. If we are to use a mid-20th century figure as a metaphor for AI, Friedrich Hayek, an Austrian economist known for his anti-regulation stance, may be a better fit. Hayek’s ideas laid the foundation for the neoliberal consensus, which prioritizes self-serving systems over human controls. This ideology has allowed global corporations to dominate both government and citizens.
Ultimately, it is the human flaws and shared experiences that give art its true expression. AI may be capable of creating more efficient and concise works, but it is the imperfections and depth of human creations that spark meaningful conversations and shape culture. As we navigate the future of AI, we must not lose sight of the importance of human creativity and connection.