Warning: This article contains spoilers about several 2024 Academy Award nominated films.

With the Academy Awards fast approaching, many are probably watching and re-watching the nominees, attempting to predict which will be recognized, and which may be “snubbed,” but some of us are interested in something else. Many of this year’s nominees grapple with diverse subjects within science and technology, whether they be historical or fictional. How accurately do they portray these scientific topics? 

“Oppenheimer” and the atomic bomb

Dominating box offices as one of the biggest movies of the year, Christopher Nolan’s “Oppenheimer” clearly did a lot right. Based on Kai Bird and Martin Sherwin’s book “American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer,” the story follows Oppenheimer as he leads the Manhattan Project, a top-secret team of physicists, engineers and military officials, to develop the world’s first atomic bomb. From both a historical and scientific perspective, the movie is impressively accurate. From the initial concerns that a nuclear bomb would set fire to the entire atmosphere, to the relationship between Oppenheimer and Albert Einstein at Princeton University, the central plotlines in the film were all true. According to Bird himself, the film didn’t leave out any major historical scenes, but there are still some details that were excluded, including the devastating effects of the Trinity test at Los Alamos and the aftermath of the Hiroshima and Nagasaki bombings. Chris Griffith, a Manhattan Project historian, described a few other aspects of the project that Nolan got right and wrong. Focusing on Los Alamos, the film largely left out the contributions of the other test sites, like those of Hanford, Washington,  and Oak Ridge, Tennessee. While Los Alamos was essentially the center of the Manhattan Project, the others played roles in the production of materials such as uranium and plutonium that would be used in the Trinity device. While they are briefly referenced in the film’s marbles scene, Griffith argues that the exclusion of these other sites may leave the viewer without an accurate grasp on the true scale of the project. Another, albeit minor, aspect which was changed for the screen was that of the Trinity test explosion itself. While Nolan got the timing right (and it should be said that he made sure to use practical effects rather than CGI to maintain as much accuracy as possible), Griffith expresses disappointment at not seeing the characteristic mushroom cloud, which should have followed the light and sound. In addition, Australian nuclear experts have said that the film left out any reference to or depiction of the heat wave that should have immediately followed the explosion. One of the researchers, Dr. Kirrily Rule, went so far as to say that the science of the project was ineffectively conveyed to the audience throughout the film, leaving them with a sense that it’s “too hard.” He argues this establishes a disconnect between the viewers and the subject matter. 

“As a physicist watching the movie, I think they could have been much clearer on the science involved,” he said. 

Nevertheless, whether the film oversimplifies or overcomplicates the exact science behind the Manhattan Project, the film does an effective job at communicating the moral and political implications of the project, and leaves viewers excited to learn more. 

Resurrection and “self” in “Poor Things” 

A testament to womanhood and self-discovery, “Poor Things,” directed by Greek filmmaker Yorgos Lanthimos, has become another fan favorite this year. The film explores the psychological development of Bella Baxter, a Frankenstein-like creation of her father figure Dr. Godwin Baxter, as she grows into both her own body and the world around her. From the original Frankenstein to our current cultural craze over zombies, reanimation and resurrection are themes which have been explored in popular culture for decades and reflect our society’s long-standing fascination with the ability to overcome death. Within science itself, many have taken enormous efforts to research ways to potentially extend lifespans, revive patients who have died and — like Dr. Baxter’s duck-goats and dog-chickens — creating “chimeras,” by mixing and matching the most interesting or strongest attributes of various species. However, side-stepping any obvious ethical issues, is there any actual scientific evidence that Dr. Baxter’s creations could theoretically one day be possible? “Poor Things” makes it look so easy: pluck a brain out of one body, open the brain case and set it in. Voíla — a new human retaining all motor function and mental capabilities. While there have been quite a few attempts at studying brain tissue revival and even full head and brain transplants, there hasn’t been any definitive success. While Bella is the amalgamation of a prenatal brain in an adult woman’s body, some researchers, like Sergio Canavero, have been studying the possibility of transplanting older brains from their elderly bodies into younger and healthier bodies. Laboratory trials have attempted the transplants in mice and monkeys, but the fact that the brain is connected to the spinal cord complicates things; While the severing and replacement of a head is one thing, there does not yet exist technology that can facilitate the complete reconnection of the brain and spinal cord (at least in humans.) And, even if a full brain transplant became possible, there is no concrete understanding of how the brain would then continue to develop once inside the new body. In “Poor Things,” since Bella’s body stays the same, the film focuses on exploring the character’s psychological progression. However, and perhaps unsurprisingly, the film’s depiction of her mental development is not entirely scientifically accurate. The French philosopher René Descartes argued that the mind and body are completely separate, and that each could operate independently of the other—a theory clearly at play in the film. Many contemporary psychologists, however, believe that the mind and body are intrinsically connected. Regarding childrens’ development, psychologist Phillippe Rochat, asserts that the body plays a major role in the mind’s development, and that over the course of one’s life, the two work together to inform an individual’s “sense of self.” Nevertheless, while Baxter herself may be a work of fiction, if the incredible slew of medical advancements of the last century is any indication, human brain transplants may be commonplace in another hundred years.  

“The Creator” and artificial intelligence 

With tools like ChatGPT, facial recognition software and virtual assistants (like Siri or Alexa) becoming more and more popular and abundant, many have expressed concerns regarding how much power and autonomy these technologies really have — and whether they may be developed to a point in which they have minds of their own. Director Gareth Edwards’ most recent film, “The Creator,” explores these themes to an extreme. The story follows Joshua, a soldier in a deadly war between humans and robots, whose mission is to find and eliminate a mysterious artificial intelligence (AI) weapon that the robots have been developing. Finding that the weapon is in fact a young ‘synthetic’ girl, however, and in the process discovering more about his deceased wife, Joshua is forced to face his previous assumptions about the inhumanity and danger of AI, and rethink his beliefs about what constitutes humanity. The film does an effective job at creating a story which evokes an emotional response from the audience, interestingly more so toward the machines than toward the human characters. Film critics have acknowledged this, and subsequently drawn connections between this story and that of the iconic film “Blade Runner.” In both stories, attention is drawn to how the robots are “more human than human.” This sentiment is evident in several scenes in “The Creator” which demonstrate the cruelty humans inflict on the robots and those protecting them. Yet, does the film perpetuate inaccuracies and stereotypes regarding the essence of artificial intelligence and their potential to equal or even surpass humans? Perhaps. The movie takes place in 2070, 15 years after robots detonated a nuclear bomb in Los Angeles, killing millions of people. This is a theme popular culture has latched on to in recent years; whether AI might get to a point where it can overrun humanity. Many experts, however, have assuaged these concerns, maintaining that while AI is becoming more sophisticated every day, it is highly unlikely that it will ever ‘take over the world,’ at least in the way movies like “The Creator” portray it. Robots have been trained to do everything from playing chess to writing novels, but so far none have been able to truly exhibit (that is, not programmed or mimicked) emotion, creativity or intuition — all defining characteristics of humans. Some researchers, however, believe that in the next few years AI systems could get to a point where, if granted enough access to global systems, like the stock markets or military, and depending on what they were programmed to do, they could get out of control. Like with any technology, though, the risk arises less with artificial intelligence itself than with how we choose to use it. 

A version of this article appeared on pg. 12 of the March 7, 2024 print edition of the Daily Nexus.

Print