Archives

color schemes
   rss feed:
1 Oct 2021

Review: Free Guy – Two weeks ago, HJ and I went to see Free Guy, a deep and thoughtful exploration of important philosophical issues, such as the nature of reality, the meaning of life, and the question of free will.

“Free Guy knows exactly what sort of film it is, and it leans into that with everything it’s got.”

Ha! I’m just kidding! No, Free Guy is an action-comedy blockbuster with some impressive special effects and Ryan Reynolds doing his best Ryan Reynolds. Or, wait... can it be both? I’ll save you the suspense and tell you right off the bat that the answer to that question is “No.” I don’t think Free Guy necessarily tried to be both, though. It knew which path it wanted to take from the very beginning, and it committed to that path, even while casting the occasional longing (but ultimately perfunctory) glance at the other path.

I’m getting ahead of myself here. Since I have labeled today’s post a “review,” I’ll start with that, avoiding spoilers for the time being. Free Guy tells the story of Guy, an NPC (non-player character) in a massively multiplayer online game along the lines of Grand Theft Auto. He works as a teller at a bank that gets robbed numerous times throughout the day, and he seems to be perfectly at peace with the fact that his hometown of Free City is basically a war zone. One day, though, he meets the girl of his dreams, one of the “sunglasses people,” and everything he thought he knew about his world is suddenly called into question. Can he become more than his default programming to save his world and everyone in it?

That’s the plot in a nutshell, and it’s about what you’ll get if you watch the trailer. I was only half kidding when I mentioned “the nature of reality, the meaning of life, and the question of free will” back in the first sentence. Free Guy does indeed touch on these issues—or, at the very least, it waves cheerily at them as it blasts by on its joyride through the world of Free City. Like I said above, Free Guy knows exactly what sort of film it is, and it leans into that with everything it’s got. Mind you, this is not a bad thing. I found the film quite enjoyable, thanks to some good writing, seamless special effects, and quite a few good performances by the cast. Taika Waititi shows that he can play not only a hilariously deadpan rock monster (Korg, from Thor: Ragnarok, hands down the best Thor film) but also a deliciously detestable villain. Jodie Comer and Joe Keery are solid as leads Millie/Molotov Girl and Keys. But your response to this film will probably hinge in large part on how you feel about Ryan Reynolds. If you like Ryan Reynolds, you’ll like him here, and you’ll probably enjoy Free Guy, too. If you do not like Ryan Reynolds, then this is probably not the film for you—but I’m guessing you already know that. Still, whether you like Reynolds or not, the film brings up a lot of very interesting questions that are worth thinking about, so the remainder of this entry will be a (much) deeper dive into both the film and those questions. I should tell you up front that I got a little carried away and ended up writing nearly eighty five hundred words in total, which is why it took me two weeks to write this. Most of these words are tangents that stray far from anywhere the film ever dreamed of going. Whether this is a good thing or not remains to be seen. Hopefully all those words will at least give you something to think about, too. Oh, and there will be some spoilers from here on out, although this isn’t really the sort of film where spoiling will necessarily ruin it, I don’t think. (There will be spoilers for a few other older films, as well, though: The Matrix, The Truman Show, Inception, Blade Runner 2049, etc.)

Most of what you see in the trailer for Free Guy, and the main focus of the marketing, is only half of what goes on in the film. There are two plots: the “Free City” plot inside the game, and the “real world” plot outside the game. The latter can be very roughly summed up as follows. Two independent game developers, Keys and Millie, develop a “game” focused on NPCs that will go about living their own lives like real people. There will (presumably) be human player characters in the game as well, but the gameplay will primarily consist of watching the AI-driven world develop. This original game developed by Keys and Millie never ends up going anywhere for some reason, and when we join the story in medias res at the beginning of the film, Keys is working for the evil game developer Antwan at the mammoth AAA (“Triple A”) company Soonami, while Millie is trying to prove that Antwan stole their code to create the massively popular “Free City.” She is doing this from the inside, as one of what the NPCs call “sunglasses people,” that is, player characters or avatars in the game. (The sunglasses are a conceit that allow Guy to see all the game elements of his world, although it’s unclear why such a mechanic would exist, other than as a way to allow Guy to become a player himself. Just go with it for now.)

Now, if it feels like there is a gap in the story there, that’s because there is—or, at least, I was never all that clear on how we got from that original build by Keys and Millie to the game “Free City.” At first I had the impression that Antwan and Soonami bought out Keys and Millie, but if that is the case then he would have had the rights to the original code, which is what drives this whole story forward. Did Antwan just beat Keys and Millie to the market with “Free City,” and then Keys decided that if you can’t beat them, join them? That makes the most sense, although it doesn’t really explain why Keys and Millie couldn’t have gone to market with their game separately. I’ve been puzzling this out in my head since we saw the film, and I honestly can’t come up with an explanation that makes complete sense.

I don’t think it really matters, though. To put it another way, I think that if the film writers thought this was important they would have done a more thorough job explaining it. I only saw the film once, but I am a fairly attentive viewer, so I would like to think that if they had explained it properly I would have been able to understand. Of course, I can’t discount the possibility that I’m just an idiot, or that I was just distracted by all the bells and whistles. But I won’t belabor this point any further. What we need to know to understand the story is: 1) Antwan is evil (he is basically a parodic personification of an evil AAA game company), 2) Keys is working for him anyway, but in customer service and not in programming, for some reason, and 3) Millie is trying to prove that Antwan stole their code by finding their original “build” in the game. This sets up the basic relationships and conflicts that drive the plot forward: Antwan is being sued by Millie and obviously does not want her to find the original build, Keys is working for Antwan but is still sympathetic to Millie (in no small part because he is secretly in love with her), and Millie is exasperated with Keys for working for the enemy but still needs him as her inside man.

It is in the intertwining of these two plots—the Guy-driven “Free City” plot and the Millie/Keys/Antwan plot in the “real” world—where the really interesting questions pop up. More specifically, when Guy, who up until that point had gone through life cheerfully accepting all the violence and carnage happening around him on a daily basis, sees Millie (or, more precisely, her in-game character Molotov Girl) and hears her singing Mariah Carey’s “Fantasy,” something clicks in his brain and he begins to step outside of his NPC programming. He intervenes in a bank robbery and nabs the sunglasses off the player trying to commit the robbery. From that point on, everyone who sees him, Molotov Girl included, assumes that he is a player character wearing an NPC skin, both because he is wearing the sunglasses that mark player characters and because it never occurs to anyone that an NPC might do something other than what it is programmed to do. It is only after Guy rescues Molotov Girl from a botched mission that Keys realizes what is going on. He reports to Antwan that one of the NPCs now has a mind of his own, but Antwan doesn’t seem to care. In an attempt to impress upon Antwan the importance of this development, he tells him that Guy is “the first living AI!”

While I appreciate what the film is trying to do here, I think it is important that we untangle three concepts that the writers seem to be conflating (or at least not distinguishing) here: artificial life, artificial intelligence, and artificial consciousness. Technically, Keys is not wrong when he says that Guy is “living,” if by this he means that Guy represents software-based artificial life. But this is a definition of “life” that most people would have difficulty accepting. That is, some researchers in artificial life do indeed consider software-based artificial life to be “alive,” but most people would not; just because a computer program can mimic the processes of life (because life is a process, or a series of processes, rather than a state of being), that doesn’t mean that this program is alive.

The fact that Keys uses the phrase “living AI” is also a little problematic, because it fails to distinguish between artificial life and artificial intelligence. They are two separate things, and you do not necessarily need to have one in order to have the other. If this weren’t confusing enough, we also have the unspoken third concept of artificial consciousness. As far as I can remember, neither Keys nor anyone else in the film ever actually uses the word “consciousness,” but the understanding of AI does seem to be conflated with the idea of “consciousness,” especially if we consider Guy’s reaction later when he learns about the nature of his world.

Of course, we already have AI, although the AI that we have now is of the “narrow” variety—that is, AI that is focused on very narrow applications of intelligence. For example, the neural network AI known as AlphaZero is very, very good at chess (and shogi and go, too, all of which it learned on its own), but if you asked it to write a constitution for a burgeoning democracy it would be very much out to sea. You could of course design a neural network and train it on the constitutions of all the world’s democracies throughout history, and it would probably be able to come up with a very good constitution—or at least a constitution that is a good amalgamation and application of what human beings have been able to come up with so far—but narrow AI can only work within the initial parameters of its programming.

What is depicted in Free Guy is what is known as “Artificial General Intelligence.” The difference between AI and AGI is far greater than what you might expect from the addition of a single, innocent-looking word. While we have had AI for decades now and use AI in just about every area of our lives, AGI is still quite a way off yet. Basically, an AGI would exhibit the same intelligence as that exhibited by a human being. If a human being could solve a problem, no matter how complex, than an AGI would be able to solve it as well, and probably more quickly. We know that Guy is an AGI because he goes beyond the parameters set for him—that is, being a bank teller and carrying out the transactions and interactions expected of a bank teller. Instead, he becomes a hero himself, protecting the citizens of Free City and reining in violent player characters.

Guy’s full potential is demonstrated during a scene that the film plays off as a bit of a joke, even though it is obviously important. Guy takes Molotov Girl on a date to a relatively deserted (meaning that it is not overrun by homicidal players) area of the map, where he buys her bubblegum ice cream and they ride on swings. As it turns out, these are two things that Millie absolutely loves, so the date goes swimmingly. Guy confesses that he wants to kiss Molotov Girl, and she says, “Well, if you know how, you can.” He embraces her in a passionate kiss, and we cut away to Millie sitting at her computer with her headset on, eyes wide. The kiss isn’t physically happening to Millie, but she is certainly reacting to it. When Keys later reveals to Millie that Guy is an NPC and not a player in an NPC skin, she says, “I let him kiss me!” Keys gives her a confused look and says, “There’s no button for that.” Rather than dwelling on the ramifications of this, Millie says, “Oh, he definitely found the button,” and the moment is laughed off as Keys waves his hands with an I-don’t-want-to-know-about-it look.

But think about what happened there. Keys says that there is no “button” for kissing, but that word is chosen so Millie can make the joke about Guy “finding the button.” What Keys really means is that not only is there no button for kissing, there is no kissing functionality in the game at all. There is no code that would allow two characters to kiss, nor is there an animation to show them doing so. By kissing Molotov Girl, Guy essentially inserted brand new code, animations, etc. into the game in that instant. But the writing of new code and creation of new animations is only the half of it. This would be like AlphaZero (the chess AI) learning how to “accidentally” knock over the virtual board when it is losing, something that no online chess program currently allows. Think of everything that Guy would have to somehow learn to even imagine a kiss. Even human beings often have a difficult time imagining or understanding things outside of their experience (how many of you out there really understand string theory?). If you think about it, that one moment alone has mind-blowing implications.

Of course, I doubt very much that the writers intended for that kiss to be so momentous, or that they thought too much about the implications. It was a romantic and humorous moment, and it sets up something that happens later in the film, and that’s about it. I’m not saying that this is a problem, mind you. Like I said up at the top, the writers set out to make a certain sort of film, and as a result they didn’t dive very deeply into some of the questions they touched on. And that’s fine. But you also can’t expect a nerd like me not to take that dive.

I could talk about AI for the remainder of this entry, but I’m going to move on now to four other questions and issues raised by Free Guy. The first two are what I believe the film’s main messages are, one learned inside the game world and one learned outside the game world. The next two are important philosophical issues that the film gently grazes and then moves on from as quickly as possible—and in doing so perhaps does a disservice to the story and the characters. But we’ll start with the “messages” and then get to those other issues.

The film’s most obvious message is one of self-actualization. This is what we get from Guy, who starts the film as an almost pathologically cheerful bank teller whose catchphrase is “Don’t have a good day; have a great day!” His existence involves being robbed at gunpoint several times a day, but he’s as happy as he can be—that is, until he passes Molotov Girl in the street. This is the moment I mentioned above, where her singing Mariah Carey’s “Fantasy” triggers something in his code and sets off a chain reaction that leads him to wonder if this is all there really is to life.

He starts pushing back against his daily routine (that is, his programming) when ordering his morning coffee the next day. The barista cheerfully offers him his (and seemingly everyone else’s) usual order of “medium coffee, cream, two sugars.” But Guy says that instead he wants a cappuccino. This rattles not only the barista, who has never made a cappuccino before, but everyone else in the coffee shop. Remember that scene in Inception where Dom (as Mr. Charles) is trying to convince Fischer that he is his head of dream security, and everything goes quiet as all the projections suddenly turn ominously toward Dom? I’ve got to imagine that the filmmakers were inspired by that scene, because almost the same exact thing happens. Even the tank passing by outside the window swivels its turret to point at Guy. A tense silence follows, finally broken by Guy laughing it off and saying he was only joking.

But the seeds have been planted, not just in Guy but in everyone else. As Guy breaks out of his routine to become a hero, he begins to “infect” other NPCs around him with the crazy idea that you don’t have to play the role that society has cast you in—you can be whoever and whatever you want to be. The barista remains a barista, but she starts experimenting with other drinks and learns how to make a cappuccino through trial and error. A blond bombshell (her actual name is “Bombshell,” just as the barista’s name is “Barista”) whose role previously had been to latch onto player characters and make them feel macho goes off and writes a book about the male-dominated society of Free City. Once Guy gives everyone that initial push, things snowball into a completely new world for the NPC citizens of Free City.

Put succinctly, Guy’s message is that you don’t have to live your life as a background character, that you too can be a hero. It’s a great, feel-good message, and certainly one that I think is good for a lot of people to hear. But it is also, as I mentioned above, the most obvious, surface message, and (perhaps for this reason) to me it is the least interesting. Again, I’m not saying it’s a bad message, it just doesn’t go very deep. So I’ll move on to what I think is a far more interesting and meaningful message for humanity, and that is the idea of expanding moral circles.

One of the interesting framing devices we get in the film is scenes with famous streamers (like Pokimane... who I must admit I had not heard of before I saw this film) playing and commenting on “Free City.” As Guy gains popularity as a hero who goes around doing good rather than robbing banks and killing innocent bystanders, we get commentary from these streamers about how he has changed their perspectives. At one point, Pokimane (I think) says something like, “You know, I never really thought too much about NPCs before, but I will now.” Up until that point, players had heedlessly mowed down NPCs like so much digital chaff, whether it was necessary to complete a mission or not. Seeing Guy be the good guy, though, started to make at least some players think about NPCs as, well, people. Mind you, this is while they still thought that Guy was a human player himself. In other words, just seeing someone else treat the NPCs differently was enough to get them to reevaluate their own attitudes and behavior.

This, for me, is the far more powerful message of Free Guy. Self-actualization is great and all, but it’s a bit overdone in Hollywood films at this point. The idea that one person’s actions can cause others to expand their moral circles, though—that is important. If you’re not familiar with the idea of a “moral circles,” it is basically the circles that everyone draws to determine which beings in the universe are worthy of moral consideration. If a being is inside your moral circle, you will presumably treat that being with basic respect and dignity. If a being is outside your moral circle, though, you needn’t treat it with any sort of respect or dignity. My computer, for example, is outside of my moral circle, which is why I feel absolutely no guilt when I say all sorts of mean and horrible things to it if it doesn’t do what I want. My computer is also not sentient (yet), which is why it is outside of my moral circle in the first place.

You might be thinking right now about where you draw your own moral circle. “Well, we start with humans, of course,” you might say, “but what about the rest of the animal kingdom? Cute little puppy dogs, for sure, but spiders and mosquitoes?” Hold on a second, though. If you take a look at the world around us, it quickly becomes apparent that not all humans are automatically within everyone’s moral circles; the sad truth is that many people’s moral circles run right through the middle of the human race, including some humans and excluding others. If everyone’s moral circles included everyone else in the human race, the world would be a far better place. And I’m not calling out any particular group of people here; we have all done this at one point or another. If you can take an honest look inside yourself and say that you haven’t... well, you’re a better human being than I am, that’s for sure.

Just the idea that, hey, maybe those people I’ve never given a second thought to before are actually worthy of my moral consideration is important enough as it is, but this issue goes beyond merely being kind to your fellow human beings. Imagine that we one day manage to create an AI like Guy. Should he be in our moral circles? Millie and Keys certainly think so, but Antwan treats him like any other bit of code—just zeroes and ones that add up to nothing special at the end of the day. Sure, Antwan doesn’t properly understand what—or who—Guy is, but the point is that he was never interested in understanding in the first place. How will we react if such a day ever comes for us? Will we, like Keys and Millie, see this AI as a fellow “living” creature, worthy of respect and dignity? Or will we, like Antwan, see it simply as a disposable tool, worthy of no more consideration than a hammer?

True, such a day is probably still quite a long way off (and I, for one, hope it never comes, but that’s a discussion for another day). But whether we are talking about being humane to our fellow human beings or treating an AI with respect, it’s all just different locations on the same spectrum. I’ll be honest: Free Guy made me think a lot about this. In this very partisan and divided world we live in, you sometimes hear news stories about horrible things happening to people who are not on your “side.” People from the other side often react to these stories with schadenfreude. I try to fight against this, treating everyone with the same compassion that I have for my family and friends. But I will admit that this is a struggle. As silly as this may sound, Free Guy showed me a little glimmer of hope, that maybe we can change and expand our moral circles. All we need is a role model like Guy.

Of course, one could always argue that Guy is only acting in accordance with his programming. When he asks Molotov Girl how he can “level up” and she tells him that he has to rob banks and complete other sorts of missions, he rejects the idea out of hand. He doesn’t want to be the bad guy—he wants to be the good guy. But why? Is it because he is programmed to be incessantly, almost nauseatingly cheerful? Another instance where Guy’s programming comes up is when he professes his love for Molotov Girl/Millie. Keys explains that Guy was programmed with what amounts to a Millie-shaped hole in his heart, and that he was only responding to that programming. This raises an important question: Are Guy’s good deeds any less valuable if they only come about because of his programming? And this leads naturally into the first of the two philosophical ideas that the film only barely touches on: the idea of free will.

I am not going to attempt here to fully hash out the discussion surrounding free will and whether or not we really have it. Entire books have been written about the subject and we still haven’t heard all there is to be said about it. Instead, I’ll try to tackle the problem from within the framework of the film and answer, at least, the first question I asked above—whether Guy being programmed to act in a certain way makes those actions less valuable, or whether Guy’s love for Millie is any less valid because he has been programmed to love her.

It’s hard to watch Free Guy and not think of other films that have dealt with this issue—and perhaps done a better job of it. The Truman Show is one film that comes to mind (and not just to me but to the scriptwriter, who was apparently inspired by it); The Matrix is another. But in the specific case of Guy and his feelings for Millie (I’m dropping the “Molotov Girl” moniker here, because Millie does as well), I am reminded of the AI Joi and her love for K (whom she calls “Joe”) in Bladerunner 2049. If you haven’t seen this film, here be spoilers (also, why haven’t you seen this film?!); skip the next paragraph.

Joi is an AI specifically programmed to be a companion. She exists only as a hologram and has no physical form, but her love for Joe grows so strong and she wants to be with him so badly that at one point she hires a prostitute and syncs up with the prostitute’s body so that they can make love (sort of like that scene in Ghost where Patrick Swayze possesses Whoopi Goldberg, except significantly less creepy, because at least the prostitute knows exactly what she’s getting into). Joe, for his part, constantly tells Joi that she doesn’t have to tell him that she loves him—because that is the way he sees it, as something she has to do because of her programming. Does she really love him? Can an AI programmed to be a companion ever truly love of her own free will? The film never gives a definitive answer, but (OK, seriously, if you haven’t seen the film please stop reading here and go watch it) right before she is killed—when the device containing her entire being is destroyed—her final action is to look at Joe and cry out, “I love you!” It was an absolutely heart-rending scene.

I think everyone comes to their own conclusion when watching the film, and for my part I believe that she really did love him, at least as much as anyone can love someone else. Is this wishful thinking? Perhaps. But I have my own logic, and that is this: We are all, in one way or another, programmed. In Guy’s case, this programming is line upon line upon line of code. But that code can be so complex that it might develop beyond even the original coder’s ability to figure out what is going on. Without getting too deep into the details, AI that utilize machine-learning approaches are trained on data sets. For example, if you want to build an AI that can identify flowers, you show it a whole bunch of pictures of flowers. The AI that results will only be as good as the training data it is given. (This has actually been problematic in real-world applications, such as facial recognition programs having difficulty identifying people of color because the training data set didn’t contain enough photographs of people of color.) If the AI is complex enough, it might take quite some time to figure out why you are not getting the results you hoped for.

Why am I suddenly bringing up machine learning again? Because, although there are certainly differences between human and machine learning, for the purposes of the point I want to make they are very similar. That is, machines learn by processing a whole bunch of data and then drawing their own conclusions—just like humans. Humans are often better at drawing conclusions from data, I think, because we can make certain connections that a computer would not, and we are not stymied by some of the things that stymie computers. For example, a child will learn what a chair is fairly quickly, while a computer may require thousands upon thousands of data points before it can identify a chair with the same accuracy as a child. Computers make up for this with vastly superior processing and calculating speed. The point, though, is that whether you are an AI or a human, what you know about the world is based in large part on what you have been exposed to.

I say “in large part,” because AI will often be hard-coded with information to start them out. Take chess programs, for example. While a neural network AI will generally be able to come up with the standard chess openings on its own, it is often more efficient to combine machine learning with hard-coded “opening books,” or collections of known opening lines. A similar thing happens with Guy. Although he very clearly has the ability to learn, he is also hard-coded with a certain personality type—specifically, Keys has programmed him to be both “cheerful” and “lovelorn.” Humans also come with hard-coded instructions in the form of genetics. Genetics are important, and they will determine certain aspects of an individual’s existence, but they are only one side of the coin, with the environmental factors I mentioned above being the other side.

If all this is ringing a bell, that’s because I’ve just outlined the old pairing of nature and nurture. Nature is hard-coded instructions: genetics in human beings and actual code in AI. Nurture is the environment: our experiences, education, upbringing, etc. as human beings and the data sets used for training AI. It shouldn’t be too surprising that humans and AI should share this nature/nurture dichotomy. After all, who designed the AI? It’s only natural that we should think in terms of paradigms with which we are already familiar.

So, what does all this mean? Well, as far as I’m concerned, there is little significant difference between an AI that is programmed to “love” another being and a human being who loves another being. “But, but,” you sputter, arms waving erratically, “that’s not the same thing at all! Falling in love as a human being and being programmed to love as an AI are two completely different things!” Are they, though? Granted, I think most of us would like to think that they are. We’d like to think that there is something magical about falling in love, and the idea that an AI could do it as well would shatter that magic. But the way that human beings fall in love may not be as magical as you might think. Take, for instance, the role of pheromones in attraction. You may not be consciously aware of what is going on when you meet someone, but your brain is responding to things that you may just chalk up to “magic.” I am reminded of that old quote from Arthur C. Clarke: “Any sufficiently advanced technology is indistinguishable from magic.” I’ll offer a simpler version: “If we don’t understand it, it might as well be magic!”

So perhaps we attribute special significance to the processes of attraction and affection in the human race simply because they are happening to us. It wouldn’t be the first time we valued our own experiences over the experiences of others—you could almost say it is human nature to do that. Mind you, I’m not saying that the AI we have today are capable of falling in love (much to the chagrin, no doubt, of the millions of people that ask Alexa to marry them). But AI like Guy or Joi? That’s a different story. And it’s something we’re going to have to consider if we want to forge ahead in this brave new world of artificial intelligence.

I also want to consider the fourth of the four big questions I mentioned above. This starts with a pivotal scene in the film, where Millie breaks the news to Guy that he is an NPC in a computer game. As you might imagine, Guy does not take this well at first. He rebels against the idea, but when he is provided with sufficient evidence he begins to accept it. And then the following exchange takes place (I am paraphrasing from memory, of course). Guy: “So you’re real and I’m....” Millie (tearfully): “You’re not.” Guy is gutted by this, and he nearly loses the will to go on.

But things don’t end here, of course. Guy does what you might expect someone who’s just received life-shattering news to do: He goes to see his best friend, Buddy. Guy tells him what he just learned, and exclaims: “If none of this is real, then what does anything matter anymore?” Buddy, for his part, is shockingly unmoved by the news that his entire existence is an artificial construct. He says (again, paraphrasing): “Look, I don’t really understand what you just said. All I know is that I am sitting here with my best friend, helping him through a hard time. If that isn’t real, I don’t know what is.” Buddy’s words of encouragement snap Guy out of his funk and give him the strength to go on.

Your mind has probably already started jumping to other films that question the nature of reality: The Matrix, The Truman Show, Inception, etc. One thing that distinguishes Free Guy from these other films is the relative ease with which Guy overcomes his existential crisis. In both The Matrix and The Truman Show, the protagonist spends a significant chunk of the film coming to the realization that the life he has been living is a lie, and another significant chunk dealing with the psychological fallout of that realization. In Inception, Dom is haunted through the entire film by the memory of his late wife, who committed suicide because she could not shake the haunting feeling that her world was not real.

Of these films, I think my favorite for dealing with the question of what “real” means is probably The Matrix. There is one scene in particular that says it all. When Neo first enters The Construct—a blank slate of a program where the resistance can load up whatever they might need while in the Matrix—he puts his hands on the back of a worn leather chair and says, “This isn’t real?” Morpheus replies, “What is real? How do you define ‘real’? If you’re talking about what you can feel, what you can smell, what you can taste and see, then ‘real’ is simply electrical signals interpreted by your brain.” In other words, as long as the right signals are getting to our brains, what we are experiencing is, for all intents and purposes, real. We could, for all we know, be brains in a vat hooked up to machinery that sends us the right signals to mimic “reality” (this is known, appropriately, as the “Brain in a Vat Theory”). Or we could, like Guy, just all be pieces of code living in a computer simulation (known as the “Simulation Hypothesis”).

So let’s get back to that scene where Millie tells Guy that he is not real. Reader, believe me when I tell you that I nearly screamed out loud when she said that. But I managed to contain myself. Instead, I was screaming inside my head: “No, you idiot! That’s not what you’re supposed to say! You’re supposed to tell him that he is real in all the ways that matter, and that he matters, too! Do you not even understand your own creation?!” Think about the impact that Guy had on everyone, on all those streamers who talked about expanding their moral circles, on the players who started wondering if maybe there wasn’t a different way to play the game, on Millie herself! That is real. Buddy was right, and I appreciated that the filmmakers included that moment between Guy and Buddy, but Buddy didn’t really understand what Guy was experiencing in that moment. Of all people, Millie should have understood that Guy was real—but she didn’t and that was perhaps my single biggest disappointment with the film.

Think about what she is saying. Why does she believe that Guy isn’t real? Because he doesn’t have a corporeal form in the real world? But that logic just doesn’t hold up. Love has no corporeal form—are we going to say that love is not real? Or any of the other emotions that we feel, or inspirations that we have, or ideas that we share? Perhaps what she means is that Guy is not a real human being. Well, in that case she might technically be right, but should that really matter when we have an AI capable of contemplating the ramifications of his own existence? Is such an AI any less worthy of being called “real” than any other being? In the moment when it mattered the most, when everything hung in the balance, Millie drew her moral circle too small and excluded a being she had clearly come to care about. It was mind-bogglingly frustrating.

There were other ways in which the film let me down, although none of them rose quite to the heights of Millie’s betrayal in that scene. So for the remainder of this entry I am going to move away from the bigger issues and talk about some of the things that niggled at me in lesser ways. Probably my second biggest niggle also involved Millie. This is the scene after the game has been rebooted and Guy has been reset so that he no longer recognizes Millie. Keys says that Guy’s programming is still there, and all she needs to do is activate it again, like she did the first time. So Millie grabs Guy and plants a big kiss on his lips, and we get a colorful animation representing Guy’s programming reactivating. When they finally stop kissing, Guy looks at her and says, “I remember. I remember everything.”

Unlike with the “real” scene, I did not want to scream here. Instead, I was just confused. Didn’t we already establish that there was no “button” for kissing in the game? How did Millie kiss Guy if she didn’t rewrite the game’s code? Also, the idea that Guy would “remember” everything is kind of ridiculous. Sure, Guy’s programming would be reactivated, and he would be capable of learning the things he had learned before, but if the game was rebooted and everything reset to default, everything he had learned would have been wiped out. There would have been nothing to remember. To be honest, that aspect of the scene didn’t bother me too much, since by this point I had basically come to accept that Guy was magic and not actually based in physically possible technology. (Remember: “If we don’t understand it, it might as well be magic!” This seems to be a motto of screenwriters.) So, yeah, sure. Guy remembers everything. The film doesn’t work if he doesn’t, so fine.

What did bother me, though, was the fact that there was already a super-obvious, ready-made device for triggering Guy’s programming. It was the same thing that triggered Guy the first time he met Millie: her singing Mariah Carey’s “Fantasy.” I cannot for the life of me imagine why she didn’t do this instead of that ridiculous, nonsensical kiss. Well, no, I guess that’s not true. I suppose a kiss is more “romantic” or something like that. But it still makes no sense. I wondered if I was alone in thinking this, so I asked HJ what she thought Millie was going to do when Keys told her she had to trigger Guy’s programming. She immediately responded: “I thought she was going to sing.” I’m pretty sure everyone who was paying attention was expecting that, and I imagine they were quite confused by the kiss. In the end, though, it is merely stupid; it doesn’t go against the entire philosophy of the film. That’s why it is not my biggest niggle.

My next niggle occurs in one of the funniest parts of the film: the appearance of a villain NPC named “Dude.” Dude is a character being created for the sequel to Free City—supposedly a more advanced version of Guy—and even though he is not fully ready yet in terms of his personality (he shouts things like “Catchphrase!” instead of having an actual catchphrase), he is a physically functioning NPC. And boy is he physical; the filmmakers basically just took Ryan Reynold’s head and slapped it on a body builder’s body. That makes it sound lazy, I suppose, but it was actually done really well. And, yes, it is the old “hero fights evil version of himself” trope, but again I thought it was done well. It’s funny, for one, which is always welcome, and it ends when Guy realizes he cannot defeat Dude physically—so he puts his player-character sunglasses on Dude. Dude is immediately distracted by all the bright, shiny game elements floating about, and he goes prancing off after power-ups. It was nice to see a hero defeating his evil alter ego in an unexpected way.

The niggle, though, came during the fight itself. At one point, Dude loads up a “killing blow,” which will presumably kill Guy in one shot. Guy rapidly scrolls through his inventory and lands on “shield.” Which makes perfect sense, but then we see that this is actually Captain America’s vibranium shield. To drive the point home, the hero music from The Avengers starts playing. To then take the point out behind the shed and beat it to within an inch of its life, we get a cutaway to a bearded Chris Evans sitting in a cafe and watching the scene on his phone. He exclaims, “What the shit?!” I laughed, but it was more a laugh of bewilderment than a laugh of amusement. It completely yanked me out of the film and made me think, “What? How did they get the rights to that? Wait... is this is Disney film?” (Turns out the film was distributed by Disney, which I did not know at the time, because who cares?) If this weren’t enough, shortly after this Guy stands up and draws a lightsaber. To eliminate any shadow of a doubt about what this object is, we get cutaways to several streamers in succession saying, “Is that a lightsaber? That’s a lightsaber!” I did not laugh this time.

These two references to Disney properties annoyed me. A lot. There is a convention in film that the stories take place in worlds where much of the pop culture we are familiar with does not exist; you very rarely see such blatant borrowing from other intellectual properties. Granted, this convention exists primarily because of copyright laws, so it is artificial (as all conventions are), but the point is that we are so used to it that anything straying from it immediately jumps out at us. I am reminded of Space Jam 2, which I have not seen (and have no intention of ever seeing), but which I read a fair bit about online when it first came out. One of the big complaints I saw was regarding the basketball match scenes—basically, Disney threw every intellectual property they had into the audiences for the matches, so that moviegoers spent their time trying to pick familiar faces out of the crowds rather than watching the action on the court. Free Guy wasn’t that bad, but those two instances were still very distracting. Can we maybe stop doing this? Please? Putting an intellectual property on screen just because you happen to own it and know that people will recognize it is not clever. It is crude and distracting, and it takes people out of the moment. So stop it.

My second minor niggle relates to how certain people outside the game related to and were influenced by characters inside the game. After the fight with Dude, Guy and Buddy are racing over a bridge out into the ocean. In the outside world, though, Antwan is taking a fire ax to the servers, destroying them one by one. As he destroys the servers, we see parts of the game collapse and twinkle out of existence (in reality, of course, they would just immediately blink out of existence, but we are well into the “this is all magic” stage by this point). Guy and Buddy are separated by a gap in the bridge, and Buddy gives Guy the “go on without me” speech before dying (except he doesn’t really die, of course). A group of Soonami security guards are watching this take place on a big screen in the lobby, and one of the guards says, with tears in his eyes, “That security guard is a hero!” All the other guards nod in agreement. It’s a great moment, and another example of how “real” people are connecting with NPCs in a game.

The problem is that we see Antwan twice call security to have our heroes thrown out—first Keys, and then Keys’ friend Mouser. So, what are we supposed to think of these security guards? Are they just mindless drones doing the bidding of Antwan despite the fact that he is growing ever more erratic and violent? Or are they real people who might actually think for themselves and do the right thing? I get the impression that the filmmakers have two different types of security guards in mind—the mindless drones and the real people. Maybe we’re supposed to keep them separate, not seeing the guards that throw out Keys and Mouser as real people, but the contrast struck a very discordant note for me. I was expecting the security guards to rise up and rebel against Antwan, but they never did. The final confrontation with Antwan is saved for Millie, which makes sense, but it also makes that scene with the security guards in the lobby feel kind of pointless.

My final niggle comes at the very end, after the original build has been saved and turned into a paradise for all of the NPCs from Free City, who are now (presumably) living their best lives. We see Molotov Girl walking with Guy, and he tells her that he loves her but he knows she can’t stay. He also says that his being programmed to love her is actually a love letter to her—and now she has to go find the author in the “real” world. Millie suddenly realizes that Keys loves her, and she runs off to find him and they (presumably) live happily ever after. Which is great, I guess, although personally I didn’t find the love plot between Keys and Millie to be all that compelling. But that’s not my niggle. My niggle is that, yeah, maybe Guy is a love letter to Millie, but he still has a Millie-shaped hole in his heart! He has spent most of the film helping his fellow NPCs reach their full potential, but there is always going to be something missing for him. We’ve already established that he is a “living” AI, and someone worthy of moral consideration. How hard would it have been to program a Molotov Girl NPC for him to meet? Our two human leads may have found love, but you’re just going to leave the film’s main hero hanging like that?

There is a possible explanation for this: I was doing some reading about the film afterward, and apparently they already have a sequel lined up. So maybe they’re going to address this issue in the next film. I kind of wish they would have wrapped things up neatly in this one, though. Does every film have to have a sequel? The Truman Show didn’t have a sequel, and that was absolutely the right choice. The Matrix did have sequels—and I’ve been trying to forget their existence for the past eighteen years. Can’t we just have nice, stand-alone films? Can’t we have complete stories that have a clear narrative arc and a satisfying (or not, depending on your style) conclusion?

And that’s it for my niggles. Ending the entry like this may give you the impression that I hated this film, or was at least annoyed by it. But I said at the top that I found the film quite enjoyable, and that remains true. It’s OK to like something even if it isn’t perfect, or if parts of it annoy you. Would it had been a better film had it addressed my niggles? Well, it might have been a better film for me, but I’m not going to pretend I am the final arbiter on quality. Would it have been a better film had it gone as deeply into the philosophical issues as I did above? Not at all! That was not at all the sort of film that Free Guy set out to be, and had it stopped to dive deeper into any of those questions it would have imploded under its own weight (sort of like this entry is threatening to do). So I can honestly say that I both appreciate Free Guy for the Ryan-Reynolds-powered joyride that it was while at the same time appreciating the opportunity it gave me to think a little more about some of these philosophical issues. And if you made it all the way to the end of this, I hope that you got something from my contribution to the conversation.

color schemes
   rss feed: