Skip to main content
  1. Home
  2. Emerging Tech
  3. Entertainment
  4. News

Studying brain waves while watching trailers can help predict a film’s success

Add as a preferred source on Google

As an industry built around making money by guessing what the general population wants to watch, Hollywood has been trying to get into our minds for years.

An experiment at Northwestern University’s Kellogg School of Management may have cracked the best way to do this: By literally reading moviegoers’ brain waves.

Recommended Videos

In a study published in the Journal of Consumer Research, 122 moviegoers had their brains monitored with electroencephalogram (EEG) technology, in which electrical activity in the brain is read using electrodes attached to the scalp. While they were being observed in this way, they were then shown trailers for various movies, while scientists noted their engagement levels.

What the experiment showed is that the more engaged people were watching a trailer, the more money the resulting film tended to make at the box office.

The superhero movie X-Men: Days of Future Past earned the highest “neural similarity” score and, of the movie trailers shown, also grossed the most money in theaters. The film Mr. Peabody & Sherman meanwhile scored the lowest, and — in the real world — only managed to earn a fraction of the X-Men box office.

As neuroscience and business professor Moran Cerf told Digital Trends, the idea of measuring engagement in content in a person’s brain has historically been challenging because engagement can be measured in multiple ways.

“Our contribution was in the usage of multiple brains simultaneously to measure the content,” Cerf said. “Instead of looking at one brain and trying to interpret it, we said that the one thing that is common to engaging content is that it, well, engages the brain. And more so, it does that in a way that takes over our brains regardless of who we are and what state we are in. Our measure was testing how similar people’s brains are when they watched the content. What we saw is that the more similar the brains are when they view specific content, the more it is later remembered, the more people say it was relevant, engaging, that time passed rapidly for them, and the more they are interested in it.”

Of course, no one in Hollywood has presumably ever set out to make an un-engaging trailer but the work offers some fascinating insights. For instance, using the neural similarity method it was possible to discover peak moments of engagement and investigators found that if these moments take place in the first 16-21 seconds of a trailer, those movies have the greatest ticket sales when they arrive in theaters.

Ultimately, it’s unlikely that theaters are going to start handing out EEG readers alongside popcorn anytime soon, but Cerf noted that similar methodology has a wide range of use-cases.

“This technique now can be used for any type of content, to measure engagement in various types of modalities,” he said. “Since all we measure is similarity between brains, it actually doesn’t matter what the content is the brain processes.”

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
DJI’s first 360° drone offers 8K video recording and a freakishly long transmission range
From omnidirectional obstacle sensing to 42 GB of onboard storage, the Avata 360 is DJI doing what DJI does best: raising the bar for everyone else.
DJI Avata 360° drone.

DJI has officially entered the 360° drone arena with the launch of the Avata 360. It’s the company’s first-ever fully immersive FPV drone, and a direct shot at the Antigravity A1, a rival built by an Insta360-incubated brand. Looks like the drone wars just got more interesting. 

What makes the Avata 360 worth looking at?

Read more
I transferred all my chats from other AI apps to Gemini — and it works flawlessly
Google Gemini Graphics Featured

You know that moment when AI assistants like ChatGPT, Gemini, or Claude suddenly lose the plot mid-conversation and start hallucinating like they’re absolutely sure they’re right? Yeah…it’s equal parts funny and painfully annoying. My usual reaction is switching between apps, hoping one of them gets it right. But the real problem is that I have to start over every single time. It feels like I’m stuck in a loop explaining my life story to different AIs, one after the other.

Now with Gemini, I can now jump in from other AI apps without that whole reset conversation. Finally, the Google gods have blessed us. I tried it out expecting the usual hiccups, but it was surprisingly smooth and quick.

Read more
Google expands Search Live globally with voice and camera AI
The feature is now available in 200+ countries with multilingual support
Google Search Live

Google is taking another big step toward turning Search into a full-blown AI assistant. The company has officially expanded Search Live globally, making the feature available in over 200 countries and territories, along with support for dozens of languages.

https://twitter.com/google/status/2037201891130523917

Read more