Watching a Movie? Here’s What’s Going On Behind the Scenes

This is a blog about what’s happening on your computer while you’re watching a movie.

When I’m watching a movie, and it starts buffering, I get anxious.

And then I start to wonder: what’s happening behind the scenes? What’s going on right now? What is my computer doing?

I’ve been wondering this for years, and never found a good answer. Until now!

I wrote that the other day, and showed it to some friends at work. They suggested adding some more detail about what was going on inside the streaming protocol (HLS).

A few hours later, I had this:

I’m really excited about this blog! I have been wanting to do a technical blog for some time. I’ll be writing about what’s happening on your computer while you’re watching a movie.

Some of the topics that I have planned are:

How does Netflix choose which movie to stream?

How does Netflix keep track of where you paused?

How does Netflix know if your internet connection is too slow to watch a movie?

How does Netflix charge you when you pause and resume watching a movie?

Do all of the above, but with Amazon Prime Video instead.

The most striking thing about this blog is how much goes on in your computer as you watch a movie.

When you’re watching a movie, it’s easy to forget that every frame of the movie is being decompressed and then repainted to your screen upwards of 24 times per second. This need for speed has pushed both hardware and software engineers to optimize every possible aspect of video rendering. Doing otherwise would leave you with a stuttery, unpleasant, unwatchable mess that nobody wants to see.

But just how do they manage this? How do engineers maximize performance while minimizing power consumption? How do they make sure that playing a movie doesn’t completely drain your battery? The answer lies in two major components: hardware acceleration and adaptive video playback.

Hollywood movies have entire teams of people working on the computer graphics, animation and other special effects. The end result is a movie that’s visually stunning, but you often don’t think about what’s going on in the background to make it happen.

This amazing blog post by Cameron Kaiser, a software engineer at Wolfram Research, gives us a behind-the-scenes look at what’s happening when you watch a movie. It shows us how much data each frame of a movie takes up, and how that data is divided up between video and audio files.

Looking at some of his calculations, it seems like there’s a lot more room for movies to improve when it comes to video compression. The HD movie clip he uses in his example only has an average bitrate of 9 megabits per second, while the average broadband internet speed in the US is around 16 Mbps!

Shannon O’Donnell, an M.I.T.-trained mathematician who is now a film editor, wrote the blog post after watching the movie “Elysium” with her husband, Chris. She was fascinated by the scene in which Matt Damon’s character, Max Da Costa, is introduced to an exoskeleton suit that he will later use to break into his former employer’s space station headquarters.

What Ms. O’Donnell noticed was how much happens on screen before Max puts on the suit and how quickly crucial information is conveyed to the audience — that he has a serious spinal injury, for example, and that his ex-boss’s bodyguard is a robot named Kruger.

Ms. O’Donnell analyzed how much screen time passed before each new plot point was delivered and found that over all it took just 1 minute and 47 seconds before the action came to a halt so the story could be explained to moviegoers who had not read or seen any of the trailers for “Elysium.” By then, she wrote: “The viewer knows:

Who this guy is (Max)

What he wants (to get into Elysium)

What stands in his

Screens are the most common way we interact with computers, tablets and smartphones. In fact, the screen has been so fundamental to our interaction that you could say that there’s a screen paradigm. The screen paradigm is the idea that all interactions with a computer happen through seeing displays of text and graphics on a screen, and interacting with that through some kind of pointer device (mouse, touchpad, touchscreen).

But it wasn’t always like this! In fact, the first computers had no screens at all, and the screen has only become more prominent as time has gone on. Here I’ll go over how things came to be this way.

Hardware

The most prominent piece of hardware used in a movie is the camera. At its most basic, a camera is just an enclosed box that has a lens on one side and film on the other. The lens focuses light from the scene onto the film, and when you develop the film, you’re left with an image.

This is a modern-day version of the camera lucida, which was invented in 1807 by William Hyde Wollaston. The Latin translation of “camera lucida” is “light chamber” – which makes sense, because it was designed to project an image from outside onto paper so that you could trace it.

But how do we get from projecting images on top of paper to projecting them on celluloid? In order to answer that question, we need to learn about lenses.

Leave a Reply