AdobeStock_336353028-1

Explaining the Internet Through the Lifecycle of a YouTube Video

Internet

By Greg BryanMar 14, 2024

Share

Last week, we wrapped up a five-part podcast special that literally explains how the internet works.

This series describes precisely how data moves around the world, covering the basics of internettransport networksdata centers, the cloud, and WAN along the way.

At the end of each episode, I thought it would be fun to relate what we talked about to a real world example: the lifecycle of a YouTube video.

To paint the full picture, I've compiled each piece of the story here. Keep reading to find out how a video goes from one camera to millions of screens around the world.

Step 1: Understand That the Internet Is a Network of Computers


The first step is for our content creator—we'll call him Gary—to film himself doing something, like demonstrating a guitar riff, skateboarding, or a particularly complicated origami.

The sights and sounds of this video are converted by Gary’s camera and microphone into 1s and 0s and downloaded to his computer.

After any edits are made, Gary uploads the video to his YouTube account. His computer breaks the video into packets, each with an IP address directing them to the server space YouTube has allocated for him in a local Google data center.

Just to clarify, YouTube is owned by Google, and although Google’s real corporate name is Alphabet, we tend to refer to it as Google.

After Gary’s local internet service provider (ISP) connects with Google using his autonomous service number (ASN), the packets are on their way to Google’s servers.

Depending on where Gary lives, this journey could take the packets on a simple path between only a few providers, or a complex path through several different internet backbone providers’ networks.

Once it arrives at the data center, Gary’s video is indexed for search and again separated into packets to be spread around the YouTube network.

During this journey, the packets will get new IP addresses based on their destinations, and various internet providers will exchange traffic with each other through IP peering or transit agreements. The packets may wind up taking different paths before being reassembled into a video.

Viewers around the world can now watch it without the latency they would experience if the video had only been stored in Gary’s local data center.

Once a viewer sees the new video and hits play, it starts the journey back from the data center to their device.

The video is again split into packets from its home on a server in YouTube’s nearest data center and given IP addresses that tell the packets to go to the viewer’s ISP and then device.

Once again, the packets will take a seemingly random journey, possibly across several different networks, and be reassembled together on the viewer’s device! 

Step 2: Tunnel Deeper Into the Physical Transport Networks


Now let’s start over and watch the scenario unfold through a different lens.

Gary films himself doing something else, like demonstrating how to do a wheelie on a mountain bike, draw a dragon, or make the perfect soft-boiled egg. He edits the video and uploads it to YouTube.

The bits leave his house and travel through wires that are either buried or hanging on poles in his neighborhood until they get to the first aggregation point, the central office, somewhere in Gary’s town or even neighborhood.

There, the bits leave their ISP’s network and hop onto a regional middle mile provider’s network—in this case, maybe one traveling along a railroad line toward the nearest major metropolitan area.

Once the packets reach the big city, they are routed to an internet exchange point (IXP), located in a huge windowless building the size of a massive warehouse where dozens of carriers interconnect. Just down the street is another massive building owned by Google that houses YouTube servers.

Let’s assume that Gary is a pretty popular influencer, so subscribers are immediately alerted and start streaming the video.

Copies of the packets then go to YouTube’s servers all over the world, first across terrestrial links between IXPs, then hitting submarine cable stations and traveling across the sea, where they then do the whole journey in reverse to get to the fans downloading the videos. 

Step 3: Add Geography Into the Mix


At the end of our data center episode, we shift our focus to where the video actually goes.

In this example, let’s say the content creator is me, and I upload a video explaining the internet to my YouTube account.

I happen to live in western Loudoun County in Northern Virginia, which is conveniently down the street from Data Center Alley. Because I am so close, my incumbent telco broadband provider most certainly has fiber directly into a neutral data center, such as one of the many Equinix facilities.

If I lived farther away or had an alternative provider, my video might first go to a neutral data center to get passed onto a backbone provider’s network before finding Google.

However, my broadband provider likely has a cross connect or is in a meet-me room where they can pass my traffic directly to Google. Google is then likely to store my video in their own data center right down the street.

My video is now sitting on a server full of other YouTube videos in a rack somewhere down a long hallway inside one of Google’s two Loudoun data centers.  

From there, Google will determine where it needs to cache my video, based on the geographic distribution of my viewership, and likely send it around the world via the transport network backbone to its other data centers. 

Step 4: Explore the Path to the Cloud


Let's go back to Gary to explain his relationship with the cloud.

This time, Gary films 1.5 hours of footage for his 30-minute show. Rather than tie up local bandwidth, he stores this video on a paid service like Google Cloud.

He uses a software-as-a-service tool like Final Cut Pro hosted in the cloud to do his editing, leaving the original footage on the cloud server and saving the edited version as a new file there.

During the editing process, Gary splices in some stock footage (content-as-a-service, really) from a subscription service with copyright free images. He does the same for music, using a service where creators can select stock music with the proper license for their show type.

Finally, as in other scenarios, Gary sends his video off to be stored on Google’s servers and cached based on his viewership.

Step 5: Take a Walk on the Corporate Side


Our final episode in the series focuses on corporate networks, so I have to alter our use case a bit. (Because corporate employees don't watch YouTube during work hours, right?)

This time around, let’s imagine that HR has asked some would-be actors to stay late and record a short video about how to properly fill out a TPS report.

After editing together the amazing acting and appalling royalty free corporate music, HR uploads the video to their cloud provider’s storage nearest to the HQ where this is all going down.

The video goes through their SD-WAN device in the office, which selects the best performing path between that site and the nearest data center. The WAN is connected at that data center to their cloud provider’s network, and the video goes and sits on a server dedicated to that company.

That video then gets encrypted and sent across the WAN to all the various data centers around the country, or world.

After each local office downloads the video on the WAN from the nearest data center, it runs a training seminar to review with its employees, who then must fill out a test to make sure they were paying attention.

Remote workers watch a live stream of the training session, connected via some secure tunneling over the internet—protecting those TPS trade secrets. 

New call-to-action

Greg Bryan

Greg Bryan

Greg is Senior Manager, Enterprise Research at TeleGeography. He's spent the last decade and a half at TeleGeography developing many of our pricing products and reports about enterprise networks. He is a frequent speaker at conferences about corporate wide area networks and enterprise telecom services. He also hosts our podcast, TeleGeography Explains the Internet.

Connect with Greg