06

Sep

YouTube Case Study

Located in San Bruno, California, YouTube is the undisputed leader in online video. Users worldwide gather to share and watch original videos through YouTube’s simple interface.

System Specifications
  • Bandwidth (Why High?): Steaming video online is one of the most resource consuming operations. To send countless minutes of short films to users across the globe requires the highest amount of bandwidth to make these operations run seamlessly.
  • CPU (Why High?): In order to format and convert the short films to individual monitors and mobile devices, YouTube has to use large amounts of computing power to perform these conversions.
  • Disk (Why High?): YouTube has a virtually unlimited supply of homemade movies, music videos, clips, and other types of film. To store this content YouTube uses an incredible amount of disk space that is continually increasing.
  • RAM (Why High?): To stream videos to viewers requires a lot of RAM, especially when viewers want to watch a complete film without waiting too long for buffering or conversion.
  • Scalability (Why High?): YouTube has deployed a massive amount of resources in its server architecture and hosting solutions in order to become a highly scalable company in order to adjust to the high amounts of traffic.

Overview
Three former PayPal employees, Chad Hurley, Steve Chen, and Jawed Karim, founded YouTube in February 2005. Located in San Bruno, California, YouTube is the undisputed leader in online video. Users worldwide gather to share and watch original videos through YouTube’s simple interface and through their own websites, mobile devices, blogs, email, etc. Using Adobe Flash Video technology, YouTube is host to a wide variety of user-generated video content, including movie clips, TV clips, and music videos, as well as amateur content such as video blogging and short original videos.

Originally funded by an $11.5 million investment from Sequoia Capital, YouTube was purchased in November of ’06 for $1.65 billion by Google, Inc.

Screenshots
In May of 2005, six months before the scheduled launch, YouTube offered the public a beta test of the site. Almost a year later, the company reported that more than 65,000 new videos were being uploaded every day, and that the site was receiving 100 million video views per day.

Currently, its market share is around 43%, with more than 14 billion videos viewed in May 2010, according to data published by comScor. Each minute, approximately 24 hours’ worth of new video is uploaded, and 75% of that material comes from outside the United States. As for bandwidth, it’s estimated that in 2007, YouTube consumed as much as the entire Internet in 2000. YouTube also ranks behind Google and Facebook as the third most visited site on the web.

In 2008, YouTube reached an agreement with MGM, Lions Gate Entertainment and CBS which would allow them to post full-length films and television episodes on the site with intermittent advertisements in a section for U.S. viewers called “Shows.” In the wake of Hulu’s success, the move was intended to generate competition, but in 2009, YouTube launched a version of “Shows” specifically for UK viewers, which offered around 4,000 full-length shows from more than 60 partners. However, YouTube also introduced an online film rental service in January 2010, which is currently available only to users in the United States.

YouTube launched a new website design on March 31, 2010 with the purpose of increasing user dwell time and to “step back and remove the confusion,” according to Shiva Rajaraman, manager of Google product.

Currently, YouTube’s audience is described as “nearly double the prime-time audience of all three major US television networks combined.” The 2008 George Foster Peabody Award was awarded to YouTube for being a “Speakers’ Corner,” that both embodied and promoted democracy.

YouTube has approximately 1,800,001 servers supporting its data and processes. There are 34 languages available through its user interface and its bandwidth usage is approximately 25 Petabytes.

YouTube’s Data Center Strategy
YouTube’s founders started off using managed hosting and getting by on credit cards, but they soon ran into scaling difficulties in addition to conflicts with control of their hardware and networking, so they graduated to a collocation arrangement. Their new arrangement allowed them to customize everything and negotiate their own contracts.

With 5 or 6 data centers and a CDN, videos were stored and transmitted out of any data center, with the most popular in the CDN.

Platform
YouTube uses Apache, Python, Linux (SuSe), MySQL, psyco, a dynamic python->C compiler, and lighttpd for video instead of Apache.

Databases
In the early years of YouTube, MySQL was used to store meta data like users, tags, and descriptions, which was served off a monolithic RAID 10 Volume with 10 disks. They evolved from a single server to a single master with multiple read slaves, eventually partitioning the database and then settling on a sharding approach. The result was replica lag. Because the master is multi-threaded and slaves are not, they can lag significantly behind the master.

By dividing the data into two clusters, a video watch pool and a general cluster, YouTube was able to prioritize their traffic by giving the most popular function (watching videos) the most resources. Because social networking on YouTube is less important, it was routed to the general cluster, which was inherently less capable.

Now, YouTube can scale its database almost arbitrarily as it has reduced its replica lag to zero, minimized its hardware by 30%, and improved cache locality with database partitioning.

YouTube’s Video Serving Architecture
YouTube’s main costs include bandwidth, hardware, and power consumption, largely because each video is served by more than one machine and hosted by a mini-cluster. Using a cluster allows for increased speed and redundancy.

Also, because the most popular content is moved to a CDN (content delivery network), the content is replicated in multiple locations, which increases the chance of it being closer to the user and more easily accessible.

Less popular content, or content with 1-20 views per day, uses YouTube servers in various colo sites.

Sources

Allen, Katie. YouTube launches UK TV section with more than 60 partners.” The Guardian. http://www.guardian.co.uk/media/2009/nov/19/youtube-uk-full-length-shows (accessed September 4, 2010).

Amazon.com. “Youtube.com Site Info.” Alexa. http://www.alexa.com/siteinfo/youtube.com (accessed September 3, 2010).

Carter, Lewis. “Web could collapse as video demand soars.” The Telegraph. http://www.telegraph.co.uk/news/uknews/1584230/Web-could-collapse-as-video-demand-soars.html (accessed September 4, 2010).

Chapman, Glenn. “YouTube redesigns website to keep viewers captivated.” AFP.com. http://www.google.com/hostednews/afp/article/ALeqM5jfGfKKsiwbxNv8XoUbm8ZlRZZWyw (accessed September 4, 2010).

“comScore Releases May 2010 U.S. Online Video Rankings.” comScore, Inc. http://www.comscore.com/Press_Events/Press_Releases/2010/6/comScore_Releases_May_2010_U.S._Online_Video_Rankings (accessed September 4, 2010).

Graham, Jefferson. “Video websites pop up, invite postings.” USA Today. http://www.usatoday.com/tech/news/techinnovations/2005-11-21-video-websites_x.htm (accessed September 3, 2010).

Helft, Miguel. “Venture Firm Shares a YouTube Jackpot.” The New York Times. http://www.nytimes.com/2006/10/10/technology/10payday.html (accessed September 3, 2010).

Ho, Rodney. “Peabody honors CNN, TMC.” The Atlanta Journal-Constitution. http://www.ajc.com/services/content/printedition/2009/04/02/peabody0402.html (accessed September 4, 2010).

Hoff, Todd. “YouTube Architecture.” High Scalability. http://highscalability.com/youtube-architecture (accessed September 3, 2010).

Google. “Timeline.” YouTube. http://www.youtube.com/t/press_timeline (accessed September 3, 2010).

Richmond, Shane. “YouTube user uploading two days of video every minute.” The Telegraph. http://www.telegraph.co.uk/technology/google/8536634/YouTube-users-uploading-two-days-of-video-every-minute.html (accessed September 4, 2010).

Shiels, Maggie. “YouTube turns to movie rental business.” BBC. http://news.bbc.co.uk/2/hi/8471635.stm (accessed September 4, 2010).

Stone, Brad. “MGM to Post Full Films on YouTube.” The New York Times. http://www.nytimes.com/2008/11/10/business/media/10mgm.html?ref=technology (accessed September 4, 2010).

Weil, Nancy. “It’s Official: Google Buys YouTube.” PCWorld. http://www.pcworld.com/article/127436/its_official_google_buys_youtube.html (accessed September 3, 2010).

“Youtube Office in San Bruno California.” Fresh Pics. http://freshpics.blogspot.com/2009/08/youtube-office-in-san-bruno-california.html (accessed September 3, 2010).
 

You should check out these related case studies< /h3>

 

About Rebekah Perkins

Rebekah is an avid writer and former lacrosse player. She is also the Content Manager at NetHosting. Rebekah enjoys the pace and energy of the Internet industry as well as the rules-are-made-to-be-broken attitude of the English language.



Leave a Reply