Facebook Works To Increase Traffic Flow Without Additional Servers
Michael Harper for redOrbit.com – Your Universe Online
Last month, Facebook, in partnership with a handful of other tech giants, announced its intentions to deliver Internet access to the farthest regions of the world. At that time, CEO Mark Zuckerberg posted a blog explaining the philosophical points of connecting the world, but made few comments about how the technology behind this endeavor would look.
Today Facebook has released a 70-page white paper explaining how the social networking giant has been able to scale its website to handle the massive amounts of traffic from the more than one billion customers each day.
Together with Ericsson, MediaTek, Nokia, Opera, Qualcomm, and Samsung, Facebook launched the coalition Internet.org, a group which prides itself on being open to the public. The group has said they’re influenced by the Open Compute Project, another coalition which counts Facebook as a member. With the Open Compute Project, Facebook has made the power and water usage of its data centers wide open to the public.
“Many of the technologies we outline in this paper have been released as open source software for the community to use and improve,” reads the online-released paper.
“At Facebook, we are building solutions to unprecedented scaling and connectivity challenges that other companies will start to experience as more people from diverse geographies, network connections and devices share more types of content and make new connections,” the paper continues.
Facebook has plenty of experience in making its website available to large groups of people. Thanks to tools it calls “Hip Hop for PHP” and “Hip Hop Virtual Machine,” the social giant claims it is able to handle 500 percent more traffic on the same amount of servers.
Facebook also claims its users share some 4.75 billion items on the site every day. These items include comments, photos, pictures, updates and more. Users click Facebook’s iconic “Like” button 4.5 billion times each day and send messages to each other 10 million times daily. All this traffic requires some clever thinking when it comes to serving up this data to its users on desktops and mobile devices.
This kind of functionality is important for a project as large as the one Internet.org is looking to take on. As mentioned in the initial announcement, part of the challenge of connecting some five billion people to the Internet is making sure servers can not only run efficiently, but scale to handle larger amounts of traffic than they are currently accustomed to dealing with. The coalition also plans to discover ways to make the Internet less data intensive so mobile users aren’t left paying hefty data fees.
Facebook also has these mobile users covered with a tool it has built called Air Traffic Control. As explained in the Internet.org white paper, this tool allows engineers at Facebook HQ in Menlo Park, California to test the performance of its mobile app under a variety of simulated conditions, such as areas of low connectivity or heavy bandwidth loads. Using this data, the social network has been able to optimize the content it delivers and make it as efficient and streamlined as possible.
Facebook also discusses the work of Ericsson and Qualcomm in the paper. The two companies are said to have put network technologies in place which make the wireless Internet not only more efficient but sturdy enough to handle the additional load.