+14087965644Silicon Valley, CA, US

HomeBlogBlogImproving the Internet with IPFS – Part 1

Improving the Internet with IPFS – Part 1

This is a three part blog series written with the aim of explaining IPFS and the need for it. In this part, we start with the basics and try to understand the problems our current internet paradigm is facing. In the next blog I explain how IPFS tries to tackle these problems and in the final blog we talk about the various applications of IPFS and how you can use it and contribute to it.

IPFS stands for “InterPlanetary File System”. The concept of IPFS is just as cool as its name. Dubbed the “Base for the Internet of tomorrow”, IPFS aims for a new paradigm where the world wide web is decentralized as opposed to today’s centralized web. To understand IPFS and the purpose for which it is being developed, we first need to know the shortcomings of the current structure the Internet is based on.

Staggering Data-

From the year 2005 humanity had gathered around 130 Exabytes of data. If all this data was to be stored in books, then we would need to plant all of the available land on earth with trees and then convert  it all to paper. In 2015 this figure was 7900 Exabytes. Predictions say that the data on the internet will rise to 175 Zetta Bytes by 2025.

To visualize these numbers, here are two examples from IDC-

  • If you stored 175ZB on DVDs, the stack of DVDs would be long enough to circle the Earth 222 times.
  • If you attempted to download 175ZB at the average current internet connection speed, it would take you 1.8 billion years to download!

Hence managing such data is no easy feat and the Internet was not designed to tackle such huge loads anyways. (Remember that the internet was first designed as a military project).

Inefficient and Expensive-

HTTP downloads files from one computer at a time instead of getting pieces from multiple computers simultaneously. Hence, many times when a lot of users request the same data, congestion increases and speed decreases. As the number of Internet users increases almost exponentially daily, this is already becoming a severe problem.

Also a lot of copies of the same data exist on the web. As we have previously seen, data handling causes a lot of problems and duplicate data just makes it worse. Another inefficiency of the web is that it does not preserve our history well. According to Mike Ashenfelder, the average lifespan of a webpage is just 100 days before it is buried under newer content.

Centralization- 

Today’s web is centralized, which means a central authority or authorities have control over it. The internet is an impartial tool which has boosted development and innovation in all fields of life – but increasing consolidation of control threatens that progress and limits opportunities.

Because the current structure of the internet is centralized, it has a singular point of failure. This point of failure can be abused to disable access to the Internet for a large region, an entire country even. This is precisely what happened to Egypt during the protests of 2011. In order to resist the coup d’etat, the president pulled the plug on the net. Also during the times of natural disasters or any other emergency, the centralization of the web can hamper rescue operations if the local servers are down.

These are massive problems and will bottleneck progress if left unchecked. Thankfully, IPFS might work as an almost ideal solution to these problems. In the next blog, we will see the solutions that IPFS has to offer. Stay Tuned!