The team behind the open source InterPlanetary File System (IPFS) announced this week that a significantly faster version of the peer-to-peer hypermedia protocol is now generally available.
IPFS is a distributed file system that makes use of a global namespace to connect all computing devices. The fundamental difference between IPFS and other distributed file systems is a decentralized system of operators who hold a portion of the overall data, which serves to create a highly resilient system for storing and sharing files. Any operator on the network can serve a file by its content address, and IT teams can find and request content from any node using a distributed hash table (DHT).
Molly Mackinlay, project lead for IPFS and a senior product manager for Protocol Labs, which provides protocols, systems and tools to improve how the internet works, said the latest 0.5 update to IPFS significantly improves the content routing performance in addition to adding support for the Transport Layer Security (TLS) protocol.
IPFS can now also add files to the IPFS network twice as fast, in addition to performance improvements that have been made to the core file transfer mechanism.
Finally, the mechanism for looking up mutable links is also faster and there is an experimental pubsub transport to speed up record distribution.
Mackinlay said organizations such as Microsoft and Netflix are adopting IPFS to drive a number of distributed computing use cases. Netflix, for example, has been experimenting with IPFS as a way to distribute container images that is more efficient than employing repositories.
IPFS won’t replace the need for traditional file systems as much as it does create an opportunity to employ a file system that makes distributed computing on a global scale more feasible. Microsoft, for example, uses IPFS as the content address storage layer to power its take on a decentralized identity system. In fact, Mackinlay noted IPFS has now been deployed on more than 100,000 nodes around the globe.
It’s too early to say how broadly IPFS might be employed. Mackinlay said the community is still growing as IT organizations discover IPFS makes tasks ranging from fetching data to watching a video more efficient. As is the case with most open source projects, the adoption of IPFS is more likely to be driven by developers from the bottom up rather than IT managers from the top down. Developers trying to find a way to more efficiently distribute container images could, in theory, employ IPFS without anyone in the organization even knowing.
Regardless of how IPFS finds its way into the enterprise, the need for more efficient ways to distribute files and content has become apparent. While data may be more distributed than ever, the laws of physics have not been suspended. As such, IT teams need to find a way to leverage IT infrastructure resources that are physically located as close as possible to the point where content is consumed. On one level, of course, that’s the role of a traditional content delivery network (CDN). However, given the cost of employing CDNs to distribute massive amounts of data, it might not come as a surprise that the open source community is exploring a much less expensive alternative.