fogcomputing

Fog Computing Presents a New Way of Providing Computing Power Where It’s Needed

Add Your Comments

As the world struggles to embrace cloud computing, a new trend could be taking shape known as “fog computing,” which is designed to bring cloud computing and services to the edge of the network, and closer to where they’re needed.

Instead of data having to move across the open internet to centralized cloud servers, fog allows services to be hosted at the network edge or even devices such as set-top-boxes, access points, or anything else networked and possessing computing power.

In a recent Wall Street Journal article, technology columnist Christopher Mims notes that many companies are now looking into ways for smart devices and sensors from the “Internet of Things” to talk to one another rather than route information up to the cloud. This approach can help reduce service latency,provide predictability, improve the user’s experience and quality of service, and even make new services possible.

One of the “things” that could already benefit from fog computing could include air travel, where a jetliner’s engine can generate half a terabyte of data during a single flight. Rather than send this raw data to a cloud, it could transmit it to a computing device within the plane to be analyzed in real-time, and only send data outward when, perhaps, a part of an engine might need maintenance.

And in office environments, services like AeroFS are already making use of companies’ underutilized internal LAN bandwidth to sync and share files faster than a service like Dropbox that could be limited by external bandwidth.

AeroFS CEO and co-founder Yuri Sagalov wrote in a blog post, “Although few of us have 100mbit or 1Gbit Internet connectivity, most of us have 100mbit/1Gbit routers and switches in our offices and homes, and even our Wi-Fi connections are often faster than the available bandwidth of the ISP they ultimately connect to. This means that if you need to share data with someone a few floors above you, a few desks away from you, or in the office down the street, using the public cloud is usually the wrong solution.”

Fog computing can really be thought of as a way of providing services more immediately, but also as a way of bypassing the wider internet, whose speeds are largely dependent on carriers.

To avoid network bottlenecks, Google and Facebook are among several companies looking into establishing alternate means of Internet access such as balloons and drones. But smaller organizations could be able to create a fog out of whatever devices are currently around to establish closer and quicker connections to compute resources.

Cisco, which has invested $1 billion to be a pioneer in building next-generation “Internet of Everything” services, is specifically seeking out new ways to incorporate fog computing into service delivery.

While the hosting industry certainly doesn’t need any more marketing terms, fog computing seems like an appropriate term to describe a version of cloud computing that is closer to its users and even more distributed into a wispy vapor.

There will certainly still be a place for more centralized and aggregated cloud computing, but it seems that as sensors move into more things and data grows at an enormous rate, a new approach to hosting the applications will be needed.

Fog computing, which could inventively utilize existing devices, could be the right approach to hosting an important new set of applications.

About the Author

David Hamilton is a Toronto-based technology journalist who has written for the National Post and other news outlets. He has covered the hosting industry internationally for the Web Host Industry Review with particular attention to innovative hosting solutions and the issues facing the industry. David is a graduate of Queen’s University and the Humber College School of Media Studies.

Add Your Comments

  • (will not be published)