Welcome Guest! To enable all features please Login or Register.



Go to last post Go to first unread
#1 Posted : 30 August 2019 13:03:41(UTC)

Rank: Newbie

Groups: Registered
Joined: 08/02/2019(UTC)
Posts: 0

As companies start using IoT more and more, edge computing is helping them gain more real-time analytics for faster decision-making.

In enterprise IoT architectures, cloud computing can play a significant role in organizing, storing and analyzing a variety of data collected from IoT sensors, However, as companies start to use IoT for a broader array purposes, including management of mission-critical or industrial applications, there will be a growing need for real-time analytics and faster decision-making to occur directly on the IoT device, or somewhere closer to the device, .

For example, an autonomous car or a critical piece of equipment in a partially automated factory may require instant, almost zero-latency response to detected conditions, as human lives or business livelihood, may hang in the balance. To enable necessary action to be taken more quickly–perhaps autonomously–companies deploying IoT for such purposes increasingly may turn to the concept of edge computing.

Generally, edge computing is the concept of processing data on the connected device, or as close to that device as possible, but the term has no universally agreed-upon definition.

Phil Ressler, CEO of IoT sensor platform company Sixgill, said, “Edge computing is the notion of processing data and conducting analytics on the endpoint device itself, or somewhere else at the edge of the enterprise network, as opposed to sending the data out to a data center.”

David Linthicum, chief cloud strategist at Cisco Systems, shares that view. “By eliminating the distance and time it takes to send data to centralized sources, we can improve the speed and performance of data transport, as well as devices and applications on the edge,” he wrote in a blog post.

Telecom service provider AT&T put a slightly different spin on it in a 2017 white paper, describing edge computing as “placement of processing and storage capabilities near the perimeter (i.e., “edge”) of a provider’s network.” That definition suggests processing occurring not on the enterprise premises, but at a location in carrier’s network, such as a 5G cell tower close to an enterprise customer, a take that befits AT&T’s mission to serve enterprises as a provider of managed services. (AT&T, according to a Light Reading report, is planning to test enterprise use cases for edge computing later this year. The same outlet reports Verizon recently tested edge computing on its 5G network in Houston and determined that edge helped them cut latency in half.)

Sixgill’s Ressler, acknowledged, “There’s not a binary definition of where the edge begins and where the edge ends.”

There are other terms floating around in the edge computing stew, such as edge cloud, which is similar to AT&T’s definition; fog computing, which was spun by Cisco as a way of describing a standardized model for edge computing; and mist computing, the new-age-sounding notion that puts processing in between the cloud and edge — within a sensor, for instance.

If all of this sounds more like the decades-old concept of distributed computing — with Ethernet and ARPANET being prominent examples, there’s a reason for that. Edge computing is one of the latest ways of applying distributed computing philosophy and practice, according to Eric Simone, CEO of ClearBlade, a company that has been talking up the benefits of edge computing for several years.

more info here
Users browsing this topic
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.