Jyoti Nigania

Hi,i am writing blogs for our platform House of Bots on Artificial Intelligence, Machine Learning, Chatbots, Automation etc after completing my MBA degree. ...

Full Bio 

Hi,i am writing blogs for our platform House of Bots on Artificial Intelligence, Machine Learning, Chatbots, Automation etc after completing my MBA degree.

NPCI is implementing Blockchain Solution To Power Up Digital Payments
3 days ago

TCS And Google's Partnership Permits Best Cloud Solutions To Customers
7 days ago

What Made Cloud Computing Differ From Edge Computing?
8 days ago

Netcore Solutions CEO Kalpit Jain Take Over AI Chatbot Startup Quinto.ai
8 days ago

Mapping Use Cases of Data Science Across Different Industries
9 days ago

These Computer Science Certifications Really Pay Good To You
115419 views

List Of Top 5 Programming Skills Which Makes The Programmer Different From Others?
112275 views

Which Programming Language Should We Use On A Regular Basis?
104310 views

Cloud Engineers Are In Demand And What Programming Language They Should Learn?
83070 views

Python Opens The Door For Computer Programming
63174 views

What Made Cloud Computing Differ From Edge Computing?

By Jyoti Nigania |Email | Apr 10, 2019 | 3486 Views

Every conversation today around digital transformation or the internet of things (IoT), no matter the industry, tends to include a discussion about where applications will be hosted. The cloud is often presented as a good option just send all the data to the cloud for analysis.
Others might suggest a newer concept, edge computing as an important breakthrough that will power applications and deliver results that cloud computing could never achieve. These computing architectures are complementary, both with important roles to perform. My company provides software for industrial customers, some of which run on the edge and some more typically run in the cloud this gives me a good perspective of both environments and why you would choose one over the other.

Cloud computing has been around much longer, and most people have a basic idea of what it is Simply put, cloud computing involves remote data centers full of computers connected via the internet that offer compute power that's available for a cost per unit and available to anyone.
Edge computing, on the other hand, has just recently become a common term. It's used to highlight an opposite approach to the cloud, especially in terms of IoT. Fundamentally, edge computing is the idea of running applications as physically close as possible to where the data is generated. Consider, for example, a vehicle that instantaneously calculates fuel economy based on data from speed and fuel consumption sensors. The computer performing that calculation in the vehicle could be correctly labeled as an edge computing device.

However, the concept of edge computing is not a new idea. In fact, companies in the automation space where I work have been performing control and analysis at the equipment level computing in process plants, factories, mines, oil fields and more for some time. A common example that my company offers is a distributed control system, a network of special computers that operate on site and monitor data from thousands of sensors that measure the temperatures, pressures, and flows of processes, and generate actions to keep them operating safely and optimally. This is essentially edge computing before the term became common.

Enter the age of IoT. Industries outside of manufacturing are now putting sensors on their equipment or processes from individual rooms within buildings to parking meters and lights in cities or wearable devices on people and cellular networks exist to move the data around. So a new question has arisen: Where should you do the computing to turn data into actionable insights? Cloud evangelists say it should all be done in the cloud. But often, there can be good reasons to perform computing and analysis locally through edge computing. I believe both approaches have strengths and weaknesses, and examining them in light of the application at hand will usually drive a clear choice.

Edge computing may be the better option under certain conditions, such as in the following situations:
There is not enough or reliable network bandwidth to send the data to the cloud.
Though the industry has an immense focus on cybersecurity, there may be security and privacy concerns about sending the data over public networks or storing it in the cloud. With edge computing, data is retained locally.
The communication networks connected to the cloud is not robust or reliable enough to be dependable.
Applications need rapid data sampling or must calculate results with a minimum amount of delay. Let's look at an example of edge computing that my company provides for the oil and gas industry. In the oil patch, communications from oil wells may be wireless and can be intermittent specialized computers called remote terminal units (RTUs) are used to ingest sensor data and perform local control functions on the well. When communications are down, they typically store the good data internally when communications return, they send the data to remote systems for reporting and analysis.

Conversely, the cloud might be a better option because of the following factors:
Cloud processing power is nearly limitless. Any analysis tool can be deployed at any time.
The form factor and environmental limits of some applications could increase the cost of edge computing and make the cloud more cost-effective.
The data set may be large. A high number of applications in the cloud and availability to pull in other data can allow the applications to begin self-learning which can lead to better results. Many of us have heard of using big data.
Results may need to be widely distributed and viewed on a variety of platforms. The cloud can be accessed from anywhere on multiple devices.

Let's look at the vehicle example. The cruise control needs to be highly reliable, secure and responsive, so it runs at the edge, or inside the vehicle. However, a monitoring application for a fleet of vehicles that pulls performance data to schedule maintenance or navigation information to calculate routes should run in the cloud where vast amounts of data on multiple vehicles can be accessed and analyzed.

Similarly, in the oil and gas industry, Some of the data sent via RTU around well performance or equipment health information may not be used inside the RTU. But applications in the cloud or remote data centers could use the data in applications that model the entire oil field and direct actions to recover the maximum amount of oil from the field. Both edge and cloud computing will continue to have important roles in the foreseeable future. In fact, I expect software development efforts and research endeavors will connect the two worlds more seamlessly. So, do not worry there will be plenty of applications for both sets of evangelists in the future.

Source: HOB