Fog to Edge Computing in "IoT aware" Smart Grid Architectures

KDM Force developed a Fog to Edge Computing model to connect participants in an energy system by Internet of Things (IoT) device and actuators ecosystem.

The Client was a primary Distribution System Operator ("DSO") - one of the world’s leading integrated electricity and gas company, with operations in 34 countries across 5 continents - and had two goals: making energy distribution more efficient and leveraging IoT (Internet of Things) technologies to optimize the management of the network, reduce the faults and improve field workforce operations, the network resilience and predictive analysis based on collected data.


KDM Force partnered with Ericsson, that was working for the Client, to set up a PoC (Proof of Concept) to demonstrate the possibility to assure continuous monitoring and anomalies detection combined with autonomous alert generation and automated counter-action triggering (including the instance of support service tickets) with seamless provision of role-based information and analytics to users.

Electrical grids are continuously undergoing large scale expansion to meet the ever increasing power demands. Because of this, the complexity to monitor and protect such infrastructures relies upon more stringent requirements than, let say, the ones required by an energy supplier. Therefore, the traditional Cloud Computing deployments looked insufficient to meet the analytics and computational demands of the Client's real time alerting.

These mission critical requirements suggested, instead, to look into a Fog to Edge Computing model that would push the data validation closer to the requester, in order to fulfill the processing and computational objectives.

Fog to Edge Computing: The Solution

Electrical grids are continuously undergoing large scale expansion to meet the ever increasing power demands. Because of this, the complexity to monitor and protect such infrastructures relies upon more stringent requirements than, let say, the ones required by an energy supplier. Therefore, the traditional Cloud Computing deployments looked insufficient to meet the analytics and computational demands of the Client's real time alerting. These mission critical requirements suggested, instead, to look into a Fog to Edge Computing model that would push the data validation closer to the requester, in order to fulfill the processing and computational objectives.

The project needed to demonstrate the applicability of a Fog to Edge Computing algorithms to interplay with the core Cloud Computing support, thus showing the feasibility to come up with a new breed of real-time and latency free operations as well as automated work flows to meet quality of service ("QoS") levels more efficiently.

KDM Force designed a Fog to Edge Computing architecture with low latency and high availability that pushed “intelligence” and computation capabilities, deployed on site, near the devices that collected the data and near the actuators.


KDM FORCE designed an architecture with low latency and high availability that pushed "intelligence" and computation capabilities, deployed on site, near the devices that collected the data and near the actuators. This approach allowed to crunch through data at a faster pace, than if such data were held in a central location, and to instance actions and reactions more effectively.

This approach allowed to crunch through data at a faster pace, than if such data were held in a central location, and to instance actions and reactions more effectively.

The PoC solution was implemented on a loosely decoupled architecture that operated at several levels:

  • At the Cloud level (representing the core of the solution), the PoC was deployed on the AWS PaaS (AWS IoT).
  • The Fog nodes (representing the primary electrical equipment in the substations) and the Edge nodes (representing a secondary electrical equipment) were both based on the EdgeX Foundry architecture (an early realease of AWS Greengrass was also tested while scouting Edge Computing technologies).
  • On the hardware side, for the Edge nodes, the choice went on the “Gateway 5000” from Dell, a very robust industrial IoT gateway.
  • The PoC’s software architecture was implemented using a microservices layer deployed on AWS in order to speed up the development process and to rely on Amazon’s PaaS reliable horizontal and vertical scalability.
Fog to Edge Computing Diagram

As soon as the PoC started running, the data were collected by edge nodes and sent to fog computing nodes were data was then analyzed in real time to take decisions on premises. Aggregated data was routed to the cloud for trending analysis and predictive maintenance purposes.

As a plus, the PoC solution was also integrated with the Salesforce Service Cloud (representing the customer service and support platform) in order to demonstrate the activation of incident management operations and to simulate on-site maintenance processes, autonomously.


Fog to Edge Computing: The Results

The PoC’s results showed the suitability and viability of a Fog to Edge Computing model that would push the data validation closer to the requester. These results were confirmed through comparative metrics of traditional data center or Cloud Computing approach.

The results demonstrated the superiority of the Fog to Edge Computing over its core Cloud Computing counterpart. This approach will be, in the near future, the reference one when in need to collect real data and instate immediate counter-action.