KDM FORCE designed and implemented a fully functional Proof of Concept ("PoC") to demonstrate how an Internet of Things ("IoT") device and actuators ecosystem can connect participants in an energy system by pushing computational capacity from the core to the Edge.
The Client is a primary Distribution System Operator ("DSO"), a branch of one of the world’s leading integrated electricity and gas company, with operations in 34 countries across 5 continents. At the time of the project, the Client was looking into ways to make energy distribution more efficient and develop new optimization mechanisms leveraging IoT technologies, to address efficiencies in the management of the network, reductions of faults, optimizing field workforce operations and increasing network resilience and predictive analysis based on collected data.
In this scenario Ericsson, while working for the Client, partnered with KDM FORCE to set up a PoC to demonstrate the possibility to assure continuous monitoring and anomalies detection combined with autonomous alert generation and automated counter-action triggering (including the instance of support service tickets) with seamless provision of role-based information and analytics to users.
Electrical grids are continuously undergoing large scale expansion to meet the ever increasing power demands. Because of this, the complexity to monitor and protect such infrastructures relies upon more stringent requirements than, let say, the ones required by an energy supplier. Therefore, the traditional Cloud Computing deployments looked insufficient to meet the analytics and computational demands of the Client's real time alerting. These mission critical requirements suggested, instead, to look into a Fog to Edge Computing model that would push the data validation closer to the requester, in order to fulfill the processing and computational objectives.
The project needed to demonstrate the applicability of a Fog to Edge Computing algorithms to interplay with the core Cloud Computing support, thus showing the feasibility to come up with a new breed of real-time and latency free operations as well as automated work flows to meet quality of service ("QoS") levels more efficiently.
KDM FORCE designed an architecture with low latency and high availability that pushed “intelligence” and computation capabilities, deployed on site, near the devices that collected the data and near the actuators. This approach allowed to crunch through data at a faster pace, than if such data were held in a central location, and to instance actions and reactions more effectively.
The PoC solution was implemented on a loosely decoupled architecture that operated at several levels. At the Cloud level (representing the core of the solution), the PoC was deployed on the AWS PaaS (AWS IoT). The Fog nodes (representing the primary electrical equipment in the substations) and the Edge nodes (representing a secondary electrical equipment) were both based on the EdgeX Foundry architecture (an early realease of AWS Greengrass was also tested while scouting Edge Computing technologies). On the hardware side, for the Edge nodes, the choice went on the “Gateway 5000” from Dell, a very robust industrial IoT gateway. The PoC’s software architecture was implemented using a microservices layer deployed on AWS in order to speed up the development process and to rely on Amazon’s PaaS reliable horizontal and vertical scalability.
As soon as the PoC started running, the data were collected by edge nodes and sent to fog computing nodes were data was then analyzed in real time to take decisions on premises. Aggregated data was routed to the cloud for trending analysis and predictive maintenance purposes. As a plus, the PoC solution was also integrated with the Salesforce Service Cloud (representing the customer service and support platform) in order to demonstrate the activation of incident management operations and to simulate on-site maintenance processes, autonomously.
The PoC demonstrated the feasibility of the proposed framework. The results showed the suitability and viability of a Fog to Edge Computing model that would push the data validation closer to the requester. These results were confirmed through comparative metrics of traditional data center or Cloud Computing approach. Results clearly demonstrated the superiority of the Fog to Edge Computing over its core Cloud counterpart and how this approach will be, in the near future, the reference one when in need to collect real data and instate immediate counter-action.