Edge Computing

Edge computing is a model that shifts some of the computing power and data processing from the cloud to the edge of the network, where the devices and sensors are located. Edge computing can improve the performance, latency, bandwidth, and security of applications that require real-time or near-real-time data, such as IoT, AR/VR, gaming, and autonomous vehicles. Developers need to learn how to develop and deploy applications that run on the edge devices, and how to integrate them with the cloud services. Developers also need to consider the challenges and risks of edge computing, such as reliability, scalability, and security.


Links to this note