Edge computing is a broad term that refers to any type of application deployment architecture in which applications or data are hosted closer to users–in a geographic sense–than they would be when using a conventional cloud or data center.
The big idea behind edge computing is that by bringing workloads closer to end users you can reduce network latency and improve network reliability–both of which are key considerations in an age when applications in realms like IoT, machine learning and big data require ultra-fast data movement.
- Be hardware-agnostic
- Understand device vs. cloud edge
- Extend the cloud
- Test for the edge