What is an edge server?

What is an edge server?

STL Partners has been writing on the topic of edge computing, both broadly speaking and from a telecoms industry perspective for a number of years. For more information on our edge computing insights and services, check out our Edge Computing Hub or send an email directly to [email protected]

Edge servers refers to servers (compute resources) that run the processing at an edge location, which can be anywhere along the edge spectrum – usually from on-premises edge to regional edge. Edge nodes is another term used frequently, which could either refer to a broader set of compute resources (i.e. including end-devices too) as well as a cluster of edge servers.

The nature of an edge server differs across different types of edges, depending on the use case and where the edge compute resource is deployed.

No alt text provided for this image

CDN edge server or regional edge

As seen in our edge computing investments webinar, most capital being deployed today is going into data centre facilities at the regional edge. This is being driven in some part by CDN providers looking to further distribute their locations and increased demand for data centres at a more local level. For example, in the U.S., latency is significant in Tier 2 cities as large as Austin, Texas. These locations can be almost as large as a hyperscale data centre, therefore the edge nodes are likely to be a standard data centre server.

Network edge server

At the network edge, most edge nodes will reside in data centre-like environments, particularly in the next few years given that telcos will largely leverage existing data centres in their network. However, as edge computing expands into deeper parts of the network, such as a deployment at a base station, the environment will be different to a traditional data centre. For example, in a smaller scale deployment, cooling will be delivered differently compared to a hyperscale data centre with thousands of servers. In some cases, edge could be deployed alongside small cell infrastructure, which would mean the edge server would likely to be standalone and need to be ruggedised since it will not have an enclosure to ‘live in.’

On-premise edge server

At the enterprise or on-premises edge, i.e. edge computing at a factory, shopping centre, office space, etc., edge servers can take different shapes or forms. Some edge deployments will be in on-premises data centres at take the form of a standard data centre server. However, in industrial edge deployments in particular, you may have a single edge device for running workloads, for example at an oil rig. Given the harsh environments, this would need to be ruggedised. In retail, the requirements are totally different – they have limited space to be able to install an enclosure for the edge node yet and would need equipment that can be hidden away from view as much as possible. Lastly, telcos and OEMs are exploring changing existing customer premises equipment used for networking to host non-networking applications. These could either be enterprise CPE boxes, Wi-Fi gateways or programmable logic controllers in industrial settings.

Device edge node

STL Partners defines the device edge as either edge compute residing on the end-device (e.g. a smart camera) or a separate small device attached to the end-device. One example of this is for asset monitoring; manufacturers are attaching small edge nodes to their customers’ assets to be able to monitor the condition and use the analytics to provide new services. The edge nodes are less likely to be ‘servers’ per se, but take the form of a small computer or simply be additional processing hardware installed on the end-device.

Get in touch to receive our free 20 page Edge Insights pack

Key trends in edge hardware

The nature of edge servers is evolving. Some of the trends we are seeing, particularly in data centre servers, may extend into edge servers, whereas others are still open questions.

1. Hardware-as-a-service

One of the key factors for why cloud took off is because of its attractive “as-a-Service” business model, which allowed customers to spread costs over time in an OPEX-based model, rather than pay CAPEX up-front to build IT infrastructure. In order to replicate the advantages of the cloud commercial model at the edge, we are seeing the growth of Hardware-as-a-Service models. In other words, the customer pays for the server in a recurring fee model. This can either be a subscription model, consumption-based or a managed IT services fee. HPE with its Greenlake portfolio is one of the earliest proponents, however others in the industry, such as Lenovo, Dell and AWS, have all jumped on the trend too.

2. COTS vs. specialised hardware

One of the challenges for edge computing solution providers and infrastructure developers is determining the processing capabilities within the edge server. Some use cases that require heavy visual data processing or image rendering will need GPUs (Graphics Processing Unit). Others that need high performance computing or low latency, high throughput processing may require specialised hardware accelerators, such as FPGAs (Field Programmable Gate Arrays) or ASICs (Application-specific Integrated Circuits). However, for an edge data centre operator or anyone designing a blueprint for edge infrastructure, it is difficult to achieve economies of scale as easily using specialised hardware compared to COTS (common-of-the-shelf) CPUs.

3. Storage and compute convergence

In some use cases, edge servers need to be as small as possible, for example if it is an attachment to an existing asset that will be used for monitoring the asset’s performance (condition-based monitoring). Converged infrastructure allows hardware to be used for both storage and compute, therefore simplifying a deployment by avoiding the need to have separate hardware for each process. We are starting to see an extension of this in computational storage. This moves compute even closer to storage to reduce the amount of data that needs to travel between the too, which would be particularly beneficial for ultra-low latency use cases.

4. White box CPE

In telecoms, some operators see their opportunity in edge computing around changing the nature of customer premises equipment (CPE). The industry has been opening up the previously vendor-locked CPE by disaggregating software (network services) from the underlying infrastructure. For example, Verizon worked with ADVA to create their universal CPE platform to allow customers to use COTS and select network services from multiple vendors. We have covered this at length in a previous report SDN / NFV: Early Telco Leaders in the Enterprise Market. The next step is now for these same boxes to run non-networking functions to become edge compute platforms in their own right by adding an IaaS layer. For example, a bank’s retail branch can use the CPE to run its branch networking services, but also compute for processing workloads related to enterprise applications, such as analytics for video security, IT security, access management, etc.


Vivek Kumar

Head of Data Core Design & Planning at Bharti Airtel

4 年

HW selection , 3rd party application integration & Orchestration are 3 major pre-requisite for MEC.

回复
Ashish Kar

Head of BSS Solutions at Telefónica Germany

4 年

Great write up. Thanks for the well rounded perspective on edge servers.

回复
Tariq Ehsan

Product Manager/5G/IoT/Edge Cloud/GenAI/Agile/ML

4 年

I believe edge compute requirements will be high so infra needs to be high performance as real time processing is main candidate of edge. Cache will drive storage requirements on edge.

要查看或添加评论,请登录

Dalia Adib的更多文章

  • What STL Partners is looking out for at MWC Barcelona 2024

    What STL Partners is looking out for at MWC Barcelona 2024

    The STL Partners team is going to be at MWC in Barcelona this year, keeping tabs on what is happening in the mobile…

    7 条评论
  • Where is money being made in edge computing?

    Where is money being made in edge computing?

    There is a perception that the edge market has not met expectations and the hype that has been growing over the last…

    6 条评论
  • 8 edge computing pricing models

    8 edge computing pricing models

    STL Partners has been writing on the topic of edge computing, both broadly speaking and from a telecoms industry…

    2 条评论
  • How can smart grids benefit from edge computing?

    How can smart grids benefit from edge computing?

    Sign up to our monthly edge newsletter to receive our latest articles, reports and webinars! What is a smart grid?…

    4 条评论
  • Edge computing news - April 2021

    Edge computing news - April 2021

    Recent news: Dish to use AWS infrastructure to support its network build-out To keep up-to-date with developments in…

  • How 5G and edge computing will transform AR & VR use cases

    How 5G and edge computing will transform AR & VR use cases

    Sign up to our monthly edge newsletter to receive our latest articles, reports and webinars! With the commercial…

  • Why private LTE needs edge computing

    Why private LTE needs edge computing

    STL Partners has been writing on the topic of edge computing, both broadly speaking and from a telecoms industry…

    10 条评论
  • Edge computing - August news

    Edge computing - August news

    Recent news: Signs of edge computing market maturing; advances in telco-cloud partnerships and another big acquisition…

  • Edge computing - April news

    Edge computing - April news

    Recent news: Co-opetition is the new competition as AlefEdge partners with Microsoft, Altran with Ori, and Google…

  • Smart Mobile Labs Q&A: Edge computing applications to limit the impact of COVID-19

    Smart Mobile Labs Q&A: Edge computing applications to limit the impact of COVID-19

    STL Partners has been writing on the topic of edge computing, both broadly speaking and from a telecoms industry…

    1 条评论

社区洞察

其他会员也浏览了