What is the ideal place for my workloads? Should we move to the edge? Which edge computing model fits my business needs? What are the critical factors to take into account before deciding to move to the edge? These are some of the top questions that developers and IT decision makers are considering as they develop their edge strategy. Understanding the different types of edge computing models is an important step before you build your strategy.
There are ten factors that you must consider while evaluating your edge strategy. The ten characteristics to consider are listed below:
1. Business need
Business needs always drive toward a solution. Before we start with the design considerations, we want to ask the question: Why? A well-defined customer need articulates the need from an end-user perspective. For example, prevent failure by predicting preventive maintenance needs on a manufacturing floor machine or providing a mobile application transaction response within milliseconds. The business need translates to crucial requirements and is a lead influencer in identifying the type of workloads required to achieve the functionality. It defines the quantitative and qualitative business outcomes desired such as customer satisfaction, cost savings, agility, operational resilience, staff productivity, and other tangible results.
2. Workload type
A collection of resources that performs a business function is called a workload. Business application hosting, data sharing, backup, storage, mission-critical applications, development, and testing are common workload types. Of these different workloads, specific workloads are well suited for on-prem while others fit well for cloud regions or edges. Real-time or near real-time control loop based workloads that require quick decision making are better suited for edge workloads. The entire workloads do not have to move to the edge. Instead, workloads can also be refactored to move certain functions to the edge while keeping the rest of the cloud services.
Each application will have specific performance requirements. There might be a dominant characteristic and one or more good-to-have attributes. For example, an application might require very low latency; however, another application might require significant compute and storage needs. Traditionally, we will use edge computing resources closer to the consumption point to provide low latency. For the application that requires a massive compute, it is cost-effective to move the edge processing to a different location away from the consumption point; however, this will increase latency. Suppose we have an application that requires both low latency and massive computing requirements. A more significant edge computing resource capacity can be deployed closer to consumption to achieve the low latency need; however, this will increase costs. Using a network with lower latency to compensate for the distance between the consumption point and the edge computing resources can be considered as alternatives. A 5G wireless network is such an alternative. Finding the right balance is vital as location and capacity influence performance.
Each application will have underlying resource requirements. For example, a particular function in an embedded device might require limited capacity; however, a video analytics-based back end application might require scalable GPU resources. It is essential to consider both current and future needs of quantity and type of computing resources. A small edge appliance might serve the immediate need; however, it might not be a scalable solution for the future. In this case, the decision of the type of edge computing might change. High availability, fault tolerance, and scaling are import factors to consider along with capacity.
Applications running on edge computing resources connect to multiple systems, including sensors and backend systems. Connectivity is an essential factor in enabling seamless communication and operations. An integrated connectivity model provides a plug-and-play experience without dependence on the customer’s connectivity channel. At the same time, customers might choose to enable a private network instead of using public transport. Similarly, wireless options eliminate the need for additional dependencies such as network ports or legacy interfaces. If a connectivity model provides shorter paths, that makes it an attractive option. Application dependency also can influence connectivity choices.
The physical factors such as space, power, cooling, and environmental factors can play a crucial role in the edge type decision. The physical and environmental conditions might prevent us from deploying an edge computing appliance at the customer site, even if it was qualified as the best option based on other factors. For example, deploying a compute resource in a harsh environment, such as an area with extreme heat, would likely not be such a good idea. As the capacity needs to go up, it is crucial to consider physical factors required to scale. Additionally, aesthetics and sound factors can prevent compute from being deployed close to the consumption point.
7. Operations and management
In an agile DevOps environment, developing, updating, validating, managing, and operating a solution is equally important. With a distributed application architecture where specific services can live on the edge, and particular services live in the cloud region, developing services in the region and moving workloads to the edge can be very efficient. The ability to update and enhance services without truck rolls can improve agility and customer experience. Operations and management can be a vital differentiating factor with edge computing services offered as a managed service, which uses a shared responsibility model.
8. Security and Compliance
Applications on edge computing resources receive and process data sent from sensors and backend systems. Regardless of the application’s criticality, security, and data compliance in transit and in-store are critical and have become table stakes for any operations. In the case of edge appliances, security is a challenge both from a physical access perspective and interface security. For regulated industries such as healthcare, complying with HIPAA regulations requires careful thought about physical security. Consuming edge computing as a managed service mitigates some of the physical security challenges.
9. Business Model
The business model describes how the customer will maximize edge investment and drive return on investment. The customer has a choice of multiple business models, including build, buy, and partner. They can take advantage of a complete managed service or use infrastructure, platform, or software components to create a custom solution. Customers must understand the current costs associated with the business process. The new business model for the edge initiative can be developed based on the various options available and customer preference for the capital expense (CapEx) or operating expense (OpEx). Usage-based cost models are prevalent and make projects viable without the initial investments needed in edge computing models.
10. Total Cost of ownership
Edge computing can be implemented in different models with different cost structures; this makes calculating the Total Cost of Ownership (TCO) necessary. An edge Gateway hardware might be inexpensive; however, it might be more expensive, managing and performing truck rolls. As the needs grow, it might be costlier to scale with the required type of computing resources. The total cost of ownership should account for the initial investment if hardware infrastructure is required, operational and management costs, service fees, process change cost, and other dependent requirement costs. The dependent requirements include factors such as power, cooling, and physical safety needs. Once the TCO is understood, it will help build the Return on Investment (RoI) model for the preferred edge options.
In summary, each of these factors is important in selecting the ideal edge computing model and can influence a customer’s cost-benefit analysis. A balanced trade-off is vital in making the final decision. In some instances, specific aspects such as space and performance become non-negotiable, and we might have to choose the next best alternative. In other cases, a combination of factors will provide the best option.
Mobile edge computing strikes a perfect balance between many of these factors and can address the customer’s needs for edge computing.
Learn more about how Verizon professional services can help you build the ideal edge architecture to help you meet your business needs.