Edge devices (and edge computing) are the future. Although, this does seem a little cliché, it is the truth. The edge computing industry is growing as quickly as technology can support it and it looks like we will need it to.
IoT global market
The IoT (Internet of Things) industry alone will have put 15 billion new IoT devices into operation by the year 2020 according to a recent Forbes article titled, “10 Charts That Will Challenge Your Perspective of IoT’s growth”. IoT devices are not the only edge devices we have to deal with as the total number of connected edge devices includes the likes of devices like security devices, phones, sensors, retail sales devices, and industrial and home automation devices.
The IoT (Internet of Things) industry alone will have put 15 billion new IoT devices into operation by the year 2020
The sheer number of devices begins to bring thoughts of possible security and bandwidth implications into perspective. The amount of data that will need to be passed and processed with all of these devices will be massive. There needs to be consideration taken by all business owners and automation engineers into how this amount of data and processing will be conducted.
Ever-expanding edge devices market
As the number of edge devices in the marketplace and their use among consumers and businesses rises, the need to be able to handle the data from all of these devices is no longer going to be suitable for central server architectures. We are talking about hundreds of billions and even trillions of devices. According to IHS Markit researchers’ study, there were 245 million CCTV cameras worldwide. One has to imagine there are at least 25% of that many access control devices (61.25 million devices) based on a $344 million market cap also calculated by IHS Markit’s researchers.
If all the other edge devices mentioned earlier are considered then one can see that trying to route them all through servers for processing is going to start to become difficult if it hasn’t already, -which arguably it already has, as is evidenced by the popularity of cloud-based solutions amongst those businesses that already use a lot of edge devices or are processing a lot of information on a constant basis.
The question is whether cloud computing the most effective and efficient solution as the IoT industry grows The question is this; is cloud computing the most effective and efficient solution as the IoT industry grows and the amount of edge devices becomes so numerous? My belief is that it is not. Taking the example of a $399 USD device that is just larger than the size of a pack of cards and runs a CPU benchmarked at the same level as a mid-size desktop. This device has 8GB RAM and 64GB EMMC built-in and a GPU that can comfortably support a 4K signal at 60Hz with support for NVMe SSDs for add-on storage. This would have been unbelievable five years ago.
As the price of edge computing goes down, which it has done in a dramatic way over the last 10 years (as can be seen with my recent purchase), the price to maintain a central server that can perform the processing required for all of the new devices being introduced to the world (due to the low cost of entry for edge device manufacturers) becomes more expensive. This introduces the guarantee that there will be a point where it will be less expensive for businesses, and consumers alike, to do the bulk of their processing at the edge as opposed to in central server architectures.
Cloud computing is now being overtaken by edge computing, the method of processing data at the edge of the network in the devices themselves
There are a plethora of articles discussing and detailing the opposition between the two sides of the computing technology coin, cloud computing and edge computing. The gist of it is that “cloud computing” was the hot new buzzword three years ago and is now being overtaken by “edge computing.” The truth is that cloud computing is a central server architecture hosted at someone else’s location.
Edge computing is going to be a necessary development in the technology industry
Edge computing is the method of processing data at the edge of the network (in the devices themselves) and allowing for less resources required at a central location. There is certainly a use case for both, however the shift to edge computing amongst the general public and small to mid-sized businesses will not be a surprise to those players, who have been paying attention.
One article titled, “Next Big Thing In Cloud Computing Puts Amazon And Its Peers On The Edge” by Investor’s Business Daily takes the stance that edge computing is going to completely displace centralised cloud computing and even coins the phrase, “Cloud computing, decentralised” to explain edge computing.
It speaks for the stance that most experts in technology seem to be taking, including Amazon Web Services’ VP of Technology, Marco Argenti according to the same article. We know that edge computing is going to be a necessary development in the technology industry, and it is happening as I write this, and quickly at that.
Cost efficiency of edge processing
As time goes on, the intersection between the prices of network bandwidth, edge processing and maintaining super powerful central servers will cause edge processing to be the most efficient and cost-effective way to maintain a scalable network in any environment, including datacenters.
Owning a central server or utilising edge computing become the better options
As it currently stands, most residential users can only achieve a 1Gbps WAN (internet) connection, and small to medium-sized business can’t get much more but seem to get much less, based on my personal experience. When more than 1Gbps needs to be processed, cloud computing becomes very expensive at which point, owning a central server or utilising edge computing become the better options.
Then you look a total cost of ownership and when the cost of edge computing is less expensive than the cost of maintaining central server architectures, edge computing becomes the single best option. So, I’ll say it again, edge devices (and edge computing) are the future.