As telecommunications companies lay fiber optic cables to our homes, banks and insurance companies provision high-density wires between geographically dispersed data centers. And while we stream audio and video through reliable 4G phone networks, we can easily forget that with demands on network bandwidth, response-time latency remains a real challenge for IoT development.
"We're seeing a trend where there is just too much data and too many endpoints coming in," said Sean Bowen, CEO of Push Technology Ltd., a London-based company that develops software to increase performance of Web and mobile apps.
Peter Hugheslead engineer, Push Technology Ltd.
And that is the crux of the problem. Developers who write IoT applications -- whether it's an application written on a NodeBot or software managing the dashboard of a luxury car -- tend to have a desire to upload every last piece of information to the server, worried that some esoteric bit of data might become pivotally important at some time in the future.
As a result, IoT devices often counter-productively record every metric possible, sending that data to the server and flooding a network. Testing goes well in a controlled environment, but when applications are tested in the wild, the network becomes a big pitfall. "Across America, there are so many areas that don't have fiber optics of any sort. It's changing gradually, but there will always be places where Internet connections will not be great," said Bowen.
The tenets of strategic IoT development
Software architects and app developers who specialize in IoT should adopt certain strategies to avoid network latency. "We need a solution that can accommodate and be sympathetic to high latency and poor performance," said Peter Hughes, lead engineer at Push Technology. "Networks are getting overloaded. People need to be more intelligent and more efficient about what they build."
There are plenty of tools and technologies that help compress or reduce the amount of data that applications send across the wire, but Hughes suggested that organizations embark on a much simpler approach initially. "The No. 1 avenue for improving latency is to send less data," he said.
It's straightforward advice that can pay significant dividends. "Everyone is sending large amounts of JSON down the wire to every consumer. At large scale, that is a lot of bandwidth, and the more bytes you send, the greater the latency."
Organizations that stream audio and video or engage in high-frequency interactions with users might want to bring a data architect into the ranks to ensure that the right design patterns are being used and that superfluous data is not sent back and forth across the network.
Even IoT devices that use smaller loads can benefit from the following advice:
- Ensure the only data that goes over the network is necessary data
- Be cautious about extending the scope of the data sent across the network
- Avoid sending superfluous and unnecessary data across the network
Latency reduction strategies
Another strategy that has improved performance is to store data locally on devices and only download information when those devices return to the base -- for example, when a drone lands back at its headquarters. This is a good option for monitoring data and metrics gathering that isn't required for real-time operations. Alternatively, companies can upload stored data at non-peak times during which network latency is at a minimum.
Software developers can easily get spoiled by the high-quality networks on which most of their IoT development is performed, but the real world diverges significantly from controlled test environments. To avoid the frustrating surprises that under-performing networks can thrust upon IoT devices, it's best to think about the network first, and design applications that operate optimally, even when latency is a problem and network bandwidth is at a premium.
Which strategies do you employ to encourage IoT development? Let us know.