Mesh networks open IOT’s rich last mile data seams for mining

By Kishore Jethanandani

Mesh networks (or the alternative star topology networks connecting devices to routers) afford the mining of data in IOT’s last mile. By interconnecting mobile devices, mesh networks can funnel data from sensors to local gateways or the cloud for analysis and decision-making.

Wired and wireless telecom networks do not reach the distant regions or the nooks and crannies for the mining of information-rich seams of data. Mining, oilfields, ocean-going vessels, electricity grids, emergency response sites like wildfires, and agriculture are some of the information-rich data sources rarely tapped for analytics and decision-making in real-time or otherwise.

Where telecom coverage is available, it does not necessarily reach all assets. Data generated by sensors embedded in equipment on factory floors, underground water pipes in cities, or inventory in warehouses cannot readily access cellular or wired networks.

A typical case of a remote field site is that of an oil exploration and production in California with dispersed wells where ten operators gathered data on tank levels, gas flows, and well-head pressures. Now with a mesh network, operating managers can access this data anywhere and respond to alerts in real-time.

Onsite mesh networks are deployed for microscopic monitoring of equipment to watch for losses such as energy leakages. Refineries are labyrinths of pipes with relief valves to release pressure to avoid blow-ups. One of them in Singapore had one thousand valves to monitor manually. These valves do not necessarily shut tightly enough, or need maintenance and gases trickle out. Over time, the losses add up to a lot. Acoustic signals can pinpoint otherwise unnoticeable leakages and transmit the data via mesh networks to databases; any deviation from pattern prompts action to stop the losses.

The prospects of on-premise mesh networks adoption have improved with the emergence of smart Bluetooth and beacons. With smart Bluetooth technology, an IP layer is built on top of the data layer for ease of connecting devices. Beacons are publicly available for anyone to use for building networks.

We spoke to Rob Chandhok, the President and Chief Operating Officer at San Francisco-based Helium Systems Incorporated, to understand his company’s approach to mining the data in IOT’s last mile. Helium’s current solutions target the healthcare industry and in particular its refrigeration and air conditioning equipment. “Hospitals have hundreds of refrigerators to store medicines which are likely to be damaged if the doors are inadvertently left open,” Rob Chandhok explained to us.

The touchstone of Helium’s technology is its programmable sensors embedded with a choice of scripts capable of rudimentary arithmetic like calculating the difference in temperature between two rooms. As a result, the sensors generate more data than would be possible with the investment in dumb hardware alone. Helium uses star topology for the last mile network connected to the cloud which hosts a platform for analytical solutions. The entire system is configurable from the sensor to the cloud for generating data for the desired thresholds and alerts or analytical models.

“The architecture is intended to optimize processes across the system,” Rob Chandhok told us. He illustrated with an example of the impact of pressure differences; germs are less likely to enter if the internal pressure is higher than the external pressure.

Configurable sensors help to tailor a solution to the outcome desired. Vaccine potency is the greatest if the temperature stays in the 2-8 degrees centigrade (35.6F-46.4 F). By contrast, cell cultures are rendered useless, and thousands of dollars lost, if the temperature falls out in the range of 37 degrees (plus or minus 0.5) centigrade.

In the hospitality industry, the purpose is to improve customer service by keeping temperatures in range to minimize discomfort. Guests do not have to wait until air-conditioning brings temperatures to the desired levels which vary by region and seasons.

The roster of solutions expands as Helium learns more about the clients’ problems. In the course of working with customers in hospitals, Helium was made aware of the routine underutilization of beds. Speaking of future plans, Rob Chandhok said, “We can improve the rate of utilization of beds in hospitals with automatic and real-time tracking with infrared sensors.”  Currently, nurses manually record the state of occupancy of beds usually with a lag. “Nurses are unable to keep pace as patients are moved after short intervals before and after operations,” Rob Chandhok explained.

For a field site application, we spoke to Weft’s Ganesh Sivaraman, Director of Product Management,
as well as Erin O’Bannon, Director of Marketing, about its supply chain solutions. The company uses mesh networks to determine the location and condition of cargo on ships, their expected time of arrival, the inflow of cargo once ships are docked at ports, and the extent of port congestion. “The mesh network brings near real-time visibility to the state of flow of the cargo,” Ganesh Sivaraman told us. However, he clarified that currently, its applications
do not afford the tracking of cargo at the pallet level and their flow in the supply chain. “We use predictive analytics, using proprietary and third-party data sources, to help clients time the deployment of trucks to pick the cargo with minimal delay,” Erin O’Bannon told us. “With visibility, clients can anticipate delays, which lets them plan for alternative routes for trucks to shorten delivery times or switch to air transportation if the gain in time is worth the cost,” Erin O’Bannon explained.

Mesh networks will evolve from vertical to an array of horizontal solutions. Home automation, for example, will likely be linked with fire prevention services and with the connected cars of homeowners. Analytics companies can potentially create duplicative infrastructure left to themselves. We spoke to Shilpi Kumar of Filament,  a company specializing in mesh connectivity for industries, to understand how this evolution will shape the architecture of last mile IOT networks. “Decentralized mesh infrastructure-as-a-service serves the needs of multiple analytics companies with network policies enforced by blockchain-based contracts,” Shilpi Kumar, Product Development, told us. “The interlinking of mesh networks with a secure overlay prepares the way of exchanges between devices in an ecosystem such as vehicles paying for parking automatically,” Shilpi Kumar revealed.

Mesh networks expand the universe of the Internet of Things by making remote data sources accessible. They also raise the level of granularity of data sources that are nominally reachable with existing networks. As a result, these mesh networks expand the array of opportunities for optimizing business processes.

Fog Computing: Bringing cloud vapors to the IOT fields

Sensor data creates needs for local analytics that fog computing serves

By Kishore Jethanandani

Fog computing has arrived as a distinct class of customized solutions catering to local analytical needs in physical ecologies that constitute the Internet of Things. Sensors stream vast volumes of data, from field sites like wind farms, expeditiously processed only in the vicinity and their actionable intelligence is intuitive to local decision-makers. Cloud analytics, by contrast, delays data flows and their processing far too long and loses its value.

Bringing Analytics close to sensor data

The configuration and customization of fog computing solutions address a heterogeneous mix of speed, size, and intelligence needs. An illustrative case is of Siemens’ gas turbines that have five thousand embedded sensors, in each of them, which pour data for storage in databases. Data aggregated locally helps to compare performance across gas turbines, and this is done at the moment as sensors stream live data that can be analyzed to act instantaneously.

An entirely different situation is intelligent traffic lights that sense the beams of the light of incoming ambulances and clear the way for them while alerting other vehicles ahead to reroute or slow down before choking the traffic. In this case, the data analysis spans a region.

Time is of the tactical essence with the users of information generated by sensors and connected devices. A typical case of an application is the extent of use of high-value assets such as jet engines; a breakdown could have a spiral effect on the scheduling of flights. Sensors generate data every second or milliseconds that need to be captured and analyzed to predict equipment failures at the moment.  The volumes of data are inevitably massively large and delays in processing intolerable. Fog analytics slashes the time delays that are inescapable with cloud computing, by parsing the data locally.

Users have expressed keen interest in gathering and analyzing data generated by sensors but have reservations about the technology and its ability to serve their needs. A study completed by Dimensional Research in March 2015 found that eighty-six percent reported that faster and more flexible analytics would increase their return on investment. The lack of conviction about analytics technologies is palpable by the fact that eighty-three percent of the respondents are collecting data but only eight percent capture and analyze data in time to make critical decisions.

The value of data

We spoke to Syed Hoda, Chief Marketing Officer, of ParStream, a company that offers an analytics database and a platform for real-time analytics, on data volumes as large as Petabytes of IoT data, to understand how new breakthroughs in technology help to extract value from it.

ParStream’s technology helps companies gain efficiencies from IoT data which is event specific. The productivity of wind turbines, as measured by electricity generated, is higher when their velocity is proportionate to the speed of flow of winds which is possible when their blades do not buck the wind direction. “By analyzing data, at once, companies can get better at generating actionable insights, and receive faster answers to what-if questions to take advantage of more opportunities to increase productivity,” Syed Hoda told us.

ParStream slashes the time of data processing by edge processing, at the gateway level, rather than aggregate it centrally. It stores and analyzes data on wind turbines, for example, at the wind farm. Numerical calculation routines, embedded in local databases, process arrays of live streams of data, instead of individual tables, to flexibly adjust to computation needs.

Unstructured data and quality of service

We spoke to three experts, who preferred to remain anonymous, employed by a leading company in fog computing about the state of the technical and commercial viability of IOT data-based analytics. They do not believe that impromptu learning from streaming data flowing from devices in the Internet of Things. In their view, the IOT affords only preconfigured inquiries such as comparing the current data to historical experience for purposes such as distinguishing an employee from intruders.

In their view, analytics, in local regions, encompass applications that need unstructured data such as image data for face recognition which are usable with the consistent quality of service. In a shared environment of the Internet of Things, a diversity of demands on a network are potentially detrimental to service quality. They believe that new processors afford the opportunity to dedicate the processing of individual streams of data to specific cores to achieve the desired quality of service.

Fog computing applications have become user-friendly, as devices with intuitive controls for functions like admitting visitors or to oversee an elevator are more widely available. The three experts confirmed that solutions for several verticals have been tested and found to be financially and operationally workable and ready for deployment.

Edge Intelligence

Another approach to edge intelligence is using Java virtual machines and applets, for intelligence gathering, and for executing controls. We spoke to Kenneth Lowe, Device Integration Manager, at Gemalto’s SensorLogic Platform about using edge intelligence for critical applications like regulating the temperature in a data center. “Edge intelligence sends out an alert when the temperature rises above a threshold that is potentially damaging to the machines while allowing you to take action locally and initiate cooling, or in the worst case, shut the system down without waiting for a response from the cloud,” Kenneth Lowe told us. “The SensorLogic Agent, a device-agnostic software element, is compiled into the Java applet that resides on the M2M module itself.  As sensors detect an event, the Agent decides on how to respond, process the data locally, or send it to the cloud for an aggregated view,” Kenneth Lowe explained.

Java virtual machines help to bring analytics from the cloud to the edge, not only to gateways but all the way to devices. We spoke to Steve Stover, Senior Director of Product Management at Predixion Software, which deploys analytic models on devices, gateways and in the Cloud. The distribution of analytics intelligence to devices and gateways helps to function in small or large footprints and disconnected or connected communication environments.

“We can optimize wind turbine performance in real time by performing predictive analytics on data from multiple sensors embedded on the individual turbine in a wind farm,” Steve Stover told us. “Orderly shutdowns prompted by predictive analytics running on the gateway at the edge of the wind farm helps to avoid costly failures that could have a cascading effect,” Steve Stover told us.

Similarly, analytics on the cloud can compare the performance of wind farms across regions for purposes of deciding investment levels in regional clusters of wind farms.

Fog computing expands the spectrum of analytics market opportunities by addressing the needs of varied sizes of footprints. The geographical context, the use cases, and the dimensions of applications are more differentiated with fog computing.

Previously published by All Analytics of UBM TecWeb

Knowing the unknown by digging deep

Kishore Jethanandani

Deep learning, referred to as neural network algorithms, is a lot like solving a crossword puzzle–the unknowns in gargantuan data stores are knowable only by their relationships with the known. Unsupervised deep learning goes further and does not presume, at the outset, any knowledge of the interdependencies in the data.

Supervised deep learning is analogous to searching for an undersea destination like an oil well with the knowledge of the coastline alone. It reads the known relationships in the geophysical data in the layers underneath the seashore to reach, progressively, the oil well. Unsupervised learning first establishes whether a relationship exists between the contours of the coastline and the subterranean topography.

We spoke to Dr. Charles H Martin, a long-time expert in machine learning and the founder of Calculation Consulting, about the prospects for enterprise applications of supervised and unsupervised deep learning. “Many in the business world recognize the vast potential of applications of deep learning and the technology has matured for widespread adoption,” Dr. Martin surmised. “The most hospitable culture for machine learning is scientific and open to recurring experimentation with ideas and evolving business models, the legacy enterprise fixation on engineering and static processes is a barrier to its progress,” Dr. Martin underscored.

Unstructured data abounds, and the familiar methods of analyzing them with categories and correlations do not necessarily exist. The size and variety of such databases can elude modeling. These unstructured databases have valuable information like social media conversations about brands, video from traffic cameras, sensor data of factory equipment, or trading data from exchanges that are akin to finding a needle in a haystack. Deep learning algorithms find the brand value from positive and negative remarks on social media, elusive fugitives in the video from traffic cameras, the failing equipment from the factory data, or the investment opportunity in the trading data.

“Unsupervised deep learning helps in detecting patterns and hypothesis formulation while supervised deep learning is for hypothesis testing and deeper exploration,” Dr. Martin concluded.  “Unsupervised deep learning has proved to be useful for fraud detection, and oil exploration—anomalies in the data point to cybercrime and oil respectively,” he explained. “The prediction of corporate performance using granular data such as satellite imagery of traffic in the parking lots of retail companies is an example of the second generation of supervised deep learning,” Dr. Martin revealed.

Early detection of illnesses from medical imaging is one category of problems that deep learning is well suited to address. Citing the example of COPD (Chronic Obstructive Pulmonary Disease), Dave Sullivan, the CEO of Ersatz Labs, a cloud-based deep learning company based in San Francisco, told us, “the imaging data shows nodules and not all of them indicate COPD. It is hard for even a trained eye to tell one from another. Deep learning techniques evolve as they are calibrated and recalibrated (trained) on vast volumes of data gathered in the past, and they learn to distinguish with a high degree of accuracy for individual cases.”

Clarifai has democratized access to its deep learning with its API, which allows holders of data to analyze and benefit from the insights.  We spoke to Matthew Zeiler, the CEO and Founder of Clarifai, to understand how its partners use the technology.  One of them is France-based i-nside.com, a healthcare company, which employs smartphones to conduct routine examinations of the mouth, ear, and throat to generate data for diagnosis. “In developing countries where doctors are scarce, the analysis of the data points to therapies that are reliable,” Zeiler told us. “In developed countries, the analysis of the data supports the judgment of doctors, and they have reported satisfactory results,” Zeiler added.

Enterprise is not the only place where deep learning has found a home—consumer applications like Google Now, Microsoft’s Cortana and Assistant are available in the market. Folks are often anxious and distracted, at work or play, when they are unable to keep track of critical events that could affect them or their family. Home surveillance watches pets, the return of young children from school, elderly relatives falling, the arrival of critical packages and more. What matters is an alert on an unusual event. Camio uses the camera of a handheld phone or any other home device like a computer to capture video of happenings at home. When something irregular happens, IFTTT sends alerts.

Deep learning mimics the neurons of the brain to sift the meaningful relationships otherwise lost in the clutter of humongous streams or data stores. Machines can do it faster when the correlations are known. When they are unknown, it helps to discern the patterns before deciding to invest time in deeper investigations.