Zeroing on Stuxnet-like cyber adversaries

by Kishore Jethanandani

Cyber defense is on high alert against assaults of unknown and elusive threats akin to Stuxnet that hit Iranian nuclear facilities.  Firewalls — designed for known, signature-based malware — are routinely breached.

Zero-day exploits

Alternative approaches for protecting networks against the elusive zero-day cyber attacks, AI-enabled services, and applications, exist but adversaries have found ways to subvert them. Preventive methodologies which eliminate vulnerabilities at the time of software development take management transformation before they can be implemented. 

SDN controllers are the big brothers of networks. They receive data on unusual activities from every corner of virtualized networks from sensors.  When unusual activity is detected in networks, SDN  controllers prompt actuators to take action against threats.

Finding zero-day threats, however, is a formidable challenge. Virtualized networks generate a torrent of software components with unpatched bugs — unknown vulnerabilities that hackers can exploit and go unnoticed. IoT networks and connected devices are adding another wave of software to the mix. According to a recent survey by cybersecurity firm Herjavec Group 70% of security operation centers see unknown, hidden, and emerging threats as their most challenging problem and the most desired capability they would like to acquire is threat detection (62%.)

Zero-day attacks pinpoint specific bugs leaving only small traces of their footprints. When detected, they have polymorphic chameleon characteristics to morph into new unknown versions. Network perimeters, as a result, are chronically porous.

Unsurprisingly, zero-day vulnerabilities, usually discovered accidentally during routine maintenance, peaked at 4,958 in 2014 and declined to 3,986 in 2016, according to a Symantec report. Product development processes, which incorporate security at the outset, are believed to be responsible for the fall.

Law enforcement was initially able to foil zero-day attacks by listening to conversations among cybercriminals over the darknet. Hackers have since closed this source of information.

“Cybercriminals construct their private networks to prevent law enforcement from listening to their conversations,” said Mike Spanbauer, vice president of research strategy, NSS Labs Inc. A research study by NSS Labs on breach detection systems found that five of the seven tested missed malware that evades firewalls, or advanced malware like zero-day threats, and their average effectiveness was 93%. The shortfall of 7% leaves the entire network at risk.

Living off the land

The story is no different when cybercriminals are inside of a virtualized network. They can blend into the network by acquiring credentials from the network’s administration, which is called “living off of the land” in the cyber security world. Service providers are prone to decrypting data — as illustrated by a recent FTC case — when they move it across transportation layers and provide an opportunity for intruders to sniff out credentials. They then use remote control protocols — meant for legitimate purposes such as load balancing — to maliciously control multiple VNFs.

Opportunities for deception abound in virtualized networks. For example, by masquerading as trusted authorities — such as those responsible for the quality of service — gain access to confidential information of unsuspecting victims across the network. Cybercriminals can spin virtual machines, recreated from their snapshots or images, and inject stolen identities of trusted authorities to ward off any suspicion of malicious activity.

Hackers can exploit the inconsistencies created unknowingly in interdependent systems of virtual networks. The data network, for example, is governed by the policies of the management network, and the SDN controller executes policies. Adversaries can maliciously insert fake policies in the management network, and the SDN controller unwittingly implements them.

Artificial Intelligence

In this shadowy cybersecurity world, artificial intelligence is widely touted as a means to find the clues to lurking malware. Chris Morales, head of security analytics at Vectra, said his company specializes in tracking cyber threats inside networks by analyzing data from packet headers to find patterns in communication between devices and their relationships.

“We focus on the duration, timing, frequency, and volume of network traffic,” he said. “Data on a sequence of activities point to hidden risks. An illustrative sequence that is a telltale sign of malicious activity is an outside connection initiating largely outbound data transfers and small inbound requests, together with sweeps of ports and IP addresses, searching for databases, and file servers, followed by attempts at administrative access.”

Artificial intelligence, however, is not a panacea as machine-learning algorithms have chinks that cybercriminals can exploit with their own learning algorithms. AI-augmented malware tweaks its malicious code as it learns about the detection of its earlier versions. Cybercriminals can also fob off the defending algorithms by feeding subtly misleading data (adversarial learning) such as pictures of effeminate males that are then mistakenly labeled as females.

As the cybersecurity arms race spirals ad infinitum, some industry experts are taking a step back to consider an entirely different course of action. “Hackers essentially reverse engineer code to find flaws in software and determine how to exploit them. By adopting methodologies like the secure development lifecycle (SDLC), software developers can use analytics tools to detect errors and eliminate them at the outset,” said Bill Horne, vice president, and general manager with Intertrust Technologies Corporation.

Deep Instinct’s Shimon Noam Oren, head of Cyber Intelligence, had an altogether different take on the matter. His company’s data and analytical model are designed to track unknown unknowns while current models can at best detect known unknowns.

“Data on the behavior of malware limits the training of current algorithms to known threats,” he said. “Binary sequences, the most basic level of computer programming, account for all raw data and the infinite number of combinations that are possible. Some of these sequences represent current threats, and others are possibilities open to adversaries.

“Current predictive modeling techniques in security are linear while Deep Instinct’s model is non-linear, which affords greater flexibility for the machine to autonomously simulate and predict unknown unknowns extrapolating from data on existing threats as if solving a crossword puzzle.”

The most likely scenario for the future is that improved software development methodologies will slow down the rate of increase of vulnerabilities from the current breakneck speed. Zero-defect software is improbable in the environment. Ever more sophisticated AI engines will build defenses against the remaining hidden threats.

A version of this article was previously published by Light Reading’s Telco Transformation

5G: Customized services and apps at the edge

Computing at the edge

by Kishore Jethanandani

Computing at the edge
5G radio technologies at the edge

5G’s raison d’être is to quickly provision real-time services and applications across end-to-end networks to customers and business in various locations. While that seems to be a tall order, the technologies are coalescing to deliver on the 5G promise of faster, better and new services.

Adoption of 5G applications

While 5G applications are currently in the pilot stages, their adoption is expected to accelerate as a result of the demonstration effect from the 2018 Winter Olympics early next year in South Korea.

Barry Graham, director of product management, at the TM Forum, asserts that it’s not all about the future. The industry has prioritized and enhanced the broadband aspect of 5G, which is more familiar to service providers. Mission-critical applications such as autonomous cars are further away on the horizon while massive machine-type communications are even farther down the road, but the market size for the latter is expected to be in trillions of dollars. When its deployed, machine communications will be spread over hundreds of potential applications and the customized versions will be attractive because of the prospect of higher profitability.

“Internet of Everything applications are characterized by heterogeneity and a diversity of platforms,” Graham said. “They will benefit the most from a robust and pervasive cloud-native environment that can scale up and down.”

For a pervasive cloud-native environment, “the industry needs standards for the communication between the edge and the mobile edge when traffic does not have to be directed to the central cloud,” according to Graham.

“While making the best of their existing investments, CIOs should begin to plan for the Internet of Everything for use cases such as micropayments,” he concluded.

Edge Cloud

The edge cloud, enlarged with the incorporation of a cloudified RAN, will be transformed to meet the individual performance needs of 5G-enabled applications. Services can be tailor-made for customers and delivered in real-time by placing all or most of the elements for service composition — such as VNFs, virtualized resources, microservices, management and orchestration software, a cloud-native infrastructure that includes the SaaS. IaaS, PaaS, and Cloud-RAN — in close proximity to customers at the edge.

The unknown at the moment is the economic feasibility of the edge clouds. “Clouds are centralized for a reason; their economic returns are known. For edge clouds, network operators and the industry must now work hand-in-hand to develop financially feasible use-cases together,” said Franz Seiser, vice president of Core Network and Services at Deutsche Telekom.

A step change in the expected performance of 5G undergirds the confidence to adapt to market changes as they happen. Compared to the latency of 15-20 milliseconds for 4G LTE, 5G will be below 4 milliseconds for broadband and as low as 0.5 milliseconds for mission-critical applications. 5G has actual throughput speeds of 500 Megabits per second to 5 G/bits while 4G LTE’s throughput ranges from 6.5 M/bits to 26.3 M/bits.

Service providers as master orchestrators

 
Service providers will play a pivotal role in orchestrating all of the elements needed for customizing services, which they will source from the stakeholders in ecoystem at the edge of their networks.

The managed service provider’s role is build an end-to-end virtual network slice based on the needs of the customer. The managed service provider is brokering the resources from one entity to another while also insuring that service level agreements are being met.

While network slicing has vast potential for enabling specific services and applications on the fly, service providers need to ensure that they have insight into the end-to-end framework including inter-slice management.

“ONAP is one potential candidate for a framework to manage cloudified networks, including network slicing,” Duetsche Telekom’s Seiser said.

Network slicing 

Network slicing provides the means to customize business solutions for verticals based on the performance needs spelled out by the SLAs signed with customers. “5G system aims to provide a flexible platform to enable new business cases and models… network slicing emerges as a promising future-proof framework to adhere by the technological and business needs of different industries,” according to a European Union document.

The keystone of network slicing and its design challenge is “network slicing needs to be designed from an end-to-end perspective spanning over several technical domains (e.g., core, transport, and access networks) and administrative domains (e.g., different mobile network operators) including management and orchestration plane,” according to the European Union document.

On the stretch from the core to the access networks, each operator is sharing network capacity with peers or multiple tenants for the delivery of one or more of the portfolio of services, which saves on capital expenditures that need to be minimized in order to make investments on customized services more viable.

Industrial automation, such as collaborative robotics — which includes humans interacting with robots — is a vertical where network slicing is expected to gain acceptance.

“The latency and throughput requirements of collaborative robotics vary within a factory floor, or across an industrial region, and network slicing flexibly tailors network services for each of them” said Harbor Research analyst Daniel Intolubbe-Chmil.

Network slicing end-to-end — from the core to customers’ facilities — will become possible with the reliability, flexibility and the desired throughput (collectively described as network determinism) that cloudified RAN networks will be able to deliver.

“In conjunction with mmWave technologies, they will support throughput comparable to Ethernet. Beamforming helps to achieve high reliability that maintains the quality of the signal end-to-end. Small cells lend greater flexibility and speed in deployment compared to the Ethernet,” Intolubbe-Chmil said.

SK Telecom deployed its first cloud RAN in the third quarter of last year in order to support a rollout of LTE-A, which is pre 5G, while also preparing a base for its upcoming 5G platform, according to Neil Shah, research director at Counterpoint Research.

“Network operators are ready to scale cloud RANs as they have greater clarity on the right mix of macro cell sites and small cells controlled by cloud baseband units,” Shah said.

Virtual reality 

Virtual reality looks to be the leading application of choice in the 5G environment. The countdown for its mass use has started as the organizers of the 2018 Winter Olympics next year in South Korea engineer their stadiums for 5G-enabled virtual reality even before the 5G standards are finalized.

Sports fans will be delighted with 360 degrees virtual reality that includes the option to view footage from their preferred angle such as from a sportsperson’s head-worn camera or a drone with a more panoramic view. They will also be able to switch streams from one sports arena to another with virtual reality.

“An edge cloud placed in the sports stadium, meeting the processing demand from VR streams flowing to hundreds of fans, lowers latency,” Seiser said. “Bandwidth is used efficiently when the data rate is reduced by delivering only a sliver of a VR stream needed to render content for the field of view of the user,” Seiser explained.

With a constellation of services and applications at the mobile edge, 5G cloud-native architecture are starting to converge to make widespread customization of services possible for service providers and their end customers.

Mesh networks open IOT’s rich last mile data seams for mining

By Kishore Jethanandani

Mesh networks (or the alternative star topology networks connecting devices to routers) afford the mining of data in IOT’s last mile. By interconnecting mobile devices, mesh networks can funnel data from sensors to local gateways or the cloud for analysis and decision-making.

Wired and wireless telecom networks do not reach the distant regions or the nooks and crannies for the mining of information-rich seams of data. Mining, oilfields, ocean-going vessels, electricity grids, emergency response sites like wildfires, and agriculture are some of the information-rich data sources rarely tapped for analytics and decision-making in real-time or otherwise.

Where telecom coverage is available, it does not necessarily reach all assets. Data generated by sensors embedded in equipment on factory floors, underground water pipes in cities, or inventory in warehouses cannot readily access cellular or wired networks.

A typical case of a remote field site is that of an oil exploration and production in California with dispersed wells where ten operators gathered data on tank levels, gas flows, and well-head pressures. Now with a mesh network, operating managers can access this data anywhere and respond to alerts in real-time.

Onsite mesh networks are deployed for microscopic monitoring of equipment to watch for losses such as energy leakages. Refineries are labyrinths of pipes with relief valves to release pressure to avoid blow-ups. One of them in Singapore had one thousand valves to monitor manually. These valves do not necessarily shut tightly enough, or need maintenance and gases trickle out. Over time, the losses add up to a lot. Acoustic signals can pinpoint otherwise unnoticeable leakages and transmit the data via mesh networks to databases; any deviation from pattern prompts action to stop the losses.

The prospects of on-premise mesh networks adoption have improved with the emergence of smart Bluetooth and beacons. With smart Bluetooth technology, an IP layer is built on top of the data layer for ease of connecting devices. Beacons are publicly available for anyone to use for building networks.

We spoke to Rob Chandhok, the President and Chief Operating Officer at San Francisco-based Helium Systems Incorporated, to understand his company’s approach to mining the data in IOT’s last mile. Helium’s current solutions target the healthcare industry and in particular its refrigeration and air conditioning equipment. “Hospitals have hundreds of refrigerators to store medicines which are likely to be damaged if the doors are inadvertently left open,” Rob Chandhok explained to us.

The touchstone of Helium’s technology is its programmable sensors embedded with a choice of scripts capable of rudimentary arithmetic like calculating the difference in temperature between two rooms. As a result, the sensors generate more data than would be possible with the investment in dumb hardware alone. Helium uses star topology for the last mile network connected to the cloud which hosts a platform for analytical solutions. The entire system is configurable from the sensor to the cloud for generating data for the desired thresholds and alerts or analytical models.

“The architecture is intended to optimize processes across the system,” Rob Chandhok told us. He illustrated with an example of the impact of pressure differences; germs are less likely to enter if the internal pressure is higher than the external pressure.

Configurable sensors help to tailor a solution to the outcome desired. Vaccine potency is the greatest if the temperature stays in the 2-8 degrees centigrade (35.6F-46.4 F). By contrast, cell cultures are rendered useless, and thousands of dollars lost, if the temperature falls out in the range of 37 degrees (plus or minus 0.5) centigrade.

In the hospitality industry, the purpose is to improve customer service by keeping temperatures in range to minimize discomfort. Guests do not have to wait until air-conditioning brings temperatures to the desired levels which vary by region and seasons.

The roster of solutions expands as Helium learns more about the clients’ problems. In the course of working with customers in hospitals, Helium was made aware of the routine underutilization of beds. Speaking of future plans, Rob Chandhok said, “We can improve the rate of utilization of beds in hospitals with automatic and real-time tracking with infrared sensors.”  Currently, nurses manually record the state of occupancy of beds usually with a lag. “Nurses are unable to keep pace as patients are moved after short intervals before and after operations,” Rob Chandhok explained.

For a field site application, we spoke to Weft’s Ganesh Sivaraman, Director of Product Management,
as well as Erin O’Bannon, Director of Marketing, about its supply chain solutions. The company uses mesh networks to determine the location and condition of cargo on ships, their expected time of arrival, the inflow of cargo once ships are docked at ports, and the extent of port congestion. “The mesh network brings near real-time visibility to the state of flow of the cargo,” Ganesh Sivaraman told us. However, he clarified that currently, its applications
do not afford the tracking of cargo at the pallet level and their flow in the supply chain. “We use predictive analytics, using proprietary and third-party data sources, to help clients time the deployment of trucks to pick the cargo with minimal delay,” Erin O’Bannon told us. “With visibility, clients can anticipate delays, which lets them plan for alternative routes for trucks to shorten delivery times or switch to air transportation if the gain in time is worth the cost,” Erin O’Bannon explained.

Mesh networks will evolve from vertical to an array of horizontal solutions. Home automation, for example, will likely be linked with fire prevention services and with the connected cars of homeowners. Analytics companies can potentially create duplicative infrastructure left to themselves. We spoke to Shilpi Kumar of Filament,  a company specializing in mesh connectivity for industries, to understand how this evolution will shape the architecture of last mile IOT networks. “Decentralized mesh infrastructure-as-a-service serves the needs of multiple analytics companies with network policies enforced by blockchain-based contracts,” Shilpi Kumar, Product Development, told us. “The interlinking of mesh networks with a secure overlay prepares the way of exchanges between devices in an ecosystem such as vehicles paying for parking automatically,” Shilpi Kumar revealed.

Mesh networks expand the universe of the Internet of Things by making remote data sources accessible. They also raise the level of granularity of data sources that are nominally reachable with existing networks. As a result, these mesh networks expand the array of opportunities for optimizing business processes.

Fog Computing: Bringing cloud vapors to the IOT fields

Sensor data creates needs for local analytics that fog computing serves

By Kishore Jethanandani

Fog computing has arrived as a distinct class of customized solutions catering to local analytical needs in physical ecologies that constitute the Internet of Things. Sensors stream vast volumes of data, from field sites like wind farms, expeditiously processed only in the vicinity and their actionable intelligence is intuitive to local decision-makers. Cloud analytics, by contrast, delays data flows and their processing far too long and loses its value.

Bringing Analytics close to sensor data

The configuration and customization of fog computing solutions address a heterogeneous mix of speed, size, and intelligence needs. An illustrative case is of Siemens’ gas turbines that have five thousand embedded sensors, in each of them, which pour data for storage in databases. Data aggregated locally helps to compare performance across gas turbines, and this is done at the moment as sensors stream live data that can be analyzed to act instantaneously.

An entirely different situation is intelligent traffic lights that sense the beams of the light of incoming ambulances and clear the way for them while alerting other vehicles ahead to reroute or slow down before choking the traffic. In this case, the data analysis spans a region.

Time is of the tactical essence with the users of information generated by sensors and connected devices. A typical case of an application is the extent of use of high-value assets such as jet engines; a breakdown could have a spiral effect on the scheduling of flights. Sensors generate data every second or milliseconds that need to be captured and analyzed to predict equipment failures at the moment.  The volumes of data are inevitably massively large and delays in processing intolerable. Fog analytics slashes the time delays that are inescapable with cloud computing, by parsing the data locally.

Users have expressed keen interest in gathering and analyzing data generated by sensors but have reservations about the technology and its ability to serve their needs. A study completed by Dimensional Research in March 2015 found that eighty-six percent reported that faster and more flexible analytics would increase their return on investment. The lack of conviction about analytics technologies is palpable by the fact that eighty-three percent of the respondents are collecting data but only eight percent capture and analyze data in time to make critical decisions.

The value of data

We spoke to Syed Hoda, Chief Marketing Officer, of ParStream, a company that offers an analytics database and a platform for real-time analytics, on data volumes as large as Petabytes of IoT data, to understand how new breakthroughs in technology help to extract value from it.

ParStream’s technology helps companies gain efficiencies from IoT data which is event specific. The productivity of wind turbines, as measured by electricity generated, is higher when their velocity is proportionate to the speed of flow of winds which is possible when their blades do not buck the wind direction. “By analyzing data, at once, companies can get better at generating actionable insights, and receive faster answers to what-if questions to take advantage of more opportunities to increase productivity,” Syed Hoda told us.

ParStream slashes the time of data processing by edge processing, at the gateway level, rather than aggregate it centrally. It stores and analyzes data on wind turbines, for example, at the wind farm. Numerical calculation routines, embedded in local databases, process arrays of live streams of data, instead of individual tables, to flexibly adjust to computation needs.

Unstructured data and quality of service

We spoke to three experts, who preferred to remain anonymous, employed by a leading company in fog computing about the state of the technical and commercial viability of IOT data-based analytics. They do not believe that impromptu learning from streaming data flowing from devices in the Internet of Things. In their view, the IOT affords only preconfigured inquiries such as comparing the current data to historical experience for purposes such as distinguishing an employee from intruders.

In their view, analytics, in local regions, encompass applications that need unstructured data such as image data for face recognition which are usable with the consistent quality of service. In a shared environment of the Internet of Things, a diversity of demands on a network are potentially detrimental to service quality. They believe that new processors afford the opportunity to dedicate the processing of individual streams of data to specific cores to achieve the desired quality of service.

Fog computing applications have become user-friendly, as devices with intuitive controls for functions like admitting visitors or to oversee an elevator are more widely available. The three experts confirmed that solutions for several verticals have been tested and found to be financially and operationally workable and ready for deployment.

Edge Intelligence

Another approach to edge intelligence is using Java virtual machines and applets, for intelligence gathering, and for executing controls. We spoke to Kenneth Lowe, Device Integration Manager, at Gemalto’s SensorLogic Platform about using edge intelligence for critical applications like regulating the temperature in a data center. “Edge intelligence sends out an alert when the temperature rises above a threshold that is potentially damaging to the machines while allowing you to take action locally and initiate cooling, or in the worst case, shut the system down without waiting for a response from the cloud,” Kenneth Lowe told us. “The SensorLogic Agent, a device-agnostic software element, is compiled into the Java applet that resides on the M2M module itself.  As sensors detect an event, the Agent decides on how to respond, process the data locally, or send it to the cloud for an aggregated view,” Kenneth Lowe explained.

Java virtual machines help to bring analytics from the cloud to the edge, not only to gateways but all the way to devices. We spoke to Steve Stover, Senior Director of Product Management at Predixion Software, which deploys analytic models on devices, gateways and in the Cloud. The distribution of analytics intelligence to devices and gateways helps to function in small or large footprints and disconnected or connected communication environments.

“We can optimize wind turbine performance in real time by performing predictive analytics on data from multiple sensors embedded on the individual turbine in a wind farm,” Steve Stover told us. “Orderly shutdowns prompted by predictive analytics running on the gateway at the edge of the wind farm helps to avoid costly failures that could have a cascading effect,” Steve Stover told us.

Similarly, analytics on the cloud can compare the performance of wind farms across regions for purposes of deciding investment levels in regional clusters of wind farms.

Fog computing expands the spectrum of analytics market opportunities by addressing the needs of varied sizes of footprints. The geographical context, the use cases, and the dimensions of applications are more differentiated with fog computing.

Previously published by All Analytics of UBM TecWeb

Cyber-detectives chase cyber-criminals armed with Big Data

by Kishore Jethanandani

Cyber-security in enterprises is caught in a dangerous time warp—the long held assumption that invaluable information assets of companies can be cordoned off within a perimeter, protected by firewalls, no longer holds. The perimeter is porous with its countless access points available to a mobile and distributed workforce, and partners’ networks, with remote access rights to corporate data via the cloud.

Mobile endpoints and their use of the cloud for sharing corporate data have been found to be the most vulnerable conduit that cyber-criminals exploit for launching the most sophisticated attacks (advanced persistent threats) intended to steal intellectual property. Poneman Institute’s survey of cyber-security attacks, over twenty four months, found that 71 percent of companies reported that endpoint security risks are the most difficult to mitigate. The use of multiple mobile devices to access the corporate network was reported to be the highest risk with 60 percent reporting so. Another 50 percent considered the use of personal mobile devices for work related activity to be the highest risk. The second most important class of IT risks was considered to be thirty-party cloud applications with 66 percent reporting so.  The third most important IT risk of greatest concern was reported to be Advanced Persistent threats.

In an environment of pervasive vulnerabilities, enterprises are learning to remain vigilant about anomalous behavior pointing to an impending attack from criminals. “Behavioral patterns that do not conform to the normal rhythm of daily activity, often concurrent with large volumes of traffic, are the hallmarks of a cyber-criminal,” Dr. Vincent Berk, CEO and co-founder of Flowtraq, a Big Data cyber-security firm that specializes in identifying behavioral patterns of cyber-criminals, told us.  “A tell-tale sign of an imminent cyber attack is unexpected network reconnaissance activity,” he informed us. Human beings need to correlate several clues emerging from the data analysis before drawing conclusions because criminals learn new ways to evade surveillance.

Enterprises now recognize the importance of learning to recognize the “fingerprints” of cyber-criminals from their behavior. A 2014 survey by PriceWaterHouseCooper found that 20 percent of the respondents see security information and event management tools as a priority and an equal number event correlation as a priority. These technologies help to recognize behavioral patterns of cyber-criminals.

“Scalability of Big Data solutions to identify behavior of cyber-criminals is the most daunting challenge.” Dr. Vincent Berk told us. “We extract data from routers and switches anywhere in the pathway of data flows in and out of the extended enterprise,” he explained to us. “The fluidity of enterprise networks today with increasing virtualization and recourse to the cloud makes it challenging to track them,” he informed us. “Additionally, mergers and acquisitions add to the complexity as more routers and switches have to be identified and monitored,” he explained to us.

Dr. Berk underscored the importance of avoiding false positives which could lead to denial of access to legitimate users of the network and interruption of business activity. “Ideally, we want to monitor at a more granular level, including the patterns of activity on each device in use, and any departures from norm to avoid false positives,” he told us. The filter of human intelligence is still needed to isolate false positives.

“Granular monitoring is more accurate and has uncovered sophisticated intruders who hide inside virtualized private networks (VPNs) or encrypted data flows,” Dr. Berk revealed to us. Often, these sophisticated attackers have been there for years unnoticed. “The VPNs and the encryption are not cracked but the data is analyzed to understand why they are in the network,” Dr. Berk explained to us.

Cyber-security will increasingly be a battle of wits between intruders and the victims. Big Data analysis notwithstanding, cyber-criminals will find new ways to elude their hunters. The data analysis will provide clues about the ever changing methods used by cyber-criminals and means to guard against their attacks. The quality of human intelligence on either side will determine who wins.

 

Infrared mobile devices: light under the cover of darkness

By Kishore Jethanandani

Consumer mobile devices are extending their reach into the enterprise, fulfilling more than communication needs of distributed workforces, as they are incorporated into business processes. A bevy of companies have launched infra-red mobile devices to supplement or compete with far more expensive infra-red cameras that historically have been used for specialized, high value applications.

Mass use of inexpensive infrared mobile devices in the enterprise meets a variety of operational needs to increase productivity and minimize risk. Impending equipment failures, indicated by cracks, are invisible to the human eye but can be detected by infrared devices. Additionally, intruders hiding in dark corners are spotted by infrared devices. Leaks and attendant energy losses, unnoticed by the naked eye, are visible to infrared devices.

Infrared devices have a unique ability to discern objects that the eye cannot. Intruders, for example, are detected by reading the differences in body and room temperature.

Among the new entrants is an Israeli company, Opgal, which has implemented Therm-App for law enforcement agencies and will expand its markets to private security firms, construction, and more. In collaboration with Lumus, it is also offering an equivalent wearable option with the ability to display thermal images in the center of an eyeglass.

Market leader, FLIR, which has a long history in infra-red cameras, has announced FLIRone, a camera that it is reportedly going to be used with Apple mobile devices.

Scotland-based Pyreos has launched a low-power mobile device for applications such as sensing noxious gases in industrial plants.

Opgal expects to compete with the entrenched incumbent, FLIR, by “offering an open platform for applications development over the cloud, with added benefits of much higher resolution (384*288) and depth of field than is possible with existing handheld devices to have adequate room to play for continuous development of new applications.” Mr Amit Mattatia, CEO of Opgal told us.

Opgal has gained significant traction in law enforcement by helping policepersons become more effective by piercing the veil of darkness with infrared. According to a FBI study, the largest number of deaths of law enforcement officers took place between midnight and 2 am in the morning. The fatalities happen when officers are in pursuit of fugitives outside of their vehicles. Officers are prone to injury or to lose their way in the darkness and are hard to find when they do. “Officers wear Opgal’s mobile device on their body and their pathway is traced by infra-red and communicated to a remote officer on a local map which can also be kept as a record for court proceedings,” Mr Amit Mattatia, CEO of Opgal told us.

Experienced professionals in the industry, long-time users of infra-red cameras, expressed skepticism about the ability of mobile devices to progress beyond some rudimentary applications. “A building inspector, for example, can detect a problem such as a wet spot due to a leakage but the resolution of images captured with mobile devices is not sharp enough to find its cause or source,” Gregory Stockton of Stockton Infrared, a company based in Randleman, North Carolina, told us. “Opgal’s Therm-App is an exception with its high resolution and depth of field but its price at four thousand dollars is comparable to proven and integrated alternatives like the FLIR E-8,” Mr. Stockton remarked. “The utility of most other mobile devices will be to detect problems before professional alternatives are sought for diagnosis and solutions,” Mr. Stockton concluded.

The services of professionals are needed for their specialized skills in any one of the varying types of infrared imagery.  “Broadly, the infrared imaging market is divided between the short-wave, mid-wave and the long-wave. The short-wave and midwave is largely confined to the military and requires very large investments while most professional imaging for the enterprise happens in the long-wave,” Mr. John Gutierrez of Infrared Thermal Imaging Services told us. “It takes a trained eye to detect the source of problems, by interpreting infrared images, such as those showing gas leaks in buildings, and the data on temperature differences and air movement,” Mr. John Gutierrez explained to us. According to him, none of this can be done with a lay user of mobile devices with infrared cameras but their widespread availability raises awareness about thermal imaging and the solutions possible with them.

The last word on the interplay of mobile devices and independent infrared cameras can hardly be said at this point in time. Mobile devices have been undergoing rapid transformation with improving capabilities in image capture and prolific applications development. Infrared cameras are more robust as hardware but users are more likely to be sensitive to their price as they weigh the alternative of a mobile device. One thing is certain—the market for thermal imaging will not be the same.

 

The post was previously published by the now defunct Mobility Hub of UBM Techweb

 

 

On the M2M road to pervasive intelligence

By Kishore Jethanandani

Machine-to-Machine (M2M), the tiny radio devices, sensors that feed bits of data from activities of objects such as moving trucks, have a whole lot more value when the data from each of them is aggregated to draw intelligence.  This would be performance metrics like the expected time of arrival for trucks or cargo monitoring such as temperatures in cabins of vehicles carrying perishables.

Driving safety performance has improved significantly as accident rates are lowered with detailed monitoring of driver behavior. Risky driving behavior is predicted based on real-time data, aggregated from a variety of sensors, on acceleration, brake use, maneuvers and turns that are most likely to cause accidents. Drivers are cautioned when their behavior risks accidents.  Studies undertaken by Trimble show accident rates were lowered by 45% and corresponding insurance and related costs cut by 50%.

Where multimodal transportation is involved, data aggregated from the gamut of means of transportation uncovers opportunities for business gains that would otherwise be not available. Portvision, for example, gained visibility into the movement of vessels by taking advantage of the AIS (Automated Identification System) which is otherwise used for collision avoidance by the Coast Guard. The data can tell, in real time, when the vessel is close enough to the shore to benefit from switching to cellular communications. Ships tend to spend a great deal of time close to the shore, before the cargo is unloaded, and can significantly lower the use of the more expensive satellite communications.

Telecom carriers, specialist M2M solution providers, strategic alliances and M2M marketplaces are coming together to interconnect sensors, devices and networks so that the data can flow seamlessly to central databases where it can be parsed for information. A global alliance of KPN, NTT DoCoMo, Rogers Communications, SingTel, Telefónica, through its Telefónica Digital unit, Telstra, and Vimpelcom will use a distinctive SIM card, a common web interface and will centrally manage M2M devices. They will all collaborate with Jasper Wireless to manage the M2M network. Verizon acquired Hughes Telematics in June 2012 for its fleet management solutions. Orange acquired Data and Mobile in January 2012 for its fleet management software. Masternaut entered into a strategic alliance with Telefonica for joint marketing of M2M and related fleet management solutions. Deutsche Telecom created a global M2M marketplace where the thousands of vendors could find selling opportunities and for customers to do comparison shopping.

Transportation and Logistics leads as the target market by Communication Service Providers for their M2M services. A survey conducted by Informa found that 52% of the carriers are targeting this industry.

Real-time M2M powered fleet management enables new services, cuts costs and speeds up regulatory compliance. In times of disasters or just adverse weather, vehicles can be tangled in accidents, stranded or rerouted. Software, such as that available from Agile Access Control, provides data on the status of fleets in real-time for contingency planning.

Data for compliance with hours-of-service regulations for drivers is not trustworthy as it was collected manually and was prone to errors and omissions. With M2M enabled fleet management software like Qualcomm’s hours of service application, tracking of hours of service is automated.

FMCSA (Federal Motor Carrier Safety Administration) governed rules obligate drivers to prepare a daily DVIR (Driver Vehicle Inspection Report) and submit it to headquarters for tracking maintenance or safety issues. With M2M enabled software like Cadec Global, the tedium is cut and processing of the data happens continually in real-time.

In the supply chain, the devil of underperformance is in the details of minute operational data. The data is impossible to track manually with any degree of accuracy. M2M will help to keep the devil at bay.

A version of this article was previously published by Innovation Generation hosted by UBM Techweb

Augmented Reality: Where Mind and Matter Meet

Augmented reality is a looking glass into the reified world that it juxtaposes with the living reality as seen and felt. It enables us to superimpose data, imagery and video onto the real world we do see. Like the mythical blind men and the elephant, we misconstrue the world we see only partially from our angle of view. Augmented reality virtually completes the picture in an attempt to see the whole.

Click Here For The Full White Paper