July 12, 2021

University of Wollongong gives users real-time edge with Jetson-powered solutions

Closed circuit televisions (CCTVs) are deployed practically everywhere around the world for operations, safety, security, and other purposes. While they generate a wealth of data, most are unused because of privacy regulations and concerns. 


The Digital Living Lab (DLL) at University of Wollongong has overcome the privacy challenge by using edge computing to process video feed close to the source and outputting only the necessary data to extract the maximum potential of the feed. Doing so allows organisations to deploy an intelligent sensing platform wherever needed and opens a world of opportunities. 

Converting data into knowledge and actions helps cities, communities and organisations to address key social, economic, health, safety, and environmental issues. 

Applications include monitoring of air quality and traffic flows of pedestrians, bicycles and vehicles in a city; identification and tracking of wildlife; detection of culvert blockage for stormwater management and flash flood early warnings; and detection of anti-social behaviours in public transportation. 

Need for real-time analysis of live video feeds 

As part of the Australian university's SMART Infrastructure Facility, DLL develops and delivers IoT solutions for a wide range of applications for smart cities and communities. 

It undertakes the research and development of next generation smart edge computing devices and sensors that embed ethical and privacy compliant artificial intelligence (AI). Built on open standard, these solutions are interoperable with existing infrastructures. 

DLL works on a variety of projects with a common need for real-time analysis of live video feeds. These projects involve tasks such as object detection, image segmentation, pose estimation, optical character recognition, object tracking, and ultra-low latency video analytics pipelines. 

“We needed a platform that is able to run AI on the edge in real time. After looking at the different options, the NVIDIA Jetson platform seemed to be the best choice, especially when we took into consideration the whole ecosystem available. We have been using it since 2018 and each improvement of the platform unlocked new applications,” said Dr Johan Barthelemy, Lecturer at SMART Infrastructure Facility, University of Wollongong. 

He added that being able to develop the solution on a workstation powered by a NVIDIA GPU, then port it seamlessly to a Jetson, was probably one of the most compelling features of the stack. 

AI at the edge 

Designed to enable AI at the edge, NVIDIA Jetson includes Jetson small form-factor, high-performance computer modules and a stack of software, SDKs, services, and products to speed up development. 

Compatible with the same AI software and cloud-native workflows used across other NVIDIA platforms, it delivers the performance and power-efficiency organisations need to build software-defined intelligent machines at the edge. 

DLL uses NVIDIA’s transfer learning toolkit (TLT) and DeepStream for most of its workflows to retrain and deploy optimised state-of-the-art AI models. Jetson also offers the appropriate performance/power consumption ratio for deploying the model in the field – running on batteries and performing AI computation in real time. When DeepStream is not required, TensorRT allows easy deployment of models optimised for Jetson. 

After optimisation in some cases, DLL is able to run four models in parallel on a Jetson NX for object detection, classification, tracking, and optical character recognition. 

According to Barthelemy, this kind of performance would not be achieved with traditional models. 

For models that need to run on a NVIDIA Tesla T4- powered workstation, the lab runs benchmarks to compare CPU, GPU and different inference runtimes. TensorRT always came up first in terms of frame per seconds processed, as well as for ratios frames per seconds/watt and fps/cost. 

For training, the lab uses multiple publicly available dataset, data collected from the field, and data generated with applications developed in-house and based on Unity/Unreal Engine. 

Depending on application requirements, inferencing is done on NVIDIA T4 and Jetson TX2, Xavier NX, Xavier AGX or Nano 4GB. Real-time video is processed at the edge on Jetson or T4. 

“There is a jungle of different software, framework and platforms available to develop AI-based solutions. This usually leads to compatibility issues and a lot of time being spent on solving those. TLT also allows us to speed up the training and take full advantage of multi-GPU infrastructure,” said Barthelemy. 

Access to live information 

With the Jetson platform, DLL is able to provide its partners with access to a rich source of live information. 

One application helps to improve security of pipeline infrastructure in the Australian Outback by detecting third-party intrusion. 

Another detects contamination of recycle bins, improving recycling and providing a better living environment for the community. 

DLL has also developed an application for detecting anti-social behaviours such as people fighting in public transportation network for Transport for NSW and Sydney Trains. 

Yet another application looks at early detection of culvert blockage which can lower the impact of flash floods. 

Focus on the science 

The lab is currently focused on developing an autonomous, solar-powered NVIDIA Jetson-based intelligent remote sensing solution that can be deployed in the city, the harsh environment of the Australia Outback to monitor remote infrastructure such as pipelines, or even in wintry Antarctica to monitor the impact of climate change. 

Barthelemy is pleased with the capabilities of NVIDIA’s edge computing solutions which has helped expedite research and development, and produce solutions that make the world a better place for all. 

“The NVIDIA stack and the software-hardware integration (code once, deploy anywhere) allows us to focus on the science and quickly prototype, validate and deploy new smart AI-based applications at the edge,” he said.


0 comments:

Post a Comment