SeeByte has been awarded a contract by the U.K.’s Defense Science and Technology Laboratory (Dstl) to create an advanced Deep Neural Network (DNN) framework that will provide enhanced situational awareness capabilities for passive sensor suites.
Future Active Protection Systems (APS), and specifically Modular Integrated Protection Systems (MIPS), are likely to incorporate passive sensor subsystems as a crucial element of their sensing suite.
With advanced imaging processing techniques, passive imaging sensors could provide additional enhanced situational awareness capabilities such as object detection, identification and tracking, and image segmentation and range estimation, whilst also providing their core APS function.
SeeByte’s multi-task DNN framework, developed under Phase 2 of the Defense and Security Accelerator (DASA) ‘The Advanced Vision 2020 and Beyond’ competition, will provide semantic image segmentation, object detection, and depth estimate (bearing and range) outputs for monocular Electro-Optic/Infrared (EO/IR) sensors. EO/IR sensors are a key military capability used for surveillance, reconnaissance, target acquisition, threat warning, target detection and more. SeeByte’s previous experience with multi-task DNN architectures has demonstrated that it is possible to substantially compress DNN model size and complexity without a drop in performance.
In later phases, SeeByte will address limited imagery datasets containing relevant target objects by using Generative Adversarial Networks (GAN) to inject synthetic objects into real imagery.