Patent attributes
Systems and methods to generate an adversarial attack on a black box object detection algorithm of a sensor involve obtaining an initial training data set from the black box object detection algorithm. The black box object detection algorithm performs object detection on initial input data to provide black box object detection algorithm output that provides the initial training data set. A substitute model is trained with the initial training data set such that output from the substitute model replicates the black box object detection algorithm output that makes up the initial training data set. Details of operation of the black box object detection algorithm are unknown and details of operation of the substitute model are known. The substitute model is used to perform the adversarial attack. The adversarial attack refers to identifying adversarial input data for which the black box object detection algorithm will fail to perform accurate detection.