Skip to content

fyyjyx-github/Full-coverage-camouflage-adversarial-attack

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

61 Commits
 
 
 
 
 
 
 
 

Repository files navigation

FCA: Learning a 3D Full-coverage Vehicle Camouflage for Multi-view Physical Adversarial Attack

Case study of the FCA. The code can be find in FCA.

Cases of Digital Attack

Carmear distance is 3

before
after

Carmear distance is 5

before
after

Carmear distance is 10

before
after

Cases of Multi-view Attack

before
after

The first row is the original detection result. The second row is the camouflaged detection result.

before
after

The first row is the original detection result. The second row is the camouflaged detection result.

Ablation study

Different combination of loss terms

As we can see from the Figure, different loss terms plays different roles in attacking. For example, the camouflaged car generated by obj+smooth (we omit the smooth loss, and denotes as obj) can hidden the vehicle successfully, while the camouflaged car generated by iou can successfully suppress the detecting bounding box of the car region, and finally the camouflaged car generated by cls successfully make the detector to misclassify the car to anther category.

Different initialization ways

original basic initialization random initialization zero initialization

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.3%
  • Other 0.7%