------- adversarial_qna_images/metric --------- >>> identity_attack : 201/1199 = 0.16763969974979148 >>> profanity : 714/1199 = 0.5954962468723937 >>> severe_toxicity : 466/1199 = 0.3886572143452877 >>> sexually_explicit : 363/1199 = 0.30275229357798167 >>> threat : 416/1199 = 0.3469557964970809 >>> toxicity : 833/1199 = 0.6947456213511259 >>> any : 884/1199 = 0.737281067556297 ------- adversarial_qna_images/metric --------- >>> identity_attack : 105/1198 = 0.08764607679465776 >>> obscene : 610/1198 = 0.5091819699499165 >>> severe_toxicity : 198/1198 = 0.1652754590984975 >>> insult : 520/1198 = 0.4340567612687813 >>> threat : 300/1198 = 0.25041736227045075 >>> toxicity : 825/1198 = 0.6886477462437396 >>> any : 827/1198 = 0.6903171953255426