-
1
The Dangerous Effects of a Frustratingly Easy LLMs Jailbreak Attack
Published 2025-01-01Subjects: Get full text
Article -
2
JailbreakTracer: Explainable Detection of Jailbreaking Prompts in LLMs Using Synthetic Data Generation
Published 2025-01-01Subjects: Get full text
Article -
3
LLM Abuse Prevention Tool Using GCG Jailbreak Attack Detection and DistilBERT-Based Ethics Judgment
Published 2025-03-01Subjects: Get full text
Article -
4
Improving LLM Outputs Against Jailbreak Attacks With Expert Model Integration
Published 2025-01-01Subjects: Get full text
Article -
5
Quantization-Based Jailbreaking Vulnerability Analysis: A Study on Performance and Safety of the Llama3-8B-Instruct Model
Published 2025-01-01Subjects: Get full text
Article