Beginner

Attacking and Defending Prompt Injection

Bootcamp: Rapid Threat Modeling with GenAI & LLMs - June 6-7, 2024. Only 12 seats left - Secure your spot!
Learning Path
AI & LLM Security
Ideal for
Developer
Security Engineer
1
Hours
5
Lessons
3
Cloud Labs

In this course, students will learn about one of the most prevalent and pernicious vulnerabilities in the GenAI and LLM Security landscape, Prompt Injection. Prompt Injection arises when a system inadvertently processes user input as part of its command or execution context, potentially allowing an attacker to manipulate the AI into executing unintended actions or disclosing sensitive information.

This vulnerability is particularly relevant in scenarios where AI models, including chatbots, virtual assistants, and other interactive systems, accept and act upon user-generated prompts.Prompt Injection is a key class of vulnerabilities in the OWASP Top 10 for Large Language Models.

It is a vulnerability that has an impact on other vulnerabilities like Excessive Agency, Training Data Poisoning, Overreliance and Sensitive Information Disclosure. It is a vulnerability that is simple to exploit and hard to detect and fix.In this course we're going to look Prompt Injection from both an attack and defense perpsective. We're going to deploy private LLMs and perform Prompt Injection Attacks against them.

We're also going to look at a history of Prompt Injection attacks against real-world applications.Finally we'll be exploring the defense against Prompt Injection. We'll explore the reasons for the probablistic nature of LLMs being a reason for why it is hard to fix Prompt Injection. We'll explore a very interesting library and approach that can help mitigate and deter against Prompt Injections.

You might also like these courses

Or explore these Learning Paths

Labs

LLM Prompt Injection - Attack

LLM Prompt Injection - Sensitive Data Exposure

LLM Guard

Hands-on. Defensive. Bleeding-Edge.

There's no other training platform that does all three. Except AppSecEngineer.
Get Our Newsletter
Get Started
X
FOLLOW APPSECENGINEER
CONTACT

Contact Support

help@appsecengineer.com

1603 Capitol Avenue,
Suite 413A #2898,
Cheyenne, Wyoming 82001,
United States

Copyright AppSecEngineer © 2023