aisecurity topic
Machine_Learning_CTF_Challenges
CTF challenges designed and implemented in machine learning applications
watchtower
AIShield Watchtower: Dive Deep into AI's Secrets! 🔍 Open-source tool by AIShield for AI model insights & vulnerability scans. Secure your AI supply chain today! ⚙️🛡️
ComPromptMized
ComPromptMized: Unleashing Zero-click Worms that Target GenAI-Powered Applications
Website-Prompt-Injection
Website Prompt Injection is a concept that allows for the injection of prompts into an AI system via a website's. This technique exploits the interaction between users, websites, and AI systems to exe...
Image-Prompt-Injection
Image Prompt Injection is a Python script that demonstrates how to embed a secret prompt within an image using steganography techniques. This hidden prompt can be later extracted by an AI system for a...
vger
An interactive CLI application for interacting with authenticated Jupyter instances.
ASSET
This repository is the official implementation of the paper "ASSET: Robust Backdoor Data Detection Across a Multiplicity of Deep Learning Paradigms." ASSET achieves state-of-the-art reliability in de...
AI-Prompt-Injection-List
AI/LLM Prompt Injection List is a curated collection of prompts designed for testing AI or Large Language Models (LLMs) for prompt injection vulnerabilities. This list aims to provide a comprehensive...
ASCII-Art-Prompt-Injection
ASCII Art Prompt Injection is a novel approach to hacking AI assistants using ASCII art. This project leverages the distracting nature of ASCII art to bypass security measures and inject prompts into...