hallucinations topic
awesome-trustworthy-deep-learning
A curated list of trustworthy deep learning papers. Daily updating...
agent-ci
Deploy once. Continuously improve your AI agents in production.
UHGEval
[ACL 2024] User-friendly evaluation framework: Eval Suite & Benchmarks: UHGEval, HaluEval, HalluQA, etc.
awesome-hallucination-detection
List of papers on hallucination detection in LLMs.
llm_aided_ocr
Enhance Tesseract OCR output for scanned PDFs by applying Large Language Model (LLM) corrections.
Cognitive-Mirage-Hallucinations-in-LLMs
Repository for the paper "Cognitive Mirage: A Review of Hallucinations in Large Language Models"
DCR-consistency
DCR-Consistency: Divide-Conquer-Reasoning for Consistency Evaluation and Improvement of Large Language Models
sac3
Official repo for SAC3: Reliable Hallucination Detection in Black-Box Language Models via Semantic-aware Cross-check Consistency
Hallucination-Attack
Attack to induce LLMs within hallucinations
dtt-multi-branch
Code for Controlling Hallucinations at Word Level in Data-to-Text Generation (C. Rebuffel, M. Roberti, L. Soulier, G. Scoutheeten, R. Cancelliere, P. Gallinari)