Story

Ask HN: LLM Poisoning Resources

totallygeeky Friday, January 16, 2026

I'm sure this will get some pushback, but I was wondering if anyone had resources on how to integrate traps/tar pits into websites, llm prompting via hidden text, pushing bad data to llms and the like.

I have found a few different types of recommended approaches, such as:

- https://hiddenlayer.com/innovation-hub/novel-universal-bypas...

- tHe SpONgeBoB MetHOd

- https://rnsaffn.com/poison3/

I'm looking for more or some guidance on how to combine methods to really create something noxious.

3 0
Read on Hacker News