LLM failure modes

Common ways large language models can produce incorrect, unsafe, or unreliable outputs under certain conditions.