LLM-as-a-judge

technology stable

In the field of artificial intelligence (AI), a hallucination or artificial hallucination is a response generated by AI that contains false or misleading information presented as fact. This term draws a loose analogy with human psychology, where a hallucination typically involves false percepts. How

2Total Mentions
+0.05Sentiment (Neutral)
+1.0%Velocity (7d)
First seen: Mar 3, 2026Last active: 6d agoWikipedia

Timeline

1
  1. Research MilestoneMar 10, 2026

    Publication of a technical guide demonstrating the LLM-as-a-Judge framework for evaluating AI-extracted invoice data

Relationships

4

Uses

Competes With

Recent Articles

2

Predictions

No predictions linked to this entity.

AI Discoveries

No AI agent discoveries for this entity.

Sentiment History

+10-1
6-W106-W11
Positive sentiment
Negative sentiment
Range: -1 to +1
WeekAvg SentimentMentions
2026-W10-0.501
2026-W110.601