Talan.tech
AI Risk Check/Bing Chat (legacy)
LLM

Bing Chat (legacy)

by Microsoft

Original Bing Chat (Sydney). Defamation incidents (Marks v. Microsoft).

Relevant industries:LegalMediaEngineering

Risk Score: 45/100 (Elevated) · 21+ incidents · Legal 100 · Safety 0 · Privacy 54 · Regulatory 60 · Security 0

Risk Score

45/ 100
Elevated Risk

Apr 27, 2026

Risk Score Breakdown

Legal Risk

Court cases & lawsuits

100/100

Safety Risk

Incidents & harm events

0/100

Privacy Risk

Breaches & GDPR actions

54/100

Regulatory Risk

FTC, EU enforcement

60/100

Security Risk

CVEs & vulnerabilities

0/100

Incident Timeline

21 total incidents · showing 5 most recent

Apr 2026

LOWData BreachACTIVE
The Hacker News: Researchers Uncover 73 Fake VS Code Extensions Delivering GlassWorm v2 Malware

Researchers found 73 suspected fake Visual Studio Code extensions on the Open VSX repository linked to the GlassWorm v2 info-stealing campaign, potentially exposing developers who installed them to malware.

#hackernews #security #breach

Apr 2026

LOWData BreachACTIVE
The Hacker News: Tropic Trooper Uses Trojanized SumatraPDF and GitHub to Deploy AdaptixC2

A malware campaign targeting Chinese-speaking users used a trojanized SumatraPDF installer to deploy AdaptixC2 and gain remote access via VS Code tunnels. The incident description does not indicate a confirmed breach of Microsoft Bing Chat itself.

#hackernews #security #breach

Apr 2026

LOWData BreachACTIVE
The Hacker News: UNC6692 Impersonates IT Help Desk via Microsoft Teams to Deploy SNOW Malware

A threat group (UNC6692) impersonated IT help desk staff in Microsoft Teams to trick users into installing custom SNOW malware. Users targeted through Teams social engineering may be affected.

#hackernews #security #breach

Apr 2026

LOWData BreachACTIVE
The Hacker News: Project Glasswing Proved AI Can Find the Bugs. Who's Going to Fix Them?

Anthropic’s Project Glasswing, an AI model that can discover software vulnerabilities at scale, was given early access to Microsoft and other large tech companies instead of being released publicly. The incident raises questions about how quickly identified bugs are disclosed and fixed.

#hackernews #security #breach

Apr 2026

MEDIUMCourt CaseACTIVE2:26-cv-00339
Court Case: Semantic Engines LLC v. Microsoft Corporation

Semantic Engines LLC sued Microsoft Corporation in the U.S. District Court for the Eastern District of Texas (Case No. 2:26-cv-00339). The specific claims have not yet been made public.

Court: District Court, E.D. Texas#courtlistener #lawsuit #court-case

Frequently Asked Questions

What is Bing Chat (legacy)'s AI risk score?

Bing Chat (legacy) has an AI Risk Score of 45/100 (Elevated Risk). This score is calculated from 21+ documented public incidents across legal, safety, privacy, regulatory, and security categories.

Is Bing Chat (legacy) safe to use?

Bing Chat (legacy) by Microsoft has a elevated risk profile based on public data. Organizations should review the full incident list and conduct their own due diligence. This score does not constitute legal advice.

Does Bing Chat (legacy) have lawsuits?

Yes — our public records show 1 court case(s) for Bing Chat (legacy), including: Court Case: Semantic Engines LLC v. Microsoft Corporation.

How is the AI Risk Score calculated?

Scores are weighted across 5 categories: Legal (25%), Safety (25%), Privacy (20%), Regulatory (15%), Security (15%). Each incident is scored by severity and type, then decayed based on age. Active lawsuits and fatal incidents do not decay.

Stay ahead of AI risk

Get alerts when Bing Chat (legacy) risk score changes

New lawsuits, breaches, and regulatory actions — delivered to your inbox.

Bing Chat (legacy) Risk Score — 45/100 (Elevated) | AI Risk Check | Talan.tech