Bing Chat (legacy)
by Microsoft
Original Bing Chat (Sydney). Defamation incidents (Marks v. Microsoft).
Risk Score: 45/100 (Elevated) · 21+ incidents · Legal 100 · Safety 0 · Privacy 54 · Regulatory 60 · Security 0
Risk Score
Apr 27, 2026
Risk Score Breakdown
Legal Risk
Court cases & lawsuits
100/100
Safety Risk
Incidents & harm events
0/100
Privacy Risk
Breaches & GDPR actions
54/100
Regulatory Risk
FTC, EU enforcement
60/100
Security Risk
CVEs & vulnerabilities
0/100
Incident Timeline
21 total incidents · showing 5 most recent
Apr 2026
Researchers found 73 suspected fake Visual Studio Code extensions on the Open VSX repository linked to the GlassWorm v2 info-stealing campaign, potentially exposing developers who installed them to malware.
Apr 2026
A malware campaign targeting Chinese-speaking users used a trojanized SumatraPDF installer to deploy AdaptixC2 and gain remote access via VS Code tunnels. The incident description does not indicate a confirmed breach of Microsoft Bing Chat itself.
Apr 2026
A threat group (UNC6692) impersonated IT help desk staff in Microsoft Teams to trick users into installing custom SNOW malware. Users targeted through Teams social engineering may be affected.
Apr 2026
Anthropic’s Project Glasswing, an AI model that can discover software vulnerabilities at scale, was given early access to Microsoft and other large tech companies instead of being released publicly. The incident raises questions about how quickly identified bugs are disclosed and fixed.
Apr 2026
Semantic Engines LLC sued Microsoft Corporation in the U.S. District Court for the Eastern District of Texas (Case No. 2:26-cv-00339). The specific claims have not yet been made public.
Frequently Asked Questions
What is Bing Chat (legacy)'s AI risk score?
Bing Chat (legacy) has an AI Risk Score of 45/100 (Elevated Risk). This score is calculated from 21+ documented public incidents across legal, safety, privacy, regulatory, and security categories.
Is Bing Chat (legacy) safe to use?
Bing Chat (legacy) by Microsoft has a elevated risk profile based on public data. Organizations should review the full incident list and conduct their own due diligence. This score does not constitute legal advice.
Does Bing Chat (legacy) have lawsuits?
Yes — our public records show 1 court case(s) for Bing Chat (legacy), including: Court Case: Semantic Engines LLC v. Microsoft Corporation.
How is the AI Risk Score calculated?
Scores are weighted across 5 categories: Legal (25%), Safety (25%), Privacy (20%), Regulatory (15%), Security (15%). Each incident is scored by severity and type, then decayed based on age. Active lawsuits and fatal incidents do not decay.
Stay ahead of AI risk
Get alerts when Bing Chat (legacy) risk score changes
New lawsuits, breaches, and regulatory actions — delivered to your inbox.