fix picture placement in local llm tutorials

This commit is contained in:
oxeo0 2025-06-01 15:01:16 +02:00
parent 0e31fbdfb8
commit 68a6d47a1c
No known key found for this signature in database
GPG key ID: B4DCEAB52B5BEC67

View file

@ -22,6 +22,8 @@ The vast amount of sensitive user data stored can have devastating consequences
**Assume all conversations with online chatbots can be public at any time.** **Assume all conversations with online chatbots can be public at any time.**
![](5.png)
### Claude 4 Opus contacting authorities ### Claude 4 Opus contacting authorities
On 22nd May 2025, a researcher from Anthropic posted a tweet talking about their latest model release. On 22nd May 2025, a researcher from Anthropic posted a tweet talking about their latest model release.
@ -34,8 +36,6 @@ After facing severe backlash from users, Sam Bowman deleted his post. ([archived
If you want to learn more, [Sam Bent](../index.md#wall-of-fame-as-of-may-2025) made a [YouTube video](https://www.youtube.com/watch?v=apvxd7RODDI) on this situation. If you want to learn more, [Sam Bent](../index.md#wall-of-fame-as-of-may-2025) made a [YouTube video](https://www.youtube.com/watch?v=apvxd7RODDI) on this situation.
![](5.png)
## **Privacy LLM frontends** ## **Privacy LLM frontends**
A partial solution to those problems could be a service that aggregates multiple model APIs and anonymizes their users. A bit like [searxng](https://github.com/searxng/searxng) does for search engines. A partial solution to those problems could be a service that aggregates multiple model APIs and anonymizes their users. A bit like [searxng](https://github.com/searxng/searxng) does for search engines.