diff --git a/opsec/openwebuilocalllms/40.png b/opsec/openwebuilocalllms/40.png new file mode 100644 index 0000000..c075650 Binary files /dev/null and b/opsec/openwebuilocalllms/40.png differ diff --git a/opsec/openwebuilocalllms/41.png b/opsec/openwebuilocalllms/41.png new file mode 100644 index 0000000..cd65674 Binary files /dev/null and b/opsec/openwebuilocalllms/41.png differ diff --git a/opsec/openwebuilocalllms/42.png b/opsec/openwebuilocalllms/42.png new file mode 100644 index 0000000..ab71585 Binary files /dev/null and b/opsec/openwebuilocalllms/42.png differ diff --git a/opsec/openwebuilocalllms/43.png b/opsec/openwebuilocalllms/43.png new file mode 100644 index 0000000..93f626e Binary files /dev/null and b/opsec/openwebuilocalllms/43.png differ diff --git a/opsec/openwebuilocalllms/44.png b/opsec/openwebuilocalllms/44.png new file mode 100644 index 0000000..9d5f74b Binary files /dev/null and b/opsec/openwebuilocalllms/44.png differ diff --git a/opsec/openwebuilocalllms/45.png b/opsec/openwebuilocalllms/45.png new file mode 100644 index 0000000..026dcd4 Binary files /dev/null and b/opsec/openwebuilocalllms/45.png differ diff --git a/opsec/openwebuilocalllms/46.png b/opsec/openwebuilocalllms/46.png new file mode 100644 index 0000000..91b71b9 Binary files /dev/null and b/opsec/openwebuilocalllms/46.png differ diff --git a/opsec/openwebuilocalllms/47.png b/opsec/openwebuilocalllms/47.png new file mode 100644 index 0000000..7fc1ab7 Binary files /dev/null and b/opsec/openwebuilocalllms/47.png differ diff --git a/opsec/openwebuilocalllms/48.png b/opsec/openwebuilocalllms/48.png new file mode 100644 index 0000000..27e945c Binary files /dev/null and b/opsec/openwebuilocalllms/48.png differ diff --git a/opsec/openwebuilocalllms/49.png b/opsec/openwebuilocalllms/49.png new file mode 100644 index 0000000..7828da5 Binary files /dev/null and b/opsec/openwebuilocalllms/49.png differ diff --git a/opsec/openwebuilocalllms/index.html b/opsec/openwebuilocalllms/index.html index 1eb2fa5..3426c66 100644 --- a/opsec/openwebuilocalllms/index.html +++ b/opsec/openwebuilocalllms/index.html @@ -1263,18 +1263,141 @@ Only output the translation, nothing else.
If you encounter issues with hardware acceleration on ollama, check:
+If you encounter issues with hardware acceleration on ollama, check:
+LLMs typically have pretty distant knowledge cutoff. Meaning if you ask about recent developments in some rapidly changing technology, they typically won't be able to answer you directly.
+The models presented here typically have the knowledge up to late 2023/early 2024 since they were trained somewhat around this time.
+
+With Open WebUI, models can search the web for up to date information about recent topics.
+The model is first asked to come up with search queries about the question being asked. The queries are sent to some traditional search engine (like duckduckgo, bing, google or a searxng instance). A webdriver working on the backend visits each result and aggregates knowledge in vector database which model then uses to enhance its response.
+In this section, we'll configure this feature to work entirely over Tor maintaining server side anonymity.
+Here's the output from Gemma 3 without search capability: +
+ +
+And here are the same questions with search using duckduckgo over Tor:
+
+To start, we need to know the IP address of the host on the docker0 interface.
+
oxeo@andromeda:~$ ip a show dev docker0
+3: docker0: mtu 1500 qdisc noqueue state DOWN group default
+ link/ether 3a:1c:1f:86:47:f0 brd ff:ff:ff:ff:ff:ff
+ inet 172.17.0.1/16 brd 172.17.255.255 scope global docker0
+ valid_lft forever preferred_lft forever
+In my case it's 172.17.0.1.
+
+
++Now, we'll make Tor listen on this interface with HTTP CONNECT proxy (since Open WebUI search feature doesn't support socks5). Add at the top of /etc/tor/torrc file the following line: +
HTTPTunnelPort 172.17.0.1:9080
+Remember to replace the IP with the one you got from previous step.
+
+
++Restart Tor and check if it listens on desired interface: +
oxeo@andromeda:~$ sudo systemctl restart tor
+oxeo@andromeda:~$ sudo ss -tulp
+Netid State Recv-Q Send-Q Local Address:Port Peer Address:Port Process
+udp UNCONN 0 0 0.0.0.0:bootpc 0.0.0.0:* users:(("dhclient",pid=490,fd=7))
+tcp LISTEN 0 4096 127.0.0.1:9050 0.0.0.0:* users:(("tor",pid=1572,fd=6))
+tcp LISTEN 0 4096 127.0.0.1:3000 0.0.0.0:* users:(("docker-proxy",pid=13793,fd=7))
+tcp LISTEN 0 128 0.0.0.0:ssh 0.0.0.0:* users:(("sshd",pid=522,fd=3))
+tcp LISTEN 0 4096 172.17.0.1:9080 0.0.0.0:* users:(("tor",pid=1572,fd=7))
+tcp LISTEN 0 4096 127.0.0.1:11434 0.0.0.0:* users:(("docker-proxy",pid=13708,fd=7))
+tcp LISTEN 0 128 [::]:ssh [::]:* users:(("sshd",pid=522,fd=4))
+
+
+
+We also need to adjust the ~/openwebui-stack/docker-compose.yml file.
+Add 3 environment variables telling Open WebUI to use certain proxy for HTTP and HTTPS connections. The open-webui container configuration should now look like this:
+
open-webui:
+ image: ghcr.io/open-webui/open-webui:main
+ container_name: open-webui
+ volumes:
+ - open-webui:/app/backend/data
+ depends_on:
+ - ollama
+ ports:
+ - 127.0.0.1:3000:8080 # Remove "127.0.0.1:" to access from LAN
+ environment:
+ - 'OLLAMA_BASE_URL=http://ollama:11434'
+ - 'WEBUI_SECRET_KEY='
+ - 'HTTP_PROXY=http://host.docker.internal:9080'
+ - 'HTTPS_PROXY=http://host.docker.internal:9080'
+ - 'NO_PROXY=ollama'
+ extra_hosts:
+ - host.docker.internal:host-gateway
+ restart: unless-stopped
+The host.docker.internal domain is resolved from within the container to the address of host. This allows open-webui container to access HTTP CONNECT proxy exposed by Tor daemon.
+
+
++Once that's done, we can restart the container and go to Open WebUI GUI administrator settings once again. +
oxeo@andromeda:~$ docker compose down; docker compose up -d
+
+
+This time, click on the Web Search tab.
+
+Here, enable the feature and select duckduckgo as a search engine. It's also possible to use searxng instance but it has to have JSON output enabled (which is not the default).
+You should enable Trust Proxy Environment so that every search query and visited website will be proxied through the Tor proxy we set up before.
+
+And that's it!
+Now go back to the chat interface and click on Web Search toggle, then send your question.
+
+If search engine and browser backend are working, you should see search queries model came up with.
+
+After model gives you an answer, you're able to see the sources it used to gain knowledge.
+