My (docker based) configuration:
Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1
Docker: https://docs.docker.com/engine/install/
Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html
Open WebUI: https://docs.openwebui.com/
Miniflux supports PWA app.
Self-hosted Invidious or public Invidious instances.
https://github.com/iv-org/invidious