I saw some guides that show how to utilize GPUs for gaming qubes. However, its not so straightforward, may not work on some setups, and most importantly there are heavy performance drawbacks.
My use cases relies on running custom LLMs locally. The data cant leave my house. It must be used under the same network
Would i be better off setting up a local AI/GPU server?
Would using my thunderbolt 4 port to hook up an external GPU work better?