SelfHostLLM - Calculate GPU memory for self-hosted LLM inference.