Running Agents 1 LLM Memory Calculator 🌍 Estimate the GPU memory needed to run inference with any LLM