Mixed-precision quantizations of Latitude Games's 12B models, using NVFP4+4/6 for MLP layers and FP8_DYNAMIC for self-attention.