Can't you just lower the LoRA weights yourself? This "normalization" simply makes everything weaker anyway. I don't think it's really equivalent in effect, at least not in my experience. (You can see ...
一些您可能无法访问的结果已被隐去。
显示无法访问的结果一些您可能无法访问的结果已被隐去。
显示无法访问的结果