The official LLaVA-v1.5-7B model gets 1362 points in MME Perception scores under the newest code 69b7b5e.

But it turns out to be 1497 under the 027e38c commit.

I was sure about the consistent LLaVA code between this two experiement, why would this happened?
The official LLaVA-v1.5-7B model gets 1362 points in MME Perception scores under the newest code 69b7b5e.


But it turns out to be 1497 under the 027e38c commit.
I was sure about the consistent LLaVA code between this two experiement, why would this happened?