You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You used 73 datasets to train Ovis1.5-Gemma2-9B-S3, but only 71 datasets to train Ovis1.5-Llama3-8B-S3. Why is this the case? Is it a typo or is there another reason?
The text was updated successfully, but these errors were encountered:
The difference in the number of datasets used for training Ovis1.5-Gemma2-9B-S3 compared to Ovis1.5-Llama3-8B-S3 is not a typo. The 9B version was trained after the 8B version, during which we constructed new data. These additional datasets were included in the training of the 9B model.
The text was updated successfully, but these errors were encountered: