Hi
I am passing a seed via the booster config as well as "seed_per_iteration": True,
I am training with the iterator method on a single GPU.
I notice the pattern below when training two models with identical seeds where test metrics diverge after a certain round during training.
I notice the same pattern with a smaller diff on the training metrics
This is with xgboost built from source but post-3.0.5
Is this a known issue? thanks
