The second one, splitting by expected gain, employs a local one-dimensional parabolic quadratic model (surrogate) along a single coordinate.
The algorithm is guaranteed to converge to the global minimum in the long run (i.e. when the number of function evaluations and the search depth are arbitrarily large) if the objective function is continuous in the neighbourhood of the global minimizer.
This follows from the fact that any box will become arbitrarily small eventually, hence the spacing between samples tends to zero as the number of function evaluations tends to infinity.
MCS is designed to be implemented in an efficient recursive manner with the aid of trees.
With this approach the amount of memory required is independent of problem dimensionality since the sampling points are not stored explicitly.