Let G be any graph with n vertices, whose degrees in decreasing order are d1 ≥ d2 ≥ d3 ≥ … ≥ dn.
Otherwise, it can be made into a split graph by calculating m, adding all missing edges between pairs of the m vertices of maximum degree, and removing all edges between pairs of the remaining vertices.
As a consequence, the splittance and a sequence of edge additions and removals that realize it can be computed in linear time.
[1] The splittance of a graph has been used in parameterized complexity as a parameter to describe the efficiency of algorithms.
For instance, graph coloring is fixed-parameter tractable under this parameter: it is possible to optimally color the graphs of bounded splittance in linear time.