In-Place Merge Sort(這個描述實在太簡單了,我一下子就明白了!不過這算MergeSort嗎?算插入排序還差不多)
If you want to avoid the space complexity required by having a scratch array, you can use the merge sort algorithm, but then move the data around within the original array. This version supports the recursive formulation, and the only change is in the merge logic.
private static void inPlaceSort ( Comparable[] x, int first, int last )
The recursive base case, as usual, is the array of length one (or zero): first>=last.
Otherwise, compute a mid-point (mid=(first+last)/2;), and then recursively sort the data from first up through mid, and from mid+1 up through last.
inPlaceSort (x, first, mid);
inPlaceSort (x, mid+1, last);
The slightly tricky part is the merge logic. As usual, you have a subscript into the left array segment (x[left]
left = first; right = mid+1; // One extra check: can we SKIP the merge? if ( x[mid].compareTo(x[right]) <= 0 ) return; while (left <= mid && right <= last) { // Select from left: no change, just advance left if ( x[left].compareTo(x[right]) <= 0 ) left++; // Select from right: rotate [left..right] and correct else { tmp = x[right]; // Will move to [left] System.arraycopy(x, left, x, left+1, right-left); x[left] = tmp; // EVERYTHING has moved up by one left++; mid++; right++; } } // Whatever remains in [right..last] is in place
Complexity analysis:
The recursive structure remains the same, so that you have the lg(n) recursive levels in the processing, but the data movement is markedly different. In the scratch-array implementation, the data movement generated by the merge is O(n) — for the recursive formulation, straight copy to generate the subarrays, followed by O(n) merging those sorted data back into the parameter array; for the iterative formulation, a pure O(n) merging data back and forth between the two arrays.
This case, though, has markedly different best- and worst-case situations. The code explicitly notes the very best case (if the end of the left half is in position with respect to the beginning of the right half, return), but even just shy of the best case (say, one item out of position), the data movement will still be O(1). That means that the complexity will be O(n). For lg(n) levels, we have 1+2+4+8+ . . . (i.e., the number of method invocations at each level of the recursive execution tree).
In the worst case, however, within each execution the entire right side needs to be moved to the left side — require O(n2) data movements (thanks to the well-known series "1+2+3+4+5+..."). Even in the average case, one can expect half the time to need to move from the right side, again giving O(n2) data movements.
The observed time behavior of the in-place merge sort is significantly faster than the classical O(n2) algorithms (Selection Sort and Insertion Sort), but still (for random and reversed data) can be extremely well fit by a quadratic trend line within Excel.
Bottom line: memory is cheap, use the scratch array!