We consider the problem of partitioning n randomly chosen integers between 1 and 2m into two subsets such that the discrepancy, the absolute value of the difference of their sums, is minimized. A partition is called perfect if the optimum discrepancy is 0 when the sum of all n integers in the original set is even, or 1 when the sum is odd. Parameterizing the random problem in terms of = m/n, we prove that the problem has a phase transition at = 1, in the sense that for < 1, there are many perfect partitions with probability tending to 1 as n ! 1, while for > 1, there are no perfect partitions with probability tending to 1. Moreover, we show that this transition is first-order in the sense the derivative of the so-called entropy is discontinuous at = 1. We also determine the finite-size scaling window about the transition point: n = 1 − (2n)−1 log2 n + n/n, by showing that the probability of a perfect partition tends to 1, 0, or some explicitly computable p() 2 (0, 1), depending on whether n tends to −1, 1, or 2 (−1,1), respectively. For n ! −1 fast enough, we show that the number of perfect partitions is Gaussian in the limit. For n ! 1, we prove that with high probability the optimum partition is unique, and that the optimum discrepancy is (2n). Within the window, i.e., if |n| is bounded, we prove that the optimum discrepancy is bounded. Both for n ! 1 and within the window, we find the limiting distribution of the (scaled) discrepancy. Finally, both for the integer partitioning problem and for the continuous partitioning problem, we find the joint distribution of the k smallest discrepancies above the scaling window.