In this paper, we present a novel approach to parametric density estimation from given samples. The samples are treated as a parametric density function by means of a Dirac mixture, which allows for applying analytic optimization techniques. The method is based on minimizing a distance measure between the integral of the approximation function and the empirical cumulative distribution function (EDF) of the given samples, where the EDF is represented by the integral of the Dirac mixture. Since this minimization problem cannot be solved directly in general, a progression technique is applied. Increased performance of the approach in comparison to iterative maximum likelihood approaches is shown in simulations.