In this paper, the estimation of conditional densities between continuous random variables from noisy samples is considered. The conditional densities are modeled as heteroscedastic Gaussian mixture densities allowing for closed-form solution of Bayesian inference with full-densities. The main contributions of this paper are an improved generalization quality of the estimates by the introduction of a superficial regularizer, the consideration of model uncertainty relative to local data densities by means of adaptive covariances, and the proposition of an efficient distance-based estimation algorithm. This algorithm corresponds to an iterative nested optimization scheme, optimizing hyper-parameters, component placement, and mixture weights. The obtained solutions are sparse, smooth, and generalize well as benchmark experiments, e.g., in nonlinear filtering show.