Deep generative models have revolutionized the modeling of complex, high-dimensional probability distributions, opening up new possibilities for tackling challenging scientific inference problems. In the context of Bayesian inference, which is prevalent in cosmology, these models offer innovative solutions for posterior sampling. In this talk, I will explore two key applications in cosmology. First, I will present an analysis of galaxy clustering data, where normalizing flows trained on features derived from mock galaxy catalogs enable the inference of cosmological parameters from observational data. Next, I will demonstrate how diffusion models can be employed for effective source separation in cosmic microwave background analysis, while simultaneously enabling robust cosmological inference. Finally, I will discuss the challenges posed by scarce training data and the potential of scientific foundation models to address them, illustrated by multiple physics pretraining approaches for physical surrogate modeling with transformers.