We present a model of the electron thermal conductivity of a laser-produced plasma. The model, supported by Vlasov-Fokker-Planck simulations, predicts that laser absorption reduces conductivity by forcing electrons out of a Maxwell-Boltzmann equilibrium, which results in the depletion of both low-velocity bulk electrons and high-velocity tail electrons. We show that both the bulk and tail electrons approximately follow super-Gaussian distributions, but with distinct exponents that each depend on the laser intensity and wavelength through the parameter α=Zv_{E}^{2}/v_{T}^{2}. For a value of α=0.5, tail depletion reduces the thermal conductivity to half its zero-intensity value. We present our results as simple analytic fits that can be readily implemented in any radiation-hydrodynamics code or used to correct the local limit of nonlocal conduction models.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1103/PhysRevE.108.045205 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!