Problems in suppressing cooling flows in clusters of galaxies by global heat conduction
ABSTRACT
I use a simple analytical model to show that simple heat conduction models cannot significantly suppress cluster cooling flows. I build a static medium where heat conduction globally balances radiative cooling, and then perturb it. I show that a perturbation extending over a large fraction of the cooling flow region and with an amplitude of ∼10 per cent, will grow to the non-linear regime within a Hubble time. Such perturbations are reasonable in clusters that frequently experience mergers and/or active galactic nuclei activity. This result strengthens previous findings which show that a steady solution does not exist for a constant heat conduction coefficient.