Detection of hydrogen depends on geometry of illuminating source
5 September 2013
Reservoirs of neutral hydrogen gas in galaxies provide the raw fuel for star formation in the Universe. Over the past decade several groups have used studies of the 21-cm wavelength absorption of radio emission in clouds of atomic hydrogen (HI), to claim that the density of HI gas is somehow anti-correlated with the extent of the radio emission from the centre of the host galaxy. In a new study, researchers based at CAASTRO’s node at The University of Sydney and at CSIRO, have shown that this correlation can instead be explained by simple geometric arguments.
These previous studies claim that smaller radio sources could arise in a denser absorbing medium, where the radio-emitting plasma jets are either trapped (the frustration scenario) or are yet to reach their full extent (the youth scenario). This interpretation of the observational data, however, relies on the assumption of a common gas temperature for each galaxy in order to convert the measured absorption strength into a gas density.
By considering the factors that determine the observed depth of the absorption, the team found that it is not necessarily driven by a trend in the gas density, reliant on the assumption of a common temperature, but simply that the intercepted area of background radio emission changes with source size. Their model consisted of a double-lobed radio source in
which the lobe sizes, shapes and separations could be varied. It also allowed the density and "clumpiness" of the absorbing medium to be varied. From this, the observed data were best reproduced by lobes that resembled one of the two Fanaroff & Riley classifications (i.e. FRI and FRII) and a non-uniformly dense absorbing medium. A peak in the detection of 21-cm absorption was found for a radio emission extent of 100 – 1000 Parsec, this "resonance" thereby suggesting that this is the typical size of an absorbing cloud.