Astrophysics (Index)About

Hanbury Brown and Twiss effect

(HBT effect)
(EMR interference stemming from variation in intensity)

The Hanbury Brown and Twiss effect (HBT effect) is an electromagnetic radiation (EMR) effect producing a correlation in intensities from astronomical objects such as stars sensed at separated locations. It is the underlying mechanism of intensity interferometers and was worked out during their development. The original derivation was based upon classical wave theory, but doing so by treating light from stars as waves from the entire star. Considering light as particles (photons) seemed to invalidate the theory, but the actual interferometers did function, and further quantum-theory analysis indicated it should do so, representing another case where the strangeness of the quantum behavior of particles effectively simulates waves. The underlying quantum effect is photon bunching, an instance of the non-intuitive behavior of bosons: photons reaching the separated receivers are part of the same quantum system and interfere even though they come from widely separate regions of the star.

This quantum analysis qualifies as an early instance of quantum optics, the study of optical effects due to EMR's quantum nature, generally dealing with effects that classical optics does not explain. In this case, though, the result was first predicted by a type of classical wave theory. The HBT effect does require a source that produces photons with some specific characteristics of randomness. Thermal emission has the quality and some non-thermal emission does not.


(theory,interferometry)
Further reading:
https://en.wikipedia.org/wiki/Hanbury_Brown_and_Twiss_effect
http://home.thep.lu.se/~anders/ATP_slides/100526-BosonInterferometry.pdf
https://inis.iaea.org/collection/NCLCollectionStore/_Public/29/003/29003804.pdf
https://ui.adsabs.harvard.edu/abs/2016SPIE.9907E..0MD/abstract

Referenced by pages:
intensity interferometer
Narrabri Stellar Intensity Interferometer (NSII)

Index