(Actually, this is from Astro Nutshell two weeks ago)
Each week I work with first-year grad students Marta and Becky on "order of magnitude" problems at the blackboard. I put that in quotes because we tend to do many more scaling arguments than true OoM. The idea is for them to draw on what they've picked up in class and apply it to common problems that arrise in astronomy.
This week we asked
How many photons per second per cm$^2$ (photon number flux) do you receive from a star as a function of its temperature $T$, radius $R$ and distance away $d$?
Having such an equation would be extremely handy for observation planning. When determining the feasibility of a new project, observers tend to start with a statement of the expected signal-to-noise ratio (SNR) for an observation of an astrophysical object. In the limit of a large expected number of photons, the signal is the number of photons $S = N_\gamma$, and the noise can be approximated as $N{\rm oise} = \sqrt{N_\gamma}$. So SNR$ = \sqrt{N_\gamma}$. So this week's question comes down to, "What is $N_\gamma$ for a star of a given temperature, radius and distance away?"
We start with Wein's Law, which states that the wavelength at which a star's (blackbody's) emission peaks is inversely proportional to the star's temperature
For an M dwarf with 1/5 the Sun's radius and roughly half the temperature, at 10 pc it would emit 100 times less light, but most of these photons will be down near 1 micron in the near infrared. An A-type star like Vega, with twice the Sun's radius and twice the temperature will emit 16 times as many photons, most of them in the ultraviolet.
I hope you find this scaling relationship as handy as I do!
Each week I work with first-year grad students Marta and Becky on "order of magnitude" problems at the blackboard. I put that in quotes because we tend to do many more scaling arguments than true OoM. The idea is for them to draw on what they've picked up in class and apply it to common problems that arrise in astronomy.
This week we asked
How many photons per second per cm$^2$ (photon number flux) do you receive from a star as a function of its temperature $T$, radius $R$ and distance away $d$?
Having such an equation would be extremely handy for observation planning. When determining the feasibility of a new project, observers tend to start with a statement of the expected signal-to-noise ratio (SNR) for an observation of an astrophysical object. In the limit of a large expected number of photons, the signal is the number of photons $S = N_\gamma$, and the noise can be approximated as $N{\rm oise} = \sqrt{N_\gamma}$. So SNR$ = \sqrt{N_\gamma}$. So this week's question comes down to, "What is $N_\gamma$ for a star of a given temperature, radius and distance away?"
We start with Wein's Law, which states that the wavelength at which a star's (blackbody's) emission peaks is inversely proportional to the star's temperature
$\lambda_{\rm max} \sim \frac{1}{T}$ (1)This is Astro 101. The flux level at this peak wavelength can be evaluated using the blackbody function (Planck function), which is given by
$F_\lambda(T) = \frac{2hc^2}{\lambda^5} \frac{1}{\exp{\frac{h c}{\lambda k_{\rm B} T} + 1}}$ (2)
This gives the energy per unit time (power), per area, per wavelength per unit solid angle, as a function of temperature and wavelength. If we evaluate this at $\lambda_{\rm max}$, and approximate the total flux, which is an integral over all wavelengths, as a box of height $F_{\lambda_{\rm max}}$ and with a 100 nm width (standard observing bandpass). We also need to multiply by the solid angle subtended by the star of radius $R$ at a distance $d$, which is $R^2/d^2$. This leads to
$F_{\rm tot} \sim \frac{T^5 \Delta\lambda}{e^{\rm const} - 1} R^2 d^{-2}$ (3)
Since $\lambda_{\rm max} \sim 1/T$, then $|\Delta \lambda| \sim 1/T^2$ (Wow, check out that calculus slight-of-hand! However, the same scaling falls out of actually doing the integral over $d\lambda$). Finally, the energy per photon near $\lambda_{\rm max}$ is $E_{\lambda_{\rm max}} = h c / \lambda_{\rm max} \sim T$. Dividing Equation 3 by the energy per photon, and replacing $\Delta \lambda$ we get the flux of photons
$F_\gamma \sim T^5 T^{-2} T^{-1} R^2 d^{-2}$
$F_\gamma \sim T^2 R^2 d^{-2}$Increasing the temperature or radius of the star results in more flux, which should seem fairly intuitive: hotter, bigger stars emit more photons. Also, there's the familiar inverse-square law with distance. Evaluating for the Sun at 10 pc results in
$F_{\gamma,\odot} = [5\times10^{4} {\rm photons}] T^2 R^2 d^{-2}$(please check my math on this!)
For an M dwarf with 1/5 the Sun's radius and roughly half the temperature, at 10 pc it would emit 100 times less light, but most of these photons will be down near 1 micron in the near infrared. An A-type star like Vega, with twice the Sun's radius and twice the temperature will emit 16 times as many photons, most of them in the ultraviolet.
I hope you find this scaling relationship as handy as I do!
Comments