sfr_slug: Bayesian Inference of Star Formation Rates¶
The slugy.sfr_slug module computes posterior probabilities on star formation rates given a set of star formation rates estimated using the “point mass estimate” (i.e., the estimate you would get for a fully sampled stellar population) for the SFR based on the ionizing, FUV, or bolometric luminosity. It is implemented as a wrapper around bayesphot: Bayesian Inference for Stochastic Stellar Populations, so for details on how the calculation is performed see the bayesphot documentation.
Getting the Default Library¶
The sfr_slug module requires a precomputed library of slug simulations to use as a “training set” for its calculations. Due to its size, the default library is not included in the slug git repository. Instead, it is provided for download from the SLUG data products website. Download the two files SFR_SLUG_integrated_phot.fits
and SFR_SLUG_integrated_prop.fits
and save them in the sfr_slug
directory of the main respository. If you do not do so, and do not provide your own library when you attempt to use sfr_slug, you will be prompted to download the default library.
Basic Usage¶
The sfr_slug/sfr_slug_example.py
file in the repository provides an example of how to use sfr_slug. Usage of is simple, as the functionality is all implemented through a single class, sfr_slug. The required steps are as follows:
Import the library and instantiate an
sfr_slug
object (see Full Documentation of slugpy.sfr_slug for full details):from slugpy.sfr_slug import sfr_slug sfr_estimator = sfr_slug()
This creates an sfr_slug object, using the default simulation library, $SLUG_DIR/sfr_slug/SFR_SLUG. If you have another library of simulations you’d rather use, you can use the libname
keyword to the sfr_slug
constructor to select it.
Specify your filter(s), for example:
sfr_estimator.add_filters('QH0')
The add_filter
method takes as an argument a string or list of strings specifying which filters you’re going to point mass SFRs based on. You can have more than one set of filters active at a time (just by calling add_filters
more than once), and then specify which set of filters you’re using for any given calculation.
Specify your priors, for example:
sfr_estimator.priors = 'schechter'
The priors
property specifies the assumed prior probability distribution on the star formation rate. It can be either None
(in which case all simulations in the library are given equal prior probability), an array with as many elements as there are simulations in the library giving the prior for each one, a callable that takes a star formation rate as input and returns the prior for it, or a string whose value is either “flat” or “prior”. The two strings specify, respectively, a prior distribution that is either flat in log SFR or follows the Schechter function SFR distribution from Bothwell et al. (2011):
with \(\alpha = 0.51\) and \(\mathrm{SFR}_* = 9.2\,M_\odot\,\mathrm{yr}^{1}\).
Generate the posterior probability distribuiton of SFR via:
logSFR, pdf = sfr_estimator.mpdf(logSFR_in, logSFRphoterr = logSFR_err)
The argument logSFR_in
can be a float or an array specifying one or more point mass estimates of the SFR in your chosen filter. For a case with two or more filters, then logSFR_in
must be an array whose trailing dimension matches the number of filters. If you have added two or more filter sets, you need to specify which one you want to use via the filters
keyword. The optional argument logSFRphoterr
can be used to provide errors on the photometric SFRs. Like logSFR_in
, it can be a float or an array.
The sfr_slug.mpdf
method returns a tuple of two quantities. The first is a grid of log SFR values, and the second is the posterior probability distribution at each value of log SFR. If the input consisted of multiple photometric SFRs, the output will contains posterior probabilities for each input. The output grid will be created automatically be default, but all aspects of it (shape, size, placement of grid points) can be controlled by keywords – see Full Documentation of slugpy.sfr_slug.
Full Documentation of slugpy.sfr_slug¶

class
slugpy.sfr_slug.
sfr_slug
(libname=None, detname=None, filters=None, bandwidth=0.1, ktype='gaussian', priors=None, sample_density='read', reltol=0.001, abstol=1e10, leafsize=16)[source]¶ A class that can be used to estimate the PDF of true star formation rate from a set of input point mass estimates of the star formation rate.
 Properties
 priors : array, shape (N)  callable  ‘flat’  ‘schechter’  None
 prior probability on each data point; interpretation depends on the type passed; array, shape (N): values are interpreted as the prior probability of each data point; callable: the callable must take as an argument an array of shape (N, nphys), and return an array of shape (N) giving the prior probability at each data point; None: all data points have equal prior probability; the values ‘flat’ and ‘schechter’ use priors p(log SFR) ~ constant and p(log SFR) ~ SFR^alpha exp(SFR/SFR_*), respectively, where alpha = 0.51 and SFR_* = 9.2 Msun/yr are the values measured by Bothwell et al. (2011)
 bandwidth : ‘auto’  array, shape (M)
 bandwidth for kernel density estimation; if set to ‘auto’, the bandwidth will be estimated automatically

__init__
(libname=None, detname=None, filters=None, bandwidth=0.1, ktype='gaussian', priors=None, sample_density='read', reltol=0.001, abstol=1e10, leafsize=16)[source]¶  Initialize an sfr_slug object.
 Parameters
 libname : string
 name of the SLUG model to load; if left as None, the default is $SLUG_DIR/sfr_slug/SFR_SLUG
 detname : string
 name of a SLUG model run with the same parameters but no stochasticity; used to establish the nonstochastic photometry to SFR conversions; if left as None, the default is libname_DET
 filters : iterable of stringlike
 list of filter names to be used for inferenence
 bandwidth : ‘auto’  float  array, shape (M)
 bandwidth for kernel density estimation; if set to ‘auto’, the bandwidth will be estimated automatically; if set to a float, the same bandwidth is used in all dimensions
 ktype : string
 type of kernel to be used in densty estimation; allowed values are ‘gaussian’ (default), ‘epanechnikov’, and ‘tophat’; only Gaussian can be used with error bars
 priors : array, shape (N)  callable  None
 prior probability on each data point; interpretation depends on the type passed; array, shape (N): values are interpreted as the prior probability of each data point; callable: the callable must take as an argument an array of shape (N, nphys), and return an array of shape (N) giving the prior probability at each data point; None: all data points have equal prior probability
 sample_density : array, shape (N)  callable  ‘auto’  ‘read’  None
 the density of the data samples at each data point; this need not match the prior density; interpretation depends on the type passed; array, shape (N): values are interpreted as the density of data sampling at each sample point; callable: the callable must take as an argument an array of shape (N, nphys), and return an array of shape (N) giving the sampling density at each point; ‘auto’: the sample density will be computed directly from the data set; note that this can be quite slow for large data sets, so it is preferable to specify this analytically if it is known; ‘read’: the sample density is to be read from a numpy save file whose name matches that of the library, with the extension _density.npy added; None: data are assumed to be uniformly sampled
 reltol : float
 relative error tolerance; errors on all returned probabilities p will satisfy either abs(p_est  p_true) <= reltol * p_est OR abs(p_est  p_true) <= abstol, where p_est is the returned estimate and p_true is the true value
 abstol : float
 absolute error tolerance; see above
 leafsize : int
 number of data points in each leaf of the KD tree
 Returns
 Nothing
 Raises
 IOError, if the library cannot be found

__weakref__
¶ list of weak references to the object (if defined)

add_filters
(filters)[source]¶ Add a set of filters to use for cluster property estimation
 Parameters
 filters : iterable of stringlike
 list of filter names to be used for inferenence
 Returns
 nothing

filters
()[source]¶ Returns list of all available filters
 Parameters:
 None
 Returns:
 filters : list of strings
 list of available filter names

logL
(logSFR, logSFRphot, logSFRphoterr=None, filters=None)[source]¶ This function returns the natural log of the likelihood function evaluated at a particular log SFR and set of log luminosities
 Parameters:
 logSFR : float or arraylike
 float or array giving values of the log SFR; for an array, the operation is vectorized
 logSFRphot : float or arraylike, shape (nfilter) or (..., nfilter)
 float or array giving the SFR inferred from photometry using a deterministic conversion; for an array, the operation is vectorized over the leading dimensions
 logSFRphoterr : float arraylike, shape (nfilter) or (..., nfilter)
 float or array giving photometric SFR errors; for a multidimensional array, the operation is vectorized over the leading dimensions
 filters : listlike of strings
 list of photometric filters used for the SFR estimation; if left as None, and only 1 set of photometric filters has been defined for the sfr_slug object, that set will be used by default
 Returns:
 logL : float or arraylike
 natural log of the likelihood function

mcmc
(photprop, photerr=None, mc_walkers=100, mc_steps=500, mc_burn_in=50, filters=None)[source]¶ This function returns a sample of MCMC walkers for log SFR
 Parameters:
 photprop : arraylike, shape (nfilter) or (..., nfilter)
 array giving the photometric values; for a multidimensional array, the operation is vectorized over the leading dimensions
 photerr : arraylike, shape (nfilter) or (..., nfilter)
 array giving photometric errors; for a multidimensional array, the operation is vectorized over the leading dimensions
 mc_walkers : int
 number of walkers to use in the MCMC
 mc_steps : int
 number of steps in the MCMC
 mc_burn_in : int
 number of steps to consider “burnin” and discard
 filters : listlike of strings
 list of photometric filters to use; if left as None, and only 1 set of photometric filters has been defined for the cluster_slug object, that set will be used by default
 Returns
 samples : array
 array of sample points returned by the MCMC

mpdf
(logSFRphot, logSFRphoterr=None, ngrid=128, qmin=None, qmax=None, grid=None, norm=True, filters=None)[source]¶ Returns the marginal probability of log SFR for one or more input sets of photometric properties. Output quantities are computed on a grid of values, in the same style as meshgrid
 Parameters:
 logSFRphot : float or arraylike
 float or array giving the log SFR inferred from photometry using a deterministic conversion; if the argument is an array, the operation is vectorized over it
 logSFRphoterr : arraylike, shape (nfilter) or (..., nfilter)
 array giving photometric errors; for a multidimensional array, the operation is vectorized over the leading dimensions
 ngrid : int
 number of points in the output log SFR grid
 qmin : float
 minimum value in the output log SFR grid
 qmax : float
 maximum value in the output log SFR grid
 grid : array
 set of values defining the grid of SFR values at which to evaluate; if set, overrides ngrid, qmin, and qmax
 norm : bool
 if True, returned pdf’s will be normalized to integrate to 1
 filters : listlike of strings
 list of photometric filters to use; if left as None, and only 1 set of photometric filters has been defined for the cluster_slug object, that set will be used by default
 Returns:
 grid_out : array
 array of log SFR values at which the PDF is evaluated
 pdf : array
 array of marginal posterior probabilities at each point of the output grid, for each input photometric value; the leading dimensions match the leading dimensions produced by broadcasting the leading dimensions of photprop and photerr together, while the trailing dimensions match the dimensions of the output grid