ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 1701-1800 of 3450 (3361 ASCL, 89 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:1805.025] GLACiAR: GaLAxy survey Completeness AlgoRithm

GLACiAR (GaLAxy survey Completeness AlgoRithm) estimates the completeness and selection functions in galaxy surveys. Tailored for multiband imaging surveys aimed at searching for high-redshift galaxies through the Lyman Break technique, the code can nevertheless be applied broadly. GLACiAR generates artificial galaxies that follow Sérsic profiles with different indexes and with customizable size, redshift and spectral energy distribution properties, adds them to input images, and measures the recovery rate.

[ascl:1805.026] PySE: Python Source Extractor for radio astronomical images

PySE finds and measures sources in radio telescope images. It is run with several options, such as the detection threshold (a multiple of the local noise), grid size, and the forced clean beam fit, followed by a list of input image files in standard FITS or CASA format. From these, PySe provides a list of found sources; information such as the calculated background image, source list in different formats (e.g. text, region files importable in DS9), and other data may be saved. PySe can be integrated into a pipeline; it was originally written as part of the LOFAR Transient Detection Pipeline (TraP, ascl:1412.011).

[ascl:1805.027] MontePython 3: Parameter inference code for cosmology

MontePython 3 provides numerous ways to explore parameter space using Monte Carlo Markov Chain (MCMC) sampling, including Metropolis-Hastings, Nested Sampling, Cosmo Hammer, and a Fisher sampling method. This improved version of the Monte Python (ascl:1307.002) parameter inference code for cosmology offers new ingredients that improve the performance of Metropolis-Hastings sampling, speeding up convergence and offering significant time improvement in difficult runs. Additional likelihoods and plotting options are available, as are post-processing algorithms such as Importance Sampling and Adding Derived Parameter.

[ascl:1805.028] SP_Ace: Stellar Parameters And Chemical abundances Estimator

SP_Ace (Stellar Parameters And Chemical abundances Estimator) estimates the stellar parameters Teff, log g, [M/H], and elemental abundances. It employs 1D stellar atmosphere models in Local Thermodynamic Equilibrium (LTE). The code is highly automated and suitable for analyzing the spectra of large spectroscopic surveys with low or medium spectral resolution (R = 2000-20 000). A web service for calculating these values with the software is also available.

[ascl:1805.029] DeepMoon: Convolutional neural network trainer to identify moon craters

DeepMoon trains a convolutional neural net using data derived from a global digital elevation map (DEM) and catalog of craters to recognize craters on the Moon. The TensorFlow-based pipeline code is divided into three parts. The first generates a set images of the Moon randomly cropped from the DEM, with corresponding crater positions and radii. The second trains a convnet using this data, and the third validates the convnet's predictions.

[ascl:1805.030] PyCBC: Gravitational-wave data analysis toolkit

PyCBC analyzes data from gravitational-wave laser interferometer detectors, finds signals, and studies their parameters. It contains algorithms that can detect coalescing compact binaries and measure the astrophysical parameters of detected sources. PyCBC was used in the first direct detection of gravitational waves by LIGO and is used in the ongoing analysis of LIGO and Virgo data.

[ascl:1805.031] CubiCal: Suite for fast radio interferometric calibration

CubiCal implements several accelerated gain solvers which exploit complex optimization for fast radio interferometric gain calibration. The code can be used for both direction-independent and direction-dependent self-calibration. CubiCal is implemented in Python and Cython, and multiprocessing is fully supported.

A successor to CubiCal, QuartiCal (ascl:2305.006), is available.

[ascl:1805.032] PyCCF: Python Cross Correlation Function for reverberation mapping studies

PyCCF emulates a Fortran program written by B. Peterson for use with reverberation mapping. The code cross correlates two light curves that are unevenly sampled using linear interpolation and measures the peak and centroid of the cross-correlation function. In addition, it is possible to run Monto Carlo iterations using flux randomization and random subset selection (RSS) to produce cross-correlation centroid distributions to estimate the uncertainties in the cross correlation results.

[ascl:1806.001] feets: feATURE eXTRACTOR FOR tIME sERIES

feets characterizes and analyzes light-curves from astronomical photometric databases for modelling, classification, data cleaning, outlier detection and data analysis. It uses machine learning algorithms to determine the numerical descriptors that characterize and distinguish the different variability classes of light-curves; these range from basic statistical measures such as the mean or standard deviation to complex time-series characteristics such as the autocorrelation function. The library is not restricted to the astronomical field and could also be applied to any kind of time series. This project is a derivative work of FATS (ascl:1711.017).

[ascl:1806.002] BHDD: Primordial black hole binaries code

BHDD (BlackHolesDarkDress) simulates primordial black hole (PBH) binaries that are clothed in dark matter (DM) halos. The software uses N-body simulations and analytical estimates to follow the evolution of PBH binaries formed in the early Universe.

[ascl:1806.003] pyZELDA: Python code for Zernike wavefront sensors

pyZELDA analyzes data from Zernike wavefront sensors dedicated to high-contrast imaging applications. This modular software was originally designed to analyze data from the ZELDA wavefront sensor prototype installed in VLT/SPHERE; simple configuration files allow it to be extended to support several other instruments and testbeds. pyZELDA also includes simple simulation tools to measure the theoretical sensitivity of a sensor and to compare it to other sensors.

[ascl:1806.004] WiseView: Visualizing motion and variability of faint WISE sources

WiseView renders image blinks of Wide-field Infrared Survey Explorer (WISE) coadds spanning a multi-year time baseline in a browser. The software allows for easy visual identification of motion and variability for sources far beyond the single-frame detection limit, a key threshold not surmounted by many studies. WiseView transparently gathers small image cutouts drawn from many terabytes of unWISE coadds, facilitating access to this large and unique dataset. Users need only input the coordinates of interest and can interactively tune parameters including the image stretch, colormap and blink rate. WiseView was developed in the context of the Backyard Worlds: Planet 9 citizen science project, and has enabled hundreds of brown dwarf candidate discoveries by citizen scientists and professional astronomers.

[ascl:1806.005] Indri: Pulsar population synthesis toolset

Indri models the population of single (not in binary or hierarchical systems) neutron stars. Given a starting distribution of parameters (birth place, velocity, magnetic field, and period), the code moves a set of stars through the time (by evolving spin period and magnetic field) and the space (by propagating through the Galactic potential). Upon completion of the evolution, a set of observables is computed (radio flux, position, dispersion measure) and compared with a radio survey such as the Parkes Multibeam Survey. The models' parameters are optimised by using the Markov Chain Monte Carlo technique.

[ascl:1806.006] QE: Quantum opEn-Source Package for Research in Electronic Structure, Simulation, and Optimization

Quantum ESPRESSO (opEn-Source Package for Research in Electronic Structure, Simulation, and Optimization) is an integrated suite of codes for electronic-structure calculations and materials modeling at the nanoscale. It is based on density-functional theory, plane waves, and pseudopotentials. QE performs ground-state calculations such as self-consistent total energies, forces, stresses and Kohn-Sham orbitals, Car-Parrinello and Born-Oppenheimer molecular dynamics, and quantum transport such as ballistic transport, coherent transport from maximally localized Wannier functions, and Kubo-Greenwood electrical conductivity. It can also determine spectroscopic properties and examine time-dependent density functional perturbations and electronic excitations, and has a wide range of other functions.

[ascl:1806.007] PyAMOR: AMmOnia data Reduction

PyAMOR models spectra of low level ammonia transitions (between (J,K)=(1,1) and (5,5)) and derives parameters such as intrinsic linewidth, optical depth, and rotation temperature. For low S/N or low spectral resolution data, the code uses cross-correlation between a model and a regridded spectrum (e.g. 10 times smaller channel width) to find the velocity, then fixes it and runs the minimization process. For high S/N data, PyAMOR runs with the velocity as a free parameter.

[ascl:1806.008] gsf: galactic structure finder

gsf applies Gaussian Mixture Models in the stellar kinematic space of normalized angular momentum and binding energy on NIHAO high resolution galaxies to separate the stars into multiple components. The gsf analysis package assumes that the simulation snapshot has been pre-processed with a halo finder. It is based on pynbody (ascl:1305.002) and the scikit-learnpython package for Machine Learning; after loading, orienting, and transforming a simulation snapshot to physical units, it runs the clustering algorithm and performs the direct N-body gravity force using all the particles in the given halo.

[ascl:1806.009] GLASS: Parallel, free-form gravitational lens modeling tool and framework

GLASS models strong gravitational lenses. It produces an ensemble of possible models that fit the observed input data and conform to certain constraints specified by the user. GLASS makes heavy use of the numerical routines provided by the numpy and scipy packages as well as the linear programming package GLPK. This latter package, and its Python interface, is provided with GLASS and installs automatically in the GLASS build directory.

[ascl:1806.010] SpaghettiLens: Web-based gravitational lens modeling tool

SpaghettiLens allows citizen scientists to model gravitational lenses collaboratively; the software should also be easily adaptable to any other, reasonably similar problem. It lets volunteers execute a computer intensive task that cannot be easily executed client side and relies on citizen scientists collaborating. SpaghettiLens makes survey data available to citizen scientists, manages the model configurations generated by the volunteers, stores the resulting model configuration, and delivers the actual model. A model can be shared and discussed with other volunteers and revised, and new child models can be created, resulting in a branching version tree of models that explore different possibilities. Scientists can choose a collection of models; discussion among volunteers and scientists prune the tree to determine which models will receive further analysis.

[ascl:1806.011] P2DFFT: Parallelized technique for measuring galactic spiral arm pitch angles

P2DFFT is a parallelized version of 2DFFT (ascl:1608.015). It isolates and measures the spiral arm pitch angle of galaxies. The code allows direct input of FITS images, offers the option to output inverse Fourier transform FITS images, and generates idealized logarithmic spiral test images of a specified size that have 1 to 6 arms with pitch angles of -75 degrees to 75 degrees​​. Further, it can output Fourier amplitude versus inner radius and pitch angle versus inner radius for each Fourier component (m = 0 to m = 6), and calculates the Fourier amplitude weighted mean pitch angle across m = 1 to m = 6 versus inner radius.

[ascl:1806.012] WDEC: White Dwarf Evolution Code

WDEC (White Dwarf Evolution Code), written in Fortran, offers a fast and fairly easy way to produce models of white dwarfs. The code evolves hot (~100,000 K) input models down to a chosen effective temperature by relaxing the models to be solutions of the equations of stellar structure. The code can also be used to obtain g-mode oscillation modes for the models.

[ascl:1806.013] SpS: Single-pulse Searcher

The presence of human-made interference mimicking the behavior of celestial radio pulses is a major challenge when searching for radio pulses emitted on millisecond timescales by celestial radio sources such as pulsars and fast radio bursts due to the highly imbalanced samples. Single-pulse Searcher (SpS) reduces the presence of radio interference when processing standard output from radio single-pulse searches to produce diagnostic plots useful for selecting good candidates. The modular software allows modifications for specific search characteristics. LOTAAS Single-pulse Searcher (L-SpS) is an implementation of different features of the software (such as a machine-learning approach) developed for a particular study: the LOFAR Tied-Array All-Sky Survey (LOTAAS).

[ascl:1806.014] pile-up: Monte Carlo simulations of star-disk torques on hot Jupiters

The pile-up gnuplot script generates a Monte Carlo simulation with a selectable number of randomized drawings (1000 by default, ~1min on a modern laptop). For each realization, the script calculates the torque acting on a hot Jupiter around a young, solar-type star as a function of the star-planet distance. The total torque on the planet is composed of the disk torque in the type II migration regime (that is, the planet is assumed to have opened up a gap in the disk) and of the stellar tidal torque. The model has four free parameters, which are drawn from a normal or lognormal distribution: (1) the disk's gas surface density at 1 astronomical unit, (2) the magnitude of tidal dissipation within the star, (3) the disk's alpha viscosity parameter, and (4) and the mean molecular weight of the gas in the disk midplane. For each realization, the total torque is screened for a distance at which it becomes zero. If present, then this distance would represent a tidal migration barrier to the planet. In other words, the planet would stop migrating. This location is added to a histogram on top of the main torque-over-distance panel and the realization is counted as one case that contributes to the overall survival rate of hot Jupiters. Finally, the script generates an output file (PDF by default) and prints the hot Jupiter survival rate for the assumed parameterization of the star-planet-disk system.

[ascl:1806.015] DirectDM-mma: Dark matter direct detection

The Mathematica code DirectDM takes the Wilson coefficients of relativistic operators that couple DM to the SM quarks, leptons, and gauge bosons and matches them onto a non-relativistic Galilean invariant EFT in order to calculate the direct detection scattering rates. A Python implementation of DirectDM is also available (ascl:1806.016).

[ascl:1806.016] DirectDM-py: Dark matter direct detection

DirectDM, written in Python, takes the Wilson coefficients of relativistic operators that couple DM to the SM quarks, leptons, and gauge bosons and matches them onto a non-relativistic Galilean invariant EFT in order to calculate the direct detection scattering rates. A Mathematica implementation of DirectDM is also available (ascl:1806.015).

[ascl:1806.017] RadFil: Radial density profile builder for interstellar filaments

RadFil is a radial density profile building and fitting tool for interstellar filaments. The software uses an image array and (in most cases) a boolean mask array that delineates the boundary of the filament to build and fit a radial density profile for the filaments.

[ascl:1806.018] OMEGA: One-zone Model for the Evolution of GAlaxies

OMEGA (One-zone Model for the Evolution of GAlaxies) calculates the global chemical evolution trends of galaxies. From an input star formation history, it uses SYGMA to create as a function of time multiple simple stellar populations with different masses, ages, and initial compositions. OMEGA offers several prescriptions for modeling the star formation efficiency and the evolution of galactic inflows and outflows. OMEGA is part of the NuGrid (ascl:1610.015) chemical evolution package.

[ascl:1806.019] SYGMA: Modeling stellar yields for galactic modeling

SYGMA (Stellar Yields for Galactic Modeling Applications) follows the ejecta of simple stellar populations as a function of time to model the enrichment and feedback from simple stellar populations. It is the basic building block of the galaxy code One-zone Model for the Evolution of GAlaxies (OMEGA, ascl:1806.018) and is part of the NuGrid Python Chemical Evolution Environment (NuPyCEE, ascl:1610.015). Stellar yields of AGB and massive stars are calculated with the same nuclear physics and are provided by the NuGrid collaboration.

[ascl:1806.020] exoinformatics: Compute the entropy of a planetary system's size-ordering

exoinformatics computes the entropy of a planetary system's size ordering using three different entropy methods: tally-scores, integral path, and change points.

[ascl:1806.021] LASR: Linear Algorithm for Significance Reduction

LASR removes stellar variability in the light curves of δ-Scuti and similar stars. It subtracts oscillations from a time series by minimizing their statistical significance in frequency space.

[ascl:1806.022] Keras: The Python Deep Learning library

Keras is a high-level neural networks API written in Python and capable of running on top of TensorFlow, CNTK, or Theano. It focuses on enabling fast experimentation.

[ascl:1806.023] Spheral++: Coupled hydrodynamical and gravitational numerical simulations

Spheral++ provides a steerable parallel environment for performing coupled hydrodynamical and gravitational numerical simulations. Hydrodynamics and gravity are modeled using particle-based methods (SPH and N-Body). It uses an Adaptive Smoothed Particle Hydrodynamics (ASPH) algorithm, provides a total energy conserving compatible hydro mode, and performs fluid and solid material modeling and damage and fracture modeling in solids.

[ascl:1806.024] RMextract: Ionospheric Faraday Rotation calculator

RMextract calculates Ionospheric Faraday Rotation for a given epoch, location and line of sight. This Python code extracts TEC, vTEC, Earthmagnetic field and Rotation Measures from GPS and WMM data for radio interferometry observations.

[ascl:1806.025] BRATS: Broadband Radio Astronomy ToolS

BRATS (Broadband Radio Astronomy ToolS) provides tools for the spectral analysis of broad-bandwidth radio data and legacy support for narrowband telescopes. It can fit models of spectral ageing on small spatial scales, offers automatic selection of regions based on user parameters (e.g. signal to noise), and automatic determination of the best-fitting injection index. It includes statistical testing, including Chi-squared, error maps, confidence levels and binning of model fits, and can map spectral index as a function of position. It also provides the ability to reconstruct sources at any frequency for a given model and parameter set, subtract any two FITS images and output residual maps, easily combine and scale FITS images in the image plane, and resize radio maps.

[ascl:1806.026] BWED: Brane-world extra dimensions

Braneworld-extra-dimensions places constraints on the size of the AdS5 radius of curvature within the Randall-Sundrum brane-world model in light of the near-simultaneous detection of the gravitational wave event GW170817 and its optical counterpart, the short γ-ray burst event GRB170817A. The code requires a (supplied) patch to the Montepython cosmological MCMC sampler (ascl:1805.027) to sample the posterior distribution of the 4-dimensional parameter space in VBV17 and obtain constraints on the parameters.

[ascl:1806.027] fcmaker: Creating ESO-compliant finding charts for Observing Blocks on p2

fcmaker creates astronomical finding charts for Observing Blocks (OBs) on the p2 web server from the European Southern Observatory (ESO). It automates the creation of ESO-compliant finding charts for Service Mode and/or Visitor Mode OBs at the Very Large Telescope (VLT). The design of the fcmaker finding charts, based on an intimate knowledge of VLT observing procedures, is fine-tuned to best support night time operations. As an automated tool, fcmaker also allows observers to independently check visually, for the first time, the observing sequence coded inside an OB. This includes, for example, the signs of telescope and position angle offsets.

[ascl:1806.028] PyMUSE: VLT/MUSE data analyzer

PyMUSE analyzes VLT/MUSE datacubes. The package is optimized to extract 1-D spectra of arbitrary spatial regions within the cube and also for producing images using photometric filters and customized masks. It is intended to provide the user the tools required for a complete analysis of a MUSE data set.

[ascl:1806.029] EXO-NAILER: EXOplanet traNsits and rAdIal veLocity fittER

EXO-NAILER (EXOplanet traNsits and rAdIal veLocity fittER) efficiently fits exoplanet transit lightcurves, radial velocities (RVs) or both. The code handles data taken with different instruments. For RVs, a different center-of-mass velocity can be fitted for each instrument to account for offsets between them; if jitter is included, a different jitter term can also fitted for each instrument. For transits, a different photometric jitter can be fitted to each instrument as can different limb-darkening coefficients and different transit depths. In addition to general options that need to be set, EXO-NAILER also requires that photometry and radial velocity options be defined for each instrument.

[ascl:1806.030] foxi: Forecast Observations and their eXpected Information

Using information theory and Bayesian inference, the foxi Python package computes a suite of expected utilities given futuristic observations in a flexible and user-friendly way. foxi requires a set of n-dim prior samples for each model and one set of n-dim samples from the current data, and can calculate the expected ln-Bayes factor between models, decisiveness between models and its maximum-likelihood averaged equivalent, the decisivity, and the expected Kullback-Leibler divergence (i.e., the expected information gain of the futuristic dataset). The package offers flexible inputs and is designed for all-in-one script calculation or an initial cluster run then local machine post-processing, which should make large jobs quite manageable subject to resources and includes features such as LaTeX tables and plot-making for post-data analysis visuals and convenience of presentation.

[ascl:1806.031] ASPIC: Accurate Slow-roll Predictions for Inflationary Cosmology

Aspic, written in modern Fortran, computes various observable quantities used in cosmology from definite single field inflationary models. It provides an efficient, extendable, and accurate way of comparing theoretical inflationary predictions with cosmological data and supports many (~70) models of inflation. The Hubble flow functions, observable quantities up to second order in the slow-roll approximation, are in direct correspondence with the spectral index, the tensor-to-scalar ratio and the running of the primordial power spectrum. The ASPIC library also provides the field potential, its first and second derivatives, the energy density at the end of inflation, the energy density at the end of reheating, and the field value (or e-fold value) at which the pivot scale crossed the Hubble radius during inflation. All these quantities are computed in a way which is consistent with the existence of a reheating phase.

[ascl:1806.032] pwv_kpno: Modeling atmospheric absorption

pwv_kpno provides models for the atmospheric transmission due to precipitable water vapor (PWV) at user specified sites. Atmospheric transmission in the optical and near-infrared is highly dependent on the PWV column density along the line of sight. The pwv_kpno package uses published SuomiNet data in conjunction with MODTRAN models to determine the modeled, time-dependent atmospheric transmission between 3,000 and 12,000 Å. By default, models are provided for Kitt Peak National Observatory (KPNO). Additional locations can be added by the user for any of the hundreds of SuomiNet locations worldwide.

[ascl:1807.001] POLARIS: POLArized RadIation Simulator

POLARIS (POLArized RadIation Simulator) simulates the intensity and polarization of light emerging from analytical astrophysical models as well as complex magneto-hydrodynamic simulations on various grids. This 3D Monte-Carlo continuum radiative transfer code is written in C++ and is capable of performing dust heating, dust grain alignment, line radiative transfer, and synchrotron simulations to calculate synthetic intensity and polarization maps. The code makes use of a full set of physical quantities (density, temperature, velocity, magnetic field distribution, and dust grain properties as well as different sources of radiation) as input.

[ascl:1807.002] Warpfield: Winds And Radiation Pressure: Feedback Induced Expansion, colLapse and Dissolution

Warpfield (Winds And Radiation Pressure: Feedback Induced Expansion, colLapse and Dissolution) calculates shell dynamics and shell structure simultaneously for isolated massive clouds (≥105 M). This semi-analytic 1D feedback model scans a large range of physical parameters (gas density, star formation efficiency, and metallicity) to estimate escape fractions of ionizing radiation fesc, I, the minimum star formation efficiency ∊min required to drive an outflow, and recollapse time-scales for clouds that are not destroyed by feedback.

[ascl:1807.003] PyAutoLens: Strong lens modeling

PyAutoLens models and analyzes galaxy-scale strong gravitational lenses. This automated module suite simultaneously models the lens galaxy's light and mass while reconstructing the extended source galaxy on an adaptive pixel-grid. Source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing PyAutoLens to cleanly deblend its light from the source. Bayesian model comparison is used to automatically chose the complexity of the light and mass models. PyAutoLens provides accurate light, mass, and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

[ascl:1807.004] ARKCoS: Radial kernel convolution on the sphere

ARKCoS (Accelerated radial kernel convolution on the sphere) efficiently convolves pixelated maps on the sphere with radially symmetric kernels with compact support. It performs the convolution along isolatitude rings in Fourier space and integrates in longitudinal direction in pixel space. The computational costs scale linearly with the kernel support, making the method most beneficial for convolution with compact kernels. Typical applications include CMB beam smoothing, symmetric wavelet analyses, and point-source filtering operations. The software is written in C++/CUDA and provides two independent code paths to do the necessary computation either on conventional hardware (CPUs), or on graphics processing units (GPUs).

[ascl:1807.005] MAPPINGS V: Astrophysical plasma modeling code

MAPPINGS V is a update of the MAPPINGS code (ascl:1306.008) and provides new cooling function computations for optically thin plasmas based on the greatly expanded atomic data of the CHIANTI 8 database. The number of cooling and recombination lines has been expanded from ~2000 to over 80,000, and temperature-dependent spline-based collisional data have been adopted for the majority of transitions. The expanded atomic data set provides improved modeling of both thermally ionized and photoionized plasmas; the code is now capable of predicting detailed X-ray spectra of nonequilibrium plasmas over the full nonrelativistic temperature range, increasing its utility in cosmological simulations, in modeling cooling flows, and in generating accurate models for the X-ray emission from shocks in supernova remnants.

[ascl:1807.006] pyqz: Emission line code

pyqz computes the values of log(Q) [the ionization parameter] and 12+log(O/H) [the oxygen abundance, either total or in the gas phase] for a given set of strong emission lines fluxes from HII regions. The log(Q) and 12+log(O/H) values are interpolated from a finite set of diagnostic line ratio grids computed with the MAPPINGS V code (ascl:1807.005). The grids used by pyqz are chosen to be flat, without wraps, to decouple the influence of log(Q) and 12+log(O/H) on the emission line ratios.

[ascl:1807.007] HII-CHI-mistry: Oxygen abundance and ionizionation parameters for optical emission lines

HII-CHI-mistry calculates the oxygen abundance for gaseous nebulae ionized by massive stars using optical collisionally excited emission lines. This code takes the extinction-corrected emission line fluxes and, based on a Χ2 minimization on a photoionization models grid, determines chemical-abundances (O/H, N/O) and ionization parameters. An ultraviolet version of this Python code, HII-CHI-mistry-UV (ascl:1807.008), is also available.

[ascl:1807.008] HII-CHI-mistry_UV: Oxygen abundance and ionizionation parameters for ultraviolet emission lines

HII-CHI-mistry_UV derives oxygen and carbon abundances using the ultraviolet (UV) lines emitted by the gas phase ionized by massive stars. The code first fixes C/O using ratios of appropriate emission lines and, in a second step, calculates O/H and the ionization parameter from carbon lines in the UV. An optical version of this Python code, HII-CHI-mistry (ascl:1807.007), is also available.

[ascl:1807.009] HELIOS: Radiative transfer code for exoplanetary atmospheres

HELIOS, a radiative transfer code, is constructed for studying exoplanetary atmospheres. The model atmospheres of HELIOS are one-dimensional and plane-parallel, and the equation of radiative transfer is solved in the two-stream approximation with non-isotropic scattering. Though HELIOS can be used alone, the opacity calculator HELIOS-K (ascl:1503.004) can be used with it to provide the molecular opacities.

[ascl:1807.010] THOR: Global Circulation Model for planetary atmospheres

THOR solves the three-dimensional nonhydrostatic Euler equations. The code implements an icosahedral grid for the poles where converging meridians lead to increasingly smaller time steps; irregularities in the grid are smoothed using spring dynamics. THOR is designed to run on graphics processing units (GPUs) and is part of the open-source Exoclimes Simulation Platform.

[ascl:1807.011] nfield: Stochastic tool for QFT on inflationary backgrounds

nfield uses a stochastic formalism to compute the IR correlation functions of quantum fields during cosmic inflation in n-field dimensions. This is a necessary 1-loop resummation of the correlation functions to render them finite. The code supports the implementation of n-numbers of coupled test fields (energetically sub-dominant) as well as non-test fields.

[ascl:1807.012] AngPow: Fast computation of accurate tomographic power spectra

AngPow computes the auto (z1 = z2) and cross (z1 ≠ z2) angular power spectra between redshift bins (i.e. Cℓ(z1,z2)). The developed algorithm is based on developments on the Chebyshev polynomial basis and on the Clenshaw-Curtis quadrature method. AngPow is flexible and can handle any user-defined power spectra, transfer functions, bias functions, and redshift selection windows. The code is fast enough to be embedded inside programs exploring large cosmological parameter spaces through the Cℓ(z1,z2) comparison with data.

[ascl:1807.013] CLASSgal: Relativistic cosmological large scale structure code

CLASSgal computes large scale structure observables; it includes all relativistic corrections and computes both the power spectrum Cl(z1,z2) and the corresponding correlation function ξ(θ, z1, z2) of the matter density and the galaxy number fluctuations in linear perturbation theory. These quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory for Gaussian initial perturbations. CLASSgal is a modified version of CLASS (ascl:1106.020).

[ascl:1807.014] SPEGID: Single-Pulse Event Group IDentification

SPEGID (Single-Pulse Event Group IDentification) identifies astrophysical pulse candidates as trial single-pulse event groups (SPEGs) by first applying Density Based Spatial Clustering of Applications with Noise (DBSCAN) on trial single-pulse events and then merging the clusters that fall within the expected DM (Dispersion Measure) and time span of astrophysical pulses. SPEGID also calculates the peak score for each SPEG in the S/N versus DM space to identify the expected peak-like shape in the signal-to-noise (S/N) ratio versus DM curve of astrophysical pulses. Additionally, SPEGID groups SPEGs that appear at a consistent DM and therefore are likely emitted from the same source. After running SPEGID, periocity.py can be used to find (or verify) the underlying periodicity among a group of SPEGs (i.e., astrophysical pulse candidates).

[ascl:1807.015] CAESAR: Compact And Extended Source Automated Recognition

CAESAR extracts and parameterizes both compact and extended sources from astronomical radio interferometric maps. The processing pipeline is a series of stages that can run on multiple cores and processors. After local background and rms map computation, compact sources are extracted with flood-fill and blob finder algorithms, processed (selection + deblending), and fitted using a 2D gaussian mixture model. Extended source search is based on a pre-filtering stage, allowing image denoising, compact source removal and enhancement of diffuse emission, followed by a final segmentation. Different algorithms are available for image filtering and segmentation. The outputs delivered to the user include source fitted and shape parameters, regions and contours. Written in C++, CAESAR is designed to handle the large-scale surveys planned with the Square Kilometer Array (SKA) and its precursors.

[ascl:1807.016] MIDLL: Markwardt IDL Library

The Markwardt IDL Library contains routines for curve fitting and function minimization, including MPFIT (ascl:1208.019), statistical tests, and non-linear optimization (TNMIN); graphics programs including plotting three-dimensional data as a cube and fixed- or variable-width histograms; adaptive numerical integration (Quadpack), Chebyshev approximation and interpolation, and other mathematical tools; many ephemeris and timing routines; and array and set operations, such as computing the fast product of a large array, efficiently inserting or deleting elements in an array, and performing set operations on numbers and strings; and many other useful and varied routines.

[ascl:1807.017] ZBARYCORR: Barycentric redshift calculator

ZBARYCORR determines the barycentric redshift (zB) for a given star. It calculates the positions and velocities of solar system objects, applies the rotation, precession, nutation, and polar motion of the Earth, applies the stellar motion using the Markwardt library (ascl:1807.016), Shapiro delay, and light-travel term, and finally calculates the quantity zB—the barycentric correction independent of the measured redshift. A Python wrapper, BARYCORR (ascl:1807.018), is available.

[ascl:1807.018] BARYCORR: Python interface for barycentric RV correction

BARYCORR is a Python interface for ZBARYCORR (ascl:1807.017); it requires the measured redshift and returns the corrected barycentric velocity and time correction.

[ascl:1807.019] GLS: Generalized Lomb-Scargle periodogram

The Lomb-Scargle periodogram is a common tool in the frequency analysis of unequally spaced data equivalent to least-squares fitting of sine waves. GLS is a solution for the generalization to a full sine wave fit, including an offset and weights (χ2 fitting). Compared to the Lomb-Scargle periodogram, GLS is superior as it provides more accurate frequencies, is less susceptible to aliasing, and gives a much better determination of the spectral intensity.

[ascl:1807.020] wdmerger: Simulate white dwarf mergers with CASTRO

wdmerger simulates binary white dwarf mergers (and related events) in CASTRO (ascl:1105.010) and provides useful information on the viability of mergers of white dwarfs as a progenitor for Type Ia supernovae.

[ascl:1807.021] POWER: Python Open-source Waveform ExtractoR

POWER (Python Open-source Waveform ExtractoR) monitors the status and progress of numerical relativity simulations and post-processes the data products of these simulations to compute the gravitational wave strain at future null infinity.

[ascl:1807.022] PUMA: Low-frequency radio catalog cross-matching

PUMA (Positional Update and Matching Algorithm) cross-matches low-frequency radio catalogs using a Bayesian positional probability with spectral matching criteria. The code reliably finds the correct spectral indices of sources and recovers ionospheric offsets. PUMA can be used to facilitate all-sky cross-matches with further constraints applied for other science goals.

[ascl:1807.023] DAMOCLES: Monte Carlo line radiative transfer code

The Monte Carlo code DAMOCLES models the effects of dust, composed of any combination of species and grain size distributions, on optical and NIR emission lines emitted from the expanding ejecta of a late-time (> 1 yr) supernova. The emissivity and dust distributions follow smooth radial power-law distributions; any arbitrary distribution can be specified by providing the appropriate grid. DAMOCLES treats a variety of clumping structures as specified by a clumped dust mass fraction, volume filling factor, clump size and clump power-law distribution, and the emissivity distribution may also initially be clumped. The code has a large number of variable parameters ranging from 5 dimensions in the simplest models to > 20 in the most complex cases.

[ascl:1807.024] TBI: Three-Body Integration

Three-Body Integration performs numerical n-body simulations for mapping conditions for close approaches for the relevant parameter space of configurations and mass values of two white dwarfs and a third star. Low tertiary masses of 0.1M⊙ can be studied, and the collision probability can be estimated with good confidence for the case of nearly equal mass white dwarfs.

[ascl:1807.025] NRPy+: Code generator for Numerical Relativity

NRPy+ (Python-based Code generation for Numerical Relativity and Beyond) generates highly-optimized C code from complex tensorial expressions input in Einstein-like notation. NRPy+ uses SymPy as its computer algebra system backend. It is part of the NRPy+/SENR numerical relativity code package for solving Einstein's equations of general relativity to model compact objects at about 1/100 the cost in memory of more traditional, AMR-based numerical relativity codes, thus allowing desktop computers to be used for gravitational wave astrophysics.

[ascl:1807.026] SENR: Simple, Efficient Numerical Relativity

SENR (Simple, Efficient Numerical Relativity) provides the algorithmic framework that combines the C codes generated by NRPy+ (ascl:1807.025) into a functioning numerical relativity code. It is part of the numerical relativity code package SENR/NRPy+. The package extends previous implementations of the BSSN reference-metric formulation to a much broader class of curvilinear coordinate systems, making it suitable for modeling physical configurations with approximate or exact symmetries, such as modeling black hole dynamics.

[ascl:1807.027] kplr: Tools for working with Kepler data using Python

kplr provides a lightweight Pythonic interface to the catalog of planet candidates (Kepler Objects of Interest [KOIs]) in the NASA Exoplanet Archive and the data stored in the Barbara A. Mikulski Archive for Space Telescopes (MAST). kplr automatically supports loading Kepler data using pyfits (ascl:1207.009) and supports two types of data: light curves and target pixel files.

[ascl:1807.028] ktransit: Exoplanet transit modeling tool in python

The routines in ktransit create and fit a transiting planet model. The underlying model is a Fortran implementation of the Mandel & Agol (2002) limb darkened transit model. The code calculates a full orbital model and eccentricity can be allowed to vary; radial velocity data can also be calculated via the model and included in the fit.

[ascl:1807.029] EVEREST: Tools for de-trending stellar photometry

EVEREST (EPIC Variability Extraction and Removal for Exoplanet Science Targets) removes instrumental noise from light curves with pixel level decorrelation and Gaussian processes. The code, written in Python, generates the EVEREST catalog and offers tools for accessing and interacting with the de-trended light curves. EVEREST exploits correlations across the pixels on the CCD to remove systematics introduced by the spacecraft’s pointing error. For K2, it yields light curves with precision comparable to that of the original Kepler mission. Interaction with the EVEREST catalog catalog is available via the command line and through the Python interface. Though written for K2, EVEREST can be applied to additional surveys, such as the TESS mission, to correct for instrumental systematics and enable the detection of low signal-to-noise transiting exoplanets.

[ascl:1807.030] ASP: Ames Stereo Pipeline

ASP (Ames Stereo Pipeline) provides fully automated geodesy and stereogrammetry tools for processing stereo imagery captured from satellites (around Earth and other planets), robotic rovers, aerial cameras, and historical imagery, with and without accurate camera pose information. It produces cartographic products, including digital elevation models (DEMs), ortho-projected imagery, 3D models, and bundle-adjusted networks of cameras. ASP's data products are suitable for science analysis, mission planning, and public outreach.

[ascl:1807.031] xGDS: Exploration Ground Data Systems

xGDS (Exploration Ground Data Systems) synthesizes real world data (from sensors, robots, ROVs, mobile devices, etc) and human observations into rich, digital maps and displays for analysis, decision making, and collaboration. xGDS processes and maps data (including video) in real-time during operations and uses it to support live role-based geolocated note taking. Notes can be used to search for and display important data. The software enables real-time analysis of data, permitting one to make inferences and plan new data collection operations while still in the field.

[ascl:1807.032] SSMM: Slotted Symbolic Markov Modeling for classifying variable star signatures

SSMM (Slotted Symbolic Markov Modeling) reduces time-domain stellar variable observations to classify stellar variables. The method can be applied to both folded and unfolded data, and does not require time-warping for waveform alignment. Written in Matlab, the performance of the supervised classification code is quantifiable and consistent, and the rate at which new data is processed is dependent only on the computational processing power available.

[ascl:1807.033] LSC: Supervised classification of time-series variable stars

LSC (LINEAR Supervised Classification) trains a number of classifiers, including random forest and K-nearest neighbor, to classify variable stars and compares the results to determine which classifier is most successful. Written in R, the package includes anomaly detection code for testing the application of the selected classifier to new data, thus enabling the creation of highly reliable data sets of classified variable stars.

[submitted] 3D texturized model of MARS (MOLA) regions

The Matlab Tool generates a 3D model (WRL, texturized in height false color map) of a defined region of the Mars surface. It defines the region of interest of the Mars surface (by Lat Long), a resolution of the MOLA DTMs to be considered (with a minimum px onground of 468 m), a scale factor to be multiplied to the height of the surface to improve features visibility for bumping or shadowing effect.

[ascl:1808.001] Barycorrpy: Barycentric velocity calculation and leap second management

barycorrpy (BCPy) is a Python implementation of Wright and Eastman's 2014 code (ascl:1807.017) that calculates precise barycentric corrections well below the 1 cm/s level. This level of precision is required in the search for 1 Earth mass planets in the Habitable Zones of Sun-like stars by the Radial Velocity (RV) method, where the maximum semi-amplitude is about 9 cm/s. BCPy was developed for the pipeline for the next generation Doppler Spectrometers - Habitable-zone Planet Finder (HPF) and NEID. An automated leap second management routine improves upon the one available in Astropy. It checks for and downloads a new leap second file before converting from the UT time scale to TDB. The code also includes a converter for JDUTC to BJDTDB.

[ascl:1808.002] rsigma: Resonant disturbance

rsigma calculates the resonant disturbing function, R(sigma), for a massless particle in an arbitrary orbit perturbed by a planet in circular orbit. This function defines the strength of the resonance (its semi-amplitude) and the location of the stable equilibrium points (the minima). It depends on the variable sigma called critical angle and on the particle's orbital elements a, e, i and the argument of the perihelion. R(sigma) is numerically calculated and the code is valid for arbitrary eccentricities and inclinations, including retrograde orbits.

[ascl:1808.003] CPF: Corral Pipeline Framework

Corral generates astronomical pipelines. Data processing pipelines represent an important slice of the astronomical software library that include chains of processes that transform raw data into valuable information via data reduction and analysis. Written in Python, Corral features a Model-View-Controller design pattern on top of an SQL Relational Database capable of handling custom data models, processing stages, and communication alerts. It also provides automatic quality and structural metrics based on unit testing. The Model-View-Controller provides concept separation between the user logic and the data models, delivering at the same time multi-processing and distributed computing capabilities.

[ascl:1808.004] ImPlaneIA: Image Plane Approach to Interferometric Analysis

Aperture masking interferometric data analysis involves measuring phases and amplitudes of fringes formed by interference between holes in the pupil mask. These fringe observables can be measured by computing an analytic model of the point spread function and fitting the relevant set of spatial frequencies directly in the image plane, without recourse to numerical Fourier transforms. The ImPlaneIA pipeline converts aperture masking images to fringe observables by fitting fringes in the image plane, calibrates data from a target of interest with one or more point source calibrators, and contains some basic model-fitting routines. The pipeline can accept different mask geometries, instruments, and observing modes.

[ascl:1808.005] hfof: Friends-of-Friends via spatial hashing

hfof is a 3-d friends-of-friends (FoF) cluster finder with Python bindings based on a fast spatial hashing algorithm that identifies connected sets of points where the point-wise connections are determined by a fixed spatial distance. This technique sorts particles into fine cells sufficiently compact to guarantee their cohabitants are linked, and uses locality sensitive hashing to search for neighboring (blocks of) cells. Tests on N-body simulations of up to a billion particles exhibit speed increases of factors up to 20x compared with FOF via trees, and is consistently complete in less than the time of a k-d tree construction, giving it an intrinsic advantage over tree-based methods.

[ascl:1808.006] Fips: An OpenGL based FITS viewer

FIPS is a cross-platform FITS viewer with a responsive user interface. Unlike other FITS viewers, FIPS uses GPU hardware via OpenGL to provide functionality such as zooming, panning and level adjustments. OpenGL 2.1 and later is supported. FIPS supports all 2D image formats except floating point formats on OpenGL 2.1. FITS image extension has basic limited support.

[ascl:1808.007] 2DSF: Vectorized Structure Function Algorithm

The vectorized physical domain structure function (SF) algorithm calculates the velocity anisotropy within two-dimensional molecular line emission observations. The vectorized approach is significantly faster than brute force iterative algorithms and is very efficient for even relatively large images. Furthermore, unlike frequency domain algorithms which require the input data to be fully integrable, this algorithm, implemented in Python, has no such requirements, making it a robust tool for observations with irregularities such as asymmetric boundaries and missing data.

[ascl:1808.008] PyMieDap: Python Mie Doubling Adding Program

PyMieDAP (Python Mie Doubling Adding Program) makes light scattering computations with Mie scattering and radiative transfer computations with full orders of scattering and taking into account the polarization of the light scattered. Full planet modeling at any phase angle is possible. With the included subpackage exopy, it is also possible to simulate systems with a star, a planet and a possible moon.

[ascl:1808.009] py-sdm: Support Distribution Machines

py-sdm (Support Distribution Machines) is a Python implementation of nonparametric nearest-neighbor-based estimators for divergences between distributions for machine learning on sets of data rather than individual data points. It treats points of sets of data as samples from some unknown probability distribution and then statistically estimates the distance between those distributions, such as the KL divergence, the closely related Rényi divergence, L2 distance, or other similar distances.

[ascl:1808.010] hi_class: Horndeski in the Cosmic Linear Anisotropy Solving System

hi_class implements Horndeski's theory of gravity in the modern Cosmic Linear Anisotropy Solving System (ascl:1106.020). It can be used to compute any cosmological observable at the level of background or linear perturbations, such as cosmological distances, cosmic microwave background, matter power and number count spectra (including relativistic effects). hi_class can be readily interfaced with Monte Python (ascl:1307.002) to test Gravity and Dark Energy models.

[ascl:1808.011] Robbie: Radio transients and variables detection workflow

Robbie automates cataloging sources, finding variables, and identifying transients in the image domain. It works in a batch processing paradigm with a modular design so components can be swapped out or upgraded to adapt to different input data while retaining a consistent and coherent methodological approach. Robbie is based on commonly used and open software, including AegeanTools (ascl:1212.009) and STILS/TOPCAT (ascl:1101.010).

[ascl:1809.001] LEMON: Differential photometry pipeline

LEMON is a differential-photometry pipeline, written in Python, that determines the changes in the brightness of astronomical objects over time and compiles their measurements into light curves. This code makes it possible to completely reduce thousands of FITS images of time series in a matter of only a few hours, requiring minimal user interaction.

[ascl:1809.002] PCCDPACK: Polarimetry with CCD

PCCDPACK analyzes polarimetry data. The set of routines is written in CL-IRAF (including compiled Fortran codes) and analyzes dozens of point objects simultaneously on the same CCD image. A subpackage, specpol, is included to analyze spectropolarimetry data.

[ascl:1809.003] PASTA: Python Astronomical Stacking Tool Array

PASTA performs median stacking of astronomical sources. Written in Python, it can filter sources, provide stack statistics, generate Karma annotations, format source lists, and read information from stacked Flexible Image Transport System (FITS) images. PASTA was originally written to examine polarization stack properties and includes a Monte Carlo modeler for obtaining true polarized intensity from the observed polarization of a stack. PASTA is also useful as a generic stacking tool, even if polarization properties are not being examined.

[ascl:1809.004] VBBINARYLENSING: Microlensing light-curve computation

VBBinaryLensing forward models gravitational microlensing events using the advanced contour integration method; it supports single and binary lenses. The lens map is inverted on a collection of points on the source boundary to obtain a corresponding collection of points on the boundaries of the images from which the area of the images can be recovered by use of Green’s theorem. The code takes advantage of a number of techniques to make contour integration much more efficient, including using a parabolic correction to increase the accuracy of the summation, introducing an error estimate on each arc of the boundary to enable defining an optimal sampling, and allowing the inclusion of limb darkening. The code is written as a C++ library and wrapped as a Python package, and can be called from either C++ or Python.

[ascl:1809.005] perfectns: "Perfect" dynamic and standard nested sampling for spherically symmetric likelihoods and priors

perfectns performs dynamic nested sampling and standard nested sampling for spherically symmetric likelihoods and priors, and analyses the samples produced. The spherical symmetry allows the nested sampling algorithm to be followed “perfectly” - i.e. without implementation-specific errors correlations between samples. It is intended for use in research into the statistical properties of nested sampling, and to provide a benchmark for testing the performance of nested sampling software packages used for practical problems - which rely on numerical techniques to produce approximately uncorrelated samples.

[ascl:1809.006] spops: Spinning black-hole binary population synthesis

spops is a database of populations synthesis simulations of spinning black-hole binary systems, together with a python module to query it. Data are obtained with the startrack and precession [ascl:1611.004] numerical codes to consistently evolve binary stars from formation to gravitational-wave detection. spops allows quick exploration of the interplay between stellar physics and black-hole spin dynamics.

[ascl:1809.007] surfinBH: Surrogate final black hole properties for mergers of binary black holes

surfinBH predicts the final mass, spin and recoil velocity of the remnant of a binary black hole merger. Trained directly against numerical relativity simulations, these models are extremely accurate, reproducing the results of the simulations at the same level of accuracy as the simulations themselves. Fits such as these play a crucial role in waveform modeling and tests of general relativity with gravitational waves, performed by LIGO.

[ascl:1809.008] PyQSOFit: Python code to fit the spectrum of quasars

The Python QSO fitting code (PyQSOFit) measures spectral properties of quasars. Based on Shen's IDL version, this code decomposes different components in the quasar spectrum, e.g., host galaxy, power-law continuum, Fe II component, and emission lines. In addition, it can run Monto Carlo iterations using flux randomization to estimate the uncertainties.

[ascl:1809.009] NEBULA: Radiative transfer code of ionized nebulae at radio wavelengths

NEBULA performs the radiative transfer of the 3He+ hyperfine transition, radio recombination lines (RRLs), and free-free continuum emission through a model nebula. The model nebula is composed of only H and He within a three-dimension Cartesian grid with arbitrary density, temperature, and ionization structure. The 3He+ line is assumed to be in local thermodynamic equilibrium (LTE), but non-LTE effects and pressure broadening from electron impacts can be included for the RRLs. All spectra are broadened by thermal and microturbulent motions.

[ascl:1809.010] Isca: Idealized global circulation modeling

Isca provides a framework for the idealized modeling of the global circulation of planetary atmospheres at varying levels of complexity and realism. Though Isca is an outgrowth of models designed for Earth's atmosphere, it may readily be extended into other planetary regimes. Various forcing and radiation options are available. At the simple end of the spectrum a Held-Suarez case is available. An idealized grey radiation scheme, a grey scheme with moisture feedback, a two-band scheme and a multi-band scheme are also available, all with simple moist effects and astronomically-based solar forcing. At the complex end of the spectrum the framework provides a direct connection to comprehensive atmospheric general circulation models.

[ascl:1809.011] qp: Quantile parametrization for probability distribution functions

qp manipulates parametrizations of 1-dimensional probability distribution functions, as suitable for photo-z PDF compression. The code helps determine a parameterization for storing a catalog of photo-z PDFs that balances the available storage resources against the accuracy of the photo-z PDFs and science products reconstructed from the stored parameters.

[ascl:1809.012] nestcheck: Nested sampling calculations analysis

Nestcheck analyzes nested sampling runs and estimates numerical uncertainties on calculations using them. The package can load results from a number of nested sampling software packages, including MultiNest (ascl:1109.006), PolyChord (ascl:1502.011), dynesty (ascl:1809.013) and perfectns (ascl:1809.005), and offers the flexibility to add input functions for other nested sampling software packages. Nestcheck utilities include error analysis, diagnostic tests, and plots for nested sampling calculations.

[ascl:1809.013] dynesty: Dynamic Nested Sampling package

dynesty is a Dynamic Nested Sampling package for estimating Bayesian posteriors and evidences. dynesty samples from a given distribution when provided with a loglikelihood function, a prior_transform function (that transforms samples from the unit cube to the target prior), and the dimensionality of the parameter space.

[ascl:1809.014] stepped_luneburg: Stacked-based ray tracing code to model a stepped Luneburg lens

stepped_luneburg investigates the scattered light properties of a Luneburg lens approximated as a series of concentric shells with discrete refractive indices. The optical Luneburg lens has promising applications for low-cost, continuous all-sky monitoring to obtain transit light curves of bright, nearby stars. This code implements a stack-based algorithm that tracks all reflected and refracted rays generated at each optical interface of the lens as described by Snell's law. The Luneburg lens model parameters, such as number of lens layers, the power-law that describes the refractive indices, the number of incident rays, and the initial direction of the incident wavefront can be altered to optimize lens performance. The stepped_luneburg module can be imported within the Python environment or used with scripting, and it is accompanied by two other modules, enc_int and int_map, that help the user to determine the resolving power of the lens and the strength of scattered light haloes for the purpose of quality assessment.

[ascl:1809.015] MrMoose: Multi-Resolution Multi-Object/Origin Spectral Energy distribution fitting procedure

MrMoose (Multi-Resolution Multi-Object/Origin Spectral Energy) fits user-defined models onto a set of multi-wavelength data using a Bayesian framework. The code can handle blended sources, large variation in resolution, and even upper limits consistently. It also generates a series of outputs allowing for an quick interpretation of the results. The code uses emcee (ascl:1303.002), and saves the emcee sampler object, thus allowing users to transfer the output to a personal graphical interface.

Would you like to view a random code?