Results 1-100 of 3873 (3771 ASCL, 102 submitted)
STELA Toolkit is a fully documented Python package for interpolating gappy/irregular, noisy light curves using Gaussian Processes, enabling the computation of a wide range of time-domain and frequency-domain data products. STELA supports standard Fourier frequency-resolved products such as power spectra, cross spectra, lag spectra, and coherence, as well as lags via the Cross-Correlation Function (CCF), interpolated with GPs or traditional linear interpolation.
DPMhalo (Descriptive Parametric Model) generates profiles of gaseous halos (pressure, electron density, and metallicity) as functions of radius, halo mass, and redshift. The code assumes spherically symmetric, volume-filling warm/hot gas, and enables mock observations of the circumgalactic medium (CGM), group halos, and clusters across a number of wavebands including X-ray, sub-millimeter/millimeter, radio, and ultraviolet (UV).
PIRATES (Polarimetric Image Reconstruction AI for Tracing Evolved Structures) uses machine learning to perform image reconstruction. It uses MCFOST (ascl:2207.023) to generate models, then uses those models to build, train, iteratively fit, and evaluate PIRATES performance.
torchmfbd carries out multi-object multi-frame blind deconvolution (MOMFBD) of point-like or extended objects, and is especially tailored to solar images. The code is built on PyTorch and provides a high-level interface for adding observations, defining phase diversity channels, and adding regularization. It can deal with spatially variant PSFs either by mosaicking the images or by defining a spatially variant PSF. torchmfbd supports smooth solutions and solutions based on the ℓ 1 penalization of the isotropic undecimated wavelet transform of the object, and regularizations are easily extendable. The code also includes an API and a configuration file.
SPIBACK (SPIral arms & Bar bACKward integrations) generates Milky Way models through the backward integration method. The code allows users to plot the 2D local velocity space distribution at the Sun's position, as well as median Galactocentric radial velocity maps across the area of the Galactic disk probed with Gaia DR3. This can be done for different bar and spiral arms parameters. The parameters are set by default to be those of the "fiducial model" and can be adjusted as needed.
HYDRAD (HYDrodynamics and RADiation) computes solutions to field-aligned hydrodynamic equations in coronal loops. The code models a broad variety of phenomena, including multi-species plasma confined to full-length, magnetic flux tubes of arbitrary geometrical and cross-section variation in the field-aligned direction; solar flares driven by non-thermal electrons3 and Alfven waves4, and the non-thermal equilibrium response of the chromosphere; and coronal rain formed by condensations in thermal non-equilibrium where the adaptive grid is required to fully resolve and track multiple steep transition regions. HYDRAD also models ultracold, strongly coupled laboratory plasmas composed of weakly-ionized strontium. The code, written in C++, is modular in its structure; new capabilities can be added in a relatively straightforward way and handled robustly by the numerical scheme. HYDRAD is also intended to be fairly undemanding of computational resources, though its needs do depend strongly on the particular nature of each model run.
EXP performs and analyzes N-body simulations using biorthogonal and orthogonal expansions. The package also supports time series analysis of expansion coefficients using multivariate Singular Spectrum Analysis (mSSA) to discover new dynamical correlations, separate signal from noise, and visualize these in two- and three-dimensional renderings. EXP's object-oriented design enforces minimal consistency while retaining flexibility.
P-CORONA models the intensity and polarization of coronal atomic lines in any given three dimensional (3D) model of the solar corona. It takes into account the scattering of anisotropic radiation as well as the symmetry-breaking effects arising from the influence of magnetic fields, through the Hanle and Zeeman effects, and from non-radial solar wind velocities. The code solves the statistical equilibrium equations for the elements of the atomic density matrix corresponding to the multi-level atomic model under consideration, assuming complete frequency redistribution. The calculations are carried out assuming an optically thin plasma, with the emergent Stokes profiles resulting from the integration of the local emission coefficients along the line-of-sight in the chosen 3D coronal model. P-CORONA incorporates HDF5 input/output functionality and includes a graphical user interface.
Capivara implements a spectral-based segmentation method for Integral Field Unit (IFU) data cubes. The code uses hierarchical clustering in the spectral domain, grouping similar spectra to improve the signal-to-noise ratio without compromising astrophysical similarity among regions, and leverages advanced matrix operations via torch for GPU acceleration.
spherimatch performs efficient cross-matching and self-matching of astronomical catalogs in spherical coordinates. Designed for use in astrophysics, where data is naturally distributed on the celestial sphere, the package enables fast matching with an algorithmic complexity of O (N log N). It supports Friends-of-Friends (FoF) group identification and duplicate removal in spherical coordinates, and integrates easily with common data processing tools such as pandas.
PCM-HiPT (Planetary Climate Model for High Pressures and Temperatures) simulates the thermal structure of dense, hot terrestrial exoplanet atmospheres. This 1D line-by-line radiative-convective model uses a high-resolution spectral grid and HITRAN-based absorption data to model radiative energy transfer with high accuracy at elevated pressures and temperatures (>1000 K). PCM-HiPT extends the PCM_LBL model (ascl:2504.003) for early Mars conditions, and modifications allow PCM-HiPT to capture complex atmospheric structures, including detached convective zones and stable lower atmosphere layers driven by shortwave absorption.
SPAN (SPectral ANalysis) is a cross-platform graphical user interface (GUI) application for extracting, manipulating, and analyzing astronomical spectra. It is optimized for the study of galaxy spectra across the near-ultraviolet (NUV) to near-infrared (NIR) atmospheric windows.
SPAN extracts 1D spectra from FITS images and datacubes, performs spectral processing (e.g., Doppler correction, continuum modelling, denoising), and supports analyses such as line-strength measurements, stellar and gas kinematics, and stellar population studies, using both built-in routines and the widely adopted pPXF algorithm (ascl:1210.002) for full spectral fitting.
It runs on Windows, Linux, macOS, and Android, and features an intuitive, task-oriented interface. The goal of SPAN is to unify essential tools for modern spectral analysis into a single, user-friendly application that offers a flexible and accessible environment while maintaining scientific accuracy.
CosmoWAP (Cosmology with Wide-separation, relAtivistic and Primordial non-Gaussian contibutions) analyzes the Fourier power spectra and bispectra with wide-separation, relativistic and primordial non-Gaussian effects in large-scale structure cosmology. The analytical expressions themselves are computed analytically in Mathematica using MathWAP (ascl:2507.019) routines, which can be exported as .py files. CosmoWAP then takes these expressions and implements them for a given cosmology and set of survey parameters.
MathWAP contains Mathematica notebooks that compute the Fourier power spectrum and bispectrum, including contributions (up to second order) from wide-separation (WS) and relatativistic (GR) efects as well as primoridal non-Gaussianity (PNG). Outputs are stored from Mathematica in .json files in mathematica_expr. The read_mathematica notebook can be used to convert from Mathematica to Python formatting for use by CosmoWAP (ascl:2507.020).
Modern large scale cosmological hydrodynamic simulations require robust tools capable of analysing their data outputs in a parallel and efficient manner. We introduce SOAP (Spherical Overdensity and Aperture Processor), a Python package designed to compute halo and galaxy properties from SWIFT simulations after being post-processed with a subhalo finder. SOAP takes a subhalo catalogue as input and calculates a wide array of properties for each object. SOAP offers parallel processing capabilities via mpi4py for efficient handling of large datasets, and allows for consistent property calculation across multiple halo finders. SOAP supports various halo definitions, including spherical overdensities and fixed physical apertures, providing flexibility for diverse observational comparisons. The package is compatible with both dark matter-only and full hydrodynamic simulations, producing HDF5 catalogues that are integrated with the swiftsimio package for seamless unit handling.
QUIDS is a Python package for generating synthetic Stokes Q and U polarization maps using 3D dust density and Galactic Magnetic Field (GMF) shell data. It integrates polarized emission over log-spaced spherical shells, with polarization angles derived from GMF models such as UF23 or JF12. The goal is to explore whether small-scale structures in the GMF and dust distribution can reconstruct or preserve the large-scale polarization patterns observed across the sky. This package is particularly relevant for modeling and probing how local Galactic features contribute to or interfere with global polarization signals.
ysoisochrone handles the isochrones for young stellar objects (YSOs) and uses isochrones to derive the stellar mass and ages. The method uses a Bayesian inference approach. The code estimates the stellar masses, ages, and associated uncertainties by comparing their stellar effective temperature, bolometric luminosity, and their uncertainties with different stellar evolutionary models, including those specifically developed for YSOs. ysoisochrone also enables user-developed evolutionary tracks.
Spectool processes astronomical spectral data, offering a collection of common spectral analysis algorithms. The toolkit includes functions for spectral resampling, spectral flattening, radial velocity measurements, spectral convolution broadening, among others. Each function in the package is implemented independently, allowing users to select and utilize the desired features as needed. Spectool's functions have simple and intuitive interfaces, ensuring ease of use for various data sets and analysis tasks.
Spinifex is a pure Python tooling for ionospheric corrections in radio astronomy, e.g., getting total electron content and rotation measures. The code is in part a re-write of RMextract (ascl:1806.024). All existing features of RMextract have been re-implemented, but spinifex is not directly backwards compatible with RMextract.
nGIST (new Galaxy Integral-field Spectroscopy Tool) analyzes modern galaxy integral field spectroscopic (IFS) data. Borne out of the need for a robust but flexible analysis pipeline for an influx of MUSE and other galaxy IFS data, the code is the continuation of the archived GIST pipeline (ascl:1907.025). It improves memory and parallelization management and deals better with longer optical wavelength ranges and sky residuals that are particularly problematic at redder wavelengths (>7000 Angstrom). Performance improvements include memory and parallelization optimization, and smaller and more convenient output files. nGIST can create continuum-only cubes and offers better handling of cube variance and better bias estimation for stellar kinematics, and includes a pPXF-based emission line fitter and an updated version of Mapviewer for a quick-look interface to view results.
SAUSERO (Software to AUomatize in a Simple Environment the Reduction of Osiris+) processes raw science frames to address noise, cosmetic defects, and pixel heterogeneity, preparing them for photometric analysis for OSIRIS+ (Gran Telescopio Canarias). Correcting these artifacts is a critical prerequisite for reliable scientific analysis. The software applies observation-specific reduction steps, ensuring optimized treatment for different data types. Developed with a focus on simplicity and efficiency, SAUSERO streamlines the reduction pipeline, enabling researchers to obtain calibrated data ready for photometric studies.
Sapphire++ (Simulating astrophysical plasmas and particles with highly relativistic energies in C++) numerically solves the Vlasov–Fokker–Planck equation for astrophysical applications. It employs a numerical algorithm based on a spherical harmonic expansion of the distribution function, expressing the Vlasov–Fokker–Planck equation as a system of partial differential equations governing the evolution of the expansion coefficients. The code uses the discontinuous Galerkin method in conjunction with implicit and explicit time stepping methods to compute these coefficients, providing significant flexibility in its choice of spatial and temporal accuracy.
arctic_weather analyzes meteorological data recorded from High Arctic weatherstations (called Inuksuit) deployed on coastal mountains north of 80 degrees on Ellesmere Island Canada from 2006 through 2009, along with clear-sky fractions from horizon-viewing sky-monitoring cameras. The code calculates solar and lunar elevations, and so allows correlation of polar nighttime to the development of prevailing thermal inversion conditions in winter, and statistical comparison to other optical/infrared observatory sites.
COBRA (Cosmology with Optimally factorized Bases for Rapid Approximation) rapidly computes large-scale structure observables, separating scale dependence from cosmological parameters in the linear matter power spectrum while also minimizing the number of necessary basis terms. This enables direct and efficient computation of derived and nonlinear observables. Moreover, the dependence on cosmological parameters is efficiently approximated using radial basis function interpolation. COBRA opens a new window for efficient computations of higher loop and higher order correlators involving multiple powers of the linear matter power spectra. The resulting factorization can also be utilied in clustering, weak lensing and CMB analyses.
show_cube displays the results of reducing, aligning, and combining near-infrared integral field spectroscopy with the Gemini Observatory NIFS (Near-infrared Integral Field Spectrometer) instrument. Image slices are extracted from the raw data frames to make the input datacube. The code site also provides a tarfile containing all the raw NIFS FITS-format files for the observations of high-redshift radio galaxies 3C230, 3C294, and 4C+41.17, the last of which are reported, together with line-strengths using the MAPPINGS III (ascl:1306.008) shock models.
Coniferest implements anomaly detection algorithms and interactive active learning tools. The centerpiece of the package is an Isolation Forest algorithm, which operates by constructing random decision trees. Coniferest also offers two modified versions for active learning: AAD Forest and Pineforest. The AAD Forest modifies the Isolation Forest by reweighting its leaves based on responses from human experts, providing a faster alternative to the ad_examples package. Pineforest employs a filtering algorithm that builds and dismantles trees with each new human-machine iteration step. The Coniferest package provides a user-friendly interface for conducting interactive human-machine sessions; the code has been used for anomaly detection with a particular focus on light-curve data from large time-domain surveys.
Tayph analyzes high-resolution spectroscopic time-series observations of close-in exoplanets using a cross-correlation technique. The tool can be applied to transit observations of hot Jupiters made with echelle spectrographs at optical wavelengths. In particular, it can be applied to pipeline-reduced observations by HARPS, HARPS-N, ESPRESSO, CARMENES and to a certain extent UVES, with minimal interaction required. Tayph works on observations made with other instruments, provided the user provides these according to a specific format, and can also be used in conjunction with Molecfit (ascl:1501.013).
Nii-C implements a framework of automatic parallel tempering Markov Chain Monte Carlo. Parameters ensure an efficient parallel tempering process that is set by a control system during the initial stages of a sampling process. The autotuned parameters consist of two parts: the temperature ladders of all parallel tempering Markov Chains, and the proposal distributions for all model parameters across all parallel tempering chains. Written in C, Nii-C supersedes the Python code Nii (ascl:2111.010). Nii-C is parallelized using the message-passing interface protocol to optimize the efficiency of parallel sampling, which facilitates rapid convergence in the sampling of high-dimensional and multimodal distributions, as well as the expeditious code execution time. The code can be used to trace complex distributions due to its high sampling efficiency and quick execution speed.
The Opacity Wizard performs easy and fast visualizations of opacity and abundance data for exoplanet and brown dwarf atmospheres. It was designed to be used by observers studying these substellar objects as a way to explore which molecules are most important for a given planet and predict where the absorption features of those molecules will be. Opacity Wizard provides an iPython notebook with widgets for choosing a pressure, temperature, metallicity, and molecule list, and then create plots of the mixing ratios and opacities.
SysSimPyMMEN infers the minimum-mass extrasolar nebula (MMEN), a power-law profile for the minimum mass in disk solids required to form the existing exoplanets if they formed in their present locations. Designed to work with the SysSim clustered planetary system models (ascl:2507.001) that characterize the underlying occurrence and intra-system correlations of multi-planet systems, SysSimPyMMEN can also be applied to any other planetary system.
SysSimPyPlots loads, plots, and visualizes the simulated catalogs generated by ExoplanetsSysSim (ascl:2507.001), a comprehensive forward modeling framework for studying planetary systems based on the Kepler mission. In particular, it is designed to work with the SysSim clustered planetary system models (ascl:2507.003) that characterize the underlying occurrence and intra-system correlations of multi-planet systems. Unlike the SysSim codebase, which is written in Julia, SysSimPyPlot is written almost entirely in Python 3.
SysSimExClusters provides a comprehensive forward modelling framework for studying planetary systems in conjunction with ExoplanetsSysSim (ascl:2507.001). It includes several statistical models for describing the intrinsic planetary systems, their architectures, and the correlations within multi-planet systems using the Kepler population of exoplanet candidates.
LSCS (Lightweight Space Coronagraph Simulator) simulates realistic high-contrast space imaging instruments in their linear regime of small wavefront perturbations about the nominal dark hole. The code can be used for testing high-order wavefront sensing and control as well as post-processing algorithms. It models broadband images with sensor noise, wavefront drift, actuators drift, and residual effects from low-order wavefront sensing, and supports a model of the Roman Space Telescope Hybrid Lyot Coronagraph based on its FALCO (ascl:2304.004, ascl:2304.005) model. The LSCS package provides an example of dark hole maintenance using an Extended Kalman Filter and Electric Field Conjugation.
The ExoplanetsSysSim.jl package generates populations of planetary systems, simulates observations of those systems with a transit survey, and facilitates comparisons of simulated and observed catalogs of planetary systems. Critically, ExoplanetsSysSim accounts for intrinsic correlations in the sizes and orbital periods of planets within a planetary system.
We develop a Physics-Informed Neural Network (PINN) code to solve the modified Teukolsky equation under realistic astrophysical conditions. The code embeds domain-specific physics—spin-weighted curvature perturbations, quasi-normal mode (QNM) boundary conditions, and attenuation dynamics—directly into the training loss function. Applied to data from the GW190521 event, the model accurately infers complex QNM frequencies (ω = 0.2917 − 0.0389i) and learns an attenuation coefficient α = 0.04096, corresponding to a 14.4% decay rate. The code demonstrates strong predictive performance, reducing mean squared error by 50.3% (MSE = 0.2537 vs. 0.5310) compared to Bayesian baselines, and achieving a positive R² score. It further reveals non-trivial r–t coupling and gravitational memory effects, which standard exponential decay models fail to capture. This PINN-based implementation establishes a computationally efficient and accurate tool for environmental modeling in gravitational wave astrophysics and offers a path forward for black hole spectroscopy beyond vacuum assumptions.
Procoli extracts profile likelihoods in cosmology. It wraps MontePython (ascl:1805.027), the fast sampler written specifically for CLASS (ascl:1106.020). All likelihoods available for use with MontePython are hence immediately available for use. Procoli is based on a simulated-annealing optimizer to find the global maximum likelihoods value as well as the maximum likelihood points along the profile of any use input parameter.
CAMEL (Cosmological Analysis with Minuit Exploration of the Likelihood) performs cosmological parameters estimations using best fits, Monte-Carlo Markov Chains, and profile-likelihoods. Widely used in Planck satellite data analysis, by default it employs CLASS (ascl:1106.020) to compute all relevant cosmological quantities, but any other Boltzmann solver can easily be plugged in.
pinc ("profiles in cosmology") computes profile likelihoods in cosmology; it can also determine the (boundary-corrected) confidence intervals with the graphical construction. The code uses a simulated annealing scheme and interfaces with MontePython (ascl:1805.027). pinc consists of three short scripts; these automatically set the relevant parameters in MontePython, submit the minimization chains, and analyze the results.
OK Binaries is a tool for identifying suitable calibration binaries from the Washington Double Star (WDS) Sixth Orbit Catalog. It calculates orbital positions at any epoch, propagates uncertainties using Monte Carlo sampling, and generates orbit plots. The web app includes automated daily updates of binary positions and a searchable interface with filters for position, magnitude, separation, and other orbital parameters. OK Binaries can be used online, as a standalone offline browser app, or via the command line.
CLUES (CLustering UnsupErvised with Sequencer) analyzes spectral and IFU data. This fully interpretable clustering tool uses machine learning to classify and reduce the effective dimensionality of data sets. It combines multiple unsupervised clustering methods with multiscale distance measures using Sequencer (ascl:2105.006) to find representative end-member spectra that can be analyzed with detailed mineralogical modeling and follow-up observations. CLUES has been used on Spitzer IRS data and debris disk science, and can be applied to other high-dimensional spectral data sets, including mineral spectroscopy in general areas of astrophysics and remote sensing.
Bjet_MCMC automatically models multiwavelength spectral energy distributions of blazars, considering one-zone synchrotron-self-Compton (SSC) model with or without the addition of external inverse-Compton process from the thermal emission of the nucleus. The code also contains manual fitting functionalities for multi-zone SSC modeling. Bjet_MCMC is built as an MCMC python wrapper around the C++ code Bjet.
pynchrotron implements synchrotron emission from cooling electrons. It removes the need for GSL which was originally relied on for a quick computation of the synchrotron kernel. The code has been ported from GSL and written directly in python as well as accelerated with numba. pynchrotron also includes an astromodels (ascl:2506.019) function for direct use in 3ML (ascl:2506.018).
Astromodels defines models for likelihood or Bayesian analysis of astrophysical data. Though designed for analysis in the spectral domain, it can also be used as a toolbox containing functions of any variable. Astromodels is not a modeling package; it provides the tools to build a model as complex as one needs. A separate package such as 3ML (ascl:2506.018) is needed to fit the model to the data.
The Multi-Mission Maximum Likelihood framework (3ML) provides a common high-level interface and model definition for coherent and intuitive modeling of sources using all the available data, no matter their origin. Astrophysical sources are observed by different instruments at different wavelengths with an unprecedented quality, and each instrument and data type has its own ad-hoc software and handling procedure. 3ML's architecture is based on plug-ins; the package uses the official software of each instrument under the hood, thus guaranteeing that 3ML is always using the best possible methodology to deal with the data of each instrument. Though Maximum Likelihood is in the name for historical reasons, 3ML is an interface to several Bayesian inference algorithms such as MCMC and nested sampling as well as likelihood optimization algorithms.
Hydromass analyzes galaxy cluster mass profiles from X-ray and/or Sunyaev-Zel’dovich observations. It provides a global Bayesian framework for deprojection and mass profile reconstruction, including mass model fitting, forward fitting with parametric and polytropic models, and non-parametric log-normal mixture reconstruction. Hydromass easily loads public X-COP data products and applies reconstruction tools directly within a Jupyter notebook.
SBI++ is a complete methodology based on simulation-based (likelihood-free) inference that is customized for astronomical applications. Specifically, the code retains the fast inference speed of ∼1 sec for objects in the observational training set distribution, and additionally permits parameter inference outside of the trained noise and data at ~1 min per object. The package includes scripts for training and implementing SBI++ and is dependent on sbi (ascl:2306.002).
Octofitter performs Bayesian inference against a wide variety of exoplanet and binary star data. It is highly modular and allows users to easily adjust priors, change parameterizations, and specify arbitrary function relations between the parameters of one or more planets. Octofitter further supplies tools for examining model outputs including prior and posterior predictive checks and simulation based calibration.
M_-M_K- converts absolute 2MASS Ks-band magnitude (or a distance and a Ks-band magnitude) into an estimate of the stellar mass using the empirical relation derived from the resolved photometry and orbits of astrometric binaries. The code requires scalar values for K, distance, and corresponding uncertainties. M_-M_K- outputs errors based on the relationship's scatter and errors in the provided distance and Ks magnitude.
This One-Class Support Vector Machine (SVM) model detects exoplanet transit events. One-class SVMs fit data and make predictions faster than simple CNNs, and do not require specialized equipment such as Graphics Processing Units (GPU). The code uses a Gaussian kernel to compute a nonlinear decision boundary. After training, OCSVM-Transit-Detection requires that lightcurves classified as containing a transit have features very similar to the lightcurves in the training dataset, thus limiting misclassifications.
The Python wrapper pyTPCI couples newer versions of the hydrodynamics code PLUTO (ascl:1010.045) and the gas microphysics code CLOUDY (ascl:9910.001) to self-consistently simulate escaping atmospheres in 1D. Following TPCI (ascl:2506.012), on which pyTPCI is based, CLOUDY is modified to read in depth-dependent wind velocities, and to output useful physical quantities (including mass density, number density, and mean molecular weight as a function of depth).
The PLUTO CLOUDY Interface (TPCI) combines the PLUTO (ascl:1010.045) and CLOUDY (ascl.net:9910.001) simulation codes to simulate hydrodynamic evolution under irradiation from a source. The code solves the photoionization and chemical network of the 30 lightest elements. By combining an equilibrium photoionization solver with a general MHD code, TPCI provides an advanced simulation tool applicable to a variety of astrophysical problems.
easyCHEM calculates chemical equilibrium abundances (including condensation) and adiabatic gradients by minimization of the so-called Gibbs free energy. Ancillary outputs are the atmospheric adiabatic temperature gradient and mean molar mass. Because easyCHEM incorporates the dgesv routine from LAPACK (ascl:2104.020) for fast matrix inversion,external math libraries are not required.
GRIP (Generic data Reduction for nulling Interferometry Package) reduces nulling data with enhanced statistical self-calibration methods from any nulling interferometric instrument within a single and consistent framework. The toolbox self-calibrates null depth measurements by fitting a model of the instrumental perturbations to histograms of data. The model is generated using a simulator of the instrument built into the package for the main operating nullers or provided by the user. GRIP handles baseline discrimination and spectral dispersion and features several optimizing strategy, including least squares, maximum likelihood, and MCMC with emcee (ascl:1303.002), and works on GPU using the cupy library.
DART-Vetter distinguishes planetary candidates from false positives detected in any transiting survey, and is tailored for photometric data collected from space-based missions. The Convolutional Neural Network is trained on Kepler and TESS Threshold Crossing Events (TCEs), and processes only light curves folded on the period of the relative signal. DART-Vetter has a simple and compact architecture; it is lightweight enough to be executed on personal laptops.
The excalibuhr end-to-end pipeline extracts high-resolution spectra designed for VLT/CRIRES+. The package preprocesses raw calibration files, including darks, flats, and lamp frames, and can trace spectral orders on 2D detector images. It applies calibrations to science frames, can remove the sky background by nodding subtraction, and combines frames per nodding position. excalibuhr can also extract 1D spectrum and perform wavelength and flux calibration.
Gen TSO estimates signal-to-noise ratios for transit/eclipse depths through an interactive graphical interface, similar to the JWST Exposure Time Calculator (ETC). This interface leverages the ETC by combining its noise simulator, Pandeia, with additional exoplanet resources from the NASA Exoplanet Archive, the Gaia DR3 catalog, and the TrExoLiSTS database of JWST programs. Gen TSO calculates S/Ns for all JWST instruments for the spectroscopic time-series modes available as of the Cycle 4 GO call. It also simulates target acquisition on the science targets or, when needed, on nearby stellar targets.
VBMicrolensing performs efficient computation in gravitational microlensing events using the advanced contour integration method, supporting single, binary and multiple lenses. It calculates magnification by single, binary and multiple lenses, centroid of the images generated by single and binary lenses, and critical curves and caustics of binary and multiple lenses. It also computes complete light curves including several higher order effects, such as limb darkening of the source, binary source, parallax, xallarap, and circular and elliptic orbital motion.
VBMicrolensing is written as a C++ library and wrapped as a Python package; the code can be called from either C++ or Python. This package encompasses VBBinaryLensing (ascl:1809.004), which is at the basis of several platforms for microlensing modeling. VBBinaryLensing will still be available as a legacy software, but will no longer be maintained.
TESS-cont quantifies the flux fraction coming from nearby stars in the TESS photometric aperture of any observed target. The package identifies the main contaminant Gaia DR2/DR3 sources, quantifies their individual and total flux contributions to the aperture, and determines whether any of these stars could be the origin of the observed transit and variability signals. Written in Python, TESS-cont is based on building the pixel response functions (PRFs) of nearby Gaia sources and computing their flux distributions across the TESS Target Pixel Files (TPFs) or Full Frame Images (FFIs).
SMART (Spectral Modeling Analysis and RV Tool) forward models spectral data. The method works best in those spectral orders with both strong telluric absorption features for accurate wavelength calibration and sufficient structure in the stellar spectrum to distinguish it from the telluric absorption. The code uses Markov Chain Monte Carlo (MCMC) methods to determine stellar parameters such as effective temperature, surface gravity, and rotational velocity, and calibration factors, including continuum and wavelength corrections, instrumental line-spread function (LSF), and strength of telluric absorption. SMART has been used with Keck/NIRSPEC, SDSS/APOGEE, Gemini/IGRINS high-resolution near-infrared spectrometers, among others, and with medium-resolution spectrometers, including Keck/OSIRIS and Keck/NIRES
The MAGIC (Microlensing Analysis Guided by Intelligent Computation) PyTorch framework efficiently and accurately infers the microlensing parameters of binary events with realistic data quality. The code divides binary microlensing parameters into two groups, which are inferred separately with different neural networks. The neural controlled differential equation handles light curves with irregular sampling and large data gaps. MAGIC can achieve fractional uncertainties of a few percent on the binary mass ratio and separation, and can locate the degenerate solutions even when large data gaps are introduced. As irregular samplings are common in astronomical surveys, this code may be useful for other time series studies.
Cumulative Time Dilation (CTD) calculates and plots the total time dilation experienced by a point (Earth) located at the center of a spherical mass-energy distribution. There are both analytical and numerical solutions for two different descriptions of how gravity acts across cosmological distances. The calculations are done for universes filled with a single energy type (dark energy; matter, including dark matter; or radiation) as well as the concordance model.
Hibridon solves the close-coupled equations which occur in the quantum treatment of inelastic atomic and molecular collisions. Gas-phase scattering, photodissociation, collisions of atoms and/or molecules with flat surfaces, and bound states of weakly-bound complexes can be treated.
The AIRI (AI for Regularization in radio-interferometric Imaging) algorithms are Plug-and-Play (PnP) algorithms propelled by learned regularization denoisers and endowed with robust convergence guarantees. The (unconstrained) AIRI algorithm is built on a Forward-Backward optimization algorithmic backbone enabling handling soft data-fidelity terms. AIRI's primary application is to solve large-scale high-resolution high-dynamic range inverse problems for RI in radio astronomy, more specifically 2D planar monochromatic intensity imaging.
The SCATTERING code solves the coupled equations for a given scattering system, provides the scattering S-matrix elements, and calculates the state-to-state cross-sections. Its approach is different from codes such as MOLSCAT (ascl:1206.004) or Hibridon (ascl:2505.020), as SCATTERING solves coupled equations in the body-fixed (BF) frame, where the coupling matrix exhibits a predominantly block-diagonal structure with blocks interconnected by centrifugal terms. This significantly reduces computational time and memory requirements.
TD-CARMA estimates cosmological time delays to model observed and irregularly sampled light curves as realizations of a continuous auto-regressive moving average (CARMA) process using MultiNest (ascl:1109.006) for Bayesian inference. TD-CARMA accounts for heteroskedastic measurement errors and microlensing, an additional source of independent extrinsic long-term variability in the source brightness.
iSLAT (the interactive Spectral-Line Analysis Tool) provides an interactive interface for the visualization, exploration, and analysis of molecular spectra. Synthetic spectra are made using a simple slab model; the code uses molecular data from HITRAN. iSLAT has been tested on spectra at infrared wavelengths as observed at different resolving powers (R = 700-90,000) with JWST-MIRI, Spitzer-IRS, VLT-CRIRES, and IRTF-ISHELL.
CETRA (Cambridge Exoplanet Transit Recovery Algorithm) detects transit by performing a linear transit search followed by a phase-folding of the former into a periodic signal search, using a physically motivated transit model to improve detection sensitivity. Implemented with NVIDIA’s CUDA platform, the code outperforms traditional methods like Box Least Squares and Transit Least Squares in both sensitivity and speed. It can also be used to identify transits that aren't periodic in the input light curve. CETRA is designed to be run on detrended light curves.
afterglowpy models Gamma-ray burst afterglows. It computes synchrotron radiation from an external shock and is capable of handling both structured jets and off-axis observers. The code provides fully trans-relativistic shock evolution through a constant density medium, on-the-fly integration over the equal-observer-time slices of the shock surface, and includes an approximate prescription for jet spreading. afterglowpy has been calibrated to the BoxFit code (ascl:2306.059) and produces similar light curves for top hat jets (within 50% when same parameters are used) both on- and off-axis.
tBilby is a trans-dimensional Bayesian inference tool based on the Bilby (ascl:1901.011) inference package. It provides tools and examples to facilitate trans-dimensional Bayesian inference and offers a high degree of flexibility in constructing models and defining priors. tBilby seeks to further develop trans-dimensional Bayesian inference.
The 1D cloud model code ExoLyn solves the transport equation of cloud particles and vapor under cloud condensation rates that are self-consistently calculated from thermodynamics. It can be combined with optool (ascl:2104.010) to calculate solid opacities and with petitRADTRANS (ascl:2207.014) to generate transmission or emission spectra. The code balances physical consistency with computational efficiency, opening the possibility of joint retrieval of exoplanets' gas and cloud components. ExoLyn has been designed to study cloud formation across a variety of planets, such as hot Jupiters, sub-Neptunes, and self-luminous planets.
Eclipsoid provides a general framework allowing rotational deformation to be modeled in transits, occultations, phase curves, transmission spectra and more of bodies in orbit around each other, such as an exoplanet orbiting a host star. It is an extension of jaxoplanet (ascl:2504.028).
Exo-MerCat generates a catalog of known and candidate exoplanets, collecting and selecting the most precise measurement for all interesting planetary and orbital parameters contained in exoplanet databases. It retrieves a common name for the planet target, linking its host star name with the preferred identifier in the most well-known stellar databases, and accounts for the presence of multiple aliases for the same target. The code standardizes the output and notation differences and homogenizes the data in a VO-aware way. Exo-MerCat also provides a graphical user interface to filter data based on the user's constraints and generate automatic plots that are commonly used in the exoplanetary community.
BEM predicts the radius of exoplanets based on their planetary and stellar parameters. The code uses the random forests machine learning algorithm to derive reliable radii, especially for planets between 4 R⊕ and 20 R⊕ for which the error is under 25%. BEM computes error bars for the radius predictions and can also create diagnostic plots.
The Aeolus library, written in Python, analyzes and plots climate model output using modules to work with 3D general circulation models of planetary atmospheres. The code provides various functions tailored to exoplanet research, e.g., in the context of tidally-locked exoplanets. Generic (planet-independent constants) and basic constants of the Earth atmosphere are also provided. Aeolus can store model-specific variable and coordinate names in one container, which can be passed to various functions, and can also calculate the synthetic transmission spectrum.
Jitter predicts radial-velocity (RV) jitter due to stellar oscillations and granulation, in terms of various sets of fundamental stellar properties. The code can also be used to set a prior for the jitter term as a component when modeling the Keplerian orbits of the exoplanets.
jnkepler models photometric and radial velocity data of multi-planet systems via N-body integration. Built with JAX, it leverages automatic differentiation for efficient computation of model gradients. This enables seamless integration with gradient-based optimizers and Hamiltonian Monte Carlo methods, including the No-U-Turn Sampler (NUTS) in NumPyro (ascl:2505.005). jnkepler is particularly suited for efficiently sampling from multi-planet posteriors involving a larger number of parameters and strong degeneracy.
The lightweight probabilistic programming library NumPyro provides a NumPy backend for Pyro (ascl:2110.016). It relies on JAX for automatic differentiation and JIT compilation to GPU/CPU. The code focuses on providing a flexible substrate for users to build on, including Pyro Primitives, inference algorithms with a particular focus on MCMC algorithms such as Hamiltonian Monte Carlo, and distribution classes, constraints and bijective transforms. NumPyro also provides effect-handlers that can be extended to implement custom inference algorithms and inference utilities.
Eureka! reduces and analyzes exoplanet time-series observations; though particularly focused on JWST data, it also handles HST observations. Starting with raw, uncalibrated FITS files, it reduces time-series data to precise exoplanet transmission and emission spectra. The code can perform flat-fielding, unit conversion, background subtraction, and optimal spectral extraction. It can generate a time series of 1D spectra for spectroscopic observations and a single light curve of flux versus time for photometric observations. Eureka! can also fit light curves with noise and astrophysical models using different optimization or sampling algorithms and is able to display the planet spectrum in figure and table form.
pyGCG provides a graphical user interface for viewing and classifying NIRISS-WFSS data products. Though originally designed for use by the GLASS-JWST collaboration, this software has been tested against the data products from the PASSAGE collaboration as well. pyGCG allows users to interactively browse a selection of reduced data products with the option of also writing classifications to a table.
SWIFTGalaxy analyzes particles belonging to individual simulated galaxies. The code provides a software abstraction of simulated galaxies produced by the SWIFT smoothed particle hydrodynamics code (ascl:1805.020) and extends the SWIFTSimIO module. SWIFTGalaxy inherits from and extends the functionality of the SWIFTDataset. It understands the output of halo finders and therefore which particles belong to a galaxy and its integrated properties. The particles occupy a coordinate frame that is enforced to be consistent, such that particles loaded on-the-fly will, for example, match rotations and translations of particles already in memory. Intuitive masking of particle datasets is also enabled. Finally, SWIFTGalaxy provides utilities that make working in cylindrical and spherical coordinate systems more convenient.
speclib provides a lightweight Python interface for loading, manipulating, and analyzing stellar spectra and model grids. The code can load a spectral grid into memory and linearly interpolate between temperature grid points to generate component spectra. speclib includes utilities for photometric synthesis, spectral resampling, and SED construction using stellar spectral libraries.
This modular Python-based pipeline provides tools for computing background cosmological quantities and Fourier-space power spectra for multiple tracers of large-scale structure, such as galaxies and 21cm intensity maps. It is designed for multitracer Fisher forecasting in both the linear regime and nonlinear scales using HALOFIT. The pipeline enables forecasts of cosmological parameters such as f_NL, fσ8, and tracer bias parameters. Its flexible architecture includes independently callable modules for the Hubble parameter, comoving distance, growth functions, matter power spectrum, transfer functions, and cross-power spectrum combinations. The code supports both theoretical survey design and nonlinear parameter estimation, making it suitable for a wide range of cosmological analyses.
infrared_comparison compares the downwelling infrared radiation, or sky spectral brightness, of arctic/antarctic astronomical observing sites with the best mid-latitude mountain sites. The code site provides a tarfile of Fourier-transform spectra from 3.3 microns 20 microns, obtained near Eureka, on Ellesmere Island Canada, along with meteorological data. The code can compare these via an atmospheric thermal-inversion model to reported values for South Pole and other mid-latitude sites, such as Maunakea.
arctic_mass_dimm reduces data from the Multi-Aperture Seeing Sensor (MASS) and Differential Image Motion Monitor (MASS) obtained from the Polar Environment Atmospheric Research Laboratory (PEARL), reporting seeing conditions, and comparing to other observatories. The code site provides a tarfile of all MASS and DIMM data obtained near Eureka, on Ellesmere Island Canada in 2011/12 along with associated meteorological data. The code employs a simple two-component atmospheric model to allow comparison of PEARL to mid-latitude sites such as Maunakea.
allsky performs photometry of Polaris with the Polar Environment Atmospheric Research Laboratory (PEARL) All-Sky Camera (PASI) to report transparency measurements, with comparison to conditions at other observatories worldwide. The code site provides a tarfile of PASI data obtained near Eureka, on Ellesmere Island Canada in darktime of 2008/09 and 2009/10 along with associated meteorological data. The code employs a simple atmospheric thermal inversion model, with a power-law fit to ice-crystal attenuation, allowing direct comparison of PEARL dark-time photometric-sky statistics to mid-latitude sites such as Maunakea.
astromorph performs an automatic classification of astronomical objects based on their morphology using machine learning in a self-supervised manner. Written in Python, the pipeline is an implementation for astronomical images in FITS-format files of the Boot-strap Your Own Latents (BYOL; Grill et al. 2020) method, which does not require labelling of the training data.
SHELLFISH (SHELL Finding In Spheroidal Halos) finds the splashback shells of individual halos within cosmological simulations. It uses a command line toolchain to produce human-readable catalogs. It requires a configuration file that describes the layout of the particle snapshots and halo catalog and which halos to measure the splashback shell for; once that is provided, Shellfish takes care of the rest. It supports numerous particle catalog types, including gotetra, Gadget-2, and Bolshoi, all text column-based halo catalogs, and consistent-trees merger trees.
JOFILUREN analyzes and de-noises scientific data and is useful for studying and reducing the physical effects of particle noise in particle-mesh computer simulations. It uses wavelets, which can efficiently remove noise from cosmological, galaxy and plasma N-body simulations. Written in Fortran, the code is portable and can be included in grid-based N-body codes. JOFILUREN can also be applied for removing noise from standard data, such as signals and images.
Vela.jl performs Bayesian pulsar timing and noise analysis. It supports narrowband and wideband TOAs along with most commonly used pulsar timing models. The code provides an independent, efficient, and parallelized implementation of the full nonlinear pulsar timing and noise model and includes a Python binding (pyvela). One-time operations such as data file input, clock corrections, and solar system ephemeris computations are performed by pyvela with the help of the PINT (ascl:1902.007) pulsar timing package.
DMCalc estimates the Dispersion Measure (DM) of wide-band pulsar data in psrfits format. It uses PSRCHIVE (ascl:1105.014) tools to get ToAs and then uses TEMPO2 (ascl:1210.015) for DM fitting. A median absolute deviation (MAD) based ToA rejection algorithm is implemented in the code to remove large outlier ToAs using Huber Regression. Although the code has been used for analyzing uGMRT wide-band data, DMCalc can in principle be used for any pulsar dataset.
TempoNest performs a Bayesian analysis of pulsar timing data, which allows for the robust determination of the non-linear pulsar timing solution simultaneously with a range of additional stochastic parameters. This includes both red spin noise and dispersion measure variations using either power law descriptions of the noise, or through a model-independent method that parameterizes the power at individual frequencies in the signal. It uses the Bayesian inference tool MultiNest (ascl:1109.006) to explore the joint parameter space, while using Tempo2 (ascl:1210.015) as a means of evaluating the timing model. TempoNest allows for the analysis of additional stochastic signals beyond the white noise described by the TOA error bars that may be present in the data.
The highly optimized Kotekan framework processes streaming data. Written in a combination of C/C++, it is primarily designed for use on radio telescopes and was originally developed for the CHIME project. It is similar to radio projects such as GNUradio (ascl:2504.029) or Bifrost (ascl:1711.021), though has a greater focus on efficiency and throughput. Kotekan is conceptually straightforward: data is carried through the system in a series of ring buffer objects, which are connected by processing blocks which manipulate the data, and optional metadata structures can be passed alongside the streaming data.
The GNU Radio toolkit provides signal processing blocks to implement software radios. A software radio performs signal processing in software instead of using dedicated integrated circuits in hardware. The benefit is that since software can be easily replaced in the radio system, the same hardware can be used to create many kinds of radios for many different communications standards. GNU Radio can be used with readily-available low-cost external RF hardware to create software-defined radios and to simulate wireless communications.
jaxoplanet is a functional-programming-forward implementation of many features from the exoplanet and starry packages built on top of JAX (ascl:2111.002). It includes fast and robust implementations of many exoplanet-specific operations, including solving Kepler’s equation, and computing limb-darkened light curves. jaxoplanet has first-class support for hardware acceleration using GPUs and TPUs, and integrates seamlessly with modeling tools such as NumPyro (ascl:2505.005) and Flax (ascl:2504.026).
picasso makes predictions for the thermodynamic properties of the gas in massive dark matter halos from gravity-only cosmological simulations. It combines an analytical model of gas properties as a function of gravitational potential with a neural network predicting the parameters of said model. Written in Python, it combines an implementation of the gas model based on JAX (ascl:2111.002) and Flax (ascl:2504.026), and models that have been pre-trained to reproduce gas properties from hydrodynamic simulations.
Flax provides a flexible end-to-end user experience for JAX users; its NNX is a simplified API that creates, inspects, debugs, and analyzes neural networks in JAX. It has first class support for Python reference semantics, enabling users to express their models using regular Python objects. Flax NNX is an evolution of the previous Flax Linen API.
RFIClean excises periodic RFI (broadband as well as narrow-band) in the Fourier domain, and then mitigates narrow-band spectral line RFI as well as broadband bursty time-domain RFI using robust statistics. Primarily designed to efficiently search and mitigate periodic RFI from GMRT time-domain data, RFIClean has evolved to mitigate any spiky (in time or frequency) RFI as well, and from any SIGPROC filterbank format data file. RFIClean uses several modules from SIGPROC (ascl:1107.016) to handle the filterbank format I/O.
PDQ predicts the positions on the sky of high-redshift quasars that should provide photons that are both acausal and uncorrelated. The predicted signal-to-noise ratios are calculated at framerate sufficient for random-number generation input to a loophole-free Bell test, and are calibrated against a public archival dataset of four pairs of highly-separated bright stars observed simultaneously (and serendipitously) at 17 Hz with that same instrumentation in 2019 to 2021.
AstroPT trains astronomical large observation models using imagery data. The code follows a similar saturating log-log scaling law to textual models and the models' performances on downstream tasks as measured by linear probing improves with model size up to the model parameter saturation point. Other modalities can be folded into the AstroPT model, and use of a causally trained autoregressive transformer model enables integration with the wider deep learning FOSS community.
MultiREx generates synthetic transmission spectra of exoplanets. This tool extends the functionalities of the TauREx (ascl:2209.015) framework, enabling the mass production of spectra and observations with added noise. Though the package was originally conceived to train machine learning models in the identification of biosignatures in noisy spectra, it can also be used for other purposes.
Would you like to view a random code?