ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[submitted] SOAP: A Python Package for Calculating the Properties of Galaxies and Halos Formed in Cosmological Simulations

Modern large scale cosmological hydrodynamic simulations require robust tools capable of analysing their data outputs in a parallel and efficient manner. We introduce SOAP (Spherical Overdensity and Aperture Processor), a Python package designed to compute halo and galaxy properties from SWIFT simulations after being post-processed with a subhalo finder. SOAP takes a subhalo catalogue as input and calculates a wide array of properties for each object. SOAP offers parallel processing capabilities via mpi4py for efficient handling of large datasets, and allows for consistent property calculation across multiple halo finders. SOAP supports various halo definitions, including spherical overdensities and fixed physical apertures, providing flexibility for diverse observational comparisons. The package is compatible with both dark matter-only and full hydrodynamic simulations, producing HDF5 catalogues that are integrated with the swiftsimio package for seamless unit handling.

[submitted] QUIDS: Q/U Integrated Dust Shells

QUIDS is a Python package for generating synthetic Stokes Q and U polarization maps using 3D dust density and Galactic Magnetic Field (GMF) shell data. It integrates polarized emission over log-spaced spherical shells, with polarization angles derived from GMF models such as UF23 or JF12. The goal is to explore whether small-scale structures in the GMF and dust distribution can reconstruct or preserve the large-scale polarization patterns observed across the sky. This package is particularly relevant for modeling and probing how local Galactic features contribute to or interfere with global polarization signals.

[ascl:2507.018] ysoisochrone: Python package to estimate masses and ages for young stellar objects

ysoisochrone handles the isochrones for young stellar objects (YSOs) and uses isochrones to derive the stellar mass and ages. The method uses a Bayesian inference approach. The code estimates the stellar masses, ages, and associated uncertainties by comparing their stellar effective temperature, bolometric luminosity, and their uncertainties with different stellar evolutionary models, including those specifically developed for YSOs. ysoisochrone also enables user-developed evolutionary tracks.

[ascl:2507.017] spectool: Spectral data processing and analysis toolkit

Spectool processes astronomical spectral data, offering a collection of common spectral analysis algorithms. The toolkit includes functions for spectral resampling, spectral flattening, radial velocity measurements, spectral convolution broadening, among others. Each function in the package is implemented independently, allowing users to select and utilize the desired features as needed. Spectool's functions have simple and intuitive interfaces, ensuring ease of use for various data sets and analysis tasks.

[ascl:2507.016] spinifex: Ionospheric corrections

Spinifex is a pure Python tooling for ionospheric corrections in radio astronomy, e.g., getting total electron content and rotation measures. The code is in part a re-write of RMextract (ascl:1806.024). All existing features of RMextract have been re-implemented, but spinifex is not directly backwards compatible with RMextract.

[ascl:2507.015] nGIST: The new Galaxy Integral-field Spectroscopy Tool

nGIST (new Galaxy Integral-field Spectroscopy Tool) analyzes modern galaxy integral field spectroscopic (IFS) data. Borne out of the need for a robust but flexible analysis pipeline for an influx of MUSE and other galaxy IFS data, the code is the continuation of the archived GIST pipeline (ascl:1907.025). It improves memory and parallelization management and deals better with longer optical wavelength ranges and sky residuals that are particularly problematic at redder wavelengths (>7000 Angstrom). Performance improvements include memory and parallelization optimization, and smaller and more convenient output files. nGIST can create continuum-only cubes and offers better handling of cube variance and better bias estimation for stellar kinematics, and includes a pPXF-based emission line fitter and an updated version of Mapviewer for a quick-look interface to view results.

[ascl:2507.014] SAUSERO: Software to AUomatize in a Simple Environment the Reduction of Osiris+

SAUSERO (Software to AUomatize in a Simple Environment the Reduction of Osiris+) processes raw science frames to address noise, cosmetic defects, and pixel heterogeneity, preparing them for photometric analysis for OSIRIS+ (Gran Telescopio Canarias). Correcting these artifacts is a critical prerequisite for reliable scientific analysis. The software applies observation-specific reduction steps, ensuring optimized treatment for different data types. Developed with a focus on simplicity and efficiency, SAUSERO streamlines the reduction pipeline, enabling researchers to obtain calibrated data ready for photometric studies.

[ascl:2507.013] Sapphire++: Interaction of charged particles with a background plasma simulator

Sapphire++ (Simulating astrophysical plasmas and particles with highly relativistic energies in C++) numerically solves the Vlasov–Fokker–Planck equation for astrophysical applications. It employs a numerical algorithm based on a spherical harmonic expansion of the distribution function, expressing the Vlasov–Fokker–Planck equation as a system of partial differential equations governing the evolution of the expansion coefficients. The code uses the discontinuous Galerkin method in conjunction with implicit and explicit time stepping methods to compute these coefficients, providing significant flexibility in its choice of spatial and temporal accuracy.

[ascl:2507.012] arctic_weather: High Arctic meteorological conditions analyzer

arctic_weather analyzes meteorological data recorded from High Arctic weatherstations (called Inuksuit) deployed on coastal mountains north of 80 degrees on Ellesmere Island Canada from 2006 through 2009, along with clear-sky fractions from horizon-viewing sky-monitoring cameras. The code calculates solar and lunar elevations, and so allows correlation of polar nighttime to the development of prevailing thermal inversion conditions in winter, and statistical comparison to other optical/infrared observatory sites.

[ascl:2507.011] COBRA: Optimal Factorization of Cosmological Observables

COBRA (Cosmology with Optimally factorized Bases for Rapid Approximation) rapidly computes large-scale structure observables, separating scale dependence from cosmological parameters in the linear matter power spectrum while also minimizing the number of necessary basis terms. This enables direct and efficient computation of derived and nonlinear observables. Moreover, the dependence on cosmological parameters is efficiently approximated using radial basis function interpolation. COBRA opens a new window for efficient computations of higher loop and higher order correlators involving multiple powers of the linear matter power spectra. The resulting factorization can also be utilied in clustering, weak lensing and CMB analyses.

[ascl:2507.010] show_cube: Show reduced spectra for Gemini NIFS

show_cube displays the results of reducing, aligning, and combining near-infrared integral field spectroscopy with the Gemini Observatory NIFS (Near-infrared Integral Field Spectrometer) instrument. Image slices are extracted from the raw data frames to make the input datacube. The code site also provides a tarfile containing all the raw NIFS FITS-format files for the observations of high-redshift radio galaxies 3C230, 3C294, and 4C+41.17, the last of which are reported, together with line-strengths using the MAPPINGS III (ascl:1306.008) shock models.

[ascl:2507.009] Coniferest: Python package for active anomaly detection

Coniferest implements anomaly detection algorithms and interactive active learning tools. The centerpiece of the package is an Isolation Forest algorithm, which operates by constructing random decision trees. Coniferest also offers two modified versions for active learning: AAD Forest and Pineforest. The AAD Forest modifies the Isolation Forest by reweighting its leaves based on responses from human experts, providing a faster alternative to the ad_examples package. Pineforest employs a filtering algorithm that builds and dismantles trees with each new human-machine iteration step. The Coniferest package provides a user-friendly interface for conducting interactive human-machine sessions; the code has been used for anomaly detection with a particular focus on light-curve data from large time-domain surveys.

[ascl:2507.008] tayph: Cross-correlation analysis of high resolution spectroscopy

Tayph analyzes high-resolution spectroscopic time-series observations of close-in exoplanets using a cross-correlation technique. The tool can be applied to transit observations of hot Jupiters made with echelle spectrographs at optical wavelengths. In particular, it can be applied to pipeline-reduced observations by HARPS, HARPS-N, ESPRESSO, CARMENES and to a certain extent UVES, with minimal interaction required. Tayph works on observations made with other instruments, provided the user provides these according to a specific format, and can also be used in conjunction with Molecfit (ascl:1501.013).

[ascl:2507.007] Nii-C: Automatic parallel tempering Markov Chain Monte Carlo framework

Nii-C implements a framework of automatic parallel tempering Markov Chain Monte Carlo. Parameters ensure an efficient parallel tempering process that is set by a control system during the initial stages of a sampling process. The autotuned parameters consist of two parts: the temperature ladders of all parallel tempering Markov Chains, and the proposal distributions for all model parameters across all parallel tempering chains. Written in C, Nii-C supersedes the Python code Nii (ascl:2111.010). Nii-C is parallelized using the message-passing interface protocol to optimize the efficiency of parallel sampling, which facilitates rapid convergence in the sampling of high-dimensional and multimodal distributions, as well as the expeditious code execution time. The code can be used to trace complex distributions due to its high sampling efficiency and quick execution speed.

[ascl:2507.006] OW: Opacity Wizard

The Opacity Wizard performs easy and fast visualizations of opacity and abundance data for exoplanet and brown dwarf atmospheres. It was designed to be used by observers studying these substellar objects as a way to explore which molecules are most important for a given planet and predict where the absorption features of those molecules will be. Opacity Wizard provides an iPython notebook with widgets for choosing a pressure, temperature, metallicity, and molecule list, and then create plots of the mixing ratios and opacities.

[ascl:2507.005] SysSimPyMMEN: Infer the minimum-mass extrasolar nebula

SysSimPyMMEN infers the minimum-mass extrasolar nebula (MMEN), a power-law profile for the minimum mass in disk solids required to form the existing exoplanets if they formed in their present locations. Designed to work with the SysSim clustered planetary system models (ascl:2507.001) that characterize the underlying occurrence and intra-system correlations of multi-planet systems, SysSimPyMMEN can also be applied to any other planetary system.

[ascl:2507.004] SysSimPyPlots: Functions for plotting galleries of systems

SysSimPyPlots loads, plots, and visualizes the simulated catalogs generated by ExoplanetsSysSim (ascl:2507.001), a comprehensive forward modeling framework for studying planetary systems based on the Kepler mission. In particular, it is designed to work with the SysSim clustered planetary system models (ascl:2507.003) that characterize the underlying occurrence and intra-system correlations of multi-planet systems. Unlike the SysSim codebase, which is written in Julia, SysSimPyPlot is written almost entirely in Python 3.

[ascl:2507.003] SysSimExClusters: Clustered planetary system model for SysSim

SysSimExClusters provides a comprehensive forward modelling framework for studying planetary systems in conjunction with ExoplanetsSysSim (ascl:2507.001). It includes several statistical models for describing the intrinsic planetary systems, their architectures, and the correlations within multi-planet systems using the Kepler population of exoplanet candidates.

[ascl:2507.002] LSCS: High-contrast space telescopes simulator

LSCS (Lightweight Space Coronagraph Simulator) simulates realistic high-contrast space imaging instruments in their linear regime of small wavefront perturbations about the nominal dark hole. The code can be used for testing high-order wavefront sensing and control as well as post-processing algorithms. It models broadband images with sensor noise, wavefront drift, actuators drift, and residual effects from low-order wavefront sensing, and supports a model of the Roman Space Telescope Hybrid Lyot Coronagraph based on its FALCO (ascl:2304.004, ascl:2304.005) model. The LSCS package provides an example of dark hole maintenance using an Extended Kalman Filter and Electric Field Conjugation.

[ascl:2507.001] ExoplanetsSysSim: Exoplanet System Simulation

The ExoplanetsSysSim.jl package generates populations of planetary systems, simulates observations of those systems with a transit survey, and facilitates comparisons of simulated and observed catalogs of planetary systems. Critically, ExoplanetsSysSim accounts for intrinsic correlations in the sizes and orbital periods of planets within a planetary system.

[submitted] spherimatch: A Python package for cross-matching and self-matching in spherical coordinates

spherimatch is a Python package for efficient cross-matching and self-matching of astronomical catalogs in spherical coordinates. Designed for use in astrophysics, where data is naturally distributed on the celestial sphere, the package enables fast matching with an algorithmic complexity of O(NlogN). It supports Friends-of-Friends (FoF) group identification and duplicate removal in spherical coordinates, and integrates easily with common data processing tools such as pandas.

[submitted] Modified Teukolsky Framework for Environmentally- Coupled Black Hole Ringdown: A Physics-Informed Neural Network Approach for Improved Gravitational Wave Analysis

We develop a Physics-Informed Neural Network (PINN) code to solve the modified Teukolsky equation under realistic astrophysical conditions. The code embeds domain-specific physics—spin-weighted curvature perturbations, quasi-normal mode (QNM) boundary conditions, and attenuation dynamics—directly into the training loss function. Applied to data from the GW190521 event, the model accurately infers complex QNM frequencies (ω = 0.2917 − 0.0389i) and learns an attenuation coefficient α = 0.04096, corresponding to a 14.4% decay rate. The code demonstrates strong predictive performance, reducing mean squared error by 50.3% (MSE = 0.2537 vs. 0.5310) compared to Bayesian baselines, and achieving a positive R² score. It further reveals non-trivial r–t coupling and gravitational memory effects, which standard exponential decay models fail to capture. This PINN-based implementation establishes a computationally efficient and accurate tool for environmental modeling in gravitational wave astrophysics and offers a path forward for black hole spectroscopy beyond vacuum assumptions.

[ascl:2506.025] Procoli: 1D profile likelihood extractor

Procoli extracts profile likelihoods in cosmology. It wraps MontePython (ascl:1805.027), the fast sampler written specifically for CLASS (ascl:1106.020). All likelihoods available for use with MontePython are hence immediately available for use. Procoli is based on a simulated-annealing optimizer to find the global maximum likelihoods value as well as the maximum likelihood points along the profile of any use input parameter.

[ascl:2506.024] CAMEL: Cosmological parameters estimator

CAMEL (Cosmological Analysis with Minuit Exploration of the Likelihood) performs cosmological parameters estimations using best fits, Monte-Carlo Markov Chains, and profile-likelihoods. Widely used in Planck satellite data analysis, by default it employs CLASS (ascl:1106.020) to compute all relevant cosmological quantities, but any other Boltzmann solver can easily be plugged in.

[ascl:2506.023] pinc: Compute profile likelihoods in cosmology

pinc ("profiles in cosmology") computes profile likelihoods in cosmology; it can also determine the (boundary-corrected) confidence intervals with the graphical construction. The code uses a simulated annealing scheme and interfaces with MontePython (ascl:1805.027). pinc consists of three short scripts; these automatically set the relevant parameters in MontePython, submit the minimization chains, and analyze the results.

[submitted] OK Binaries Interactive Catalog

OK Binaries is a tool for identifying suitable calibration binaries from the Washington Double Star (WDS) Sixth Orbit Catalog. It calculates orbital positions at any epoch, propagates uncertainties using Monte Carlo sampling, and generates orbit plots. The web app includes automated daily updates of binary positions and a searchable interface with filters for position, magnitude, separation, and other orbital parameters. OK Binaries can be used online, as a standalone offline browser app, or via the command line.

[ascl:2506.022] CLUES: Clustering tool for analyzing spectral data

CLUES (CLustering UnsupErvised with Sequencer) analyzes spectral and IFU data. This fully interpretable clustering tool uses machine learning to classify and reduce the effective dimensionality of data sets. It combines multiple unsupervised clustering methods with multiscale distance measures using Sequencer (ascl:2105.006) to find representative end-member spectra that can be analyzed with detailed mineralogical modeling and follow-up observations. CLUES has been used on Spitzer IRS data and debris disk science, and can be applied to other high-dimensional spectral data sets, including mineral spectroscopy in general areas of astrophysics and remote sensing.

[ascl:2506.021] Bjet_MCMC: Model multiwavelength spectral energy distributions of blazars

Bjet_MCMC automatically models multiwavelength spectral energy distributions of blazars, considering one-zone synchrotron-self-Compton (SSC) model with or without the addition of external inverse-Compton process from the thermal emission of the nucleus. The code also contains manual fitting functionalities for multi-zone SSC modeling. Bjet_MCMC is built as an MCMC python wrapper around the C++ code Bjet.

[ascl:2506.020] pynchrotron: Synchrotron emission from cooling electrons

pynchrotron implements synchrotron emission from cooling electrons. It removes the need for GSL which was originally relied on for a quick computation of the synchrotron kernel. The code has been ported from GSL and written directly in python as well as accelerated with numba. pynchrotron also includes an astromodels (ascl:2506.019) function for direct use in 3ML (ascl:2506.018).

[ascl:2506.019] astromodels: Spatial and spectral models for astrophysics

Astromodels defines models for likelihood or Bayesian analysis of astrophysical data. Though designed for analysis in the spectral domain, it can also be used as a toolbox containing functions of any variable. Astromodels is not a modeling package; it provides the tools to build a model as complex as one needs. A separate package such as 3ML (ascl:2506.018) is needed to fit the model to the data.

[ascl:2506.018] 3ML: Framework for multi-wavelength/multi-messenger analysis

The Multi-Mission Maximum Likelihood framework (3ML) provides a common high-level interface and model definition for coherent and intuitive modeling of sources using all the available data, no matter their origin. Astrophysical sources are observed by different instruments at different wavelengths with an unprecedented quality, and each instrument and data type has its own ad-hoc software and handling procedure. 3ML's architecture is based on plug-ins; the package uses the official software of each instrument under the hood, thus guaranteeing that 3ML is always using the best possible methodology to deal with the data of each instrument. Though Maximum Likelihood is in the name for historical reasons, 3ML is an interface to several Bayesian inference algorithms such as MCMC and nested sampling as well as likelihood optimization algorithms.

[ascl:2506.017] hydromass: Hydrostatic mass profile reconstruction

Hydromass analyzes galaxy cluster mass profiles from X-ray and/or Sunyaev-Zel’dovich observations. It provides a global Bayesian framework for deprojection and mass profile reconstruction, including mass model fitting, forward fitting with parametric and polytropic models, and non-parametric log-normal mixture reconstruction. Hydromass easily loads public X-COP data products and applies reconstruction tools directly within a Jupyter notebook.

[ascl:2506.016] SBI++: Simulation-based (likelihood-free) inference for astronomical applications

SBI++ is a complete methodology based on simulation-based (likelihood-free) inference that is customized for astronomical applications. Specifically, the code retains the fast inference speed of ∼1 sec for objects in the observational training set distribution, and additionally permits parameter inference outside of the trained noise and data at ~1 min per object. The package includes scripts for training and implementing SBI++ and is dependent on sbi (ascl:2306.002).

[ascl:2506.015] Octofitter: Bayesian inference against exoplanet and binary star data

Octofitter performs Bayesian inference against a wide variety of exoplanet and binary star data. It is highly modular and allows users to easily adjust priors, change parameterizations, and specify arbitrary function relations between the parameters of one or more planets. Octofitter further supplies tools for examining model outputs including prior and posterior predictive checks and simulation based calibration.

[ascl:2506.014] M_-M_K-: Estimate masses and uncertainties from M_Ks (2MASS Ks + distance)

M_-M_K- converts absolute 2MASS Ks-band magnitude (or a distance and a Ks-band magnitude) into an estimate of the stellar mass using the empirical relation derived from the resolved photometry and orbits of astrometric binaries. The code requires scalar values for K, distance, and corresponding uncertainties. M_-M_K- outputs errors based on the relationship's scatter and errors in the provided distance and Ks magnitude.

[ascl:2506.013] OCSVM-Transit-Detection: One-Class SVM model for exoplanet transit detection

This One-Class Support Vector Machine (SVM) model detects exoplanet transit events. One-class SVMs fit data and make predictions faster than simple CNNs, and do not require specialized equipment such as Graphics Processing Units (GPU). The code uses a Gaussian kernel to compute a nonlinear decision boundary. After training, OCSVM-Transit-Detection requires that lightcurves classified as containing a transit have features very similar to the lightcurves in the training dataset, thus limiting misclassifications.

[ascl:2506.012] pyTPCI: Python version of The Pluto-Cloudy Interface

The Python wrapper pyTPCI couples newer versions of the hydrodynamics code PLUTO (ascl:1010.045) and the gas microphysics code CLOUDY (ascl:9910.001) to self-consistently simulate escaping atmospheres in 1D. Following TPCI (ascl:2506.012), on which pyTPCI is based, CLOUDY is modified to read in depth-dependent wind velocities, and to output useful physical quantities (including mass density, number density, and mean molecular weight as a function of depth).

[ascl:2506.011] TPCI: The PLUTO CLOUDY interface

The PLUTO CLOUDY Interface (TPCI) combines the PLUTO (ascl:1010.045) and CLOUDY (ascl.net:9910.001) simulation codes to simulate hydrodynamic evolution under irradiation from a source. The code solves the photoionization and chemical network of the 30 lightest elements. By combining an equilibrium photoionization solver with a general MHD code, TPCI provides an advanced simulation tool applicable to a variety of astrophysical problems.

[ascl:2506.010] easyCHEM: Chemical abundances in exoplanet atmospheres calculator

easyCHEM calculates chemical equilibrium abundances (including condensation) and adiabatic gradients by minimization of the so-called Gibbs free energy. Ancillary outputs are the atmospheric adiabatic temperature gradient and mean molar mass. Because easyCHEM incorporates the dgesv routine from LAPACK (ascl:2104.020) for fast matrix inversion,external math libraries are not required.

[ascl:2506.009] GRIP: Generic data Reduction for nulling Interferometry Package

GRIP (Generic data Reduction for nulling Interferometry Package) reduces nulling data with enhanced statistical self-calibration methods from any nulling interferometric instrument within a single and consistent framework. The toolbox self-calibrates null depth measurements by fitting a model of the instrumental perturbations to histograms of data. The model is generated using a simulator of the instrument built into the package for the main operating nullers or provided by the user. GRIP handles baseline discrimination and spectral dispersion and features several optimizing strategy, including least squares, maximum likelihood, and MCMC with emcee (ascl:1303.002), and works on GPU using the cupy library.

[ascl:2506.008] DART-Vetter: Convolutional Neural Network to distinguish planetary transits from false positives

DART-Vetter distinguishes planetary candidates from false positives detected in any transiting survey, and is tailored for photometric data collected from space-based missions. The Convolutional Neural Network is trained on Kepler and TESS Threshold Crossing Events (TCEs), and processes only light curves folded on the period of the relative signal. DART-Vetter has a simple and compact architecture; it is lightweight enough to be executed on personal laptops.

[ascl:2506.007] excalibuhr: High-resolution spectral data reduction

The excalibuhr end-to-end pipeline extracts high-resolution spectra designed for VLT/CRIRES+. The package preprocesses raw calibration files, including darks, flats, and lamp frames, and can trace spectral orders on 2D detector images. It applies calibrations to science frames, can remove the sky background by nodding subtraction, and combines frames per nodding position. excalibuhr can also extract 1D spectrum and perform wavelength and flux calibration.

[ascl:2506.006] Gen TSO: Graphical interface to simulate JWST exoplanet time-series observations

Gen TSO estimates signal-to-noise ratios for transit/eclipse depths through an interactive graphical interface, similar to the JWST Exposure Time Calculator (ETC). This interface leverages the ETC by combining its noise simulator, Pandeia, with additional exoplanet resources from the NASA Exoplanet Archive, the Gaia DR3 catalog, and the TrExoLiSTS database of JWST programs. Gen TSO calculates S/Ns for all JWST instruments for the spectroscopic time-series modes available as of the Cycle 4 GO call. It also simulates target acquisition on the science targets or, when needed, on nearby stellar targets.

[ascl:2506.005] VBMicrolensing: Microlensing computations for single, binary, and multiple lenses

VBMicrolensing performs efficient computation in gravitational microlensing events using the advanced contour integration method, supporting single, binary and multiple lenses. It calculates magnification by single, binary and multiple lenses, centroid of the images generated by single and binary lenses, and critical curves and caustics of binary and multiple lenses. It also computes complete light curves including several higher order effects, such as limb darkening of the source, binary source, parallax, xallarap, and circular and elliptic orbital motion.

VBMicrolensing is written as a C++ library and wrapped as a Python package; the code can be called from either C++ or Python. This package encompasses VBBinaryLensing (ascl:1809.004), which is at the basis of several platforms for microlensing modeling. VBBinaryLensing will still be available as a legacy software, but will no longer be maintained.

[ascl:2506.004] TESS-cont: TESS contamination tool

TESS-cont quantifies the flux fraction coming from nearby stars in the TESS photometric aperture of any observed target. The package identifies the main contaminant Gaia DR2/DR3 sources, quantifies their individual and total flux contributions to the aperture, and determines whether any of these stars could be the origin of the observed transit and variability signals. Written in Python, TESS-cont is based on building the pixel response functions (PRFs) of nearby Gaia sources and computing their flux distributions across the TESS Target Pixel Files (TPFs) or Full Frame Images (FFIs).

[ascl:2506.003] SMART: Forward-modeling framework for spectroscopic data

SMART (Spectral Modeling Analysis and RV Tool) forward models spectral data. The method works best in those spectral orders with both strong telluric absorption features for accurate wavelength calibration and sufficient structure in the stellar spectrum to distinguish it from the telluric absorption. The code uses Markov Chain Monte Carlo (MCMC) methods to determine stellar parameters such as effective temperature, surface gravity, and rotational velocity, and calibration factors, including continuum and wavelength corrections, instrumental line-spread function (LSF), and strength of telluric absorption. SMART has been used with Keck/NIRSPEC, SDSS/APOGEE, Gemini/IGRINS high-resolution near-infrared spectrometers, among others, and with medium-resolution spectrometers, including Keck/OSIRIS and Keck/NIRES

[ascl:2506.002] MAGIC: Automatic analysis of realistic microlensing light curves

The MAGIC (Microlensing Analysis Guided by Intelligent Computation) PyTorch framework efficiently and accurately infers the microlensing parameters of binary events with realistic data quality. The code divides binary microlensing parameters into two groups, which are inferred separately with different neural networks. The neural controlled differential equation handles light curves with irregular sampling and large data gaps. MAGIC can achieve fractional uncertainties of a few percent on the binary mass ratio and separation, and can locate the degenerate solutions even when large data gaps are introduced. As irregular samplings are common in astronomical surveys, this code may be useful for other time series studies.

[ascl:2506.001] CTD: Cumulative Time Dilation

Cumulative Time Dilation (CTD) calculates and plots the total time dilation experienced by a point (Earth) located at the center of a spherical mass-energy distribution. There are both analytical and numerical solutions for two different descriptions of how gravity acts across cosmological distances. The calculations are done for universes filled with a single energy type (dark energy; matter, including dark matter; or radiation) as well as the concordance model.

[ascl:2505.020] Hibridon: Time-independent non-reactive quantum scattering calculations

Hibridon solves the close-coupled equations which occur in the quantum treatment of inelastic atomic and molecular collisions. Gas-phase scattering, photodissociation, collisions of atoms and/or molecules with flat surfaces, and bound states of weakly-bound complexes can be treated.

[ascl:2505.019] AIRI: Algorithms for computational imaging

The AIRI (AI for Regularization in radio-interferometric Imaging) algorithms are Plug-and-Play (PnP) algorithms propelled by learned regularization denoisers and endowed with robust convergence guarantees. The (unconstrained) AIRI algorithm is built on a Forward-Backward optimization algorithmic backbone enabling handling soft data-fidelity terms. AIRI's primary application is to solve large-scale high-resolution high-dynamic range inverse problems for RI in radio astronomy, more specifically 2D planar monochromatic intensity imaging.

Would you like to view a random code?