ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 3251-3500 of 3437 (3348 ASCL, 89 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:2307.027] CosmicFish: Cosmology forecasting tool

CosmicFish obtains expected bounds on cosmological parameters for a wide range of models and observables for cosmological forecasting. The package includes a Fortran library to produce Fisher matrices, a Python library that performs operations on the produced Fisher matrices, and a full set of plotting utilities. It works with many models, including CAMB (ascl:1102.026) and MGCAMB (ascl:1106.013), and can interface with any Boltzmann solver. The user can choose within a wide range of possible cosmological observables, including cosmic microwave background, weak lensing tomography, galaxy clustering, and redshift drift. CosmicFish is easy to customize; it provides a flexible package system and users can produce their own analyses and plotting pipelines following the default Python apps.

[ascl:2307.028] TidalPy: Moon and exoplanet tidal heating and dynamics estimator

TidalPy performs semi-analytic calculations of tidal dissipation and subsequent orbit-spin evolution for rocky and icy worlds. It can be used as a black box, in which an Object-Oriented Programming (OOP) scheme performs many calculations with very little user input from the user, making it easy to get started with the package, or as a toolbox, as it contains many efficient functions to perform calculations relevant to tides and thermal-orbital coupling, which can be quickly imported and used in a custom scripts. In general, TidelPy's toolbox (functional) scheme provides much higher performance, flexibility, and extensibility than the OOP scheme. It also makes assumptions more visible to the user. The downside is the user may need to be more familiar with the underlying physics.

[ascl:2307.029] SIMPLE: Intensity map generator

SIMPLE (Simple Intensity Map Producer for Line Emission) generates intensity maps that include observational effects such as noise, anisotropic smoothing, sky subtraction, and masking. Written in Python, it is based on a lognormal simulation of galaxies and random assignment of luminosities to these galaxies and generates mock intensity maps that can be used to study survey systematics and calculate covariance matrices of power spectra. The code is modular, allowing its components to be used independently.

[ascl:2307.030] SAMUS: Simulator of Asteroid Malformation Under Stress

SAMUS (Simulator of Asteroid Malformation Under Stress) simulates the deformation of minor bodies, assuming that they are homogenous incompressible fluid masses. They are initialized as ellipsoids and the Navier-Stokes equations are interatively solved to investigate the deformation of the body over time. The software is modular and allows for user-defined output functions, size, and trajectories. Structured as a single large class, SAMUS can store variables and handle arbitrary function calls, which eases debugging and investigation, especially for lengthy high-fidelity simulation runs.

[ascl:2307.031] HilalPy: Analysis tool for lunar crescent visibility criterion

HilalPy analyzes lunar crescent visibility criteria. Written in Python, the code uses more than 8000 lunar crescent visibility records extracted from literature and websites of lunar crescent observation, descriptive statistics, contradiction rate percentage, and regression analysis in its analysis to predict the visibility of a lunar crescent.

[ascl:2307.032] AmpF: Amplification factor for solar lensing

AmpF numerically calculates the amplification factor for solar lensing. The import parameters are the gravitational-wave frequency and the source angular position with respect to the solar center; the code outputs are the amplification factor and its geometrical-optics limit. AmpF accepts variables for several attributes and the overall amplitude of the lensing potential can be changed as needed. The method has been implemented in both C and Python.

[ascl:2307.033] Imber: Doppler imaging tool for modeling stellar and substellar surfaces

Imber simulates spectroscopic and photometric observations with both a gridded numerical simulation and analytical model. Written in Python, it is specifically designed to predict Extremely Large Telescope instrument (such as ELT/METIS and TMT/MODHIS) Doppler imaging performance, and has also been applied to existing, archival observations of spectroscopy and photometry.

[ascl:2307.034] Guacho: 3D uniform mesh parallel HD/MHD code for astrophysics

Guacho is a 3D hydrodynamical/magnetohydrodynamical code suited for astrophysical fluids. The hydrodynamic equations are evolved with a number of approximate Riemann solvers. Gaucho includes various modules to deal with different cooling regimes, and a radiation transfer module based on a Monte Carlo ray tracing method. The code can run sequentially or in parallel with MPI.

[ascl:2307.035] binary_c: Stellar population synthesis software framework

The binary_c software framework models the evolution of single, binary and multiple stars, including stellar evolution and nucleosynthesis. Stellar evolution includes wind mass loss, rotation, thermal pulses, magnetic braking, pre-main sequence evolution, supernovae and kicks, and neutron stars; binary-star evolution includes mass transfer, gravitational-wave losses, tides, novae, circumbinary discs, and merging stars. binary_c natively includes nucleosynthesis, and, as it is designed for stellar population calculations, it is lightweight and versatile. binary_c works in standalone, virtual and HPC environments, and its support software contains tools for development and data analysis. A version in Python, binary_c-python (ascl:2307.036), is also available.

[ascl:2307.036] binary_c-python: Stellar population synthesis tool and interface to binary_c

binary_c-python provides a manager for and interface to the binary_c framework (ascl:2307.035), and rapidly evolves individual systems and populations of stars. It provides functions such as data processing tools and initial distribution functions for stellar properties. binary_c-python also includes tools to run large grids of (binary) stellar systems on servers or distributed systems.

[ascl:2307.037] WDMWaveletTransforms: Fast forward and inverse WDM wavelet transforms

WDMWaveletTransforms implements the fast forward and inverse WDM wavelet transforms in Python from both the time and frequency domains. The frequency domain transforms are inherently faster and more accurate. The wavelet domain->frequency domain and frequency domain->wavelet domain transforms are nearly exact numerical inverses of each other for a variety of inputs tested, including Gaussian random noise. WDMWaveletTransforms has both command line and Python interfaces.

[ascl:2307.038] WarpX: Time-based electromagnetic and electrostatic Particle-In-Cell code

WarpX is an advanced electromagnetic & electrostatic Particle-In-Cell code. It supports many features including Perfectly-Matched Layers (PML), mesh refinement, and the boosted-frame technique. A highly-parallel and highly-optimized code, WarpX can run on GPUs and multi-core CPUs, includes load balancing capabilities, and scales to the largest supercomputers.

[ascl:2307.039] adiabatic-tides: Tidal stripping of dark matter (sub)haloes

adiabatic-tides evaluates the tidal stripping of dark matter (sub)haloes in the adiabatic limit. It exactly reproduces the remnant of an NFW halo that is exposed to a slowly increasing isotropic tidal field and approximately reproduces the remnant for an anisotropic tidal field. adiabatic-tides also predicts the asymptotic mass loss limit for orbiting subhaloes and differently concentrated host-haloes with and without baryonic components, and can be used to improve predictions of dark matter annihilation.

[ascl:2307.040] pycrires: Data reduction pipeline for VLT/CRIRES+

pycrires runs the CRIRES+ recipes of EsoRex. The pipeline organizes the raw data, creates SOF and configuration files, runs the calibration and science recipes, and creates plots of the images and extracted spectra. Additionally, it corrects remaining inaccuracies in the wavelength solution and the spectrum curvature. pycrires also provides dedicated routines for the extraction, calibration, and detection of spatially-resolved objects such as directly imaged planets.

[ascl:2307.041] EFTCAMB: Effective Field Theory with CAMB

EFTCAMB patches the public Einstein-Boltzmann solver CAMB (ascl:1102.026) to implement the Effective Field Theory approach to cosmic acceleration. It can be used to investigate the effect of different EFT operators on linear perturbations and to study perturbations in any specific DE/MG model that can be cast into EFT framework. To interface EFTCAMB with cosmological data sets, it is equipped with a modified version of CosmoMC (ascl:1106.025), EFTCosmoMC, to create a bridge between the EFT parametrization of the dynamics of perturbations and observations.

[ascl:2307.042] LIMpy: Line Intensity Mapping in Python

LIMpy models and analyzes multi-line intensity maps of CII (158 µ), OIII (88 µ), and CO (1-0) to CO (13-12) transitions. It can be used as an analytic model for star formation rate, to simulate line intensity maps based on halo catalogs, and to calculate the power spectrum from simulated maps and the cross-correlated signal between two separate lines. Among other things, LIMpy can also create multi-line luminosity models and determine the multi-line intensity power spectrum.

[ascl:2307.043] EAGLES: Estimating AGes from Lithium Equivalent widthS

EAGLES (Estimating AGes from Lithium Equivalent widthS) implements an empirical model that predicts the lithium equivalent width (EW) of a star as a function of its age and effective temperature. The code computes the age probability distribution for a star with a given EW and Teff, subject to an age probability prior that may be flat in age or flat in log age. Data for more than one star can be entered; EAGLES then treats these as a cluster and determines the age probability distribution for the ensemble. The code produces estimates of the most probable age, uncertainties and the median age; output files consisting of probability plots, best-fit isochrone plots, and tables of the posterior age probability distribution(s).

[ascl:2307.044] RUBIS: Fast centrifugal deformation program for stellar and planetary models

The centrifugal deformation program RUBIS (Rotation code Using Barotropy conservation over Isopotential Surfaces) takes an input 1D model (with spherical symmetry) and returns its deformed version by applying a conservative rotation profile specified by the user. The code needs only the density as a function of radial distance from the reference model in addition to the surface pressure to be imposed to perform the deformation; preserving the relation between density and pressure when going from the 1D to the 2D structure makes this lightness possible. By solving Poisson's equation in spheroidal rather than spherical coordinates whenever a discontinuity is present, RUBIS can deform both stellar and planetary models, thereby dealing with potential discontinuities in the density profile.

[ascl:2307.045] NAVanalysis: Normalized Additional Velocity analysis

NAVanalysis studies the non-baryonic, or non-Newtonian, contribution to galaxy rotation curves straight from a given data sample. Conclusions on the radial profile of a given model can be drawn without individual galaxy fits to provide an efficient sample comparison. The method can be used to eliminate model parameter regions, find the most probable parameter regions, and uncover trends not easy to find from standard fits. Further, NAVanalysis can compare different approaches and models.

[ascl:2307.046] HAYASHI: Halo-level AnalYsis of the Absorption Signal in HI

HAYASHI (Halo-level AnalYsis of the Absorption Signal in HI) computes the number of absorption features of the 21cm forest using a semianalytic formalism. It includes the enhancement of the signal due to the presence of substructures within minihalos and supports non-standard cosmologies with impact in the large scale structure, such as warm dark matter and primordial black holes. HAYASHI is written in Python3 and uses the cosmological computations package Colossus (ascl:1501.016).

[ascl:2307.047] GWDALI: Gravitational wave parameter estimation

GWDALI focuses on parameter estimations of gravitational waves generated by compact object coalescence (CBC). This software employs both Gaussian (Fisher Matrix) and Beyond-Gaussian methods to approximate the likelihood of gravitational wave events. GWDALI also addresses the challenges posed by Fisher Matrices with zero determinants. Additionally, the Beyond-Gaussian approach incorporates the Derivative Approximation for Likelihoods (DALI) algorithm, enabling a more reliable estimation process.

[ascl:2307.048] NaMaster: Unified pseudo-Cl framework

NaMaster computes full-sky angular cross-power spectra of masked, spin-0 and spin-2 fields with an arbitrary number of known contaminants using a pseudo-Cl (aka MASTER) approach. The code also implements E/B-mode purification and offers both full-sky and flat-sky modes. NaMaster is available as a C library, Python module, and standalone program.

[ascl:2307.049] reMASTERed: Calculate contributions to pseudo-Cl for maps with correlated masks

reMASTERed reconstructs ensemble-averaged pseudo-$C_\ell$ to effectively exact precision, with significant improvements over traditional estimators for cases where the map and mask are correlated. The code can compute the results given an arbitrary map and mask; it can also compute the results in the ensemble average for certain types of threshold masks.

[ascl:2307.050] νHawkHunter: Forecasting of PBH neutrinos

νHawkHunter explores the prospects of detecting neutrinos produced by the evaporation of primordial black holes in ground-based experiments. It makes use of neutrino fluxes from Hawking radiation computed with BlackHawk (ascl:2012.020). νHawkHunter is also be used for Diffuse Supernova Neutrino Background or similar studies by replacing the signal fluxes by the proper ones.

[ascl:2307.051] WeakLensingQML: Quadratic Maximum Likelihood estimator applied to Weak Lensing

WeakLensingQML implements the Quadratic Maximum Likelihood (QML) estimator and applies it to simulated cosmic shear data and compares the results to a Pseudo-Cl implementation. The package computes and saves relevant data files for later processes, such as the fiduciary cosmic shear power spectrum used in the analysis, the sky mask, and computing an analytic version of the QML's covariance matrix. The core of the package implements a conjugate-gradient approach for the quadratic estimator, and is parallelized for maximum performance. The code relies on the Eigen linear algebra package and the HealPix spherical harmonic transform library. A post-processing script analyzes the results and compares the QML's estimates with those from the Pseudo-Cl estimator; it then produces an array of plots highlighting the results.

[ascl:2307.052] EVo: Thermodynamic magma degassing model

EVo calculates the speciation and volume of a volcanic gas phase erupting in equilibrium with its parent magma. Models can be run to calculate the gas phase in equilibrium with a melt at a single pressure, or the melt can be decompressed from depth rising to the surface as a closed-system case. Single pressure and decompression can be run for OH, COH, SOH, COHS and COHSN systems. EVo can calculate gas phase weight and volume fraction within the system, gas phase speciation as mole fraction or weight fraction across numerous compounds, and the volatile content of the melt at each pressure. It also calculates melt density, f02 of the system, and more. EVo can be set up using either melt volatile contents, or for a set amount of atomic volatile which is preferable for conducting experiments over a wide range of fO2 values.

[ascl:2307.053] EVolve: Growth and evolution of volcanically-derived atmospheres

EVolve calculates the chemical composition and surface pressure of a ID atmosphere on a rocky planet that is being produced by volcanic activity, as it grows over time. Once the initial volatile content of the planet's mantle and the composition and resultant surface pressure of any pre-existing atmosphere is set, the volcanic degassing model EVo (ascl:2307.052) calculates the amount and speciation of any volcanic gases released into the atmosphere over each time step. Atmospheric processing is calculated using FastChem (ascl:1804.025); thermochemical equilibrium is assumed so the final chemical composition of the atmosphere is calculated according to the pre-set surface temperature.

[submitted] backtrack: fit relative motion of candidate direct imaging sources with background proper motion and parallax

Directly imaged planet candidates (high contrast point sources near bright stars) are often validated, among other supporting lines of evidence, by comparing their observed motion against the projected motion of a background source due to the proper motion of the bright star and the parallax motion due to the Earth's orbit. Often, the "background track" is constructed assuming an interloping point source is at infinity and has no proper motion itself, but this assumption can fail, producing false positive results, for crowded fields or insufficient observing time-baselines (e.g. Nielsen et al. 2017). `backtrack` is a tool for constructing background proper motion and parallax tracks for validation of high contrast candidates. It can produce classical infinite distance, stationary background tracks, but was constructed in order to fit finite distance, non-stationary tracks using nested sampling (and can be used on clusters). The code sets priors on parallax based on the relations in Bailer-Jones et al. 2021 that are fit to Gaia eDR3 data, and are therefore representative of the galactic stellar density. The public example currently reproduces the results of Nielsen et al. 2017 and Wagner et al. 2022, demonstrating that the motion of HD 131399A "b" is fit by a finite distance, non-stationary background star, but the code has been tested and validated on proprietary datasets. The code is open source, available on github, and additional contributions are welcome.

[ascl:2307.054] LEFTfield: Forward modeling of cosmological density fields

LEFTfield forward models cosmological matter density fields and biased tracers of large-scale structure. The model, written in C++ code, is centered around classes encapsulating scalar, vector, and tensor grids. It includes the complete bias expansion at any order in perturbations and captures general expansion histories without relying on the EdS approximation; however, the latter is also implemented and results in substantially smaller computational demands. LEFTfield includes a subset of the nonlinear higher-derivative terms in the bias expansion of general tracers.

[ascl:2307.055] plan-net: Bayesian neural networks for exoplanetary atmospheric retrieval

plan-net uses machine learning with an ensemble of Bayesian neural networks for atmospheric retrieval; this approach yields greater accuracy and more robust uncertainties than a single model. A new loss function for BNNs learns correlations between the model outputs. Performance is improved by incorporating domain-specific knowledge into the machine learning models and provides additional insight by inferring the covariance of the retrieved atmospheric parameters.

[ascl:2307.056] HELA: Random Forest retrieval for exoplanet atmospheres

HELA performs atmospheric retrieval on exoplanet atmospheres using a Random Forest algorithm. The code has two stages: training (which includes testing), and predicting. It requires a training set that matches the format of the data to be analyzed, with the same number of points and a sample spectrum for each parameter. The number of trees used and the number of jobs are editable. The HELA package includes a training set and data as examples.

[ascl:2307.057] species: Atmospheric characterization of directly imaged exoplanets

species (spectral characterization and inference for exoplanet science) provides a coherent framework for spectral and photometric analysis of directly imaged exoplanets and brown dwarfs which builds on publicly-available data and models from various resources. species contains tools for grid and free retrievals using Bayesian inference, synthetic photometry, interpolating a variety atmospheric and evolutionary model grids (including the possibility to add a custom grid), color-magnitude and color-color diagrams, empirical spectral analysis, spectral and photometric calibration, and analysis of emission lines.

[ascl:2307.058] APOLLO: Radiative transfer and atmosphere spectroscopic retrieval for exoplanets

APOLLO forward models the radiative transfer of light through a planetary (or brown dwarf) atmosphere; it also forward models transit and emission spectra and retrieves atmospheric properties of extrasolar planets. The code has two operational modes: one to compute a planetary spectrum given a set of parameters, and one to retrieve those parameters based on an observed spectrum. The package uses emcee (ascl:1303.002) to find the best fit to a spectrum for a given parameter set. APOLLO is modular and offers many options that may be turned on and off, including the type of observations, a flexible molecular composition, multiple cloud prescriptions, multiple temperature-pressure profile prescriptions, multiple priors, and continuum normalization.

[ascl:2307.059] orbitN: Symplectic integrator for near-Keplerian planetary systems

orbitN generates accurate and reproducible long-term orbital solutions for near-Keplerian planetary systems with a dominant mass M0. The code focuses on hierarchical systems without close encounters but can be extended to include additional features. Among other features, the package includes M0's quadrupole moment, a lunar contribution, and post-Newtonian corrections (1PN) due to M0 (fast symplectic implementation). To reduce numerical roundoff errors, orbitN features Kahan compensated summation.

[ascl:2307.060] MBASC: Multi-Band AGN-SFG Classifier

MBASC (Multi-Band AGN-SFG Classifier) classifies sources as Active Galactic Nuclei (AGNs) and Star Forming Galaxies (SFGs). The algorithm is based on the light gradient-boosting machine ML technique. MBASC can use a wide range of multi-wavelength data and redshifts to predict a classification for sources.

[ascl:2307.061] connect: COsmological Neural Network Emulator of CLASS using TensorFlow

connect (COsmological Neural Network Emulator of CLASS using TensorFlow) emulates cosmological parameters using neural networks. This includes both sampling of training data and training of the actual networks using the TensorFlow library. connect aids in cosmological parameter inference by immensely speeding up the process, which is achieved by substituting the cosmological Einstein-Boltzmann solver codes, needed for every evaluation of the likelihood, with a neural network with a 102 to 103 times faster evaluation time. The code requires CLASS (ascl:1106.020) and Monte Python (ascl:1307.002) if iterative sampling is used.

[ascl:2307.062] FABADA: Non-parametric noise reduction using Bayesian inference

FABADA (Fully Adaptive Bayesian Algorithm for Data Analysis) performs non-parametric noise reduction using Bayesian inference. It iteratively evaluates possible smoothed models of the data to estimate the underlying signal that is statistically compatible with the noisy measurements. Iterations stop based on the evidence E and the χ2 statistic of the last smooth model, and the expected value of the signal is computed as a weighted average of the smooth models. Though FABADA was written for astronomical data, such as spectra (1D) or images (2D), it can be used as a general noise reduction algorithm for any one- or two-dimensional data; the only requisite of the input data is an estimation of its associated variance.

[ascl:2308.001] MOOG_SCAT: Scattering version of the MOOG Line Transfer Code

MOOG_SCAT, a redevelopment of the LTE radiative transfer code MOOG (ascl:1202.009), contains modifications that allow for the treatment of isotropic, coherent scattering in stars. MOOG_SCAT employs a modified form of the source function and solves radiative transfer with a short charactersitics approach and an acclerated lambda iteration scheme.

[ascl:2308.002] FLATW'RM: Finding flares in Kepler data using machine-learning tools

FLATW'RM (FLAre deTection With Ransac Method) detects stellar flares in light curves using a classical machine-learning method. The code tries to find a rotation period in the light curve and splits the data to detection windows. The light curve sections are fit with the robust fitting algorithm RANSAC (Random sample consensus); outlier points (flare candidates) above the pre-set detection level are marked for each section. For the given detection window, only those flare candidates that have at least a given number of consecutive points (three by default) are kept and marked as flares. When using FLATW’RM, the code's output should be checked to determine whether changes to the default settings are needed to account for light curve noise, data sampling frequency, and scientific needs.

[ascl:2308.003] SIMBI: 3D relativistic gas dynamics code

SIMBI simulates heterogeneous relativistic gas dynamics up to 3d for special relativistic hydrodynamics and up to 2D Newtonian hydrodynamics. It supports user-defined mesh expansion and contraction, density, momentum, and energy density terms outside of grid; the code also supports source terms in the Euler equations and source terms at the boundaries. Boundary conditions, which include periodic, reflecting, outflow, and inflow boundaries, are given as an array of strings. If an inflow boundary condition is set but no inflow boundary source terms are given, SIMBI switches to outflow boundary conditions to prevent crashes. The code can track a single passive scalar, insert an immersed boundary, and is impermeable by default. SIMBI USES the Cython framework to blend together C++, CUDA, HIP, and Python.

[ascl:2308.004] AstroPhot: Fitting everything everywhere all at once in astronomical images

AstroPhot quickly extracts detailed information from complex astronomical data for individual images or large survey programs. It fits models for sky, stars, galaxies, PSFs, and more in a principled chi^2 forward optimization, recovering Bayesian posterior information and covariance of all parameters. The code optimizes forward models on CPU or GPU, across images that are large, multi-band, multi-epoch, rotated, dithered, and more. Models are optimized together, thus handling overlapping objects and including the covariance between parameters (including PSF and galaxy parameters). AstroPhot includes several optimization algorithms, including Levenberg-Marquardt, Gradient descent, and No-U-Turn MCMC sampling.

[ascl:2308.005] FastSpecFit: Fast spectral synthesis and emission-line fitting of DESI spectra

FastSpecFit models the observed-frame optical spectroscopy and broadband photometry of extragalactic targets using physically grounded stellar continuum and emission-line templates. The code handles data from the Dark Energy Spectroscopic Instrument (DESI) Survey, which is amassing spectrophotometry for an unprecedented 40 million extragalactic targets, although the algorithms are general enough to accommodate other upcoming, massively multiplexed spectroscopic surveys. FastSpecFit extracts nearly 800 observed- and rest-frame quantities from each target, including light-weighted ages and stellar velocity dispersions based on the underlying stellar continuum; line-widths, velocity shifts, integrated fluxes, and equivalent widths for nearly 40 rest-frame ultraviolet, optical, and near-infrared emission lines arising from both star formation and active galactic nuclear activity; and K-corrections and rest-frame absolute magnitudes and colors. Moreover, FastSpecFit is designed with speed and parallelism in mind, enabling it to deliver robust model fits to tens of millions of targets.

[ascl:2308.006] Nemo: Millimeter-wave map filtering and Sunyaev-Zel'dovich galaxy cluster and source detection

Nemo detects millimeter-wave Sunyaev-Zel'dovich galaxy clusters and compact sources. Originally developed for the Atacama Cosmology Telescope project, the code is capable of analyzing the next generation of deep, wide multifrequency millimeter-wave maps that will be produced by experiments such as the Simons Observatory. Nemo provides several modules for analyzing ACT/SO data in addition to the command-line programs provided in the package.

[ascl:2308.007] DiskMINT: Disk Model For INdividual Targets

DiskMINT (Disk Model for INdividual Targets) models individual disks and derives robust disk mass estimates. Built on RADMC-3D (ascl:1202.015) for continuum (and gas line) radiative transfer, the code includes a reduced chemical network to determine the C18O emission. DiskMINT has a Python3 module that generates a self-consistent 2D disk structure to satisfy VHSE (Vertical Hydrostatic Equilibrium). It also contains a Fortran code of the reduced chemical network that contains the main chemical processes necessary for C18O modeling: the isotopologue-selective photodissociation, and the grain-surface chemistry where the CO converting to CO2 ice is the main reaction.

[ascl:2308.008] Rapster: Rapid population synthesis for binary black hole mergers in dynamical environments

Rapster (RAPid cluSTER evolution) models binary black hole population synthesis and the evolution of star clusters based on simple, yet realistic prescriptions. The code can generate large populations of dynamically formed binary black holes. Rapster uses SEVN (ascl:2206.019) to model the initial black hole mass spectrum and PRECESSION (ascl:1611.004) to model the mass, spin, and gravitational recoil of merger remnants.

[ascl:2308.009] caput: Utilities for building radio astronomy data analysis pipelines

Caput (Cluster Astronomical Python Utilities) contains utilities for handling large datasets on computer clusters. Written with radio astronomy in mind, the package provides an infrastructure for building, managing and configuring pipelines for data processing. It includes modules for dynamically importing and utilizing mpi4py, in-memory mock-ups of h5py objects, and infrastructure for running data analysis pipelines on computer clusters. Caput features a generic container for holding self-documenting datasets in memory with straightforward syncing to h5py files, and offers specialization for holding time stream data. Caput also includes tools for MPI-parallel analysis and routines for converting between different time representations, dealing with leap seconds, and calculating celestial times.

[ascl:2308.010] BCemu: Model baryonic effects in cosmological simulations

BCMemu provides emulators to model the suppression in the power spectrum due to baryonic feedback processes. These emulators are based on the baryonification model, where gravity-only N-body simulation results are manipulated to include the impact of baryonic feedback processes. The package also has a three parameter barynification model; the first assumes all the three parameters to be independent of redshift while the second assumes the parameters to be redshift dependent.

[ascl:2308.011] glmnet: Lasso and elastic-net regularized generalized linear models

glmnet efficiently fits the entire lasso or elastic-net regularization path for linear regression (gaussian), multi-task gaussian, logistic and multinomial regression models (grouped or not), Poisson regression and the Cox model. The algorithm uses cyclical coordinate descent in a path-wise fashion.

[ascl:2308.012] KeplerFit: Keplerian velocity distribution model fitter

KeplerFit fits a Keplerian velocity distribution model to position-velocity (PV) data to obtain an estimate of the enclosed mass. The code extracts the scales of the pixels in both directions, spatial and spectral, then extracts the most extreme velocity at each position; this returns two arrays of positions and velocities. KeplerFit then models the extracted PV data and returns a set of the best-fit parameters, the standard deviations in each of the parameters, and the total residual of the fit.

[ascl:2308.013] Driftscan: Drift scan telescope analysis

Driftscan simulates and analyzes transit radio interferometers, with a particular focus on 21cm cosmology. Given a design of a telescope, it generates a set of products used to analyze data from it and simulate timestreams. Driftscan also constructs a filter to extract cosmological 21 cm emission from astrophysical foregrounds, such as our galaxy and radio point sources, and estimates the 21cm power spectrum using an optimal quadratic estimator.

[ascl:2308.014] velocileptors: Velocity-based Lagrangian and Eulerian PT expansions of redshift-space distortions

velocileptors computes the real- and redshift-space power spectra and correlation functions of biased tracers using 1-loop perturbation theory (with effective field theory counter terms and up to cubic biasing) as well as the real-space pairwise velocity moments. It provides simple computation of the power spectrum wedges or multipoles, and uses a reduced set of parameters for computing the most common case of the redshift-space power spectrum. In addition, velocileptors offers two "direct expansion" modules available in LPT and EPT.

[ascl:2308.015] FishLSS: Fisher forecasting for Large Scale Structure surveys

FishLSS computes the Fisher information matrix for a set of observables and model parameters. It can model the redshift-space power spectrum of any biased tracer of the CDM+baryon field and the post-reconstruction galaxy power spectrum. The code also models the projected cross-correlation of galaxies with the CMB lensing convergence, the projected galaxy power spectrum, and the CMB lensing convergence power spectrum. FishLSS requires pyFFTW (ascl:2109.009), velocileptors (ascl:2308.014), and CLASS (ascl:1106.020).

[ascl:2309.001] TRES: TRiple Evolution Simulation package

TRES simulates hierarchical triple systems with stellar and planetary components, including stellar evolution, stellar winds, tides, general relativistic effects, mass transfer, and three-body dynamics. It combines stellar evolution and interactions with three-body dynamics in a self-consistent way. The code includes the effects of common-envelope evolution, circularized stable mass transfer, tides, gravitational wave emission and up-to-date stellar evolution through SeBa (ascl:1201.003). Other stellar evolution codes, such as SSE (ascl:1303.015), can also be used. TRES is written in the AMUSE (ascl:1107.007) software framework.

[ascl:2309.002] UBHM: Uncertainty quantification of black hole mass estimation

Uncertain_blackholemass predicts virial black hole masses using a neural network model and quantifies their uncertainties. The scripts retrieve data and run feature extraction and uncertainty quantification for regression. They can be used separately or deployed to existing machine learning methods to generate prediction intervals for the black hole mass predictions.

[ascl:2309.003] Swiftbat: Utilities for handing BAT instrument data from the Neil Gehrels Swift Observatory

Swiftbat retrieves, analyzes, and displays data from NASA's Swift spacecraft, especially data from the Swift Burst Alert Telescope (BAT). All BAT data are available from the Swift data archive; however, a few routines in this library use data access methods not available to the general public and thus are useful only to Swift team members. The package also installs a command-line program 'swinfo' that provides Swift Information such as what the MET (onboard-clock) time is, where Swift was pointing, and whether a specific source was above the horizon and/or in the field of view.

[ascl:2309.004] GWSim: Mock gravitational waves event generator

GWSim generates mock gravitational waves (GW) events corresponding to different binary black holes (BBHs) population models. It can incorporate scenarios of GW mass models, GW spin distributions, the merger rate, and the cosmological parameters. GWSim generates samples of binary compact objects for a fixed amount of observation time, duty cycle, and configurations of the detector network; the universe created by the code is uniform in comobile volume.

[ascl:2309.005] DeepGlow: Neural network emulator for BOXFIT

The feed-forward neural network DeepGlow emulates BOXFIT (ascl:2306.059) simulation data of gamma-ray burst (GRB) afterglows. The package provides an easy interface to generate GRB afterglow spectra and light curves mimicking those generated through BOXFIT with high accuracy. The code used to generate the training data and to train the neural networks is also included.

[ascl:2309.006] CoLFI: Cosmological Likelihood-Free Inference

CoLFI (Cosmological Likelihood-Free Inference) estimates parameters directly from the observational data sets using neural density estimators (NDEs); it is a fully ANN-based framework that differs from the Bayesian inference. The package contains three NDEs that are used to estimate parameters: an artificial neural network (ANN), a mixture density network (MDN), and a mixture neural network (MNN). CoLFI can learn the conditional probability density using samples generated by models, and the posterior distribution can be obtained for given observational data.

[ascl:2309.007] MATRIX: Multi-phAse Transits Recovery from Injected eXoplanets toolkit

The injection-recovery MATRIX (Multi-phAse Transits Recovery from Injected eXoplanets) Toolkit creates grids of scenarios with a set of periods, radii, and epochs of synthetic transiting exoplanet signals in a provided light curve. Typical injection-recovery executions consist of 2-dimensional scenarios, where only one epoch (random or hardcoded) was used for each period and radius, which may reduce accuracy. MATRIX performs multi-phase analyses needing only a few parameters in a configuration file and running one line of code.

[ascl:2309.008] PI: Plages Identification

Plages Identification identifies solar plages from Ca II K photographic observations irrespective of noise level, brightness, and other image properties. The code provides an efficient, reliable method for identifying solar plages. The output of the algorithm is an image highlighting the plages and the calculated plage index. Plages Identification is also deployed as a webapp, allowing users to experiment with different hyperparameters and visualize their impact on the output image in real time.

[submitted] LOFAR H5plot

Calibration solutions for the LOFAR radio telescope are stored in a 5-dimensional (time, frequency, station, polarisation and direction in the sky) HDF5 table. H5plot is a GUI application focussing on interactive visual inspection of these calibration solutions.

[submitted] qmatch: Some astronomical image matching programs

Matching stars in astronomical images is an essential step in data reduction. This work includes some matching programs implemented by Python: simple matching, fast matching, and triangle matching. For two catalogs with m and n objects, the simple method has a time and space complexity of O(m*n) but is fast for fewer n or m. The time complexity of the fast method is O(mlogm+nlogn). The triangle method will work between rotated and scaled images. All methods are applied in pipelines and work well. This package is published to the PyPI with the name 'qmatch'.

[submitted] A pseudo GUI with pyplot

Working with a GUI, or adding interaction in plotting, will help a lot in data analysis. However, the common GUI of Python is OS-dependent, while manually adding interactive codes is too complex. A pseudo-GUI tool is introduced in this work. It will help to add buttons/checkers in the graph and assign callback functions to them. The remaining problem is that the documents in this package are in Chinese and will be in English in the next version. This program is published to the PyPI, and can be installed by 'pip install pltgui'.

[submitted] INSPECTA: INtegrated SDHDF Processing Engine in C for Telescope data Analysis

INSPECTA (formerly sdhdfProc) is a software package to read, manipulate and process radio astronomy data in Spectral-Domain Hierarchical Data Format (SDHDF). It is available as part of the 'sdhdf_tools' repository.

[ascl:2309.009] pymcspearman: Monte carlo calculation of Spearman's rank correlation coefficient with uncertainties

pymcspearman is a python implementation of MCSpearman (ascl:1504.008) and calculates Spearman's rank correlation coefficient for data, using bootstrapping and/or perturbation to estimate the uncertainties on the correlation coefficient. This software project has migrated (and expanded) to pymccorrelation (ascl:2309.010).

[ascl:2309.010] pymccorrelation: Correlation coefficients with uncertainties

pymccorrelation calculates correlation coefficients for data, using bootstrapping and/or perturbation to estimate the uncertainties on the correlation coefficient and p-value. The code supports Pearson's r, Spearman's rho, and Kendall's tau. Calculations of Kendall's tau additionally support censored data. This code supercedes and expands the deprecated code pymcspearman (ascl:2309.009).

[ascl:2309.011] PCOSTPD: Periodogram Comparison for Optimizing Small Transiting Planet Detection

The Periodogram Comparison for Optimizing Small Transiting Planet Detection R code compares two periodogram algorithms for detecting transiting exoplanets: the Box-fitting Least Squares (BLS) and the Transit Comb Filter (TCF). It calculates the False Alarm Probability (FAP) based on extreme value theory and signal-to-noise ratio (SNR) metrics to quantify periodogram peak significance. The comparison approach is aimed at optimizing the detection of small transiting planets in future transiting exoplanet surveys. The code can be extended for comparing any set of periodograms.

[ascl:2309.012] StarbugII: JWST PSF photometry for crowded fields

The python photometry suite StarbugII provides accurate photometry on point-like sources embedded in complex diffuse emissions. The tool has a simple modular interface with a wide range of photometric routines including embedded source detection, aperture and PSF photometry, diffuse background emission estimation, catalog matching and artificial star testing. The core is built around Photutils (ascl:1609.011).

[ascl:2309.013] maszcal: Mass calibrations for thermal-SZ clusters

maszcal calibrates the observable-mass relation for galaxy clusters, with a focus on the thermal Sunyaev-Zeldovich signal's relation to mass. maszcal explicitly models baryonic matter density profiles, differing from most previous approaches that treat galaxy clusters as purely dark matter. To do this, it uses a generalized Nararro-Frenk-White (GNFW) density to represent the baryons, while using the more typical NFW profile to represent dark matter.

[ascl:2309.014] fitScalingRelation: Fit galaxy cluster scaling relations using MCMC

fitScalingRelation fits galaxy cluster scaling relations using orthogonal or bisector regression and MCMC. It takes into account errors on both variables and intrinsic scatter. Although it geared for fitting galaxy cluster scaling relations of all kinds, it can be used for any kind of regression problem with errors on both variables and intrinsic scatter.

[ascl:2309.015] bskit: Bispectra from cosmological simulation snapshots

bskit, built upon the nbodykit (ascl:1904.027) simulation analysis package, measures density bispectra from snapshots of cosmological N-body or hydrodynamical simulations. It can measure auto or cross bispectra in a user-specified set of triangle bins (that is, triplets of 3-vector wavenumbers). Several common sets of bins are also implemented, including all triangle bins for specified k_min and k_max, equilateral triangles between specified k_min and k_max, isosceles triangles, and squeezed isosceles triangles.

[ascl:2309.016] PEREGRINE: Gravitational wave parameter inference with neural ration estimation

PEREGRINE performs full parameter estimation on gravitational wave signals. Using an internal Truncated Marginal Neural Ratio Estimation (TMNRE) algorithm and building upon the swyft (ascl:2302.016) code to efficiently access marginal posteriors, PEREGRINE conducts a sequential simulation-based inference approach to support the analysis of both transient and continuous gravitational wave sources. The code can fully reconstruct the posterior distributions for all parameters of spinning, precessing compact binary mergers using waveform approximants.

[ascl:2309.017] ChEAP: Chemical Evolution Analytic Package

ChEAP (Chemical Evolution Analytic Package) implements an analytic solution for the chemical evolution model of the Galaxy that extends the instantaneous recycling approximation with the contribution of Type Ia SNe. The code works for different prescriptions of the delay time distributions (DTDs), including the single and double degenerate scenarios, and allows the inclusion of an arbitrary number of pristine gas infalls. The required functions are contained in the CheapTools.py file, which is imported as a Python library. ChEAP also includes code to illustrate, with a random-parameter chemical evolution model, the accuracy of this analytic solution compared to one using numerical integration.

[ascl:2309.018] Sprout: Moving mesh finite volume hydro code

The finite volume hydro code Sprout uses a simple expanding Cartesian grid to track outflows for several orders of magnitudes in expansion. It captures shocks whether they are aligned or misaligned with the grid, and provides second-order convergence for smooth flows. The code's expanding mesh capability reduces numerical diffusion drastically for outflows, especially when the analytic nature of the bulk flow is known beforehand. Sprout can be used to study fluid instabilities in expanding flows, such as in SN explosions and jets; it resolves fine fluid structures at small length scales and expand the mesh gradually as the structures grow.

[ascl:2309.019] FRISBHEE: FRIedmann Solver for Black Hole Evaporation in the Early-universe

FRISBHEE (FRIedmann Solver for Black Hole Evaporation in the Early-universe solves the Friedmann - Boltzmann equations for Primordial Black Holes + SM radiation + BSM Models. Considering the collapse of density fluctuations as the PBH formation mechanism, the code handles monochromatic and extended mass and spin distributions. FRISBHEE can return the full evolution of the PBH, SM and Dark Radiation comoving energy densities, together with the evolution of the PBH mass and spin as a function of the log10 at scale factor, and can determine the relic abundance in the case of Dark Matter produced from BH evaporation for monochromatic and extended distributions.

[ascl:2309.020] PlanetSlicer: Orange-slice algorithm for fitting brightness maps to phase curves

PlanetSlicer fits brightness maps to phase curves using the "orange-slice" method and works both for self-luminous objects and those that diffuse reflected light assuming Lambertian reflectance. In both cases, the model supposes that a spherical object can be divided into slices of constant brightness (or albedo) which may be integrated to yield the total flux observed, given the angles of observation. The package contains two key functions: toPhaseCurve and fromPhaseCurve; the former integrates the brightness for each slice to calculate the observed total flux from the object, given the longitude of observation. The latter does the opposite, estimating the brightness of the slices from a set of observed total flux (the phase curve).

[ascl:2310.001] celerite2: Fast and scalable Gaussian Processes in one dimension

celerite2 is a re-write of celerite (ascl:1709.008), an algorithm for fast and scalable Gaussian Process (GP) Regression in one dimension. celerite2 improves numerical stability and integration with various machine learning frameworks. The implementation includes interfaces in Python and C++, with full support for PyMC (ascl:1610.016) and JAX (ascl:2111.002).

[ascl:2310.002] lcsim: Light curve simulation code

lcsim creates artificial light curves using two algorithms. The first simulates Gaussian distributed light curves following a specific power spectral density (PSD) freely selectable by the user. The second algorithm simulates light curves following a specific PSD and matching a specific probability density function (PDF). The package provides methods to resample the simulated light curves and add "observational" noise. Furthermore, the package provides an interface to a SQLite3-based database to store and access the simulations.

[ascl:2310.003] wwz: Weighted wavelet z-transform code

wwz provides a python3 implementation of the Foster weighted wavelet z-transform, a wavelet-based method for periodicity analysis of unevenly sampled data.

[ascl:2310.004] q3dfit: PSF decomposition and spectral analysis for JWST-IFU spectroscopy

q3dfit performs PSF decomposition and spectral analysis for high dynamic range JWST IFU observations, allowing the user to create science-ready maps of relevant spectral features. The software takes advantage of the spectral differences between quasars and their host galaxies for maximal-contrast subtraction of the quasar point-spread function (PSF) to reveal and characterize the faint extended emission of the host galaxy. Host galaxy emission is carefully fit with a combination of stellar continuum, emission and absorption of dust and ices, and ionic and molecular emission lines.

[ascl:2310.005] DustPyLib: A library of DustPy extensions

The DustPyLib library contains auxiliary modules for the dust evolution software DustPy (ascl:2207.016), which simulates the evolution of dust and gas in protoplanetary disks. DustPyLib includes interfaces to radiative transfer codes and modules with extensions to the DustPy defaults.

[ascl:2310.006] MAGPy-RV: Gaussian Process regression pipeline with MCMC parameter searching

MAGPy-RV (Modelling stellar Activity with Gaussian Processes in Radial Velocity) models data with Gaussian Process regression and affine invariant Monte Carlo Markov Chain parameter searching. Developed to model intrinsic, quasi-periodic variations induced by the host star in radial velocity (RV) surveys for the detection of exoplanets and the accurate measurements of their orbital parameters and masses, it now includes a variety of kernels and models and can be applied to any timeseries analysis. MAGPy-RV includes publication level plotting, efficient posterior extraction, and export-ready LaTeX results tables. It also handles multiple datasets at once and can model offsets and systematics from multiple instruments. MAGPy-RV requires no external dependencies besides basic python libraries and corner (ascl:1702.002).

[ascl:2310.007] zCluster: Measure photometric redshifts for galaxy clusters

zCluster measures galaxy cluster photometric redshifts using data from broadband photometry in large public surveys, given a priori knowledge of the cluster position. The code retrieves and uses redshift probability distributions in order to create a projected two-dimensional density map of a targeted galaxy cluster, which is later convolved with a Gaussian kernel to smooth the map. zCluster also produces photometric redshift estimates and galaxy density maps for any point in the sky using the included zField tool.

[ascl:2310.008] clfd: Clean folded data

clfd (clean folded data) implements GPU-accelerated smart interference removal algorithms to be used on folded pulsar search and pulsar timing data. The code converts each source profile to a small set of representative features, flagging outliers in the resulting feature space. clfd further visualizes the outlier flagging process, as well as the resulting two-dimensional time-frequency mask that is applied to the clean archive. The code provides access to cleaning algorithms that were initially developed for the High Time Resolution Universe (HTRU) survey which found several pulsars.

[ascl:2310.009] IQRM-APOLLO: Clean narrow-band RFI using Inter-Quartile Range Mitigation (IQRM) algorithm

IQRM-APOLLO cleans narrow-band radio frequency interference (RFI) using the Inter-Quartile Range Mitigation (IQRM) algorithm. By masking this interference, the code reduces the number of false positive pulsar candidates and increases sensitivity for pulsar detection. The IQRM algorithm is an outlier detection algorithm that is both non-parametric and robust to the presences of trends in time series data. Using short-duration data blocks, IQRM-APOLLO computes a spectral statistic that correlates with the presence of RFI, removing high outliers from the input signal.

[ascl:2310.010] riptide: Pulsar searching with the Fast Folding Algorithm

riptide implements the Fast Folding Algorithm (FFA) to identify periodic signals from time series data. In order to identify faint pulsars, the code provides access to a library of functions and classes for processing dedispersed radio signals. The FFA approaches the theoretical optimum for sensitivity to periodic signals regardless of pulse period and duty cycle.

[ascl:2310.011] AI-Feynman: Symbolic regression algorithm

AI-Feynman fits analytical expressions to data sets via symbolic regression, mapping the target variable to different features supplied in the data array. Using a neural network with constraints in the number of parameters utilized, the code provides the ability to obtain analytical expressions for normalized features that are used to predict a Pareto-optimal target. AI-Feynman is robust in handling noisy data, recursively generating multidimensional symbolic expressions that match data from an unknown functions.

[ascl:2310.012] GRIZZLY: 1D radiative transfer code

GRIZZLY simulates reionization using a 1D radiative transfer scheme. The code enables the efficient exploration of the parameter space for evaluating 21cm brightness temperature fluctuations near the cosmic dawn. GRIZZLY builds upon the BEARS algorithm for generating simulated reionization maps with density and velocity fields, which are useful for profiling dark matter halos and cosmological density fields.

[ascl:2311.001] wcpy: Wavelength Calibrator

The graphical user interface Wavelength Calibrator facilitates wavelength calibration. Although developed for astronomical data reduction, it can also be used in any place where wavelength calibration is needed.

[ascl:2311.002] VCAL-SPHERE: Hybrid pipeline for reduction of VLT/SPHERE data

VCAL-SPHERE, for VIP-based Calibration of VLT/SPHERE data, is a versatile pipeline for high-contrast imaging of exoplanets and circumstellar disks. The pipeline covers all steps of data reduction, including raw calibration, pre-processing and post-processing (i.e., modeling and subtraction of the stellar halo), for the IFS, IRDIS-DBI and IRDIS-CI modes (and combinations thereof) of the VLT instrument SPHERE. The three main steps of the reduction correspond to different modules, where the first follows the recommended EsoRex (ascl:1504.003) workflow and associated recipes with occasional inclusion of VIP (ascl:1603.003) routines (e.g., for PCA-based sky subtraction), while the other two stages fully rely on the VIP toolbox. Although the default parameters of the pipeline should yield a good reduction in most cases, these can be tuned using JSON parameter files for each stage of the pipeline for optimal reduction of specific datasets.

[ascl:2311.003] Special-Blurring: Compare quantum-spacetime foam models to GRB localizations

The IDL code Special-Blurring compares models of quantum-foam-induced blurring with the full dataset of gamma-ray burst localizations available from the NASA High Energy Astrophysics Science Research Archive (as of 1 November 2022). This includes GRB221009A, which was especially bright and detected in extremely high energy TeV gamma-rays. An upper limit of the parameter alpha (giving the maximal strength of quantum blurring) can be entered, which is scaled in the model of blurring (called "Phi") operating much like "seeing" from the ground in the optical, and those calculations are plotted against the observations.

[submitted] CRPropa 3.2

The landscape of high- and ultra-high-energy astrophysics has changed in the last decade, largely due to the inflow of data collected by large-scale cosmic-ray, gamma-ray, and neutrino observatories. At the dawn of the multimessenger era, the interpretation of these observations within a consistent framework is important to elucidate the open questions in this field. CRPropa 3.2 is a Monte Carlo code for simulating the propagation of high-energy particles in the Universe. This version represents a major leap forward, significantly expanding the simulation framework and opening up the possibility for many more astrophysical applications. This includes, among others: efficient simulation of high-energy particles in diffusion-dominated domains, self-consistent and fast modelling of electromagnetic cascades with an extended set of channels for photon production, and studies of cosmic-ray diffusion tensors based on updated coherent and turbulent magnetic-field models. Furthermore, several technical updates and improvements are introduced with the new version, such as: enhanced interpolation, targeted emission of sources, and a new propagation algorithm (Boris push). The detailed description of all novel features is accompanied by a discussion and a selected number of example applications.

[submitted] spectroflat

Spectroflat is a generic python calibration library for spectro-polarimetric data. It can be plugged into existing python based data reduction pipelines or used as a standalone calibration and performance ananlzsis tool.
It includes smile distortion correction and flat field extraction.

[submitted] atlas-fit

atlas-fit is a python tool to amend the results of [spectroflat] with calibration against a solar atlas. I.e., data for wavelength calibration and continuum-correction is genereted from flat field information and selected solar atlantes

[ascl:2311.004] KvW: Modified Kwee–Van Woerden method for eclipse minimum timing with reliable error estimates

The KvW code applies the Kwee Van Woerden (KvW) method for eclipse or transit minimum timing, with an improved error calculation that avoids underestimated errors in minimum times that may appear in the original method. This is particularly the case for low-noise eclipse or transit lightcurves from space or from modern ground instrumentation. The code requires an input light curve of near-equidistant points that contains only the eclipse, without any off-eclipse points, and is available in python and IDL. Both implementaitons return an eclipse minimum time with its error and provide optional text output and plots, as well as several levels of debug information.

[ascl:2311.005] NEOexchange: Target and Observation Manager for the Solar System

The NEOexchange web portal and Target and Observation Manager ingests solar system objects, including Near-Earth Object (NEO) candidates from the Minor Planet Center, schedules observations on the Las Cumbres Observatory global telescope network and reduces, displays, and analyzes the resulting data. NEOexchange produces calibrated photometry from the imaging data and uses Source Extractor (ascl:1010.064) and SCAMP (ascl:1010.063) to perform object detection and astrometric fitting and calviacat (ascl:2207.015) to perform photometric calibration against photometric catalogs. It also has the ability to perform image registration and subtraction using SWARP (ascl:1010.068) and HOTPANTS (ascl:1504.004) and image stacking, alignment, and faint feature detection using gnuastro (ascl:1801.009).

[ascl:2311.006] MONDPMesh: Particle-mesh code for Milgromian dynamics

MONDPMesh provides a particle-mesh method to calculate the time evolution of an system of point masses under modified gravity, namely the AQUAL formalism. This is done by transforming the Poisson equation for the potential into a system of four linear PDEs, and solving these using fast Fourier transforms. The accelerations on the point masses are calculated from this potential, and the system is propagated using Leapfrog integration. The time complexity of the code is O(N⋅p⋅log(p)) for p pixels and N particles, which is the same as for a Newtonian particle-mesh code.

[ascl:2311.007] tensiometer: Test a model until it breaks

Tensiometer provides non-Gaussian tension estimators that extend GetDist (ascl:1910.018) capabilities to test the level of agreement or disagreement between different posterior distributions by using kernel density estimates. The code has been used to study the level of internal agreement between different measurements of the clustering of cosmological structures from the Dark Energy Survey and the Planck satellite.

[ascl:2311.008] IQRM: IQRM interference flagging algorithm for radio pulsar and transient searches

IQRM implements the Inter-Quartile Range Mitigation (IQRM) interference flagging algorithm for radio pulsar and transient searches. This module provides only the algorithm that infers a channel mask from some spectral statistic that measures the level of RFI contamination in a time-frequency data block. It should be useful as a reference implementation to developers who wish to integrate IQRM into an existing pipeline or search code.

[ascl:2311.009] Hi-COLA: Cosmological large-scale structure simulator for Horndeski theories

Hi-COLA runs fast approximate N-body simulations of non-linear structure formation in reduced Horndeski gravity (Horndeski theories with luminal gravitational waves). It is generic with respect to the reduced Horndeski class. Given an input Lagrangian, Hi-COLA's front-end dynamically constructs the appropriate field equations and consistently solves for the cosmological background, linear growth, and screened fifth force of that theory. This is passed to the back-end, which runs a hybrid N-body simulation at significantly reduced computational and temporal cost compared to traditional N-body codes. By analyzing the particle snapshots, one can study the formation of structure through statistics such as the matter power spectrum.

[ascl:2311.010] FPFS: Fourier Power Function Shaplets

FPFS (Fourier Power Function Shaplets) is a fast, accurate shear estimator for the shear responses of galaxy shape, flux, and detection. Utilizing leading-order perturbations of shear (a vector perturbation) and image noise (a tensor perturbation), the code determines shear and noise responses for both measurements and detections. Unlike methods that distort each observed galaxy repeatedly, the software employs analytical shear responses of select basis functions, including Shapelets basis and peak basis. FPFS is efficient and can process approximately 1,000 galaxies within a single CPU second, and maintains a multiplicative shear estimation bias below 0.5% even amidst blending challenges.

[ascl:2311.011] PIPPIN: Polarimetric Differential Imaging (PDI) pipeline for NACO data

PIPPIN (PDI pipeline for NACO data) reduces the polarimetric observations made with the VLT/NACO instrument. It applies the Polarimetric Differential Imaging (PDI) technique to distinguish the polarized, scattered light from the (largely) un-polarized, stellar light. As a result, circumstellar dust can be uncovered. PIPPIN appropriately handles various instrument configurations, including half-wave plate and de-rotator usage, Wollaston beam-splitter, and wiregrid observations. As part of the PDI reduction, PIPPIN performs various levels of corrections for instrumental polarization and crosstalk.

[ascl:2311.012] CosmoLattice: Lattice simulator of scalar and gauge field dynamics in an expanding universe

CosmoLattice performs lattice simulations of field dynamics in an expanding universe. The code can simulate the dynamics of interacting scalar field theories, Abelian U(1) gauge theories, and non-Abelian SU(2) gauge theories, either in flat spacetime or an expanding FLRW background, including the case of self-consistent expansion sourced by the fields themselves. It can also compute gravitational waves sourced by U(1) Abelian Gauge fields. The CosmoLattice platform can implement any system of dynamical equations suitable for discretization on a lattice, as it introduces its own language describing fields and operations between them, and hence can implement new libraries to solve arbitrary field problems (related or not to cosmology).

[ascl:2311.013] pygwb: Lighweight python stochastic GWB analysis pipeline

pygwb analyzes laser interferometer data and designs a gravitational wave background (GWB) search pipeline. Its modular and flexible codebase is tailored to current ground-based interferometers such as LIGO Hanford, LIGO Livingston, and Virgo, but can be generalized to other configurations. It is based on GWpy (ascl:1912.016) and bilby (ascl:1901.011) for optimal integration with widely-used gravitational wave data analysis tools. pygwb also includes a set of scripts to analyze data and perform large-scale searches on a high-performance computing cluster efficiently.

[ascl:2311.014] FASMA: Stellar spectral analysis package

FASMA delivers the atmospheric stellar parameters (effective temperature, surface gravity, metallicity, microturbulence, macroturbulence, and rotational velocity) based on the spectral synthesis technique. This technique relies on the comparison of synthetic spectra with observations to yield the best-fit parameters under a χ2 minimization process. FASMA also delivers chemical abundances of 13 elements. Written in Python, the code is wrapped around MOOG (ascl:1202.009) which calculates the synthetic spectra. FASMA includes two grids of models in MOOG readable format, Kurucz and marcs, that cover the parameter space for both dwarf and giant stars with metallicity limit of -5.0 dex.

[ascl:2311.015] nemiss: Neutrino emission from hydrocode data

nemiss calculates neutrino emission from an astrophysical jet. nemiss works as part of the PLUTO-nemiss-rlos pipeline. PLUTO (ascl:1010.045) produces a hydrodynamical jet. Then, nemiss calculates beamed neutrino emission at each eligible cell along a given direction in space. Finally, rlos (ascl:1811.009) produces a synthetic neutrino image of the jet along the given direction, taking into consideration the finite nature of the speed of light.

[ascl:2311.016] RoSSBi3D: Finite volume code for protoplanetary disk evolution study

The numerical code RoSSBi3D (Rotating Systems Simulation Code for Bi-fluids) is designed for protoplanetary discs study at 2D and 3D. It is a finite volume code which is second order in time, features self-gravity (2D), and uses an exact Riemann solver to account for discontinuities. This FORTRAN 90 code solves the fully compressible inviscid Euler, continuity and energy conservation equations in polar coordinates for an ideal gas orbiting a central object. Solid particles are treated as a pressureless fluid and interact with the gas through aerodynamic forces. The code works on high performance computers thanks to the MPI standard (CPU).

[submitted] prodimopy: Python tools for the radiation thermo-chemical code ProDiMo.

prodimopy is an open-source Python package to read, analyze and plot modelling results of the radiation thermo-chemical disk code ProDiMo (PROtoplanetary DIsk MOdel, https://prodimo.iwf.oeaw.ac.at). It also includes tools to run ProDiMo in 1D slap model mode, to run simple ProDimo model grids and to interface ProDiMo with 1D and 2D disk codes (i.e. use input structure from hydrodynamic models).

prodimopy can also be used independently of ProDiMo (no ProDiMo installation is required) and hence is also useful to extract information from already available ProDiMo models (e.g. as input for other codes) or for model comparison.

[ascl:2312.001] smops: A sub-band model FITS image interpolator

smops interpolates input sub-band model FITS images, such as those produced by WSClean (ascl:1408.023), into more finely channelized sub-band model FITS images, thus generating model images at a higher frequency resolution. It is a Python-based command line tool. For example, given input model FITS images initially created from sub-dividing a given bandwidth into four, smops can subdivide that bandwidth further, resulting in more finely channelized model images, to a specified frequency resolution. This smooths out the stepwise behavior of models across frequency, which can improve the results of self-calibration with such models.

[ascl:2312.002] PROSPECT: Profile likelihood for frequentist cosmological inference

PROSPECT infers cosmological parameters using profile likelihoods. It constructs an approximate profile likelihood from an MCMC and optimizes it using simulated annealing, a gradient-free stochastic optimization algorithm. It employs an automatic tuning of the step size parameter and binned covariance matrices from the MCMC to achieve efficient optimizations of the profile likelihood.

[ascl:2312.003] BUQO: Bayesian Uncertainty Quantification by Optimization

BUQO solves large-scale imaging inverse problems. It leverages probability concentration phenomena and the underlying convex geometry to formulate the Bayesian hypothesis test as a convex problem that is then efficiently solved by using scalable optimization algorithms. This allows scaling to high-resolution and high-sensitivity imaging problems that are computationally unaffordable for other Bayesian computation approaches.

[ascl:2312.004] DENSe: Bayesian density estimation for Poisson data

DENSe enables Bayesian non-parametric inferences of densities of Poisson data counts. Its framework of stateless methods is written in Python, although it relies on NIFTy (ascl:1302.013, ascl:1903.008) for the heavy lifting. DENSe utilizes all available information in the data by modeling the inherent correlation structure using a Matérn kernel. The inference of the density from count data can be written in a single line of python code. The fitting method takes a multidimensional numpy array as input and returns multidimensional arrays of the same dimensions encoding the density field.

[ascl:2312.005] LyaCoLoRe: Generate simulated Lyman alpha forest spectra

LyaCoLoRe uses CoLoRe (ascl:2111.009) simulations to generate simulated Lyman alpha forest spectra. The code takes the output files from CoLoRe as an input, carries out several stages of processing, and produces realistic skewers of transmitted flux fraction as an output. The repository includes tools to tune the parameters within LyaCoLoRe's transformation, and to measure the 1D power spectrum of output skewers quickly.

[ascl:2312.006] SolarAxionFlux: Solar axion flux calculator for different solar models and opacity codes

SolarAxionFlux quantifies systematic differences and statistical uncertainties in the calculation of the solar axion flux from axion-photon and axion-electron interactions. Determining the limitations of these calculations can be used to identify potential improvements and help determine axion model parameters more accurately.

[ascl:2312.007] CosmoLED: Cosmo code for Large Extra Dimension (LED) black holes

CosmoLED computes Hawking evaporation from black holes and set constraints on the fraction of black holes in dark matter. Based on ExoCLASS (ascl:1106.020), the code provides a DarkAges_LED module and C codes in class_LED to compute the evolution and energy deposition functions from LED black holes. Though CosmoLED is designed for large extra dimension black holes, it can also be used to study 4D black holes.

[ascl:2312.008] CompressedFisher: Library for testing Fisher forecasts

The CompressedFisher library tests whether Fisher forecasts using simulated components are converged. The library contains tools to compute standard Fisher estimates, estimate the level of bias due to the finite number of simulations, and compute the compressed Fisher information. Typical usage of CompressedFisher requires two ensembles of simulations: one set of simulations is given at the fiducial parameters (𝜃) to estimate the covariance matrix. The second is a set of simulated derivatives; these can either be in the form of realizations of the derivatives themselves or simulations evaluate at a set of point in the neighborhood of the fiducial point that the code can use to estimate the derivatives.

[ascl:2312.009] GravSphere: Jeans modeling code

The non-parametric Jeans code GravSphere models discrete data and can be used to model dark matter distributions in galaxies. It can also recover the density ρ(r) and velocity anisotropy β(r) of spherical stellar systems, assuming only that they are in a steady state. Real or mock data are prepared by using the included binulator.py code; the repository also includes many examples for exploring the GravSphere's capabilities.

[ascl:2312.010] FORECAST: Realistic astronomical image and galaxy survey generator

FORECAST generates realistic astronomical images and galaxy surveys by forward modeling the output snapshot of any hydrodynamical cosmological simulation. It exploits the snapshot by constructing a lightcone centered on the observer's position; the code computes the observed fluxes of each simulated stellar element, modeled as a Single Stellar Population (SSP), in any chosen set of pass-band filters, including k-correction, IGM absorption, and dust attenuation. These fluxes are then used to create an image on a grid of pixels, to which observational features such as background noise and PSF blurring can be added. FORECAST provides customizable options for filters, size of the field of view, and survey parameters, thus allowing the synthetic images to be tailored for specific research requirements.

[ascl:2312.011] PhotochemPy: 1-D photochemical model of rocky planet atmospheres

PhotochemPy finds the steady-state chemical composition of an atmosphere or evolves atmospheres through time. Given inputs such as the stellar UV flux and atmospheric temperature structure, the code creates a photochemical model of a planet's atmosphere. PhotochemPy is a distant fork of Atmos (ascl:2106.039). It provides a Python wrapper to Fortran source code but can also be used exclusively in Fortran.

[ascl:2312.012] PulsarX: Pulsar searching

The folding pipeline PulsarX searches for pulsars. The code includes radio frequency interference mitigation, de-dispersion, folding, and parameter optimization, and supports both psrfits and filterbank data formats. The toolset has two implementations of the folding pipelines; one uses a brute-force de-dispersion algorithm, and the other an algorithm that becomes more efficient than the brute-force de-dispersion algorithm as the number of candidates increases. PulsarX is appropriate for large-scale pulsar surveys.

[ascl:2312.013] 21cmEMU: 21cmFAST summaries emulator

21cmEMU emulates 21cmFAST (ascl:1102.023) summary statistics, among them the 21-cm power spectrum, 21-cm global brightness temperature, IGM spin temperature, and neutral fraction. It also emulates the Thomson scattering optical depth and UV luminosity functions. With 21cmFAST installed, parameters can be supplied direction to 21cmEMU, and 21cmEMU can be used for, for example, analytic calculations of taue and UV luminosity functions. The code is included as an alternative simulator in 21cmMC (ascl:1608.017).

[ascl:2312.014] GRFolres: Extension to GRChombo for modified gravity simulations

GRFolres performs simulations in modified theories of gravity. It is based on GRChombo (ascl:2306.039) and inherits all of the capabilities of the main GRChombo code, which makes use of the Chombo library (ascl:1202.008) for adaptive mesh refinement. The code implements the 4∂ST theory of modified gravity and the cubic Horndeski theory in (3+1)-dimensional numerical relativity. GRFolres can be used for stable gauge evolution, solving the modified energy and momentum constraints for initial conditions, and monitoring the constraint violation and calculating the energy densities associated with the different scalar terms in the action. It can also extract data for the tensor and scalar gravitational waveforms.

[ascl:2312.015] SUNBIRD: Neural-network-based models for galaxy clustering

SUNBIRD trains neural-network-based models for galaxy clustering. It also incorporates pre-trained emulators for different summary statistics, including galaxy two-point correlation function, density-split clustering statistics, and old-galaxy cross-correlation function. These models have been trained on mock galaxy catalogs, and were calibrated to work for specific samples of galaxies. SUNBIRD implements routines with PyTorch to train new neural-network emulators.

[ascl:2312.016] The Farmer: Photometry routines for deep multi-wavelength galaxy surveys

The Farmer contains photometry routines geared towards deep, multi-wavelength galaxy surveys. It fits simple parametric surface brightness profiles provided by The Tractor (ascl:1604.008) to measure precision photometry even in deeply crowded fields when provided with a suitable high resolution detection image. The Farmer has been used to build a number of galaxy survey catalogs including COSMOS202, SHELA, and H20.

[ascl:2312.017] LimberJack.jl: Auto-differentiable methods for cosmology

LimberJack.jl performs cosmological analyses of 2 point auto- and cross-correlation measurements from galaxy clustering, CMB lensing and weak lensing data. Written in Julia, it obtains gradients for its outputs faster than traditional finite difference methods, making the code greatly synergistic with gradient-based sampling methods such as Hamiltonian Monte Carlo. LimberJack.jl can efficiently exploring parameter spaces with hundreds of dimensions.

[ascl:2312.018] PyMsOfa: Python package for the Standards of Fundamental Astronomy (SOFA) service

PyMsOfa accesses the International Astronomical Union’s SOFA library (ascl:1403.026) from Python. It offers a wrapper package based on a foreign function library for Python (ctypes), a wrapper with the foreign function interface for Python calling C code (cffi), and a package directly written in pure Python codes from SOFA subroutines. PyMsOfa is suitable for the astrometric detection of habitable planets of the Closeby Habitable Exoplanet Survey (CHES) mission and for the frontier themes of black holes and dark matter related to astrometric calculations and other fields.

[ascl:2312.019] Rainbow: Simultaneous multi-band light curve fitting

Rainbow is a black-body parametric model for transient light curves. It uses Bazin function as a model for bolometric flux evolution and a logistic function for the temperature evolution; it provides seven fit parameters and goodness of fit (reduced χ2) and is well-suited for transient objects. Also included is RainbowRisingFit, suitable for rising transient objects, which offers six fit parameters. It is based on a rising sigmoid bolometric flux and a sigmoid temperature evolution. These implementations are implemented in the light-curve processing toolbox (ascl:2107.001) for Python.

[ascl:2312.020] ProPane: Image warping and stacking utilities

The ProPane package comes with key utilities for warping between different WCS systems: propaneWarp (for warping individual frames once). ProPane also contains the various functions for creating large stacks of many warped frames (which is of class ProPane, which is roughly meant to suggest the idea of many panes of glass being stacked together). It uses the wcslib C library (ascl:1108.003) for projections (all legal ones are supported) via the Rwcs package, and uses the threaded Cimg C++ library via the imager library to do image warping. ProPane also contains functions converted from older (deprecated) Rwcs and ProFound (ascl:1804.006) related functions.

[ascl:2312.021] PyRaTE: Non-LTE spectral lines simulations

PyRaTE (Python Radiative Transfer Emission) post-processes astrochemical simulations. This multilevel radiative transfer code uses the escape probablity method to calculate the population densities of the species under consideration. The code can handle all projection angles and geometries and can also be used to produce mock observations of the Goldreich-Kylafis effect. PyRaTE is written in Python; it uses a parallel strategy and relies on the YT analysis toolkit (ascl:1011.022), mpi4py and numba.

[ascl:2312.022] C2-Ray: Time-dependent photo-ionization calculations

C2-Ray calculates spherical symmetric time-dependent photo-ionization in 1D with the source at the origin for hydrogen only. The code is explicitly photon-conserving and uses an analytical relaxation solution for the ionization rate equations for each time step, thus enabling integration of the equation of transfer along a ray with fewer cells and time steps than previous methods. It is suitable for coupling radiative transfer to gas and N-body dynamics methods on fixed or adaptive grids. C2-Ray is not parallelized but contains an MPI module for compatibility with the 3D version (C2-Ray3Dm).

[ascl:2312.023] C2-Ray3Dm: 3D version of C2-Ray for multiple sources, hydrogen only

C2-Ray3Dm performs time-dependent photo-ionization calculations for 3D multiple sources, and for hydrogen only. Based on C2-Ray (ascl:2312.022), it runs under both MPI and OpenMP. The length of subroutines has been reduced to make the code more manageable and easier to read.

[ascl:2312.024] C2-Ray3Dm1D_Helium: Hydrogen + helium version of C2-Ray

C2-Ray3Dm1D_Helium is the hydrogen + helium version of the radiative transfer photo-ionization code C2-Ray. It combines the 1D and 3D versions of the code.

[ascl:2312.025] pyC2Ray: Python interface to C2Ray with GPU acceleration

pyC2Ray updates C2-Ray (ascl:2312.022), an astrophysical radiative transfer code used to simulate the Epoch of Reionization (EoR). pyC2Ray includes a new raytracing method, ASORA, developed for GPUs, and provides a Python interface for customizable use of the code. The core features of C2-Ray, written in Fortran90, are wrapped using f2py as a Python extension module, while the raytracing library ASORA is implemented in C++ using CUDA. Both are native Python C-extensions and can be directly accessed from any Python script.

[ascl:2312.026] CloudFlex: Small-scale structure observational signatures modeling

CloudFlex models observational signatures associated with the small-scale structure of the circumgalactic medium. It populates cool gas structures in the CGM as a complex of cloudlets using a Monte Carlo method. Various parameters can be set to describe the structure of the cloudlet complexes, including cloudlet mass, density, velocity, and size. Functionality exists for generating the observational signatures of sightlines piercing these cloudlet complexes, borrowing heavily from the Trident code (ascl:1612.019).

[ascl:2312.027] galclaim: GALaxy Chance of Local Alignment algorIthM

galclaim identifies association between astrophysical transient sources and host galaxy. This association is made by estimating the chance alignment between a given transient sky localization and nearby galaxies. The code can be used with various catalogs, including Pan-STARRS, HSC, AllWISE and GLADE. galclaim also pre-checks for nearby bright galaxy using the RC3 catalog (https://heasarc.gsfc.nasa.gov/w3browse/all/rc3.html). When a nearby galaxy is found, a warning is raised and the properties of the galaxy are saved in a dedicated output file. The package can create plots displaying the computed pval for the found objects for each transient and each catalog; plots are stored in the result/plots directory.

[ascl:2312.028] SAGE: Stellar Activity Grid for Exoplanets

SAGE corrects the time-dependent impact of stellar activity on transmission spectra. It uses a pixelation approach to model the stellar surface with spots and faculae, while accounting for limb-darkening and rotational line-broadening. The code can be used to evaluate stellar contamination for F to M-type hosts, test various spot sizes and locations, and quantify the impact of limb-darkening. SAGE can also retrieve the properties and distribution of active regions on the stellar surface from photometric monitoring, and connect the photometric variability to the stellar contamination of transmission spectra.

[ascl:2312.029] RRLFE: Metallicity calibrations for RR Lyrae variable stars

RRLFE generates and applies calibrations for retrieving [Fe/H] from low-res spectra of RR Lyrae variable stars. The code can generate a metallicity calibration anew, from real or synthetic spectra; it can also apply a metallicity calibration to low-resolution (R ~2000) RR Lyrae spectra spanning 3911 to 4950 angstroms.

[ascl:2312.030] matvis: Fast matrix-based visibility simulator
Kittiwisit, Piyanat; Murray, Steven G.; Garsden, Hugh; Bull, Philip; Cain, Christopher; Parsons, Aaron R.; Sipple, Jackson; Abdurashidova, Zara; Adams, Tyrone; Aguirre, James E.; Alexander, Paul; Ali, Zaki S.; Baartman, Rushelle; Balfour, Yanga; Beardsley, Adam P.; Berkhout, Lindsay M.; Bernardi, Gianni; Billings, Tashalee S.; Bowman, Judd D.; Bradley, Richard F.; Burba, Jacob; Carey, Steven; Carilli, Chris L.; Chen, Kai-Feng; Cheng, Carina; Choudhuri, Samir; DeBoer, David R.; de Lera Acedo, Eloy; Dexter, Matt; Dillon, Joshua S.; Dynes, Scott; Eksteen, Nico; Ely, John; Ewall-Wice, Aaron; Fagnoni, Nicolas; Fritz, Randall; Furlanetto, Steven R.; Gale-Sides, Kingsley; Gehlot, Bharat Kumar; Ghosh, Abhik; Glendenning, Brian; Gorce, Adelie; Gorthi, Deepthi; Greig, Bradley; Grobbelaar, Jasper; Halday, Ziyaad; Hazelton, Bryna J.; Hewitt, Jacqueline N.; Hickish, Jack; Huang, Tian; Jacobs, Daniel C.; Josaitis, Alec; Julius, Austin; Kariseb, MacCalvin; Kern, Nicholas S.; Kerrigan, Joshua; Kim, Honggeun; Kohn, Saul A.; Kolopanis, Matthew; Lanman, Adam; La Plante, Paul; Liu, Adrian; Loots, Anita; Ma, Yin-Zhe; MacMahon, David H. E.; Malan, Lourence; Malgas, Cresshim; Malgas, Keith; Marero, Bradley; Martinot, Zachary E.; Mesinger, Andrei; Molewa, Mathakane; Morales, Miguel F.; Mosiane, Tshegofalang; Neben, Abraham R.; Nikolic, Bojan; Devi Nunhokee, Chuneeta; Nuwegeld, Hans; Pascua, Robert; Patra, Nipanjana; Pieterse, Samantha; Qin, Yuxiang; Rath, Eleanor; Razavi-Ghods, Nima; Riley, Daniel; Robnett, James; Rosie, Kathryn; Santos, Mario G.; Sims, Peter; Singh, Saurabh; Storer, Dara; Swarts, Hilton; Tan, Jianrong; Thyagarajan, Nithyanandan; van Wyngaarden, Pieter; Williams, Peter K. G.; Xu, Zhilei; Zheng, Haoxuan

matvis simulates radio interferometric visibilities at the necessary scale with both CPU and GPU implementations. It is matrix-based and applicable to wide field-of-view instruments such as the Hydrogen Epoch of Reionization Array (HERA) and the Square Kilometre Array (SKA), as it does not make any approximations of the visibility integral (such as the flat-sky approximation). The only approximation made is that the sky is a collection of point sources, which is valid for sky models that intrinsically consist of point-sources, but is an approximation for diffuse sky models. The matvix matrix-based algorithm is fast and scales well to large numbers of antennas. The code supports both CPU and GPU implementations as drop-in replacements for each other and also supports both dense and sparse sky models.

[ascl:2312.031] AM3: Astrophysical Multi-Messenger Modeling

AM3 simulates lepto-hadronic interactions in astrophysical environments. It solves the time-dependent partial differential equations for the energy spectra of electrons, positrons, protons, neutrons, photons, neutrinos as well as charged secondaries (pions and muons), immersed in an isotropic magnetic field. The code accounts for the emission of photons and charged secondaries in electromagnetic and hadronic interactions feed back into the interaction rates in a time-dependent manner, therefore grasping non-linear effects including electromagnetic cascades. AM3 is computationally efficient, making it possible to scan vast source parameter scans and fit the observational data, and has been deployed to explain multi-wavelength observations from blazars, gamma-ray bursts and tidal disruption events.

[ascl:2312.032] gaia_tools: Tools for working with Gaia and related data sets

gaia_tools contains codes for working with the ESA/Gaia data and related data sets (APOGEE, GALAH, LAMOST DR2, and RAVE). Written in Python, it includes tools to read catalogs, perform cross-matching, read RVS or XP spectra, and query the Gaia archive. gaia_tools also contains various matching recipes, such as matching APOGEE or APOGEE-RC to Gaia DR2, and RAVE to TGAS (taking into account the epoch difference).

[ascl:2312.033] RADIS: Fast line-by-line code for high-resolution infrared molecular spectra

RADIS resolves spectra with millions of lines within seconds on a single-CPU and can be GPU-accelerated. It supports HITRAN, HITEMP and ExoMol out-of-the-box (auto-download), and therefore is particularly suitable to compute cross-sections or transmission spectra at high-temperature. RADIS includes equilibrium calculations for all species, and non-LTE for CO2 and CO.

[ascl:2312.035] SubGen: Fast subhalo sampler

SubGen generates Monte-Carlo samples of dark matter subhaloes. It fully describes the joint distribution of subhaloes in final mass, infall mass, and radius; it can be used to predict derived distributions involving combinations of these quantities, including the universal subhalo mass function, the subhalo spatial distribution, the gravitational lensing profile, the dark matter annihilation radiation profile and boost factor. SubGen works only for CDM subhaloes; for an extension of the code to also work with WDM subhaloes, see SubGen2 (ascl:2312.036).

[ascl:2312.036] SubGen2: Subhalo population generator

The SubGen2 subhalo population generator works for both CDM and WDM of arbitrary DM particle mass. It can be used to generate a population of subhaloes according to the joint distribution of subhalo bound mass, infall mass and halo-centric distance in a halo of a given mass. SubGen2 is an extension to SubGen (ascl:2312.035), which works only for CDM subhaloes.

[submitted] NE2001p: A Native Python Implementation of the NE2001 Galactic Electron Density Model

NE2001p is a fully Python implementation of the NE2001 Galactic electron density model. NE2001p forward models the dispersion and scattering of compact radio sources, including pulsars, fast radio bursts, AGNs, and masers, and the model predicts the distances of radio sources that lack independent distance measures.

[submitted] BSAVI: Bayesian Sample Visualizer for Cosmological Likelihoods

BSAVI (Bayesian Sample Visualizer) is a tool to aid likelihood analysis of model parameters where samples from a distribution in the parameter space are used as inputs to calculate a given observable. For example, selecting a range of samples will allow you to easily see how the observables change as you traverse the sample distribution. At the core of BSAVI is the Observable object, which contains the data for a given observable and instructions for plotting it. It is modular, so you can write your own function that takes the parameter values as inputs, and BSAVI will use it to compute observables on the fly. It also accepts tabular data, so if you have pre-computed observables, simply import them alongside the dataset containing the sample distribution to start visualizing.

[ascl:2401.001] tomso: TOols for Models of Stars and their Oscillations

tomso loads and saves input and output files for and from stellar evolution and oscillation codes. The functions are bundled together in modules that correspond with a specific stellar evolution code, stellar oscillation code, or file format. tomso supports the FGONG format and various input/output files for ADIPLS (ascl:1109.002), GYRE (ascl:1308.010), MESA (ascl:1010.083), and STARS (ascl:1107.008). tomso's main purpose is to provide a compact interface for manipulating input and output data in these formats and simplify research that uses them.

[ascl:2401.002] Rayleigh: Pseudo-spectral MHD

The 3-D convection code Rayleigh enables study of dynamo behavior in spherical geometry. It evolves the incompressible and anelastic MHD equations in spherical geometry using a pseudo-spectral approach. Rayleigh employs spherical harmonics in the horizontal direction and Chebyshev polynomials in the radial direction and has undergone extensive accuracy testing.

[ascl:2401.003] LUNA: Forward model luna simulator

LUNA generates dynamically accurate lightcurves from a planet-moon pair, analytically accounting for shadow overlaps, stellar limb darkening, and planet-moon dynamical motion. The code takes transit timing/duration variations and ingress/egress asymmetries into consideration not only for the planet, but also the moon. LUNA was designed to be analytical and dynamical and to incorporate limb darkening (including non-linear laws) and account for all orbital elements, including eccentricity and longitude of the ascending node. Because the software is precise and analytic, LUNA is a highly potent tool for exomoon detection.

[ascl:2401.004] pyPETaL: A Pipeline for Estimating AGN Time Lags

pyPETAL produces cross-correlation functions, discrete correlation functions, and mean time lags from multi-band AGN time-series data, combining multiple different codes (including pyCCF (ascl:1805.032), pyZDCF, PyROA (ascl:2107.012), and JAVELIN (ascl:1010.007)) used for active galactic nuclei (AGN) reverberation mapping (RM) analysis into a unified pipeline. This pipeline also implements outlier rejection using Damped Random Walk Gaussian process fitting, and detrending through the LinMix algorithm. pyPETAL implements a weighting scheme for all lag-producing modules, mitigating aliasing in peaks of time lag distributions between light curves. pyPETAL scales to any combination of internal code modules, supporting a variety of computational workflows.

[ascl:2401.005] CosmosCanvas: Useful color maps for different astrophysical properties

CosmosCanvas creates perception-based color maps for different astrophysical properties such as spectral index and velocity fields. Three tutorials demonstrate how to use python code to exploit and adjust the boundaries in these divergent colour schemes. Intended to work with human physiology, each tutorial offers at least one default scheme that is monotonic in value both as a redundancy for supporting data information and an aid for colour blind viewers. This library relies on Gilles Ferrand's colourspace library.

[ascl:2401.006] LoSoTo: LOFAR solutions tool

LoSoTo (LOFAR Solution Tool) performs a variety of operations on H5parm data, which is based on the HDF5 format; it isolates direction independent systematic effects and can therefore be transferred to the target field. Subsets of data can be selected for each operation using lists of axes values, regular expressions, or intervals. The LoSoTo package stores solutions in arrays organized in a hierarchical fashion; this provides flexibility and preserves performance. The code can, for example, extract Faraday rotation from RR/LL phase solutions or a rotation matrix, clip solutions around the median, and calculate the ionospheric structure function. LoSoTo includes an outlier flagging procedure, normalizes solutions to a given value, and offers an advanced plotting routine, and many other operations.

[ascl:2401.007] deal.II: Finite element library

deal.II computes solutions to partial differential equations (PDEs) using adaptive finite elements. The code provides an interface for processing PDEs accessible to both laptops and supercomputers, and has been used to investigate the local and global waveform effects of gravitational waves by numerical simulation. deal.II supports massively parallel computing of very large linear systems of equations and provides access to triangulation of various geometries of the simulation domain.

[ascl:2401.008] DARC: Dirac Atomic R-matrix Codes

DARC (Dirac Atomic R-matrix Codes) enables the study of continuum processes for a general atomic system. The suite of programs calculate electron-atom or electron-ion collision cross-sections. In addition, the programs include code for bound-state and photoionization calculations.

[ascl:2401.009] Harmonic: Learnt harmonic mean estimator

harmonic learns an approximate harmonic mean estimator (referred to as a "learnt harmonic mean estimator") from posterior distribution samples to compute the marginal likelihood required for Bayesian model selection. Using a large number of independent Markov chain Monte Carlo (MCMC) chains from another package such as emcee (ascl:1303.002), harmonic uses importance sampling to learn a new target distribution in order to optimize an approximate harmonic estimator while minimizing its variance.

[ascl:2401.010] SYSNet: Neural Network modeling of imaging systematics in galaxy surveys

The Feed Forward Neural Network SYSNet models the relationship between the imaging maps, such as stellar density and the observed galaxy density field, in order to mitigate the systematic effects and to make a robust galaxy clustering measurements. The cost function is Mean Squared Error and a L2 regularization term, and the optimization algorithm is Adaptive Moment (ADAM).

[ascl:2401.011] ostrich: Surrogate modeling using PCA and Gaussian process interpolation

Ostrich emulates surrogate models for complex and expensive functions using Principal Component Analysis (PCA) to decompose a signal, then interpolate the PCA weights over the parameters θ using a Gaussian Process interpolator. The code is trained on samples from the expensive functions, recreating and interpolating between those training samples with reduced computational cost, and recalculating for each use.

[ascl:2401.012] baryon-sweep: Outlier rejection algorithm for JWST/NIRSpec IFS data

baryon-sweep produces a robust outlier rejection while simultaneously preserving the signal of the science target. The code works as a standalone solution or as a supplement to the current pipeline software. baryon-sweep creates the 2D pixel mask and mask layers, processes the sky (non-science target) spaxels, and creates a post-processed cube ready for use.

[ascl:2401.013] SolarKAT: Solar imaging pipeline for MeerKAT

SolarKAT mitigates solar interference in MeerKAT data and recovers the visibilities rather than discarding them; this solar imaging pipeline takes 1GC calibrated data in Measurement Set format as input. Written in Python, the pipeline employs solar tracking, subtraction, and peeling techniques to enhance data quality by significantly reducing solar radio interference. This is achieved while preserving the flux measurements in the main field. SolarKAT is versatile and can be applied to general radio astronomy observations and solar radio astronomy; additionally, generated solar images can be used for weather forecasting. SolarKAT is deployed in Stimela (ascl:2305.007). It is based on existing radio astronomy software, including CASA (ascl:1107.013), breizorro (ascl:2305.009), WSclean (ascl:1408.023), Quartical (ascl:2305.006), and Astropy (ascl:1304.002).

[ascl:2401.014] LoRD: Locate Reconnection Distribution

LoRD (Locate Reconnection Distribution) identifies the locations and structures of 3D magnetic reconnection within discrete magnetic field data. The toolkit contains three main functions; the first, ARD (Analyze Reconnection Distribution) locates the grids undergoing reconnection without null points and also recognizes the local configurations of reconnection sites. ANP (Analyze Null Points) locates and classifies the 3D null points, and APNP (Analyze Projected Null Points) analyzes the 2D neutral points projected on a plane near a cell. LoRD is written in Matlab and the toolkit contains demo scripts.

[ascl:2401.015] maskfill: Fill in masked values in an image

maskfill inward extrapolates edge pixels just outside masked regions, using iterative median filtering and the full information contained in the edge pixels. This provides seamless transitions between masked pixels and good pixels, and allows high fidelity reconstruction of gaps in continuous narrow features. An image and a mask the only required inputs.

[ascl:2401.016] CRR: Convex Ridge Regularizer

CRR (Convex Ridge Regularizer) takes the gradient of regularizers that are the sum of convex-ridge functions and parameterizes them using a neural network that has a single hidden layer with increasing and learnable activation functions. The neural network is trained within a few minutes as a multistep Gaussian denoiser, and offers improvements for denoising and image reconstruction over other methods with similar reliability.

[ascl:2401.017] QuantifAI: Radio interferometric imaging reconstruction with scalable Bayesian uncertainty quantification

QuantifAI reconstructs radio interferometric images using scalable Bayesian uncertainty quantification relying on data-driven (learned) priors. It relies on the convex accelerated optimization algorithms in CRR (ascl:2401.016) and is built on top of PyTorch. QuantifAI also includes MCMC algorithms for posterior sampling.

[ascl:2401.018] tidalspin: Constrain black hole spins using relativistic tidal forces properties

tidalspin uses a Bayesian approach to infer posterior distributions of a black hole's parameters (mass and spin) in an observed tidal disruption event, given a prior estimate of the black hole’s mass (e.g., from a galactic scaling relationship, or the tidal disruption event’s observed properties). These posterior distributions will only utilize the properties of tidal forces in their inference. tidalspin can be applied to the population of tidal disruption events already discovered.

[ascl:2401.019] StructureFunction: Bayesian estimation of the AGN structure function for Poisson data

StructureFunction determines the X-ray Structure Function of a population of Active Galactic Nuclei (AGN) for which two epoch X-ray observations are available and are separated by rest frame time interval. The calculation of the X-ray structure function is Bayesian. The sampling of the likelihood uses Stan (ascl:1801.003) for statistical modeling and high-performance statistical computation.

[ascl:2401.020] escatter: Electron scattering in Python

escatter.py performs Monte Carlo simulations of electron scattering events. The code was developed to better understand the emission lines from the interacting supernova SN 2021adxl, specifically the blue excess seen in the Hα 6563A emission line. escatter follows a photon that was formed in a thin interface between the supernova ejecta and surrounding material as it travels radially outwards through the dense material, scattering electrons outwards until it reaches an optically thin region, and plots a histogram of the emergent photons.

[ascl:2402.001] NMMA: Nuclear Multi Messenger Astronomy framework

NMMA probes nuclear physics and cosmology with multimessenger analysis. This fully featured, Bayesian multi-messenger pipeline targets joint analyses of gravitational-wave and electromagnetic data (focusing on the optical). Using bilby (ascl:1901.011) as the back-end, the software uses a variety of samplers to sampling these data sets. NMMA uses chiral effective field theory based neutron star equation of states when performing inference, and is also capable of estimating the Hubble Constant.

[ascl:2402.002] Rfits: FITS file manipulation in R

Rfits reads and writes FITS images, tables, and headers. Written in R, Rfits works with all types of compressed images, and both ASCII and binary tables. It uses CFITSIO (ascl:1010.001) for all low level FITS IO, so in general should be as fast as other CFITSIO-based software. For images, Rfits offers fully featured memory mapping and on-the-fly subsetting (by pixel and coordinate) and sparse pixel sampling, allowing for efficient inspection of very large (larger than memory) images.

[ascl:2402.003] Rwcs: World coordinate system transforms in R

Rwcs offers access to all the projection and distortion systems of WCSLIB (ascl:1108.003) in R. This can be used directly for, for example, pixel lookups, or for higher level general distortion and projection.

[ascl:2402.004] CCBH-Numerics: Cosmologically-coupled-black-holes formation mass numerics

CCBH-Numerics (previously called CCBH-PLPP) computes the probability of the existence of a single cosmologically coupled black hole (BH) with a formation mass below a specified threshold for given observational data of binary black holes (BBHs) from gravitational waves. The code uses the unbiased population of BBHs, as given by the power-law-plus-peak (PLPP) profile, as the observational input, and assumes that the detected BBHs are formed from stellar evolution, not primordial BHs. CCBH-Numerics also works with individual data from BBHs and for NSBH pairs as well.

[ascl:2402.005] MGPT: Modified Gravity Perturbation Theory code

MGPT (Modified Gravity Perturbation Theory) computes 2-point statistics for LCDM model, DGP and Hu-Sawicky f(R) gravity. Written in C, the code can be easily modified to include other models. Specifically, it computes the SPT matter power spectrum, SPT Lagrangian-biased tracers power spectrum, and the CLPT matter correlation function. MGPT also computes the CLPT Lagrangian-biased tracers correlation function and a set of Q and R functionsfrom which other statistics, as leading order bispectrum, can be constructed.

[submitted] GalMOSS: A package for GPU-accelerated Galaxy Profile Fitting

We introduce GalMOSS, a Python-based, Torch-powered tool for two-dimensional fitting of galaxy profiles. By seamlessly enabling GPU parallelization, GalMOSS meets the high computational demands of large-scale galaxy surveys, placing galaxy profile fitting in the LSST-era. It incorporates widely used profiles such as the Sérsic, Exponential disk, Ferrer, King, Gaussian, and Moffat profiles, and allows for the easy integration of more complex models. Tested on 8,289 galaxies from the Sloan Digital Sky Survey (SDSS) g-band with a single NVIDIA A100 GPU, GalMOSS completed classical Sérsic profile fitting in about 10 minutes. Benchmark tests show that GalMOSS achieves computational speeds that are 6 $\times$ faster than those of default implementations.

[submitted] TAT: Timing Analysis Toolkit for high-energy pulsar astrophysics

The TAT-pulsar (Timing Analysis Toolkit for Pulsars) package is a specialized toolkit designed for handling the scientific intricacies of pulsar timing. It provides a suite of Python-based utilities and scripts that facilitate the analysis, processing, and visualization of pulsar data. By leveraging observational data from pulsars, along with the associated physical processes and statistical characteristics, TAT-pulsar integrates a series of useful tools and data analysis scripts specifically developed for both isolated pulsars and binary systems. This enables swift analysis and the detailed presentation of timing properties in the high-energy pulsar field. Developed and implemented completely independently from other pulsar timing software such as Stingray (ascl:1608.001) and PINT (ascl:1902.007), TAT-pulsar serves as a valuable cross-checking and supplementary tool for data analysis.

[ascl:2402.006] polarizationtools: Polarization analysis and simulation tools in python

polarizationtools converts, analyzes, and simulates polarization data. The different python scripts (1) convert Stokes parameters into linear polarization parameters with proper treatment of the uncertainties and vice versa; (2) shift electric vector position angle (EVPA) data points in time series to account for the 180 degrees ambiguity; (3) identify rotations of the EVPA e.g. in blazar polarization monitoring data according to various rotation definitions; and (4) simulate polarization time series as a random walk in the Stokes Q-U plane.

[ascl:2402.007] ECLIPSR: Automatically find individual eclipses in light curves, determine ephemerides, and more

ECLIPSR fully and automatically analyzes space based light curves to find eclipsing binaries and provide some first order measurements, such as the binary star period and eclipse depths. It provides a recipe to find individual eclipses using the time derivatives of the light curves, including eclipses in light curves of stars where the dominating variability is, for example, pulsations. Since the algorithm detects each eclipse individually, even light curves containing only one eclipse can (in principle) be successfully analyzed and classified. ECLIPSR can find eclipsing binaries among both pulsating and non-pulsating stars in a homogeneous and quick manner and process large amounts of light curves in reasonable amounts of time. The output includes, among other things, the individual eclipse markers, the period and time of first (primary) eclipse, and a score between 0 and 1 indicating the likelihood that the analyzed light curve is that of an eclipsing binary.

[ascl:2402.008] star_shadow: Analyze eclipsing binary light curves, find eccentricity, and more

star_shadow automatically analyzes space based light curves of eclipsing binaries and provide a measurement of eccentricity, among other parameters. It measures the timings of eclipses using the time derivatives of the light curves, using a model of orbital harmonics obtained from an initial iterative prewhitening of sinusoids. Since the algorithm extracts the harmonics from the rest of the sinusoidal variability eclipse timings can be measured even in the presence of other (astrophysical) signals, thus determining the orbital eccentricity automatically from the light curve along with information about the other variability present in the light curve. The output includes, but is not limited to, a sinusoid plus linear model of the light curve, the orbital period, the eccentricity, argument of periastron, and inclination.

[ascl:2402.009] SkyLine: Generate mock line-intensity maps

SkyLine generates mock line-intensity maps (both in 3D and 2D) in a lightcone from a halo catalog, accounting for the evolution of clustering and astrophysical properties, and observational effects such as spectral and angular resolutions, line-interlopers, and galactic foregrounds. Using a given astrophysical model for the luminosity of each line, the code paints the signal for each emitter and generates the map, adding coherently all contributions of interest. In addition, SkyLine can generate maps with the distribution of Luminous Red Galaxies and Emitting Line Galaxies.

[ascl:2402.010] 2cosmos: Monte Python modification for two independent instances of CLASS

2cosmos is a modification of Monte Python (ascl:1307.002) and allows the user to write likelihood modules that can request two independent instances of CLASS (ascl:1106.020) and separate dictionaries and structures for all cosmological and nuisance parameters. The intention is to be able to evaluate two independent cosmological calculations and their respective parameters within the same likelihood. This is useful for evaluating a likelihood using correlated datasets (e.g. mutually exclusive subsets of the same dataset for which one wants to take into account all correlations between the subsets).

[submitted] cbeam: a coupled-mode propagator for slowly-varying waveguides

cbeam is a Python/Julia package which models the propagation of guided light through slowly-varying few-mode waveguides using the coupled-mode theory (CMT). When compared with more general numerical methods for waveguide simulation, such as the finite-differences beam propagation method (FD-BPM), numerical implementations of the CMT can be much more computationally efficient. cbeam also provides a Pythonic class structure to define waveguides, with simple classes for directional couplers and photonic lanterns already provided. Finally, cbeam doubles as a finite-element eigenmode solver.

[submitted] KCWIKit: KCWI Post-Processing and Improvements

KCWIKit extends the official KCWI DRP (ascl:2301.019) with a variety of stacking tools and DRP improvements. The software offers masking and median filtering scripts to be used while running the KCWI DRP, and a step-by-step KCWI_DRP implementation for finer control over the reduction process. Once the DRP has finished, KCWIKit can be used to stack the output cubes via the Montage package. Various functions cross-correlate and mosaic the constituent cubes and the final stacked cubes are WCS corrected. Helper functions can then be used to deproject the stacked cube into lower-dimensional representations should the user desire.

[submitted] BTSbot: Automated Identification and Follow-up of Bright Transients with Deep Learning

BTSbot is a multi-modal convolutional neural network designed for real-time identification bright extragalactic transients in Zwicky Transient Facility (ZTF) data. BTSbot provides a bright transient score to individual ZTF detections using their image data and 25 extracted features. BTSbot is able to eliminate the need for daily visual inspection of new transients by automatically identifying and requesting spectroscopic follow-up observations of new bright transient candidates. BTSbot recovers all bright transients in our test split and performs on par with human experts in terms of identification speed (on average, ∼1 hour quicker than scanners).

[submitted] Poke: An open-source ray-based physical optics platform

Integrated optical models allow for accurate prediction of the as-built performance of an optical instrument. Optical models are typically composed of a separate ray trace and diffraction model to capture both the geometrical and physical regimes of light. These models are typically separated across both open-source and commercial software that don't interface with each other directly. To bridge the gap between ray trace models and diffraction models, we have built an open-source optical analysis platform in Python called Poke that uses commercial ray tracing APIs and open-source physical optics engines to simultaneously model scalar wavefront error, diffraction, and polarization. Poke operates by storing ray data from a commercial ray tracing engine into a Python object, from which physical optics calculations can be made. We present an introduction to using Poke, and highlight the capabilities of two new propagation physics modules that add to the utility of existing scalar diffraction models. Gaussian Beamlet Decomposition is a ray-based approach to diffraction modeling that allows us to integrate physical optics models with ray trace models to directly capture the influence of ray aberrations in diffraction simulations. Polarization Ray Tracing is a ray-based method of vector field propagation that can diagnose the polarization aberrations in optical systems.

[submitted] fkpt: code to compute LCDM and modified gravity perturbation theory using fk-kernels

Perturbation theory for LCDM and Modified Gravity theories using "fk"-Kernels. fkpt is a code that computes the 1-loop redshift space power spectrum for tracers. Hu-Sawicky f(R) is the only modified gravity model implemented so far. It is straightforward to do it for other models.

[submitted] MINDS: a hybrid pipeline for the reduction of JWST/MIRI-MRS data

The MINDS package contains a hybrid pipeline based on the jwst pipeline and routines from the VIP package for the reduction of JWST MIRI-MRS data. The pipeline was developed by the MINDS - MIRI mid-INfrared Disk Survey - GTO team in an attempt to compensate for some of the known weaknesses of the official jwst pipeline to improve the quality of spectrum extracted from MIRI-MRS data. This is done by leveraging the capabilities of VIP, another large data reduction package used in the field of high-contrast imaging.
The front end of the pipeline is a highly automated Jupyter notebook. Parameters are typically set in one cell at the beginning of the notebook, and the rest of the notebook can be run without any further modification. The Jupyter notebook format provides flexibility, enhanced visibility of intermediate and final results, more straightforward troubleshooting, and the possibility to easily incorporate additional codes by the user to further analyse or exploit their results.

[ascl:2403.001] Pynkowski: Minkowski functionals and other higher order statistics

Pynkowski computes Minkowski Functionals and other higher order statistics of input fields, as well as their expected values for different kinds of fields. This package supports Minkowski functionals, and maxima and minima distributions. Supported input formats include scalar HEALPix maps such as those used by healpy (ascl:2008.022) and polarization HEALPix maps in the SO(3) formalism. Pynkowski also supports various theoretical fields, including Gaussian (e.g., CMB Temperature or the initial density field), Chi squared (e.g., CMB polarization intensity), and spin 2 maps in the SO(3) formalism.

[ascl:2403.002] DistClassiPy: Distance-based light curve classification

DistClassiPy uses different distance metrics to classify objects such as light curves. It provides state-of-the-art performance for time-domain astronomy, and offers lower computational requirements and improved interpretability over traditional methods such as Random Forests, making it suitable for large datasets. DistClassiPy allows fine-tuning based on scientific objectives by selecting appropriate distance metrics and features, which enhances its performance and improves classification interpretability.

[ascl:2403.003] kinematic_scaleheight: Infer the vertical distribution of clouds in the solar neighborhood

kinematic_scaleheight uses MCMC methods to kinematically estimate the vertical distribution of clouds in the Galactic plane, including the least squares analysis of Crovisier (1978), an updated least squares analysis using a modern Galactic rotation model, and a Bayesian model sampled via MCMC as described in Wenger et al. (2024).

Would you like to view a random code?