ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 1-250 of 3643 (3551 ASCL, 92 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[submitted] Spectroscopic Analysis of O and B-Type Stars, Neutron Stars, and White Dwarfs Using SDSS Data and Astroquery

This project presents a comprehensive spectroscopic analysis of O and B-type stars, neutron stars, and white dwarfs, with a focus on the detection of helium (He) and oxygen (O) in stellar atmospheres. By leveraging data from the Sloan Digital Sky Survey (SDSS) and utilizing tools such as Astropy, Astroquery, and Specutils, the project aims to identify key spectral lines of helium and oxygen, as well as the formation of heliox (OHe) molecules. The methodology involves querying SDSS for relevant spectral data, filtering and analyzing it based on stellar classification, and visualizing the results using advanced techniques. The findings contribute to the understanding of stellar evolution, chemical processes, and the role of these elements in various stellar classes. Additionally, the project incorporates interactive data exploration with Aladin Lite and Simbad, offering a robust framework for future astrophysical research.

[submitted] Analysis and Super-Resolution of Astronomical Data from FITS Files of NGC 0628

This notebook provides a comprehensive approach for analyzing and visualizing astronomical data from FITS (Flexible Image Transport System) files, focusing on moment maps derived from molecular line emissions within the galaxy NGC 0628. The analysis involves applying various image processing techniques to handle corrupted pixels, reconstruct images, and enhance the quality of moment maps. The notebook also demonstrates how to simulate super-resolution to improve the spatial resolution of the data. By utilizing Gaussian filtering, median filtering, and contrast enhancement, the approach improves the clarity and precision of the data, making it suitable for detailed astrophysical studies. This tool serves as an efficient method for processing and visualizing large-scale astronomical datasets for further analysis and scientific interpretation.

[ascl:2411.030] NEMESISPY: Modeling exoplanet spectra

NEMESISPY infers the atmospheric properties of exoplanets, such as chemical composition, using spectroscopic data. The package calculates radiative transfer using the correlated-k approximation and for parametric atmospheric modelling. NEMESISPY is a Python implementation of the well-established Fortran NEMESIS library (ascl:2210.009), which has been applied to the atmospheric retrievals of both solar system planets and exoplanets employing numerous different observing geometries.

[ascl:2411.029] IcyDwarf: Coupled geophysical-geochemical-orbital evolution model of icy worlds

IcyDwarf calculates the coupled physical-chemical evolution of an icy dwarf planet or moon. The code calculates the thermal evolution of an icy planetary body (moon or dwarf planet), with no chemistry, but with rock hydration, dehydration, hydrothermal circulation, core cracking, tidal heating, and porosity; the depth of cracking and a bulk water:rock ratio by mass in the rocky core are also computed. It also calculates whether cryovolcanism is possible by the exsolution of volatiles from cryolavas. IcyDwarf also determines the equilibrium fluid and rock chemistries resulting from water-rock interaction in subsurface oceans in contact with a rocky core, up to 200ºC and 1000 bar.

[ascl:2411.028] SMINT: Structure Model INTerpolator

SMINT (Structure Model INTerpolator) obtains posterior distributions on the H/He or H2O mass fraction of a planet; its interface is user-friendly. The parameters of the planet of interest are input with specifications on the priors that should be used. SMINT returns publication-ready plots presenting the joint parameters constraints obtained from interpolating the interior models grid of interest as well as confidence intervals for each parameter.

[ascl:2411.027] DarkMatters: Multi-frequency emissions from Dark Matter annihilation and decay

DarkMatters calculates multi-frequency and multi-messenger emissions from WIMP annihilation and decay. This can be done both for standard channels and custom models, with the ability to produce surface brightnesses and integrated fluxes as well as maps in FITS format to compare to actual data. DarkMatters uses an accelerated ADI solver such as GALPROP (ascl:1010.028) for electron diffusion with an innovative sparse matrix approach. Additionally, there is the option to use a Green's function approximate solution (implemented in both C++ and Python).

[ascl:2411.026] DustPOL-py: Numerical modeling of dust polarization

The numerical modeling code DustPOL-py calculates the multi-wavelength polarization degree of absorption and thermal dust emission based on Radiative Torque alignment (RAT-A), Magnetically enhanced RAT (MRAT) and Radiative Torque Disruption (RAT-D). The code saves the output files (wavelength and degree of polarization) for further analysis and is idealization for diffuse ISM, molecular clouds and star-forming regions; it also predicts the polarization spectrum for one- or two-dust layers. A web-interface GUI for DustPOL-py is also available.

[ascl:2411.025] DAMSPI: DArk Matter SPIkes in EAGLE simulations

DArk Matter SPIkes (DAMSPI) analyzes dark matter spikes around Intermediate Mass Black Holes (IMBHs) in the Milky Way. It extracts an IMBH catalog with the corresponding dark matter spike parameters from EAGLE simulations to probe a potential gamma-ray signal from dark matter self-annihilation. The catalog includes, among others, the coordinates, mass, formation redshift, and spike parameters for each individual IMBH.

[ascl:2411.024] jaxspec: X-ray spectra Bayesian analysis

jaxspec performs statistical inference on X-ray spectra. It loads an X-ray spectrum (in the OGIP standard), defines a spectral model from the implemented components, and calculates the best parameters using state-of-the-art Bayesian approaches. The code is built on top of JAX (ascl:2111.002) to provide just-in-time compilation and automatic differentiation of the spectral models, enabling the use of sampling algorithms. jaxspec is written in pure Python and is not dependent on HEASoft (ascl:1408.004).

[ascl:2411.023] mochi_class: Modelling Optimization to Compute Horndeski in CLASS

mochi_class extends the hi_class code (ascl:1808.010), itself a patch to the Einstein-Boltzmann solver CLASS (ascl:1106.020). It replaces α-functions by stable basis to ensure stability and takes general functions of time as input, including the dark energy equation of state or its normalized background energy-density. mochi_class provides stability test checking for mathematical (classical) instabilities in the scalar field fluctuations, and also includes a GR approximation scheme, among other new capabilities.

[ascl:2411.022] HIILines: Analytical ionized ISM emission line model

HIILines analytically models lines emitted by the ionized interstellar medium (ISM). It covers [OIII], [OII], Hα, and Hβ lines. The strength of HIILines is its high computational efficiency. It can be used for galaxy spectroscopic survey measurement interpolations assuming a one-zone picture and galaxy line emission measurement design and forecasts. HIILines also performs post-processing of hydrodynamical galaxy formation simulations for ISM emission lines.

[ascl:2411.021] McFine: Muli-component hyperfine fitting tool

McFine performs complex, multi-component hyperfine spectra fitting in astronomical data. It turns line intensities into gas conditions using a fully automated Bayesian method. Written in Python, the code uses Markov chain Monte Carlo (MCMC) to characterize model denegeracies. It handles local thermodynamic equilibrium (LTE) and radiative-transfer (RT) models and can fit individual spectra and data cubes; given a data cube, it can also use the neighboring information to attempt a better fit. McFine also fits the minimum number of distinct components to avoid overfitting.

[ascl:2411.020] Diagnose: Spectral classification code

The spectral classification code Diagnose assigns one of four classifications (star, galaxy, quasar, or unknown) to each source and returns a redshift estimate for the galaxies and quasars and a velocity estimate for the stars. The code uses a chi-squared minimization for linear combinations of principal component templates to determine a best-fit spectral classification and redshift estimate. It computes three best-fit chi-squared values: one for stellar type and velocity, one for galaxy type and redshift, and one for a quasar and redshift. Diagnose then compares the best fit of these three reduced chi-squared values to the second best fit and evaluates the difference against a statistical threshold.

[ascl:2411.019] unicorn: Full 3D-HST grism pipeline

The Unicorn pipeline produces data products from the 3D-HST grism survey of four CANDELS fields. It extracts interlaced 2D and 1D spectra for all objects in the Skelton et al. (2014) photometric catalogs. It then fits the 2D spectra and multi-band photometry to determine redshifts and emission line strengths. Unicorn is built on threedhst (ascl:2411.018) and has been superseded by grizli (ascl:1905.001).

[ascl:2411.018] threedhst: 3D-HST grism analysis software

threedhst reduces WFC3 grism exposures. It is essentially a wrapper around aXe (ascl:1109.016) and produces a catalog and other useful files; extracted 1D spectra are placed in a single file, and 2D spectra are in individual files. The code produces an HTML table with thumbnails of the direct images, 1D, and 2D spectra and supports the pipeline Unicorn (ascl:2411.019), which produces data products from the 3D-HST grism survey of four CANDELS fields. threedhst has been superceded by Grizli (ascl:1905.001).

[ascl:2411.017] CLASS LVDM: Cosmological model of Lorentz invariance violation in gravity and dark matter

CLASS LVDM modifies the CLASS code (ascl:1106.020) to incorporate the cosmological model of Lorentz invariance violation (LV) in gravity and dark matter. Compared to the usual CLASS code, it contains four new parameters: alpha, beta, and lambda characterize LV in the gravity sector
, and Y characterizes LV in the dark matter sector.

[ascl:2411.016] fits_warp: Warp catalogs and images to dedistort the effects of the ionosphere

fits_warp smoothly removes the distorting effect of the ionosphere and restores sources to their reference positions in both the catalog and image domain. Image warping uses pixel offsets derived from a catalog of cross-matched sources. Though initially written for low-frequency radio astronomy, fits_warp can be used to de-distort any image distorted by some vector field which is sampled by some sparse pierce-points.

[ascl:2411.015] atlas-fit: Python tool to fit solar spectra to a known atlas

atlas-fit amends the results of spectroflat (ascl:2411.014) with calibration against a solar atlas. Data for wavelength calibration and continuum-correction is generated from flat field information and selected solar atlantes. The atlas-fit package provides two tools: one to generate a list of lines from the atlas and data to use for finding a wavelength solution (dispersion), and another to amend the calibration results from the spectroflat library.

[ascl:2411.014] spectroflat: Generic Python calibration library for spectro-polarimetric data

Spectroflat flat fields spectro-polarimetric data. It can be plugged into existing Python-based data reduction pipelines or used as a standalone calibration and performance analysis tool. The code includes smile distortion correction and flat field extraction. The library expects the spatial domain on the vertical-axis and the spectral domain on the horizontal axis. Spectroflat does not include any file reading/writing routines and expects numpy arrays as input.

[ascl:2411.013] NE2001p: Python implementation of the NE2001 Galactic electron density model

NE2001p is a fully Python implementation of the NE2001 Galactic electron density model. The code forward models the dispersion and scattering of compact radio sources, including pulsars, fast radio bursts, AGNs, and masers, and the model predicts the distances of radio sources that lack independent distance measures.

[ascl:2411.012] BSAVI: Bayesian SAmple VIsualizer for cosmological likelihoods

BSAVI (Bayesian Sample Visualizer) aids likelihood analysis of model parameters where samples from a distribution in the parameter space are used as inputs to calculate a given observable. For example, selecting a range of samples will allow you to easily see how the observables change as you traverse the sample distribution. At the core of BSAVI is the Observable object, which contains the data for a given observable and instructions for plotting it. It is modular, so you can write your own function that takes the parameter values as inputs, and BSAVI will use it to compute observables on the fly. It also accepts tabular data, so if you have pre-computed observables, simply import them alongside the dataset containing the sample distribution to start visualizing. Though BSAVI was developed for use in theoretical cosmology, it can be customized to fit a wide range of visualization needs.

[ascl:2411.011] MMLPhoto-z: Cross-modal contrastive learning method for estimating photo-z of quasars

MMLPhoto-z estimates the photo-z of quasars using a cross-modal contrastive learning approach. This method employs adversarial training and contrastive loss functions to promote the mutual conversion between multi-band photometric data features (magnitude, color) and photometric image features, while extracting modality-invariant features. MMLPhoto-z can also be applied to tasks like photo-z estimation for galaxies with missing magnitudes. Overall, this method proves effective in enhancing the photo-z estimation across diverse datasets and conditions.

[ascl:2411.010] ReverseDiff: Reverse mode automatic Differentiation for Julia

ReverseDiff implements methods to take gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object) using reverse mode automatic differentiation (AD). While performance can vary depending on the functions you evaluate, the algorithms implemented by ReverseDiff generally outperform non-AD algorithms in both speed and accuracy.

[ascl:2411.009] pycosmicstar: PYthon cosmic STar formAtion Rate

Pycosmicstar studies the star formation history for different cosmological models. The package contains two abstract classes, cosmology and structureabstract. The class cosmology is passed as a parameter for the classes that implement structureabstract. This approach takes polymorphism into account. The modeling of structures and star formation are not strongly dependent on the cosmology. Pycosmicstar generates a new cosmological class that implements the methods of abstract class cosmology that is useful to study, for example, the role of dark energy in the cosmic star formation rate evolution.

[ascl:2411.008] Astrocats: Construct astronomical catalogs

Astrocats enables astronomers to construct their own curated catalogs of astronomical data with the intention of producing shareable catalogs of that data in human-readable formats. Astrocats is used by several existing open astronomy catalogs, including the Open Supernova Catalog, Open TDE Catalog, Open Nova Catalog, and the Open Black Hole Catalog.

[ascl:2411.007] EFTofPNG: Effective Field Theory of Post-Newtonian Gravity

EFTofPNG (Effective Field Theory of Post-Newtonian Gravity) performs high precision computations in the effective field theory of post-Newtonian (PN) Gravity, including spins. Written in Mathematica, it provides computer-algebra tools to derive analytical input for gravitational-wave source modelling relevant to current observatories. EFTofPNG has been used to derive of all currently known spin-dependent conservative interaction potentials in the post-Newtonian (PN) approximation to General Relativity (GR).

[ascl:2411.006] HBSGSep: Hierarchical Bayesian Star-Galaxy Separations

HBSGSep (Hierarchical Bayesian Star-Galaxy Separations) classifies stars and galaxies photometrically by fitting templates and hierarchically learning their prior weights. The hierarchical Bayesian algorithms are unsupervised and do not use a training set nor are priors set in advance of running the algorithms; the priors for the templates are inferred from the data themselves.

[ascl:2411.005] GAz: Genetic Algorithm for photometric redshift estimation

GAz calculates photometric redshifts for low redshift galaxies. It finds optimal polynomial forms to fit to data. It explores the very large space of high order polynomials while only requiring optimization of a small number of terms. Tested with the 2SLAQ LRG data set, GAz generalizes well to various data sets and redshift ranges.

[ascl:2411.004] DarkRayNet: Simulation tool for indirect Dark Matter searches

DarkRayNet uses recurrent neural networks (RNNs) to quickly simulate antiprotons, antideuterons, protons and Helium cosmic ray (CR) spectra at Earth for an extensive range of parameters. The corresponding neural networks are trained on GALPROP (ascl:1010.028) simulations. DarkRayNet can also simulate the cosmic ray fluxes for antideuterons; the spectra can be predicted for a signal from dark matter annihilation DM Antideuterons and for secondary emission Secondary Antideuterons.

[ascl:2411.003] PyMerger: Einstein Telescope binary black hole merger detector

PyMerger detects binary black hole mergers from the Einstein Telescope based on a Deep Residual Neural Network (ResNet) model; the model was trained on combined data from all three proposed sub-detectors of ET (TSDCD). The model achieved high BBH detection rates. Though not trained on BNS and BHNS mergers, PyMerger successfully detected 11,477 BNS and 323 BHNS mergers in ET-MDC, indicating its potential for broader applicability.

[ascl:2411.002] flashcurve: Fast generation of adaptive-binning light curves with Fermi-LAT data

flashcurve estimates the necessary time windows for adaptive binning light curves in Fermi-LAT data using raw photon data. Fluxes coming from Gamma rays measured by the Fermi-LAT satellite are extremely variable. Gamma-ray light curves produced by flashcurve, which uses deep learning, optimally use adaptive bin sizes to retrieve information about the source dynamics and to combine gamma-ray observations in a multi-messenger perspective.

[ascl:2411.001] Mosaic: Multibeamformed Observation Simulation And Interferometry Characterization

Mosaic characterizes the beam shape and generate efficient tilings for efficient multi-beam observations. It consists of an interferometric pattern simulator and characterizer, an optimized tiling generator, and a beamforming weights calculator. It is being used in the filter-banking beamformer in the MeerKAT telescope; more than 200 pulsars have been discovered from the multiple beam observations supported by Mosaic.

[submitted] Finalflash

**Finalflash** is a Python package designed for primary beam corrections of uGMRT radio interferometric images. The software uses frequency-dependent beam models and FITS file handling to improve the accuracy of radio astronomical data. It is open source and available under the MIT License. The code is hosted at https://github.com/arpan-52/Finalflash.

[submitted] Gradus.jl

Extensible spacetime agnostic general relativistic ray-tracing (GRRT): Gradus.jl is a suite of tools related to tracing geodesics and calculating observational signatures of accreting compact objects. Gradus.jl requires only a specification of the non-zero metric components of a chosen spacetime in order to solve the geodesic equation and compute a wide variety of trajectories and orbits. Various algorithms for calculating physical quantities are implemented generically, so they may be used with different classes of spacetime with minimal effort.

[ascl:2410.020] Falcon-DM: N-body code for inspirals in DM spikes

Falcon-DM simulates intermediate mass ratio inspirals in DM spikes. This lightweight N-body code is written in C++ and is specifically tuned for simulating IMRIs embedded in dark matter (DM) spikes. It features a 2nd order Drift-Kick-Drift integrator using the symplectic HOLD scheme and symmetrized, individual, time-steps for accurate time-integration. Falcon-DM also offers post-Newtonian (PN) effects up to PN2.5 using the auxiliary velocity algorithm.

[ascl:2410.019] Heracles: Harmonic-space statistics on the sphere

Heracles manages harmonic-space statistics on the sphere. It takes catalogs of positions and function values on the sphere and turns them into angular power spectra and mixing matrices. Heracles is both a Python library, to be used in notebooks or data processing pipelines, and a tool for running measurements from the command line using a configuration file.

[ascl:2410.018] fastPTA: Constraining power of PTA configurations forecaster

fastPTA forecasts the sensitivity of future Pulsar Timing Array (PTA) configurations and assesses constraints on Stochastic Gravitational Wave Background (SGWB) parameters. The code can generate mock PTA catalogs with noise levels compatible with current and future PTA experiments. These catalogs can then be used to perform Fisher forecasts of MCMC simulations.

[ascl:2410.017] SSOF: Data-driven models for extremely precise radial velocity (EPRV) spectra

StellarSpectraObservationFitting (SSOF) measures radial velocities and creates data-driven models (with fast, physically-motivated Gaussian Process regularization) for the time-variable spectral features for both the telluric transmission and stellar spectrum measured by Extremely Precise Radial Velocity (EPRV) spectrographs (while accounting for the wavelength-dependent instrumental line-spread function). Written in Julia, SSOF provides two methods for estimating the uncertainties on the RVs and model scores based on the photon uncertainties in the original data. For quick estimates of the uncertainties, the code looks at the local curvature of the likelihood space; the second method for estimating errors is via bootstrap resampling.

[ascl:2410.016] Gaspery: Radial velocity (RV) observing strategies

Gaspery uses the Fisher Information Matrix (FIM) to evaluate different radial velocity (RV) observing strategies; this assists observational exoplanet astronomers in constructing the observing strategy that maximizes information (or minimizes uncertainty) on the RV semi-amplitude K. The code is flexible and generalizable, however, and can maximize information on any free parameter from any model, given a time series support (x-axis).

[ascl:2410.015] Kamodo: Space weather data access, interpolation, and visualization

Kamodo provides access to, interpolation of, and visualization of space weather models and data. The code allows model developers to represent simulation results as mathematical functions which may be manipulated directly. As the software does not generate model outputs, users must acquire the desired model outputs before these outputs can be functionalized by the software. Kamodo handles unit conversion transparently and supports interactive science discovery through Jupyter notebooks with minimal coding.

[ascl:2410.014] CloudCovErr.jl: Debias fluxes and improve error bar estimates for photometry on structured backgrounds

CloudCovErr.jl debiases fluxes and improves error bar estimates for photometry on top of structured filamentary backgrounds. It first estimates the covariance matrix of the residuals from a previous photometric model and then computes corrections to the estimated flux and flux uncertainties. Using an infilling technique to estimate the background and its uncertainty dramatically improves flux and flux uncertainty estimates for stars in images of fields with significant nebulosity.

[ascl:2410.013] ARK: 3D hydrodynamics code for the study of convective problems

ARK implements Computational Fluid Dynamics applications, such as Euler and all-Mach regime, on a Cartesian grid with MPI+Kokkos. It provides a performance-portable Kokkos implementation for compressible hydrodynamics and performs simulations of convection without any approximation of Boussinesq nor anelastic type. It adapts an all-Mach number scheme into a well-balanced scheme for gravity, which preserves arbitrary discrete equilibrium states up to the machine precision. The low-Mach correction in the numerical flux allows ARK to be more precise in the low-Mach regime; the code is well suited for studying highly stratified and high-Mach convective flows.

[ascl:2410.012] Exo-REM: 1D self-consistent radiative-equilibrium model for exoplanetary atmospheres

The 1D radiative-equilibrium model Exo-REM simulates young gas giants far from their star and brown dwarfs. Fluxes are calculated using the two-stream approximation assuming hemispheric closure. The radiative-convective equilibrium is solved assuming that the net flux (radiative + convective) is conservative. The conservation of flux over the pressure grid is solved iteratively using a constrained linear inversion method. Rayleigh scattering from H2, He, and H2O, as well as absorption and scattering by clouds (calculated from extinction coefficient, single scattering albedo, and asymmetry factor interpolated from precomputed tables for a set of wavelengths and particle radii), are also taken into account.

[ascl:2410.011] DGEM: 3D dust continuum radiative transfer code for method comparison

DGEM compares different computation methods for three-dimensional dust continuum radiative transfer. This simple code is based on mcpolar, translated to C++, and refactored to realize and compare radiative transfer techniques, namely Monte Carlo, Quasi-Monte-Carlo, and the Directions Grid Enumeration Method (DGEM). DGEM uses precalculated directions of the photons propagation instead of the random ones to speed up the calculations process. The code also offers a gnuplot script for plotting the resulting images.

[ascl:2410.010] lensitbiases: rFFT-based flat-sky CMB lensing tools

lensitbiases is an rFFT-based N1 lensing bias calculation and tests. It is tuned for TT, P-only or MV (GMV) like quadratic estimators. It performs rFFT-based N1 and N1 matrix calculations in ~ O(ms) time per lensing multipole for Planck-like config, which allows on-the-fly evaluation of the bias. It also calculates 5 rFFT's of moderate size per L for N1 TT, 20 for PP, and 45 for MV or GMV. lensitbiases is not particularly efficient for low lensing L's, since in this case one must use large boxes.

[ascl:2410.009] DIRTY: 3D dust radiative transfer for dusty astrophysical sources

DIRTY (DustI Radiative Transfer, Yeah!) computes the radiative transfer and dust emission from arbitrary distributions of dust illuminated by arbitrary distributions of sources (usually stars). It uses Monte Carlo methods to solve the radiative transfer problem in full 3D including non-equilibrium and equilibrium thermal dust emission. As are other similar models, DUSTY is computationally intensive; as a result, it is written in C++.

[ascl:2410.008] solar-vSI: Calculate solar antineutrino spectra

solar-vSI performs Monte Carlo integration of multi-body phase space efficiently. The calculation of solar antineutrino spectra from 8B decay requires the integration of five-body phase space. Though there is no simple analytical approach to this problem, recursive relations can be used to facilitate numerical evaluations.

[ascl:2410.007] measure_extinction: Measure interstellar dust extinction using pair method

measure_extinction measures extinction due to dust absorbing photons or scattering photons out of the line-of-sight. Extinction applies to the case for a star seen behind a foreground screen of dust. This package provides the tools to measure dust extinction curves using observations of two effectively identical stars, differing only in that one is seen through more dust than the other.

[ascl:2410.006] forcepho: Generative modeling galaxy photometry for JWST

Forcepho infers the fluxes and shapes of galaxies from astronomical images. It models the appearance of multiple sources in multiple bands simultaneously and compares to observed data via a likelihood function. Gradients of this likelihood allow for efficient maximization of the posterior probability or sampling of the posterior probability distribution via Hamiltonian Monte Carlo. The model intrinsic galaxy shapes and positions are shared across the different bands, but the fluxes are fit separately for each band. Forcepho does not perform detection; initial locations and (very rough) parameter estimates must be supplied by the user.

[ascl:2410.005] BayeSED: Bayesian SED synthesis and analysis of galaxies and AGNs

BayeSED implements full Bayesian interpretation of spectral energy distributions (SEDs) of galaxies and AGNs. It performs Bayesian parameter estimation using posteriori probability distributions (PDFs) and Bayesian SED model comparison using Bayesian evidence. Its latest version BayeSED3 supports various built-in SED models and can emulate other SED models using machine learning techniques.

[ascl:2410.004] iPIC3D: Multi-scale plasma simulations of plasma

iPIC3D performs kinetic plasma simulations at magnetohydrodynamics time scales. This three-dimensional parallel code uses the implicit Particle-in-Cell method; implicit integration in time of the Vlasov–Maxwell system removes the numerical stability constraints. Written in C++, iPIC3D can be run with CUDA acceleration and supports MPI, OpenMP, and multi-node multi-GPU simulations.

[ascl:2410.003] vortex-p: Helmholtz-Hodge and Reynolds decomposition algorithm for particle-based simulations

vortex-p analyzes the velocity fields of astrophysical simulations of different natures (for example, SPH, moving-mesh, and meshless) usually spanning many orders of magnitude in scales involved. The code performs Helmholtz-Hodge decomposition (HHD); that is, it can decompose the velocity field into a solenoidal and an irrotational/compressive part Helmholtz-Hodge decomposition. vortex-p internally uses an AMR representation of the velocity field and can, in principle, capture the full dynamical range of the simulation. The package can also perform Reynolds decomposition (i.e., the decomposition of the velocity field into a bulk and a turbulent part). This is achieved by means of a multi-scale filtering of the velocity field, where the filtering scale around each point is determined by the local flow properties. vortex-p expands the vortex (ascl:2206.001) code, which had been coupled to the outputs of the MASCLET code, to a fully stand-alone tool capable of working with the outcomes of a broad range of simulation methods.

[ascl:2410.002] pysymlog: Symmetric (signed) logarithm scale for Python plots

pysymlog provides utilities for binning, normalizing colors, wrangling tick marks, and other tasks, in symmetric logarithm space. For numbers spanning positive and negative values, the code works in log scale with a transition through zero, down to some threshold. This is useful for representing data that span many scales such as standard log-space that include values of zero or even negative values. pysymlog provides convenient functions for creating 1D and 2D histograms and symmetric log bins, generating logspace-like arrays through zero and managing matplotlib major and minor ticks in symlog space, as well as bringing symmetric log scaling functionality to plotly.

[submitted] RadioSunPy: A Robust Preprocessing Pipeline for RATAN-600 Solar Radio Observations Data

This paper introduces RadioSunPy, an open-source Python package developed for accessing, visualizing, and analyzing multi-band radio observations of the Sun from the RATAN-600 solar complex. The advancement of observational technologies and software for processing and visualizing spectro-polarimetric microwave data obtained with the RATAN-600 radio telescope opens new opportunities for studying the physical characteristics of solar plasma at the levels of the chromosphere and corona. These levels remain some difficult to detect in the ultraviolet and X-ray ranges. The development of these methods allows for more precise investigation of the fine structure and dynamics of the solar atmosphere, thereby deepening our understanding of the processes occurring in these layers. The obtained data also can be utilized for diagnosing solar plasma and forecasting solar activity. However, using RATAN-600 data requires extensive data processing and familiarity with the RATAN-600. The package offers comprehensive data processing functionalities, including direct access to raw data, essential processing steps such as calibration and quiet Sun normalization, and tools for analyzing solar activity. This includes automatic detection of local sources, identifying them with NOAA (National Oceanic and Atmospheric Administration) active regions, and further determining parameters for local sources and active regions. By streamlining data processing workflows, RadioSunPy enables researchers to investigate the fine structure and dynamics of the solar atmosphere more efficiently, contributing to advancements in solar physics and space weather forecasting.

[submitted] ysoisochrone: A Python package to estimate masses and ages for YSOs

ysoisochrone is a Python3 package that handles the isochrones for young stellar objects (YSOs), and utilize isochrones to derive the stellar mass and ages. Our primary method is a Bayesian inference approach, and the Python code builds on the IDL version developed in Pascucci et al. (2016). The code estimates the stellar masses, ages, and associated uncertainties by comparing their stellar effective temperature, bolometric luminosity, and their uncertainties with different stellar evolutionary models, including those specifically developed for YSOs. User-developed evolutionary tracks can also be utilized when provided in the specific format described in the code documentation.

[submitted] Kete: Solar System survey tools

The kete tools are intended to enable the simulation of all-sky surveys of solar system objects. This includes multi-body physics orbital dynamics, thermal and optical modeling of the objects, as well as field of view and light delay corrections. These tools in conjunction with the Minor Planet Centers (MPC) database of known asteroids can be used to not only plan surveys but can also be used to predict what objects are visible for existing or past surveys.

The primary goal for kete is to enable a set of tools that can operate on the entire MPC catalog at once, without having to do queries on specific objects. It has been used to simulate over 10 years of survey time for the NEO Surveyor mission using 10 million main-belt and near-Earth asteroids.

[ascl:2410.001] GalCraft: Building integral-field spectrograph data cubes of the Milky Way

GalCraft creates mock integral-field spectroscopic (IFS) observations of the Milky Way and other hydrodynamical/N-body simulations. It conducts all the procedures from inputting data and spectral templates to the output of IFS data cubes in FITS format. The produced mock data cubes can be analyzed in the same way as real IFS observations by many methods, particularly codes like Voronoi binning (ascl:1211.006), pPXF (ascl:1210.002), line-strength indices, or a combination of them (e.g., the GIST pipeline, ascl:1907.025). The code is implemented using Python-native parallelization. GalCraft will be particularly useful for directly comparing the Milky Way with other MW-like galaxies in terms of kinematics and stellar population parameters and ultimately linking the Galactic and extragalactic to study galaxy evolution.

[ascl:2409.020] pyRRG: Weak lensing shape measurement code

pyRRG measures the 2nd and 4th order moments using a TinyTim model to correct for PSF distortions. The code is invariant to the number exposures and orientation of the drizzle images. pyRRG uses a machine learning algorithm to automatically classify stars and galaxies; this can also be done manually if greater accuracy is needed.

[ascl:2409.019] Padé: Protoplanetary disk turbulence simulator

Padé simulates protoplanetary disk hydrodynamics in cylindrical coordinates. Written in Fortran90, it is a finite-difference code and the compact 4th-order standard Padé scheme is used for spatial differencing. Padé differentiation is known to have spectral-like resolving power. The z direction can be periodic or non-periodic. The 4th order Runge-Kutta is used for time advancement. Padé implements a version of the FARGO technique to eliminate the time-step restriction imposed by Keplerian advection, and capturing of shocks that are not too strong can be done by using artificial bulk viscosity.

[ascl:2409.018] PySR: High-Performance Symbolic Regression in Python and Julia

PySR performs Symbolic Regression; it uses machine learning to find an interpretable symbolic expression that optimizes some objective. Over a period of several years, PySR has been engineered from the ground up to be (1) as high-performance as possible, (2) as configurable as possible, and (3) easy to use. PySR is developed alongside the Julia library SymbolicRegression.jl, which forms the powerful search engine of PySR. Symbolic regression works best on low-dimensional datasets, but one can also extend these approaches to higher-dimensional spaces by using "Symbolic Distillation" of Neural Networks. Here, one essentially uses symbolic regression to convert a neural net to an analytic equation. Thus, these tools simultaneously present an explicit and powerful way to interpret deep neural networks.

[ascl:2409.017] WISE2MBH: Mass of supermassive black holes estimator

WISE2MBH uses infrared cataloged data from the Wide-field Infrared Survey Explorer (WISE) to estimate the mass of supermassive black holes (SMBH). It implements a Monte Carlo approach for error propagation, considering mean photometric errors from WISE magnitudes, errors in fits of scaling relations used and scatter of those relations, if available.

[ascl:2409.016] PyExoCross: Molecular line lists post-processor

PyExoCross, a Python adaptation of ExoCross (ascl:1803.014), post-processes molecular line lists generated by ExoMol, HITRAN, and HITEMP and other similar initiatives. It generates absorption and emission spectra and other properties, including partition functions, specific heats, and cooling functions, based on molecular line lists. The code also calculates cross sections with four line profiles: Doppler, Gaussian, Lorentzian, and Voigt. PyExoCross can convert data format between ExoMol and HITRAN, and supports importing and exporting line lists in the ExoMol and HITRAN/HITEMP formats.

[ascl:2409.015] GASTLI: GAS gianT modeL for Interiors

GASTLI (GAS gianT modeL for Interiors) calculates the interior structure models for gas giants exoplanets. The code computes mass-radius curves, thermal evolution curves, and interior composition retrievals to fit a interior structure model to your mass, radius, age, and if available, atmospheric metallicity data. GASTLI can also plot the results, including internal and atmospheric profiles, a pressure-temperature diagram, mass-radius relations, and thermal evolution curves.

[ascl:2409.014] symbolic_pofk: Precise symbolic emulators of the linear and nonlinear matter power spectrum

symbolic_pofk provides simple Python functions and a Fortran90 routine for precise symbolic emulations of the linear and non-linear matter power spectra and for the conversion σ 8 ↔ A s as a function of cosmology. These can be easily copied, pasted, and modified to other languages. Outside of a tested k range, the fit includes baryons by default; however, this can be switched off.

[ascl:2409.013] planetMagFields: Routines to plot magnetic fields of planets in our solar system

planetMagFields accesses and analyzes information about magnetic fields of planets in our solar system and visualizes them in both 2D and 3D. The code provides access to properties of a planet, such as dipole tilt, Gauss coefficients, and computed radial magnetic field at surface, and has methods to plot the field and write a vts file for 3D visualization. planetMagFields can be used to produce both 2D and 3D visualizations of a planetary field; it also provides the option of potential extrapolation.

[ascl:2409.012] AMReX: Software framework for block structured AMR

The software framework AMReX is designed for building massively parallel block-structured adaptive mesh refinement (AMR) applications. Key features of AMReX include C++ and Fortran interfaces; 1-, 2- and 3-D support; and support for cell-centered, face-centered, edge-centered, and nodal data. The framework also supports hyperbolic, parabolic, and elliptic solves on hierarchical adaptive grid structure, optional subcycling in time for time-dependent PDEs, and parallelization via flat MPI, OpenMP, hybrid MPI/OpenMP, or MPI/MPI, and parallel I/O. AMReX supports the plotfile format with AmrVis, VisIt (ascl:1103.007), ParaView (ascl:1103.014), and yt (ascl:1011.022).

[ascl:2409.011] ClassiPyGRB: Swift/BAT GRB visualizer and classifier

ClassiPyGRB downloads, processes, visualizes, and classifies GRBs in the Swift/BAT database. Users can query light curves for any GRB and use tools to preprocess the data, including noise/duration reduction and interpolation. The package provides a set of facilities and tutorials for classifying GRBs based on their light curves using a method based on a dimensionality reduction of the data using t-Distributed Stochastic Neighbour Embedding (TSNE); results are visualized using a Graphical User Interface (GUI). ClassiPyGRB also plots and animates the results of the TSNE analysis for a deeper hyperparameter grid search.

[ascl:2409.010] BeyonCE: Beyond Common Eclipsers

BeyonCE (Beyond Common Eclipsers) explores the large parameter space of eclipsing disc systems. The fitting code reduces the parameter space encompassed by the transit of circumsecondary disc (CSD) systems with azimuthally symmetric, non-uniform optical-depth profiles to constrain the size and orientation of discs with a complex sub-structure. BeyonCE does this by rejecting disc geometries that do not reproduce the measured gradients within their light curves.

[ascl:2409.009] resonances: Mean-motion resonances in Solar system and other planetary systems identifier

resonances identifies mean-motion resonances of small bodies. It uses the REBOUND integrator (ascl:1110.016) and automatically identifies two-body and three-body mean-motion resonance in the Solar system. The package can be used for other possible planetary systems, including exoplanets. resonances accurately differentiates different types of resonances (pure, transient, uncertain) and provides an interface for mass tasks, such as finding resonant areas in a planetary system. The software can also plot time series and periodograms.

[ascl:2409.008] cloudyfsps: Python interface between FSPS and Cloudy

cloudyfsps is a Python interface between FSPS (ascl:1010.043) and Cloudy (ascl:9910.001). It compiles FSPS models for use as ionizing sources (Stellar SED grids) within Cloudy and generates Cloudy input files, single-parameter or grids of parameters. It runs Cloudy models in parallel and formats the output, which is nebular continuum and nebular line emission, for FSPS input and for explorative manipulation and plotting within Python. cloudyfsps includes pre-packaged plots for BPT diagrams (NII, SII, OI, OII) with observed data from HII regions and SDSS galaxies, and also provides comparisons with MAPPINGS III (ascl:1306.008) models.

[ascl:2409.007] Stardust: Composite template fitting software

Stardust extracts galaxy properties by fitting their multiwavelength data to a set of linearly combined templates. This Python package brings three different families of templates together: 1.) UV+Optical emission from dust unobscured stellar light; 2.) AGN heated dust in the MIR; and 3.) IR dust reprocessed stellar light in the NIR-FIR. Stardust's template fitting does not rely on energy balance. As a result, the total luminosity of dust obscured and dust unobscured stellar light do not rely on each other, and it is possible to fit objects such as SMGs where the energy balance approach might not be applicable.

[ascl:2409.006] PICASSO: Inpainter for point-sources for synchrotron and dust polarization

PICASSO (Python Inpainter for Cosmological and AStrophysical SOurces) provides a suite of inpainting methodologies to reconstruct holes on images (128x128 pixels) extracted from a HEALPIX map. Three inpainting techniques are included; these are divided into two main groups: diffusive-based methods (Nearest-Neighbors), and learning-based methods that rely on training DCNNs to fill the missing pixels with the predictions learned from a training data-set (Deep-Prior and Generative Adversarial Networks). PICASSO also provides scripts for projecting from full sky HEALPIX maps to flat thumbnails images, performing inpainting on GPUs and parallel inpainting on multiple processes, and for projecting from flat images to HEALPIX. Pretrained models are also included.

[ascl:2409.005] MCMole3D: Statistical model for galactic molecular clouds

MCMole3D (Monte-Carlo MOlecular Line Emission) simulates the 3D molecular cloud emission in the Milky Way. In particular, it can simulate both the unpolarized and polarized emission coming from the first rotational line of Carbon Monoxide (CO, J=1-0). MCMole3D seeks to compare the simulated emission with that observed by full sky surveys from the Planck satellite.

[ascl:2409.004] FGCluster: ForeGround Clustering

FGCluster runs spectral clustering onto Healpix maps for parametric foreground removal, using a map encoding the feature to cluster as inputs. Pixel similarity is given by the geometrical affinity of each pixel in the sphere. FGCluster can also take an uncertainty map as an input, in which case the adjacency is modified in such a way that the pixel similarity accounts also for the statistical significance given by the pixel values in a map and the uncertainties.

[ascl:2409.003] SUSHI: Semi-blind Unmixing with Sparsity for Hyperspectral Images

SUSHI (Semi-blind Unmixing with Sparsity for hyperspectral images) performs non-stationary unmixing of hyperspectral images. The typical use case is to map the physical parameters such as temperature and redshift from a model with multiple components using data from hyperspectral images. Applying a spatial regularization provides more robust results on voxels with low signal to noise ratio. The code has been used on X-ray astronomy but the method can be applied to any integral field unit (IFU) data cubes.

[ascl:2409.002] UltraDark: Cosmological scalar fields simulator

UltraDark.jl simulates cosmological scalar fields. Written in Julia, it is inspired by PyUltraLight (ascl:1810.009) and designed to be simple to use and extend. It solves a non-interacting scalar field Gross-Pitaevskii equation coupled to Poisson's equation for gravitational potential. The scalar field describes scalar dark matter in models including ultralight dark matter, fuzzy dark matter, axion-like particles and the like. It also describes an inflaton field in the reheating epoch of the early universe.

[ascl:2409.001] DarsakX: X-ray telescope design and imaging performance analyzer

Written in Python, DarsakX is used to design and analyze the imaging performance of a multi-shell X-ray telescope with an optical configuration similar to Wolter-1 optics for astronomical purposes. It can also assess the impact of figure error on the telescope's imaging performance and optimize the optical design to improve angular resolution for wide-field telescopes. By default, DarsakX uses DarpanX (ascl:2101.015) to calculate the mirror's reflectivity.

[ascl:2408.015] SAQQARA: Stochastic gravitational wave background analysis

SAQQARA analyzes stochastic gravitational wave background signals. This Simulation-based Inference (SBI) library is built on top of the swyft code (ascl:2302.016), which implements neural ratio estimation to efficiently access marginal posteriors for all parameters of interest. Simulation-based inference combined with implicit marginalization (over nuisance parameters) has been shown to be well suited for SGWB data analysis.

[ascl:2408.014] 21cmFirstCLASS: Generate initial conditions at recombination

21cmFirstCLASS extends 21cmFAST (ascl:1102.023) and interfaces with CLASS (ascl:1106.020) to generate initial conditions at recombination that are consistent with the input cosmological model. These initial conditions can be set during the time of recombination, allowing one to compute the 21cm signal (and its spatial fluctuations) throughout the dark ages, as well as in the proceeding cosmic dawn and reionization epochs, just as in the standard 21cmFAST. 21cmFirstCLASS tracks both the CDM density field δc as well as the baryons density field δb. In addition, the user interface in 21cmFirstCLASS has been improved and allows one to easily plot the 21cm power spectrum while including noise from the output of 21cmSense (ascl:1609.013).

[ascl:2408.013] GRBoondi: AMR-based code to evolve generalized Proca fields on arbitrary fixed backgrounds

GRBoondi simulates generalized Proca fields on arbitrary analytic fixed backgrounds; it is based on the publicly available 3+1D numerical relativity code GRChombo (ascl:2306.039). GRBoondi reduces the prerequisite knowledge of numerical relativity and GRChombo in the numerical studies of generalized Proca theories. The main steps to perform a study are inputting the additions to the equations of motion beyond the base Proca theory; GRBoondi can then automatically incorporate the higher-order terms in the simulation. The code is written entirely in C++14 and uses hybrid MPI/OpenMP parallelism. GRBoondi inherits all of the capabilities of the main GRChombo code, which makes use of the Chombo library (ascl:1202.008) for adaptive mesh refinement.

[ascl:2408.012] RadioSED: Radio SED fitting for AGN

RadioSED uses nested sampling to perform a Bayesian analysis of radio SEDs constructed from radio flux density measurements obtained as part of large area surveys (or in some limited cases, as part of targeted followup campaigns). It is a pure Python implementation, and is essentially a wrapper around Bilby (ascl:1901.011), the Bayesian inference library. RadioSED uses dynesty (ascl:1809.013) to perform the sampling steps, though other samplers could also be used. Users can make use of a pre-defined set of models and surveys from which to draw flux density measurements, or they can define their own models and provide their own input flux density measurements. All flux density measurements are referenced against the RACS-LOW survey, and source names and IDs from the survey catalogue are used as identifiers.

[ascl:2408.011] M_SMiLe: Magnification Statistics of Micro-Lensing

M_SMiLe computes an approximation of the probability of magnification for a lens system consisting of microlensing by compact objects within a galaxy cluster. It specifically focuses on the scenario where the galaxy cluster is strongly lensing a background galaxy and the compact objects, such as stars, are sensitive to this microlensing effect. The microlenses responsible for this effect are stars and stellar remnants, though exotic objects such as compact dark matter candidates (including PBHs and axion mini-halos) can contribute to this effect.

[ascl:2408.010] BELTCROSS2: Calculate the closest approaches of asteroids to meteoroid streams

BELTCROSS2 calculates the closest approaches of asteroid to the mean orbits of meteoroid streams. It is especially useful to check if an asteroid, which was observed to become active, passed through a meteoroid stream, and through which stream, a short time before the beginning of the activity. The basic characteristics of the closest encounter of the asteroid with the stream are provided by BELTCROSS2.

[ascl:2408.009] Cue: Nebular emission modeling

Cue interprets nebular emission across a wide range of ionizing conditions of galaxies. The software, based on Cloudy (ascl:9910.001), emulates a neural net. It does not require a specific ionizing spectrum as a source, instead approximating the ionizing spectrum with a 4-part piece-wise power-law. Along with the flexible ionizing spectra, Cue allows freedom in [O/H], [N/O], [C/O], gas density, and total ionizing photon budget.

[ascl:2408.008] HaloFlow: Simulation-Based Inference (SBI) using forward modeled galaxy photometry

HaloFlow uses a machine learning approach to infer Mh and stellar mass, M∗, using grizy band magnitudes, morphological properties quantifying characteristic size, concentration, and asymmetry, total measured satellite luminosity, and number of satellites.

[ascl:2408.007] LADDER: Learning Algorithm for Deep Distance Estimation and Reconstruction

LADDER (Learning Algorithm for Deep Distance Estimation and Reconstruction) reconstructs the “cosmic distance ladder” by analyzing sequential cosmological data; it can also be applied to other sequential datasets with associated covariance information. It uses the apparent magnitude data from the Pantheon Type Ia supernovae compilation, fully incorporating covariance information to accurately predict mean values and uncertainties. It offers model-independent consistency checks for datasets such as Baryon Acoustic Oscillations (BAO) and can calibrate high-redshift datasets such as Gamma Ray Bursts (GRBs) without assuming any underlying cosmological model. Additionally, LADDER serves as a model-independent mock catalog generator for forecast-based cosmological studies.

[ascl:2408.006] SonAD: Sonification of astronomical data

Sonification extends the Astronify software (ascl:2408.005) to sonify a spatially distributed dataset. The package contains scripts to convert images into scatterplots and sonifications. The reproduce_image.py script takes an image file and reproduces it as a scatterplot by converting the input image to grayscale, extracting pixel values and generating scatter data based on these values, and then plotting the scatter data to create a visual representation of the image. The sonifications script converts the scatterplot data into an audio series and adjusts the note spacing and sonification range to customize an auditory representation. Sonification accepts images in PNG and JPG formats.

[ascl:2408.005] Astronify: Astronomical data sonification

Astronify contains tools for sonifying astronomical data, specifically data series. Data series sonification takes a data table and maps one column to time, and one column to pitch. This technique is commonly used to sonify light curves, where observation time is scaled to listening time and flux is mapped to pitch. While Astronify’s sonification uses the columns “time” and “flux” by default, any two columns can be supplied and a sonification created.

[ascl:2408.004] Sailfish: GPU-accelerated grid-based astrophysics gas dynamics code

Sailfish simulates accreting binary systems, including binary protostars, post-AGN stellar binaries, mass-transferring X-ray binaries, and double black hole systems. The binary components are "on the grid" rather than excised, and are evolved according to the Kepler two-body problem, modified to account for gravitational wave losses or self-consistent forcing from the orbiting gas. The solvers are shock-capturing and are second order accurate in space and time. Gravity is fully Newtonian. Thermodynamics can be treated using a gamma-law equation of state with a blackbody cooling term, or in the locally isothermal approximation, in which the gas temperature is set to a constant times the local free-fall speed. Sailfish is fully Cartesian and has extensive diagnostic capabilities to facilitate accurate calculations of gas-driven orbital evolution or the extraction of electromagnetic disk signatures. The code is extremely efficient, reaching more than one billion zone updates per second on an NVIDIA A100 GPU, enabling extremely high resolution of complex flows around the binary components.

[submitted] AntabGMVA: A Python tool for managing GMVA metadata

Global mm-VLBI Array (GMVA) observations are accompanied by a lot of metadata (i.e., the so-called 'ANTAB' files) that contain the system temperature (Tsys) and the gain values of the individual GMVA antennas. These data are required for the amplitude calibration of GMVA data which is an essential part in the data reduction. Unfortunately, Tsys measurements in the ANTAB files are not perfect and there are almost always erroneous values in some of the ANTAB files (particularly in the VLBA data). This could lead to incorrect results in the amplitude calibration and thus need to be corrected with proper data inspection/treatment. However, every GMVA station provides the ANTAB file in their own data format which makes the examination tricky. AntabGMVA was designed to resolve these issues and allows GMVA users to manage the GMVA ANTAB files easily and efficiently. Using AntabGMVA, one can perform extraction/inspection/visualization/correction of the Tsys data from the ANTAB files and finally generate one single ANTAB file which includes all the final products.

[ascl:2408.003] SHARC: SHArpened Dimensionality Reduction and Classification

SHARC (SHArpened Dimensionality Reduction and Classification) performs local gradient clustering-based sharpened dimensionality reduction (SDR) using neural network projections and uses these projections to make classifications. The library also contains functions for finding the optimal SDR parameters and for consolidating classification results obtained through multiple classifiers. It requires pySDR (ascl:2408.002). SHARC provides accurate and physically insightful classification of astronomical objects based on their broadband colors.

[ascl:2408.002] pySDR: Wrapper for sharpened dimensionality reduction code

pySDR performs local gradient clustering-based sharpened dimensionality reduction (SDR). The library uses the C++ LGCDR_v1 code as its backend.

[ascl:2408.001] SDR: Sharpened Dimensionality Reduction

Sharpened dimensionality reduction (SDR) sharpens original data before dimensionality reduction to create visually segregrated sample clusters. user-guided labeling. Each distinct cluster can then be labeled and used to further analyze an otherwise unlabeled data set. Written in C++, SDR scales well with large high-dimensional data.

[ascl:2407.020] Package-X: Calculate Feynman loop integrals

Package‑X instantly solves one loop Feynman integrals in full generality. Written in Mathematica and extensively tested and adopted, the package computes dimensionally regulated one-loop integrals with up to four distinct propagators of arbitrarily high rank, calculates traces of Dirac matrices in d dimensions for closed fermion loops, or carries out Dirac algebra for open fermion lines. Package‑X also generates analytic results for any kinematic configuration (e.g., at zero external momentum or physical threshold) for real masses and external invariants, provides analytic expressions for UV-divergent, IR-divergent and finite parts either separately or all together, and computes discontinuities across cuts of one-loop integrals, among other tasks.

[ascl:2407.019] hipipe: VLT/HiRISE reduction pipeline

The High-Resolution Imaging and Spectroscopy of Exoplanets (HiRISE) instrument at the Very Large Telescope (VLT) combines the exoplanet imager SPHERE with the high-resolution spectrograph CRIRES using single-mode fibers. HiRISE has been designed to enable the characterization of known, directly-imaged planetary companions in the H band at a spectral resolution on the order of R = λ/∆λ = 140 000. The hipipe package is a custom python pipeline used to reduce the HiRISE data and produce high-level science products that can be used for astrophysical interpretation.

[ascl:2407.018] pony3d: Efficient island-finding tool for radio spectral line imaging

pony3d statistically identifies islands of contiguous emission inside a three-dimensional volume. The primary functionality is the rapid and reliable creation of masks for the deconvolution of radio interferometric radio spectral line emission. It has been designed to run on the output of the wsclean imager (ascl:1408.023) whereby the individual FITS image per frequency plane enables a high degree of parallelism, but can work on any image set providing this criterion is met. Single channel island rejection is offered, along with 3D mask dilation and boxcar averaging. pony3d is also a prototype source-finding and extraction tool.

[ascl:2407.017] photGalIMF: Stellar mass and luminosity evolution calculator

The photGalIMF code calculates the evolution of stellar mass and luminosity for a galaxy model, based on the PARSEC stellar evolution model (ascl:1502.005). It requires input lists specifying the age, mass, metallicity, and initial mass function (IMF) of single stellar populations. These input parameters can be provided by the companion galaxy chemical simulation code GalIMF (ascl:1903.010), which generates realistic sets of inputs.

[submitted] ELISA: Efficient Library for Spectral Analysis in High-Energy Astrophysics

ELISA is a Python library designed for efficient spectral modeling and robust statistical inference. With user-friendly interface, ELISA streamlines the spectral analysis workflow.

The modeling framework of ELISA is flexible, allowing users to construct complex models by combining models of ELISA and XSPEC, as well as custom models. Parameters across different model components can also be linked. The models can be fitted to the spectral datasets using either Bayesian or maximum likelihood approaches. For Bayesian fitting, ELISA incorporates advanced Markov Chain Monte Carlo (MCMC) algorithms, including the No-U-Turn Sampler (NUTS), nested sampling, and affine-invariant ensemble sampling, to tackle the posterior sampling problem. For maximum likelihood estimation (MLE), ELISA includes two robust algorithms: the Levenberg-Marquardt algorithm and the Migrad algorithm from Minuit. The computation backend is based on Google's JAX, a high-performance numerical computing library, which can reduce the runtime for fitting procedures like MCMC, thereby enhancing the efficiency of analysis.

After fitting, goodness-of-fit assessment can be done with a single function call, which automatically conducts posterior predictive checks and leave-one-out cross-validation for Bayesian models, or parametric bootstrap for MLE. These methods offer greater accuracy and reliability than traditional fit-statistic/dof measures, and thus better model discovery capability. For comparing multiple candidate models, ELISA provides robust Bayesian tools such as the Widely Applicable Information Criterion (WAIC) and the Leave-One-Out Information Criterion (LOOIC), which are more reliable than AIC or BIC. Thanks to the object-oriented design, collecting the analysis results should be simple. ELISA also provide visualization tools to generate ready-for-publication figures.

ELISA is an open-source project and community contributions are welcome and greatly appreciated.

[ascl:2407.016] Heimdall: GPU accelerated transient detection pipeline for radio astronomy

Heimdall uses direct, tree, and sub-band dedispersion algorithms on massively parallel computing architectures (GPUs) to speed up real-time detection of radio pulsar and other transient events.

[submitted] Flash-X: A Performance Portable, Multiphysics Simulation Software Instrument

Flash-X simulates physical phenomena in several scientific domains, primarily those involving compressible or incompressible reactive flows, using Eulerian adaptive mesh and particle techniques. It derives some of its solvers from and is a descendant of FLASH (ascl:1010.082). Flash-X has a new framework that relies on abstractions and asynchronous communications for performance portability across a range of heterogeneous hardware platforms, including exascale machines. It also includes new physics capabilities, such as the Spark general relativistic magnetohydrodynamics (GRMHD) solver, and supports interoperation with the AMReX mesh framework, the HYPRE linear solver package, and the Thornado neutrino radiation hydrodynamics package, among others.

[ascl:2407.015] AstroCLIP: Multimodal contrastive pretraining for astronomical data

AstroCLIP performs contrastive pre-training between two different kinds of astronomical data modalities (multi-band imaging and optical spectra) to yield a meaningful embedding space which captures physical information about galaxies and is shared between both modalities. The embeddings can be used as the basis for competitive zero- and few-shot learning on a variety of downstream tasks, including similarity search, redshift estimation, galaxy property prediction, and morphology classification.

[ascl:2407.014] PFFT: Parallel fast Fourier transforms

PFFT computes massively parallel, fast Fourier transformations on distributed memory architectures. PFFT can be understood as a generalization of FFTW-MPI (ascl:1201.015) to multidimensional data decomposition; in fact, using PFFT is very similar to FFTW. The library is written in C and MPI; a Fortran interface is also available.

[ascl:2407.013] cola_halo: Parallel cosmological N-body simulator

cola_halo generates hundreds of realizations on the fly. This parallel cosmological N-body simulation code generates random Gaussian initial condition using 2LPTIC (ascl:1201.005), time evolves N-body particles with colacode (ascl:1602.021), and finds dark-matter halos with the Friends-of-Friends code (ascl:2407.012).

[ascl:2407.012] Fof: Friends-of-friends code to find groups

Fof uses the friends-of-friends method to find groups. A particle belongs to a friends-of-friends group if it is within some linking length of any other particle in the group. After all such groups are found, those with less than a specified minimum number of group members are rejected. The program takes input files in the TIPSY (ascl:1111.015) binary format and produces a single ASCII output file called fof.grp. This output file is in the TIPSY array format and contains the group number to which each particle belongs. A group number of zero means that the particle does not belong to a group. The fof.grp file can be read in by TIPSY and used to color each particle by group number to visualize the groups. Simulations with periodic boundary conditions can also be handled by fof by specifying the period in each dimension on the command line.

[ascl:2407.011] bigfile: A reproducible massively parallel IO library for hierarchical data

bigfile stores data from cosmology simulations from HPC systems and beyond. It provides a hierarchical structure of data columns via File, Dataset and Column. A Column stores a two dimensional table. Numerical typed columns are supported; attributes can be attached to a Column and both numerical attributes and string attributes are supported. Type casting is performed on-the-fly if read/write operations request a different data type than the file has stored.

[ascl:2407.010] UFalcon: Ultra Fast Lightcone

UFalcon rapidly post-processes N-body code output into signal maps for many different cosmological probes. The package is able to produce maps of weak-lensing convergence, linear-bias galaxy over-density, cosmic microwave background (CMB) lensing convergence and the integrated Sachs-Wolfe temperature perturbation given a set of N-body lightcones. It offers high flexibility for lightcone construction, such as user-specific survey-redshift ranges, redshift distributions and single-source redshifts. UFalcon also computes the galaxy intrinsic alignment signal, which can be treated as an additive component to the cosmological signal.

[ascl:2407.009] ATM: Asteroid Thermal Modeling

ATM (Asteroid Thermal Modeling) models asteroid flux measurements to estimate an asteroid's size, surface temperature distribution, and emissivity, and creates model spectral energy distributions for the different thermal models. After downloading lookup tables for relevant models, it can also fit observations of asteroids.

[ascl:2407.008] RealSim: Statistical observational realism for synthetic images from galaxy simulations

RealSim generates survey-realistic synthetic images of galaxies from hydrodynamical simulations of galaxy formation and evolution. The main functionality of this code inserts "idealized" simulated galaxies into Sloan Digital Sky Survey (SDSS) images in such a way that the statistics of sky brightness, resolution, and crowding are matched between simulated galaxies and observed galaxies in the SDSS. The suite accepts idealized synthetic images in calibrated AB surface brightnesses and rebins them to the desired redshift and CCD angular scale; RealSim can add Poisson noise, if desired, by adopting generic values of photometric calibrations in survey fields. Images produced by the suite can be inserted into real image fields to incorporate real skies, PSF degradation, and contamination by neighboring sources in the field of view. The RealSim methodology can be applied to any existing galaxy imaging survey.

[ascl:2407.007] GRDzhadzha: Evolve matter on curved spacetimes

GRDzhadzha evolves matter on curved spacetimes with an analytic time and space dependence. Written in C++14, it uses hybrid MPI/OpenMP parallelism to achieve good performance. The code is based on publicly available 3+1D numerical relativity code GRChombo (ascl:2306.039) and inherits all of the capabilities of the main GRChombo code, which uses the Chombo library for adaptive mesh refinement.

[ascl:2407.006] provabgs: SED modeling tools for PROVABGS

provabgs infers full posterior distributions of galaxy properties for galaxies in the DESI Bright Galaxy Survey using state-of-the-art Bayesian spectral energy distribution (SED) modeling of DESI spectroscopy and photometry. provabgs includes a state-of-the-art stellar population synthesis (SPS) model based on non-parametric prescription for star formation history, a metallicity history that varies over the age of the galaxy, and a flexible dust prescription. It has a neural network emulator for the SPS model that enables accelerated inference. Full posteriors of the 12 SPS parameters can be derived in ~10 minutes. The emulator is currently designed for galaxies from 0 < z < 0.6. provabgs also includes a Bayesian inference pipeline that is based on zeus (ascl:2008.010).

[ascl:2407.005] BaCoN: BAyesian COsmological Network

BaCoN (BAyesian COsmological Network) trains and tests Bayesian Convolutional Neural Networks in order to classify dark matter power spectra as being representative of different cosmologies, as well as to compute the classification confidence. It supports the following theories: LCDM, wCDM, f(R), DGP, and a randomly generated class. Additional cosmologies can be easily added.

[ascl:2407.004] Forklens: Deep learning weak lensing shear

Forklens measures weak gravitational lensing signal using a deep-learning methoe. It measures galaxy shapes (shear) and corrects the smearing of the point spread function (PSF, an effect from either/both the atmosphere and optical instrument). It contains a custom CNN architecture with two input branches, fed with the observed galaxy image and PSF image, and predicts several features of the galaxy, including shape, magnitude, and size. Simulation in the code is built directly upon GalSim (ascl:1402.009).

[ascl:2407.003] pycosie: Python analysis code used on Technicolor Dawn

pycosie is analysis code used for Technicolor Dawn (TD), a Gadget-3 derived cosmological radiative SPH simulation suite. The target analyses are to complement what is done with TD and other analysis software in its suite. pycosie creates power spectrum from generated Lyman-alpha forests spectra, links absorbers to potential host galaxies, grids gas information for each galaxy, and reads specific output files from software such as Rockstar (ascl:1210.008) and SKID (ascl:1102.020).

[ascl:2407.002] pyFAT: Python Fully Automated TiRiFiC

Python Fully Automated TiRiFiC (pyFAT) wraps around the tilted ring fitting code (TiRiFiC, ascl:1208.008) to fully automate the process of fitting simple tilted ring models to line emission cubes. pyFAT is the successor to the IDL/GDL FAT (ascl:1507.011) code and offers improved handling and fitting as well as several new features. PyFAT fits simple rotationally symmetric discs with asymmetric warps and surface brightness distributions, providing a base model that can can be used in TiRiFiC to explore large scale motions. pyFAT delivers much more control over the fitting procedure, which is made possible by the new modular setup and the use of omegaconf for the input and default settings.

[ascl:2407.001] MAKEE: MAuna Kea Echelle Extraction

MAKEE (MAuna Kea Echelle Extraction) reduces data from the HIRES and ESI instruments at Keck Observatory. It is optimized for the spectral extraction of single, unresolved point sources and is designed to run non-interactively using a set of default parameters. Taking the raw HIRES FITS files as input, the code determines the position (or trace) of each echelle order, defines the object and background extraction boundaries, optimally extracts a spectrum for each order, and computes wavelength calibrations. MAKEE produces FITS format "spectral images" (each row is a separate echelle order spectrum) and the data values are in arbitrary (relative) flux units. MAKEE will reduce data from all HIRES formats, including the single CCD format, the single CCD with Red and UV cross dispersers, and the current 3 CCD system. It can handle a variety of pixel binnings, including 1x1, 1x2, 1x4 (column x row).

[ascl:2406.029] WinNet: Flexible, multi-purpose, single-zone nuclear reaction network

WinNet, a single zone nuclear reaction network, calculates many different nucleosynthesis processes, including r-process, nup-process, and explosive nucleosynthesis, and many more). It reads in a user-defined file with runtime parameters, then chooses the evolution mode, which is dependent on temperature. The temperature, density, and neutrino quantities are updated, after which the reaction network equations are solved numerically. If convergence is not achieved, the step size is halved and the iteration is repeated. Once convergence is reached, the output is generated and the time is evolved; the final output such as the final abundances and mass fractions are written.

[submitted] Exovetter

Exovetter is an open-source, pip-installable python package which calculates metrics on high cadence time series photometry to distinguish between exoplanet transit signals and false positives. The package standardizes the implementation of metrics developed for the TESS, Kepler, and K2 missions such as Odd-Even, Multiple Event Statistic, and Centroid Offset (see “Planetary Candidates Observed by Kepler. VIII.”, Thompson et al. 2018.). Metrics can be run individually or together as part of a pipeline. Exovetter also includes several visualizations to further evaluate the transits and metrics.

[ascl:2406.030] AutoPhOT: Rapid publication-quality photometry of transients

AutoPhOT (AUTOmated Photometry Of Transients) produces publication-quality photometry of transients quickly. Written in Python 3, this automated pipeline's capabilities include aperture and PSF-fitting photometry, template subtraction, and calculation of limiting magnitudes through artificial source injection. AutoPhOT is also capable of calibrating photometry against either survey catalogs (e.g., SDSS, PanSTARRS) or using a custom set of local photometric standards.

[ascl:2406.028] Redback: Bayesian inference package for fitting electromagnetic transients

Redback provides end-to-end interpretation and parameter estimation of electromagnetic transients. Using data downloaded by the code or provided by the user, the code processes the data into a homogeneous transient object. Redback implements several different types of electromagnetic transients models, ranging from simple analytical models to numerical surrogates, fits models implemented in the package or provided by the user, and plots lightcurves. The code can also be used as a tool to simulate realistic populations without having to fit anything, as models are implemented as functions and can be used to simulate populations. Redback uses Bilby (ascl:1901.011) for sampling and can easily switch samplers and likelihoods.

[ascl:2406.027] phi-GPU: Parallel Hermite Integration on GPU

The phi-GPU (Parallel Hermite Integration on GPU) high-order N-body parallel dynamic code uses the fourth-order Hermite integration scheme with hierarchical individual block time-steps and incorporates external gravity. The software works directly with GPU, using only NVIDIA GPU and CUDA code. It creates numerical simulations and can be used to study galaxy and star cluster evolution.

[ascl:2406.026] Faceted-HyperSARA: Parallel faceted imaging in radio interferometry

Faceted-HyperSARA images radio-interferometric wideband intensity data. Written in MATLAB, the library offers a collection of utility functions and scripts from data extraction from an RI measurement set MS Table to the reconstruction of a wideband intensity image over the field of view and frequency range of interest. The code achieves high precision imaging from large data volumes and supports data dimensionality reduction via visibility gridding and estimation of the effective noise level when reliable noise estimates are not available. Faceted-HyperSASA also corrects the w-term via w-projection and incorporates available compact Fourier models of the direction dependent effects (DDEs) in the measurement operator.

[ascl:2406.025] PowerSpecCovFFT: FFTLog-based computation of non-Gaussian analytic covariance of galaxy power spectrum multipoles

PowerSpecCovFFT computes the non-Gaussian (regular trispectrum and its shot noise) part of the analytic covariance matrix of the redshift-space galaxy power spectrum multipoles using an FFTLog-based method. The galaxy trispectrum is based on a tree-level standard perturbation theory but with a slightly different galaxy bias expansion. The code computes the non-Gaussian covariance of the power spectrum monopole, quadrupole, hexadecapole, and their cross-covariance up to kmax ~ 0.4 h/Mpc.

[ascl:2406.024] GRINN: Gravity Informed Neural Network for studying hydrodynamical systems

GRINN (Gravity Informed Neural Network) solves the coupled set of time-dependent partial differential equations describing the evolution of self-gravitating flows in one, two, and three spatial dimensions. It is based on physics informed neural networks (PINNs), which are mesh-free and offer a fundamentally different approach to solving such partial differential equations. GRINN has solved for the evolution of self-gravitating, small-amplitude perturbations and long-wavelength perturbations and, when modeling 3D astrophysical flows, provides accuracy on par with finite difference (FD) codes with an improvement in computational speed.

[ascl:2406.023] AARD: Automatic detection of solar active regions

This python code automatically detects solar active regions (AR). Based on morphological operation and region growing, it uses synoptic magnetograms from SOHO/MDI and SDO/HMI and calculates the parameters that characterize each AR, including the latitude and longitude of the flux-weighted centroid of two polarities and the whole AR, the area, and the flux of each polarity, and the initial and final dipole moments.

[ascl:2406.022] phazap: Low-latency identification of strongly lensed signals

Phazap post-processes gravitational-wave (GW) parameter estimation data to obtain the phases and polarization state of the signal at a given detector and frequency. It is used for low-latency identification of strongly lensed gravitational waves via their phase consistency by measuring their distance in the detector phase space. Phazap builds on top of the IGWN conda enviroment which includes the standard GW packages LALSuite (ascl:2012.021) and bilby (ascl:1901.011), and can be applied beyond lensing to test possible deviations in the phase evolution from modified theories of gravity and constrain GW birefringence.

[ascl:2406.021] photochem: Chemical model of planetary atmospheres

Photochem models the photochemical and climate composition of a planet's atmosphere. It takes inputs such as the stellar UV flux and atmospheric temperature structure to find the steady-state chemical composition of an atmosphere, or evolve atmospheres through time. Photochem also contains 1-D climate models and a chemical equilibrium solver.

[ascl:2406.020] LeHaMoC: Leptonic-Hadronic Modeling Code for high-energy astrophysical sources

LeHaMoC simulates high-energy astrophysical sources. It simulates the behavior of relativistic pairs, protons interacting with magnetic fields, and photons in a spherical region. The package contains numerous physical processes, including synchrotron emission and self-absorption, inverse Compton scattering, photon-photon pair production, and adiabatic losses. It also includes proton-photon pion production, proton-photon (Bethe-Heitler) pair production, and proton-proton collisions. LeHaMoC can model expanding spherical sources with a variable magnetic field strength. In addition, three types of external radiation fields can be defined: grey body or black body, power-law, and tabulated.

[ascl:2406.019] MBE: Magnification bias estimation

Magnification bias estimation estimates magnification bias for a galaxy sample with a complex photometric selection for the example of SDSS BOSS. The code works for CMASS and the LOWZ, z1 and z3 samples. A template for applying the approach to other surveys is included; requirements include a galaxy catalog that provides magnitudes (used for photometric selection) and the exact conditions used for the photometric selection.

[ascl:2406.018] SuperLite: Spectral synthesis code for interacting transients

SuperLite produces synthetic spectra for astrophysical transient phenomena affected by circumstellar interaction. It uses Monte Carlo methods and multigroup structured opacity calculations for semi-implicit, semirelativistic radiation transport in high-velocity shocked outflows, and can reproduce spectra of typical Type Ia, Type IIP, and Type IIn supernovae. SuperLite also generates high-quality spectra that can be compared with observations of transient events, including superluminous supernovae, pulsational pair-instability supernovae, and other peculiar transients.

[ascl:2406.017] ytree: yt-based merger-tree code

ytree reads and works with merger tree data from multiple formats. An extension of yt (ascl:1011.022), which can analyze snapshots from cosmological simulations, ytree can be thought of as the yt of merger trees. ytree's online documentation lists supported formats; support for additional formats can be added, as in principle, any type of tree-like data where an object has one or more ancestors and a single descendant can be supported.

[ascl:2406.016] BiaPy: Bioimage analysis pipeline builder

BiaPy provides deep-learning workflows for a large variety of image analysis tasks, including 2D and 3D semantic segmentation, instance segmentation, object detection, image denoising, single image super-resolution, self-supervised learning and image classification. Though developed specifically for bioimages, it can be used for watershed-based instance segmentation for friends-of-friends proto-haloes.

[ascl:2406.015] FLORAH: Galaxy merger tree generator with machine learning

FLORAH generates the assembly history of halos using a recurrent neural network and normalizing flow model. The machine-learning framework can be used to combine multiple generated networks that are trained on a suite of simulations with different redshift ranges and mass resolutions. Depending on the training, the code recovers key properties, including the time evolution of mass and concentration, and galaxy stellar mass versus halo mass relation and its residuals. FLORAH also reproduces the dependence of clustering on properties other than mass, and is a step towards a machine learning-based framework for planting full merger trees.

[ascl:2406.014] EVA: Excess Variability-based Age

EVA (Excess Variability-based Age) computes the VarX values and VarX90 ages for a given list of stars. The package retrieves information from Gaia, performs basic var90 calculations, then calculates the age of the group in a given band or overall (by combining all three bands). EVA then analyzes and plots the results.

[ascl:2406.013] AAD: ALeRCE Anomaly Detector

The ALeRCE anomaly detector cross-validates six anomaly detection algorithms for three classes (transient, periodic, and stochastic) of anomalous sources within the Zwicky Transient Facility (ZTF) data stream using the ALeRCE light curve features. A machine and deep learning-based framework is used for anomaly detection. For each class, a distinct anomaly detection model is constructed using only information about the known objects (i.e., inliers) for training. An anomaly score is computed using the probabilities to determine whether the light curve corresponds to a transient, stochastic, or periodic nature.

[ascl:2406.012] QMC: Quadratic Monte Carlo

Quadratic Monte Carlo generates ensembles of models and confines fitness landscapes without relying on linear stretch moves; it works very efficiently for ring potential and Rosenbrock density. The method is general and can be implemented into any existing MC software, requiring only a few lines of code.

[ascl:2406.011] CTC: Color transformations calculator

Color transformations calculator determines the magnitude of a galaxy in a needed photometric band, given its color and magnitude in the original band. It supports various optical and near intrared surveys, including SDSS, DECaLS, DELVE, UKIDSS, VHS, and VIKING, and provides conversions for both total and aperture magnitudes with apertures of 1.5", 2" or 3" diameters. The source code, useful for performing bulk calculations, is available in Python and IDL; the calculator is also offered as a web service.

[ascl:2406.010] PRyMordial: Precise computations of BBN within and beyond the Standard Model

PRyMordial offers fast and precise evaluation of both the Big Bang Nucleosynthesis (BBN) light-element abundances and the effective number of relativistic degrees of freedom. It can be used within and beyond the Standard Model. The package calculates Neff and helium-4, deuterium, helium-3 and lithium-7 abundances. PRyMordial corrects for QED plasma effects, neutron lifetime, and incomplete neutrino decoupling, and includes an optional module that re-elaborates all the ODE systems of the code in Julia.

[ascl:2406.009] CBiRd: Bias tracers In Redshift space

CBiRd (Code for Bias tracers In Redshift space) provides correlators in the Effective Field Theory of Large-Scale Structure (EFTofLSS) in a ready-to-use pipeline for cosmological analysis of galaxy-redshift surveys data. It provides a core calculation package (C++BiRd), a Python implementation of a Taylor expansion of the power spectrum around a reference cosmology for efficient evaluation (TBiRd), and libraries to correct for observational systematics. CBiRd also provides MCMC samplers (MCBiRd) for a power spectrum and bispectrum analysis of galaxy-redshift surveys data based on emcee (ascl:1303.002), and can provide an earlybird pass to explore the cosmos with LSS surveys.

[ascl:2406.008] sphereint: Integrate data on a grid within a sphere

sphereint calculates the numerical volume in a sphere. It provides a weight for each grid position based on whether or not it is in (weight = 1), out (weight = 0), or partially in (weight in between 0 and 1) a sphere of a given radius. A cubic cell is placed around each grid position and the volume of the cell in the sphere (assuming a flat surface in the cell) is calculated and normalized by the cell volume to obtain the weight.

[ascl:2406.007] CARDiAC: Anisotropic Redshift Distributions in Angular Clustering

CARDiAC (Code for Anisotropic Redshift Distributions in Angular Clustering) computes the impact of anisotropic redshift distributions on a wide class of angular clustering observables. It supports auto- and cross-correlations of galaxy samples and cosmic shear maps, including galaxy-galaxy lensing. The anisotropy can be present in the mean redshift and/or width of Gaussian distributions, as well as in the fraction of galaxies in each component of multi-modal distributions. Templates of these variations can be provided by the user or simulated internally within the code.

[ascl:2406.006] anzu: Measurements and emulation of Lagrangian bias models for clustering and lensing cross-correlations

The anzu package offers two independent codes for hybrid Lagrangian bias models in large-scale structure. The first code measures the hybrid "basis functions"; the second takes measurements of these basis functions and constructs an emulator to obtain predictions from them at any cosmology (within the bounds of the training set). anzu is self-contained; given a set of N-body simulations used to build emulators, it measures the basis functions. Alternatively, given measurements of the basis functions, anzu should in principle be useful for constructing a custom emulator.

[ascl:2406.005] Lenser: Measure weak gravitational flexion

Lenser estimates weak gravitational lensing signals, particularly flexion, from real survey data or realistically simulated images. Lenser employs a hybrid of image moment analysis and an Analytic Image Modeling (AIM) analysis. In addition to extracting flexion measurements by fitting a (modified Sérsic) model to a single image of a galaxy, Lenser can do multi-band, multi-epoch fitting. In multi-band mode, Lenser fits a single model to multiple postage stamps, each representing an exposure of a single galaxy in a particular band.

[ascl:2406.004] candl: Differentiable likelihood framework for analyzing CMB power spectrum measurements

candl (CMB Analysis With A Differentiable Likelihood) analyzes CMB power spectrum measurements using a differentiable likelihood framework. It is compatible with JAX (ascl:2111.002), though JAX is optional, allowing for fast and easy computation of gradients and Hessians of the likelihoods, and candl provides interface tools for working with other cosmology software packages, including Cobaya (ascl:1910.019) and MontePython (ascl:1805.027). The package also provides auxiliary tools for common analysis tasks, such as generating mock data, and supports the analysis of primary CMB and lensing power spectrum data.

[ascl:2406.003] SMART: Spectral energy distribution (SED) fitter

SMART (Spectral energy distributions Markov chain Analysis with Radiative Transfer models) implements a Bayesian Markov chain Monte Carlo (MCMC) method to fit the ultraviolet to millimeter spectral energy distributions (SEDs) of galaxies exclusively with radiative transfer models. The models constitute four types of pre-computed libraries, which describe the starburst, active galactic nucleus (AGN) torus, host galaxy and polar dust components.

[ascl:2406.002] SRF: Scaling Relations Finder

Scaling Relations Finder finds the scaling relations between magnetic field properties and observables for a model of galactic magnetic fields. It uses observable quantities as input: the galaxy rotation curve, the surface densities of the gas, stars and star formation rate, and the gas temperature to create galactic dynamo models. These models can be used to estimate parameters of the random and mean components of the magnetic field, as well as the gas scale height, root-mean-square velocity and the correlation length and time of the interstellar turbulence, in terms of the observables.

[ascl:2406.001] GAStimator: Python MCMC gibbs-sampler with adaptive stepping

GAStimator implements a Python MCMC Gibbs-sampler with adaptive stepping. The code is simple, robust, and stable and well suited to high dimensional problems with many degrees of freedom and very sharp likelihood features. It has been used extensively for kinematic modeling of molecular gas in galaxies, but is fully general and may be used for any problem MCMC methods can tackle.

[ascl:2405.025] CosmoPower: Machine learning-accelerated Bayesian inference

CosmoPower develops Bayesian inference pipelines that leverage machine learning to solve inverse problems in science. While the emphasis is on building algorithms to accelerate Bayesian inference in cosmology, the implemented methods allow for their application across a wide range of scientific fields. CosmoPower provides neural network emulators of matter and Cosmic Microwave Background power spectra, which can replace Boltzmann codes such as CAMB (ascl:1102.026) or CLASS (ascl:1106.020) in cosmological inference pipelines, to source the power spectra needed for two-point statistics analyses. This provides orders-of-magnitude acceleration to the inference pipeline and integrates naturally with efficient techniques for sampling very high-dimensional parameter spaces.

[ascl:2405.024] ndcube: Multi-dimensional contiguous and non-contiguous coordinate-aware arrays

ndcube manipulates, inspects, and visualizes multi-dimensional contiguous and non-contiguous coordinate-aware data arrays. A sunpy (ascl:1401.010) affiliated package, it combines data, uncertainties, units, metadata, masking, and coordinate transformations into classes with unified slicing and generic coordinate transformations and plotting and animation capabilities. ndcube handles data of any number of dimensions and axis types (e.g., spatial, temporal, and spectral) whose relationship between the array elements and the real world can be described by World Coordinate System (WCS) translations.

[ascl:2405.023] raccoon: Radial velocities and Activity indicators from Cross-COrrelatiON with masks

raccoon implements the cross-correlation function (CCF) method. It builds weighted binary masks from a stellar spectrum template, computes the CCF of stellar spectra with a mask, and derives radial velocities (RVs) and activity indicators from the CCF. raccoon is mainly implemented in Python 3; it also uses some Fortran subroutines that are called from Python.

[ascl:2405.022] blackthorn: Spectra from right-handed neutrino decays

blackthorn generates spectra of dark matter annihilations into right-handed (RH) neutrinos or into particles that result from their decay. These spectra include photons, positrons, and neutrinos. The code provides support for varied RH-neutrino masses ranging from MeV to TeV by incorporating hazma, PPPC4DMID, and HDMSpectra models to compute dark matter annihilation cross sections and mediator decay widths. blackthorn also computes decay branching fractions and partial decay widths.

[ascl:2405.021] PALpy: Python positional astronomy library interface

PALpy provides a Python interface to PAL, the positional Astronomy Library (ascl:1606.002), which is written in C. All arguments modified by the C API are returned and none are modified. The one routine that is different is palObs, which returns a simple dict that can be searched using standard Python. The keys to the dict are the short names and the values are another dict with keys name, long, lat and height.

[ascl:2405.020] tapify: Multitaper spectrum for time-series analysis

tapify implements a suite of multitaper spectral estimation techniques for analyzing time series data. It supports analysis of both evenly and unevenly sampled time series data. The multitaper statistic tackles the problems of bias and consistency, which makes it an improvement over the classical periodogram for evenly sampled data and the Lomb-Scargle periodogram for uneven sampling. In basic statistical terms, this estimator provides a confident look at the properties of a time series in the frequency or Fourier domain.

[ascl:2405.019] coronagraph: Python noise model for directly imaging exoplanets

coronagraph provides a Python noise model for directly imaging exoplanets with a coronagraph-equipped telescope. Based on the original IDL code for this coronagraph model, coronograph_noise (ascl:2405.018), the Python version has been expanded in a few key ways. Most notably, the Telescope, Planet, and Star objects used for reflected light coronagraph noise modeling can now be used for transmission and emission spectroscopy noise modeling, making this model a general purpose exoplanet noise model for many different types of observations.

[ascl:2405.018] coronagraph_noise: Coronagraph noise modeling routines

coronagraph_noise simulates coronagraph noise. Written in IDL, the code includes a generalized coronagraph routine and simulators for the WFIRST Shaped Pupil Coronagraph in both spectroscopy and imaging modes. Functions available include stellar and planetary flux functions, planet photon and zodiacal light count rates, planet-star flux ratio, and clock induced charge count rate, among others. coronagraph_noise also includes routines to smooth a plot by convolving with a Gaussian profile to convolve a spectrum with a given instrument resolution and to take a spectrum that is specified at high spectral resolution and degrade it to a lower resolution. A Python implementation of coronagraph_noise, coronagraph (ascl:2405.019), is also available.

[ascl:2405.017] AFINO: Automated Flare Inference of Oscillations

AFINO (Automated Flare Inference of Oscillations) finds oscillations in time series data using a Fourier-based model comparison approach. The code analyzes the date and generates a results file in either JSON or Pickle format, which contains numerous properties of the data and analysis, and a summary plot.

[ascl:2405.016] ABBHI: Autoregressive binary black hole inference

autoregressive-bbh-inference, written in Python, models the distributions of binary black hole masses, spins, and redshifts to identify physical features appearing in these distributions without the need for strongly-parametrized population models. This allows not only agnostic study of the “known unknowns” of the black hole population but also reveals the “unknown unknowns," the unexpected and impactful features that may otherwise be missed by the standard building-block method.

[ascl:2405.015] sunbather: Escaping exoplanet atmospheres and transit spectra simulator

sunbather simulates the upper atmospheres of exoplanets and their observational signatures. The code constructs 1D Parker wind profiles using p-winds (ascl:2111.011) to simulate these with Cloudy (ascl:9910.001), and postprocesses the output with a custom radiative transfer module to predict the transmission spectra of exoplanets.

[ascl:2405.014] EF-TIGRE: Effective Field Theory of Interacting dark energy with Gravitational REdshift

EF-TIGRE (Effective Field Theory of Interacting dark energy with Gravitational REdshift) constrains interacting Dark Energy/Dark Matter models in the Effective Field Theory framework through Large Scale Structures observables. In particular, the observables include the effect of gravitational redshift, a distortion of time from galaxy clustering. This generates a dipole in the correlation function which is detectable with two distinct populations of galaxies, thus making it possible to break degeneracies among parameters of the EFT description.

[ascl:2405.013] LTdwarfIndices: Variable brown dwarf identifier

LTdwarfIndices studies spectral indices to determine whether one or more brown dwarfs are photometric variable candidates. For a single brown dwarf, it analyzes a given set of indices and outputs the number of graphs the object appears in in the variable area, whether it is a variable or non-variable candidate, and, optionally, an index-index or histogram plot. Using another code module, LTdwarftIndices can also analyze a set of sample indices for many brown dwarfs.

[ascl:2405.012] fitramp: Likelihood-based jump detection

fitramp fits a ramp to a series of nondestructive reads and detects and rejects jumps. The software performs likelihood-based jump detection for detectors read out up-the-ramp; it uses the entire set of reads to compute likelihoods. The code compares the χ2 value of a fit with and without a jump for every possible jump location. fitramp can fit ramps with and without fitting the reset value (the pedestal), and fit and mask jumps within or between groups of reads. It can also compute the bias of ramp fitting.

[ascl:2405.011] DirectSHT: Direct spherical harmonic transform

DirectSHT performs direct spherical harmonic transforms for point sets on the sphere. Given a set of points, defined by arrays of theta and phi (in radians) and weights, it provides the spherical harmonic transform coefficients alm. JAX (ascl:2111.002) can be used to speed up the computation; the code will automatically fall back to numpy if JAX is not present. The code is much faster when run on GPUs. When they are available and JAX is installed, the code automatically distributes computation and memory across them.

[ascl:2405.010] riddler: Type Ia supernovae spectral time series fitter

riddler automates fitting of type Ia supernovae spectral time series. The code is comprised of a series of neural networks trained to emulate radiative transfer simulations from TARDIS (ascl:1402.018). Emulated spectra are then fit to observations using nested sampling implemented in UltraNest (ascl:1611.001) to estimate the posterior distributions of model parameters and evidences.

[ascl:2405.009] morphen: Astronomical image analysis and processing functions

morphen performs image analysis, multi-Sersic image fitting decomposition, and radio interferometric self-calibration, thus measuring basic image morphology and photometry. The code provides a state-of-the-art Python-based image fitting implementation based on the Sersic function. Geared, though not exclusively, toward radio astronomy, morphen's tools involve pure python, but also are integrated with CASA (ascl:1107.013) in order to work with common casatasks as well as WSClean (ascl:1408.023).

[ascl:2405.008] i-SPin: Multicomponent Schrodinger-Poisson systems with self-interactions

i-SPin simulates 3-component Schrodinger systems with and without gravity and with and without self-interactions while obeying SO(3) symmetry. The code allows the user to input desired parameters, along with initial conditions for the Schrodinger fields. Its three function modules then perform the main (drift-kick-drift) steps of the algorithm, track the fractional changes in total mass and spin in the system, and then plot results. The default plots are mass and spin density projections along with total mass and spin fractional changes.

[ascl:2405.007] GauPro: R package for Gaussian process modeling

GauPro fits a Gaussian process regression model to a dataset. A Gaussian process (GP) is a commonly used model in computer simulation. It assumes that the distribution of any set of points is multivariate normal. A major benefit of GP models is that they provide uncertainty estimates along with their predictions.

[ascl:2405.006] ICPertFLRW: Cactus Code thorn for initial conditions

ICPertFLRW, a Cactus code (ascl:1102.013) thorn, provides as initial conditions an FLRW metric perturbed with the comoving curvature perturbation Rc in the synchronous comoving gauge. Rc is defined as a sum of sinusoidals (20 in each x, y, and z direction) whose amplitude, wavelength, and phase shift are all parameters in param.ccl. While the metric and extrinsic curvature only have first order scalar perturbations, the energy density is computed exactly in full from the Hamiltonian constraint, hence vector and tensor perturbations are initially present at higher order. These are then passed to the CT_Dust thorn to be evolved.

[ascl:2405.005] pySPEDAS: Python-based Space Physics Environment Data Analysis Software

pySPEDAS (Python-based Space Physics Environment Data Analysis Software) supports multi-mission, multi-instrument retrieval, analysis, and visualization of heliophysics time series data. A Python implementation of SPEDAS (ascl:2405.001), it supports most of the capabilities of SPEDAS; it can load heliophysics data sets from more than 30 space-based and ground-based missions, coordinate transforms, interpolation routines, and unit conversions, and provide interactive access to numerous data sets. pySPEDAS also creates multi-mission, multi-instrument figures, includes field and wave analysis tools, and performs magnetic field modeling, among other functions.

[ascl:2405.004] pyADfit: Nested sampling approach to quasi-stellar object (QSO) accretion disc fitting

pyADfit models accretion discs around astrophysical objects. The code provides functions to calculate physical quantities related to accretion disks and perform parameter estimation using observational data. The accretion disc model is the alpha-disc model while the parameter estimation can be performed with Nessai (ascl:2405.002), Raynest (ascl:2405.003), or CPnest (ascl:2205.021).

[ascl:2405.003] raynest: Parallel nested sampling based on ray

raynest, written in Python, computes Bayesian evidences and probability distributions using parallel chains.

[ascl:2405.002] nessai: Nested sampling with artificial intelligence

nessai performs nested sampling for Bayesian Inference and incorporates normalizing flows. It is designed for applications where the Bayesian likelihood is computationally expensive. nessai uses PyTorch and also supports the use of bilby (ascl:1901.011).

[ascl:2405.001] SPEDAS: Space Physics Environment Data Analysis System

The SPEDAS (Space Physics Environment Data Analysis Software) framework supports multi-mission data ingestion, analysis and visualization for the Space Physics community. It standardizes the retrieval of data from distributed repositories, the scientific processing with a powerful set of legacy routines, the quick visualization with full output control and the graph creation for use in papers and presentations. SPEDAS includes a GUI for ease of use by novice users, works on multiple platforms, and though based on IDL, can be used with or without an IDL license. The framework supports plugin modules for multiple projects such as THEMIS, MMS, and WIND, and provides interfaces for software modules developed by the individual teams of those missions. A Python implementation of the framework, PySPEDAS (ascl:2405.005), is also available.

[submitted] Swiftest

Swiftest is a software package designed to model the long-term dynamics of system of bodies in orbit around a dominant central body, such a planetary system around a star, or a satellite system around a planet. The main body of the program is written in Modern Fortran, taking advantage of the object-oriented capabilities included with Fortran 2003 and the parallel capabilities included with Fortran 2008 and Fortran 2018. Swiftest also includes a Python package that allows the user to quickly generate input, run simulations, and process output from the simulations. Swiftest uses a NetCDF output file format which makes data analysis with the Swiftest Python package a streamlined and flexible process for the user. Building off a strong legacy, including its predecessors Swifter and Swift, Swiftest takes the next step in modeling the dynamics of planetary systems by improving the performance and ease of use of software, and by introducing a new collisional fragmentation model. Currently, Swiftest includes the four main symplectic integrators included in its predecessors: WHM, RMVS, HELIO, and SyMBA. In addition, Swiftest also contains the Fraggle model for generating products of collisional fragmentation.

[submitted] PypeIt-NIRSPEC: A PypeIt Module for Reducing Keck/NIRSPEC High Resolution Spectra

We present a module built into the PypeIt Python package to reduce high resolution Y, J, H, K, and L band spectra from the W. M. Keck Observatory NIRSPEC spectrograph. This data reduction pipeline is capable of spectral extraction, wavelength calibration, and telluric correction of data taken before and after the 2018 detector upgrade, all in a single package. The procedure for reducing data is thoroughly documented in an expansive tutorial.

[submitted] BFast

A fast GPU-based bispectrum estimator implemented using JAX.

[ascl:2404.030] RhoPop: Small-planet populations identifier

RhoPop identifies compositionally distinct populations of small planets (R≲2R). It employs mixture models in a hierarchical framework and the dynesty (ascl:1809.013) nested sampler for parameter and evidence estimates. RhoPop includes a density-mass grid of water-rich compositions from water mass fraction (WMF) 0-1.0 and a grid of volatile-free rocky compositions over a core mass fraction (CMF) range of 0.006-0.95. Both grids were calculated using the ExoPlex mass-radius-composition calculator (ascl:2404.029).

[ascl:2404.029] ExoPlex: Thermodynamically self-consistent mass-radius-composition calculator

ExoPlex is a thermodynamically self-consistent mass-radius-composition calculator. Users input a bulk molar composition and a mass or radius, and ExoPlex will calculate the resulting radius or mass. Additionally, it will produce the planet's core mass fraction, interior mineralogy and the pressure, adiabatic temperature, gravity and density profiles as a function of depth.

[ascl:2404.028] binary_precursor: Light curve model of supernova precursors powered by compact object companions

binary_precursor models light curves of supernova (SN) precursors powered by a pre-SN outburst accompanying accretion onto a compact object companion. Though it is only one of the possible models, it is useful for interpretations of (bright) SN precursors highly exceeding the Eddington limit of massive stars, which are observed in a fraction of SNe with dense circumstellar matter (CSM) around the progenitor. It offers a number of editable parameters, including compact object mass, progenitor mass, progenitor radii, and opacity. Initial CSM velocity can be normalized by the progenitor escape velocity (xi parameter), and the CSM mass, ionization temperature, and binary separation can also be specified.

[ascl:2404.027] s2fft: Differentiable and accelerated spherical transforms

S2FFT computes Fourier transforms on the sphere and rotation group using JAX (ascl:2111.002) or PyTorch. It leverages autodiff to provide differentiable transforms, which are also deployable on hardware accelerators (e.g., GPUs and TPUs). More specifically, S2FFT provides support for spin spherical harmonic and Wigner transforms (for both real and complex signals), with support for adjoint transformations where needed, and comes with different optimisations (precompute or not) that one may select depending on available resources and desired angular resolution L.

[ascl:2404.026] LEO-vetter: Automated vetting for TESS planet candidates

LEO-vetter automatically vets transit signals found in light curve data. Inspired by the Kepler Robovetter (ascl:2012.006), LEO-vetter computes vetting metrics to be compared to a series of pass-fail thresholds. If a signal passes all tests, it is considered a planet candidate (PC). If a signal fails at least one test, it may be either an astrophysical false positive (FP; e.g., eclipsing binary, nearby eclipsing signal) or false alarm (FA; e.g., systematic, stellar variability). Pass-fail thresholds can be changed to suit individual research purposes, and LEO-vetter produces vetting reports for manual inspection of signals. Flux-level vetting can be applied to any light curve dataset (such as Kepler, K2, and TESS), including light curves with mixes of cadences, while pixel-level vetting has been implemented for TESS.

[ascl:2404.025] stringgen: Scattering based cosmic string emulation

stringgen creates emulations of cosmic string maps with statistics similar to those of a single (or small ensemble) of reference simulations. It uses wavelet phase harmonics to calculate a compressed representation of these reference simulations, which may then be used to synthesize new realizations with accurate statistical properties, e.g., 2 and 3 point correlations, skewness, kurtosis, and Minkowski functionals.

[ascl:2404.024] pAGN: AGN disk model equations solver

Written in Python, pAGN solves AGN disk model equations. The code is highly customizable and, with the correct inputs, provides a fully evolved AGN disk model through parametric 1D curves for key disk parameters such as temperature and density. pAGN can be used to study migration torques in AGN disks, simulations of compact object formation inside gas disks, and comparisons with new, more complex models of AGN disks.

[ascl:2404.023] mhealpy: Object-oriented healpy wrapper with support for multi-resolution maps

mhealpy extends the functionalities of the HEALPix (ascl:1107.018) wrapper healpy (ascl:2008.022) to handle single and multi-resolution maps (a.k.a. multi-order coverage maps or MOC maps). In addition to creating and analyzes MOC maps, it supports arithmetic operations, adaptive grids, resampling of existing multi-resolution maps, and plotting, among other functions, and reads and writes to FITS, which enables sharing spatial information for multiwavelength and multimessenger analyses.

[ascl:2404.022] jetsimpy: Hydrodynamic model of gamma-ray burst jet and afterglow

jetsimpy creates hydrodynamic simulations of relativistic blastwaves with tabulated angular energy and Lorentz factor profiles and efficiently models Gamma-Ray Burst afterglows. It supports tabulated angular energy and tabulated angular Lorentz factor profiles. jetsimpy also supports ISM, wind, and mixed external density profile, including synthetic afterglow light curves, apparent superluminal motion, and sky map and Gaussian equivalent image sizes. Additionally, you can add your own emissivity model by defining a lambda function in a c++ source file, allowing the package to be used for more complicated models such as Synchrotron self-absorption.

[ascl:2404.021] cudisc: CUDA-accelerated 2D code for protoplanetary disc evolution simulations

cuDisc simulates the evolution of protoplanetary discs in both the radial and vertical dimensions, assuming axisymmetry. The code performs 2D dust advection-diffusion, dust coagulation/fragmentation, and radiative transfer. A 1D evolution model is also included, with the 2D gas structure calculated via vertical hydrostatic equilibrium. cuDisc requires a NVIDIA GPU.

[ascl:2404.020] NbodyIMRI: N-body solver for intermediate-mass ratio inspirals of black holes and dark matter spikes

NbodyIMRI uses N-body simulations to study Dark Matter-dressed intermediate-mass ratio inspirals (IMRI) and extreme mass ratio inspiral (EMRI) systems. The code calculates all BH-BH forces and BH-DM forces directly while neglecting DM-DM pairwise interactions. This allows the code to scale up to very large numbers of DM particles in order to study stochastic processes like dynamical friction.

[ascl:2404.019] PySSED: Python Stellar Spectral Energy Distributions

PySSED (Python Stellar Spectral Energy Distributions) downloads and extracts data on multi-wavelength catalogs of astronomical objects and regions of interest and automatically proceses photometry into one or more stellar SEDs. It then fits those SEDs with stellar parameters. PySSED can be run directly from the command line or as a module within a Python environment. The package offers a wide variety plots, including Hertzsprung–Russell diagrams of analyzed objects, angular separation between sources in specific catalogs, and two-dimensional offset between cross-matches.

[ascl:2404.018] GPUniverse: Quantum fields in finite dimensional Hilbert spaces modeler

GPUniverse models quantum fields in finite dimensional Hilbert spaces with Generalised Pauli Operators (GPOs) and overlapping degrees of freedom. In addition, the package can simulate sets of qubits that are only quasi independent (i.e., the Pauli algebras of different qubits have small, but non-zero anti-commutator), which is useful for validating analytical results for holographic versions of the Weyl field.

[ascl:2404.017] pyilc: Needlet ILC in Python

pyilc implements the needlet internal linear combination (NILC) algorithm for CMB component separation in pure Python; it also implements harmonic-space ILC. The code can also perform Cross-ILC, where the covariance matrices are computed only from independent splits of the maps. In addition, pyilc includes an inpainting code, diffusive_inpaint, that diffusively inpaints a masked region with the mean of the unmasked neighboring pixels.

[ascl:2404.016] MLTPC: Machine Learning Telescope Pointing Correction

The Machine Learning Telescope Pointing Correction code trains and tests machine learning models for correcting telescope pointing. Using historical APEX data from 2022, including pointing corrections, and other data such as weather conditions, position and rotation of the secondary mirror, pointing offsets observed during pointing scans, and the position of the sun, among other data, the code treats the data in two different ways to test which factors are the most likely to account for pointing errors.

[ascl:2404.015] EBWeyl: Compute the electric and magnetic parts of the Weyl tensor

EBWeyl computes the electric and magnetic parts of the Weyl tensor, Eαβ and Bαβ, using a 3+1 slicing formulation. The module provides a Finite Differencing class with 4th (default) and 6th order backward, centered, and forward schemes. Periodic boundary conditions are used by default; otherwise, a combination of the 3 schemes is available. It also includes a Weyl class that computes for a given metric the variables of the 3+1 formalism, the spatial Christoffel symbols, spatial Ricci tensor, electric and magnetic parts of the Weyl tensor projected along the normal to the hypersurface and fluid flow, the Weyl scalars and invariant scalars. EBWeyl can also compute the determinant and inverse of a 3x3 or 4x4 matrice in every position of a data box.

[ascl:2404.014] astroNN: Deep learning for astronomers with Tensorflow

astroNN creates neural networks for deep learning using Keras for model and training prototyping while taking advantage of Tensorflow's flexibility. It contains tools for use with APOGEE, Gaia and LAMOST data, though is primarily designed to apply neural nets on APOGEE spectra analysis and predict luminosity from spectra using data from Gaia parallax with reasonable uncertainty from Bayesian Neural Net. astroNN can handle 2D and 2D colored images, and the package contains custom loss functions and layers compatible with Tensorflow or Keras with Tensorflow backend to deal with incomplete labels. The code contains demo for implementing Bayesian Neural Net with Dropout Variational Inference for reasonable uncertainty estimation and other neural nets.

[ascl:2404.013] Meanoffset: Photometric image alignment with row and column means

Meanoffset performs astronomical image alignment. The code uses the means of the rows and columns of an original image for alignment and finds the optimal offset corresponding to the maximum similarity by comparing different offsets between images. The similarity is evaluated by the standard deviation of the quotient divided by the means. The code is fast and robust.

[ascl:2404.012] EffectiveHalos: Matter power spectrum and cluster counts covariance modeler

EffectiveHalos provides models of the real-space matter power spectrum, based on a combination of the Halo Model and Effective Field Theory, which are 1% accurate up to k = 1 h/Mpc, across a range of cosmologies, including those with massive neutrinos. It can additionally compute accurate halo count covariances (including a model of halo exclusion), both alone and in combination with the matter power spectrum.

[ascl:2404.011] BayeSN: NumPyro implementation of BayeSN

BayeSN performs hierarchical Bayesian SED modeling of type Ia supernova light curves. This probabilistic optical-NIR SED model analyzes the population distribution of physical properties as well as cosmology-independent distance estimation for individual SNe. BayeSN is built with NumPyro and Jax (ascl:2111.002) and provides support for GPU acceleration.

[ascl:2404.010] Panphasia: Create cosmological and resimulation initial conditions

Panphasia computes a very large realization of a Gaussian white noise field. The field has a hierarchical structure based on an octree geometry with 50 octree levels fully populated. The code sets up Gaussian initial conditions for cosmological simulations and resimulations of structure formation. Panphasia provides an easy way to publish the linear phases used to set up cosmological simulation initial conditions; publishing phases enriches the literature and makes it easier to reproduce and extend published simulation work.

[ascl:2404.009] superABC: Cosmological constraints from SN light curves using Approximate Bayesian Computation

The superABC sampling method obtains cosmological constraints from supernova light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. It provides an interface to two forward model simulations, SNCosmo (ascl:1611.017) and SNANA (ascl:1010.027), for supernova cosmology.

[ascl:2404.008] LensIt: CMB lensing delensing tools

LensIt enables CMB lensing and CMB delensing using the flat-sky approximation. The package can find the maximum posterior estimation of CMB lensing deflection maps from temperature and/or polarization maps and perform Wiener filtering of masked CMB data and allow for inhomogenous noise, including lensing deflections, using a multigrid preconditioner. It contains fast and accurate simulation libraries for lensed CMB skies, and standard quadratic estimator lensing reconstruction tools. LensIt also includes CMB internal delensing tools, including internal delensing biases calculation for temperature and/or polarization maps.

[ascl:2404.007] WignerFamilies: Compute families of wigner symbols with recurrence relations

WignerFamilies generates families of Wigner 3j and 6j symbols by recurrence relation. These exact methods are orders of magnitude more efficient than strategies such as prime factorization for problems which require every non-trivial symbol in a family, and are very useful for large quantum numbers. WignerFamilies is thread-safe and very fast, beating the standard Fortran routine DRC3JJ from SLATEC by a factor of 2-4.

[ascl:2404.006] PolyBin3D: Binned polyspectrum estimation for 3D large-scale structure

PolyBin3D estimates the binned power spectrum and bispectrum for 3D fields such as the distributions of matter and galaxies. For each statistic, two estimators are available: the standard (ideal) estimators, which do not take into account the mask, and window-deconvolved estimators. In the second case, the computation of a Fisher matrix is required; this depends on binning and the mask, but does not need to be recomputed for each new simulation. PolyBin3D supports GPU acceleration using JAX. It is a sister code to PolyBin (ascl:2307.020), which computes the polyspectra of data on the two-sphere, and is a modern reimplementation of the former Spectra-Without-Windows (ascl:2108.011) code.

[ascl:2404.005] GalMOSS: GPU-accelerated galaxy surface brightness fitting via gradient descent

GalMOSS performs two-dimensional fitting of galaxy profiles. This Python-based, Torch-powered tool seamlessly enables GPU parallelization and meets the high computational demands of large-scale galaxy surveys. It incorporates widely used profiles such as the Sérsic, Exponential disk, Ferrer, King, Gaussian, and Moffat profiles, and allows for the easy integration of more complex models. Tested on over 8,000 galaxies from the Sloan Digital Sky Survey (SDSS) g-band with a single NVIDIA A100 GPU, GalMOSS completed classical Sérsic profile fitting in about 10 minutes. Benchmark tests show that GalMOSS achieves computational speeds that are significantly faster than those of default implementations.

[ascl:2404.004] TAT: Timing Analysis Toolkit for high-energy pulsar astrophysics

TAT-pulsar (Timing Analysis Toolkit for Pulsars) analyzes, processes, and visualizes pulsar data, thus handling the scientific intricacies of pulsar timing. By leveraging observational data from pulsars, along with the associated physical processes and statistical characteristics, the package integrates a suite of Python-based tools and data analysis scripts specifically developed for both isolated pulsars and binary systems. This enables swift analysis and the detailed presentation of timing properties in the high-energy pulsar field. Developed and implemented completely independently from other pulsar timing software such as Stingray (ascl:1608.001) and PINT (ascl:1902.007), TAT-pulsar serves as a valuable cross-checking and supplementary tool for data analysis.

[ascl:2404.003] KCWIKit: KCWI Post-Processing and Improvements

KCWIKit extends the official KCWI DRP (ascl:2301.019) with a variety of stacking tools and DRP improvements. The software offers masking and median filtering scripts to be used while running the KCWI DRP, and a step-by-step KCWI_DRP implementation for finer control over the reduction process. Once the DRP has finished, KCWIKit can be used to stack the output cubes via the Montage package. Various functions cross-correlate and mosaic the constituent cubes and the final stacked cubes are WCS corrected. Helper functions can then be used to deproject the stacked cube into lower-dimensional representations should the user desire.

[ascl:2404.002] PIPE: Extracting PSF photometry from CHEOPS data

PIPE (PSF Imagette Photometric Extraction) extracts PSF (point-spread function) photometry from data acquired by the space telescope CHEOPS (CHaracterisation of ExOPlanetS). Advantages of PSF photometry over standard aperture photometry include reduced sensitivity to contaminants such as background stars, cosmic ray hits, and hot/bad pixels. For CHEOPS, an additional advantage is that photometry can be extracted from an imagette, a small window around the target that is downlinked at a shorter cadence than the larger-sized subarray used for aperture photometry. These advantages make PIPE particularly well suited for targets brighter or fainter than the nominal G = 7-11 mag range of CHEOPS, i.e., where short-cadence imagettes are available (bright end) or when contamination becomes a problem (faint end). Within the nominal range, PIPE usually offers no advantage over the standard aperture photometry.

[ascl:2404.001] cbeam: Coupled-mode propagator for slowly-varying waveguides

cbeam models the propagation of guided light through slowly-varying few-mode waveguides using the coupled-mode theory (CMT). When compared with more general numerical methods for waveguide simulation, such as the finite-differences beam propagation method (FD-BPM), numerical implementations of the CMT can be much more computationally efficient. Written in Python and Julia, the package provides a Pythonic class structure to define waveguides, with simple classes for directional couplers and photonic lanterns already provided. cbeam also doubles as a finite-element eigenmode solver.

[submitted] obsplanning - a set of python utilities to aid in planning astronomical observations

Obsplanning is a suite of tools to help plan astronomical observations from ground-based observatories, for traditional single-site as well as multi-station (VLBI) observing. Conveniently determine observability of objects in the sky from your observatory, and produce plots to help you prepare for your observations over the course of a session. Celestial source coordinates (including solar system objects) can be queried or created, and transformed. Calibrator or reference sources can be selected by proximity, and slew order can be optimized to save valuable telescope time. Plots and visualizations can be easily made to chart source elevation and transits, source proximity to the Sun and Moon, concurrent 'up time' of sources at multiple sites (for VLBI or tandem observations), 'dark time' at a telescope site for a given year, finder plots made from real images (with options to query online databases), and more.

[ascl:2403.016] DensityFieldTools: Manipulating density fields and measuring power spectra and bispectra

The DensityFieldTools toolset manipulates density fields and measures power spectra and bispectra using a very simple interface. After loading a density field, it computes the power spectrum and the bispectrum for a desired binning. The bispectrum estimator also automatically computes the power spectrum for the chosen binning, to facilitate, for example, shot-noise subtraction. DensityFieldTools also provides a quick way to measure (cross-)power spectra directly from density fields.

[ascl:2403.015] CLASS-PT: Nonlinear perturbation theory extension of the Boltzmann code CLASS

CLASS-PT modifies the CLASS (ascl:1106.020) code to compute the non-linear power spectra of dark matter and biased tracers in one-loop cosmological perturbation theory, for both Gaussian and non-Gaussian initial conditions. CLASS-PT can be interfaced with the MCMC sampler MontePython (ascl:1805.027) using the (new and improved) custom-built likelihoods found here.

[ascl:2403.014] OneLoopBispectrum: Computation of the one-loop bispectrum of galaxies in redshift space

OneLoopBispectrum computes the one-loop bispectrum of galaxies in redshift space. It computes and simplifies the bispectrum kernels using Mathematica; this is cosmology-independent. The code also computes the full and flattened bispectrum templates, given the pre-computed integration kernels. OneLoopBispectrum uses Mathematica to read in and combine the bispectrum templates, and Python to interpolate and extract the one-loop bispectrum.

[ascl:2403.013] URecon: Reconstruct initial conditions of N-Body simulations

URecon reconstructs the initial conditions of N-body simulations from late time (e.g., z=0) density fields. This simple UNET architecture is implemented in TensorFlow and requires Pylians3 (ascl:2403.012) for measuring power spectrum of density fields. The package includes weights trained on Quijote fiducial cosmology simulations.

[ascl:2403.012] Pylians3: Libraries to analyze numerical simulations in Python 3

Pylians3 (Python3 libraries for the analysis of numerical simulations) provides a Python 3 version of Pylians (ascl:1811.008), which analyzes numerical simulations (both N-body and hydrodynamic); parts of the codebase are also written in cython and C. It computes density fields, power spectra, bispectra, and correlation functions, identifies voids, and populates halos with galaxies using an HOD. Pylians3 also applies HI+H2 corrections to the output of hydrodynamic simulations, make 21cm maps, computes DLAs column density distribution functions, and can plot density fields and make movies.

[ascl:2403.011] LtU-ILI: Robust machine learning in astro

LtU-ILI (Learning the Universe Implicit Likelihood Inference) performs machine learning parameter inference. Given labeled training data or a stochastic simulator, the LtU-ILI piepline automatically trains state-of-the-art neural networks to learn the data-parameter relationship and produces robust, well-calibrated posterior inference. The package comes with a wide range of customizable complexity, including posterior-, likelihood-, and ratio-estimation methods for ILI, including sequential learning analogs, and various neural density estimators, including mixture density networks, conditional normalizing flows, and ResNet-like ratio classifiers. It offers fully-customizable, exotic embedding networks, including CNNs and Graph Neural Networks, and a unified interface for multiple ILI backends such as sbi, pydelfi, and lampe. LtU-ILI also handles multiple marginal and multivariate posterior coverage metrics, and offers Jupyter and command-line interfaces and a parallelizable configuration framework for efficient hyperparameter tuning and production runs.

[ascl:2403.010] FitCov: Fitted Covariance generation

FitCov estimates the covariance of two-point correlation functions in a way that requires fewer mocks than the standard mock-based covariance. Rather than using an analytically fixed correction to some terms that enter the jackknife covariance matrix, the code fits the correction to a mock-based covariance obtained from a small number of mocks. The fitted jackknife covariance remains unbiased, an improvement over other methods, performs well both in terms of precision (unbiased constraints) and accuracy (similar uncertainties), and requires significant less computational power. In addition, FitCov can be easily implemented on top of the standard jackknife covariance computation.

[ascl:2403.009] pycorr: Two-point correlation function estimation

pycorr wraps two-point counter engines such as Corrfunc (ascl:1703.003) to estimate the correlation function. It supports theta (angular), s, s-mu, rp-pi binning schemes, analytical two-point counts with periodic boundary conditions, and inverse bitwise weights (in any integer format) and (angular) upweighting. It also provides MPI parallelization and jackknife estimate of the correlation function covariance matrix.

[ascl:2403.008] s4cmb: Systematics For Cosmic Microwave Background

s4cmb (Systematics For Cosmic Microwave Background) studies the impact of instrumental systematic effects on measurements of CMB experiments based on bolometric detector technology. s4cmb provides a unified framework to simulate raw data streams in the time domain (TODs) acquired by CMB experiments scanning the sky, and to inject in these realistic instrumental systematics effect.

[ascl:2403.007] MINDS: Hybrid pipeline for the reduction of JWST/MIRI-MRS data

The MINDS hybrid pipeline is based on the JWST pipeline and routines from the VIP package (ascl:1603.003) for the reduction of JWST MIRI-MRS data. The pipeline compensates for some of the known weaknesses of the official JWST pipeline to improve the quality of spectrum extracted from MIRI-MRS data. This is done by leveraging the capabilities of VIP, another large data reduction package used in the field of high-contrast imaging.

The front end of the pipeline is a highly automated Jupyter notebook. Parameters are typically set in one cell at the beginning of the notebook, and the rest of the notebook can be run without any further modification. The Jupyter notebook format provides flexibility, enhanced visibility of intermediate and final results, more straightforward troubleshooting, and the possibility to easily incorporate additional codes by the user to further analyze or exploit their results.

[ascl:2403.006] fkpt: Compute LCDM and modified gravity perturbation theory using fk-kernels

fkpt computes the 1-loop redshift space power spectrum for tracers using perturbation theory for LCDM and Modified Gravity theories using "fk"-Kernels. Though implemented for the Hu-Sawicky f(R) modified gravity model, it is straightforward to use it for other models.

[ascl:2403.005] Poke: Polarization ray tracing and Gaussian beamlet module for Python

Poke (pronounced /poʊˈkeɪ/ or po-kay) uses commercial ray tracing APIs and open-source physical optics engines to simultaneously model scalar wavefront error, diffraction, and polarization to bridge the gap between ray trace models and diffraction models. It operates by storing ray data from a commercial ray tracing engine into a Python object, from which physical optics calculations can be made. Poke provides two propagation physics modules, Gaussian Beamlet Decomposition and Polarization Ray Tracing, that add to the utility of existing scalar diffraction models. Gaussian Beamlet Decomposition is a ray-based approach to diffraction modeling that integrates physical optics models with ray trace models to directly capture the influence of ray aberrations in diffraction simulations. Polarization Ray Tracing is a ray-based method of vector field propagation that can diagnose the polarization aberrations in optical systems.

[ascl:2403.004] BTSbot: Automated identification of supernovae with multi-modal deep learning

BTSbot automates real-time identification of bright extragalactic transients in Zwicky Transient Facility (ZTF) data. A multi-modal convolutional neural network, BTSbot provides a bright transient score to individual ZTF detections using their image data and 25 extracted features. The package eliminates the need for daily visual inspection of new transients by automatically identifying and requesting spectroscopic follow-up observations of new bright transient candidates. BTSbot recovers all bright transients in our test split and performs on par with human experts in terms of identification speed (on average, ∼1 hour quicker than scanners).

[ascl:2403.003] kinematic_scaleheight: Infer the vertical distribution of clouds in the solar neighborhood

kinematic_scaleheight uses MCMC methods to kinematically estimate the vertical distribution of clouds in the Galactic plane, including the least squares analysis of Crovisier (1978), an updated least squares analysis using a modern Galactic rotation model, and a Bayesian model sampled via MCMC as described in Wenger et al. (2024).

[ascl:2403.002] DistClassiPy: Distance-based light curve classification

DistClassiPy uses different distance metrics to classify objects such as light curves. It provides state-of-the-art performance for time-domain astronomy, and offers lower computational requirements and improved interpretability over traditional methods such as Random Forests, making it suitable for large datasets. DistClassiPy allows fine-tuning based on scientific objectives by selecting appropriate distance metrics and features, which enhances its performance and improves classification interpretability.

[ascl:2403.001] Pynkowski: Minkowski functionals and other higher order statistics

Pynkowski computes Minkowski Functionals and other higher order statistics of input fields, as well as their expected values for different kinds of fields. This package supports Minkowski functionals, and maxima and minima distributions. Supported input formats include scalar HEALPix maps such as those used by healpy (ascl:2008.022) and polarization HEALPix maps in the SO(3) formalism. Pynkowski also supports various theoretical fields, including Gaussian (e.g., CMB Temperature or the initial density field), Chi squared (e.g., CMB polarization intensity), and spin 2 maps in the SO(3) formalism.

[ascl:2402.010] 2cosmos: Monte Python modification for two independent instances of CLASS

2cosmos is a modification of Monte Python (ascl:1307.002) and allows the user to write likelihood modules that can request two independent instances of CLASS (ascl:1106.020) and separate dictionaries and structures for all cosmological and nuisance parameters. The intention is to be able to evaluate two independent cosmological calculations and their respective parameters within the same likelihood. This is useful for evaluating a likelihood using correlated datasets (e.g. mutually exclusive subsets of the same dataset for which one wants to take into account all correlations between the subsets).

[ascl:2402.009] SkyLine: Generate mock line-intensity maps

SkyLine generates mock line-intensity maps (both in 3D and 2D) in a lightcone from a halo catalog, accounting for the evolution of clustering and astrophysical properties, and observational effects such as spectral and angular resolutions, line-interlopers, and galactic foregrounds. Using a given astrophysical model for the luminosity of each line, the code paints the signal for each emitter and generates the map, adding coherently all contributions of interest. In addition, SkyLine can generate maps with the distribution of Luminous Red Galaxies and Emitting Line Galaxies.

[ascl:2402.008] star_shadow: Analyze eclipsing binary light curves, find eccentricity, and more

star_shadow automatically analyzes space based light curves of eclipsing binaries and provide a measurement of eccentricity, among other parameters. It measures the timings of eclipses using the time derivatives of the light curves, using a model of orbital harmonics obtained from an initial iterative prewhitening of sinusoids. Since the algorithm extracts the harmonics from the rest of the sinusoidal variability eclipse timings can be measured even in the presence of other (astrophysical) signals, thus determining the orbital eccentricity automatically from the light curve along with information about the other variability present in the light curve. The output includes, but is not limited to, a sinusoid plus linear model of the light curve, the orbital period, the eccentricity, argument of periastron, and inclination.

[ascl:2402.007] ECLIPSR: Automatically find individual eclipses in light curves, determine ephemerides, and more

ECLIPSR fully and automatically analyzes space based light curves to find eclipsing binaries and provide some first order measurements, such as the binary star period and eclipse depths. It provides a recipe to find individual eclipses using the time derivatives of the light curves, including eclipses in light curves of stars where the dominating variability is, for example, pulsations. Since the algorithm detects each eclipse individually, even light curves containing only one eclipse can (in principle) be successfully analyzed and classified. ECLIPSR can find eclipsing binaries among both pulsating and non-pulsating stars in a homogeneous and quick manner and process large amounts of light curves in reasonable amounts of time. The output includes, among other things, the individual eclipse markers, the period and time of first (primary) eclipse, and a score between 0 and 1 indicating the likelihood that the analyzed light curve is that of an eclipsing binary.

[ascl:2402.006] polarizationtools: Polarization analysis and simulation tools in python

polarizationtools converts, analyzes, and simulates polarization data. The different python scripts (1) convert Stokes parameters into linear polarization parameters with proper treatment of the uncertainties and vice versa; (2) shift electric vector position angle (EVPA) data points in time series to account for the 180 degrees ambiguity; (3) identify rotations of the EVPA e.g. in blazar polarization monitoring data according to various rotation definitions; and (4) simulate polarization time series as a random walk in the Stokes Q-U plane.

[ascl:2402.005] MGPT: Modified Gravity Perturbation Theory code

MGPT (Modified Gravity Perturbation Theory) computes 2-point statistics for LCDM model, DGP and Hu-Sawicky f(R) gravity. Written in C, the code can be easily modified to include other models. Specifically, it computes the SPT matter power spectrum, SPT Lagrangian-biased tracers power spectrum, and the CLPT matter correlation function. MGPT also computes the CLPT Lagrangian-biased tracers correlation function and a set of Q and R functionsfrom which other statistics, as leading order bispectrum, can be constructed.

[ascl:2402.004] CCBH-Numerics: Cosmologically-coupled-black-holes formation mass numerics

CCBH-Numerics (previously called CCBH-PLPP) computes the probability of the existence of a single cosmologically coupled black hole (BH) with a formation mass below a specified threshold for given observational data of binary black holes (BBHs) from gravitational waves. The code uses the unbiased population of BBHs, as given by the power-law-plus-peak (PLPP) profile, as the observational input, and assumes that the detected BBHs are formed from stellar evolution, not primordial BHs. CCBH-Numerics also works with individual data from BBHs and for NSBH pairs as well.

[ascl:2402.003] Rwcs: World coordinate system transforms in R

Rwcs offers access to all the projection and distortion systems of WCSLIB (ascl:1108.003) in R. This can be used directly for, for example, pixel lookups, or for higher level general distortion and projection.

[ascl:2402.002] Rfits: FITS file manipulation in R

Rfits reads and writes FITS images, tables, and headers. Written in R, Rfits works with all types of compressed images, and both ASCII and binary tables. It uses CFITSIO (ascl:1010.001) for all low level FITS IO, so in general should be as fast as other CFITSIO-based software. For images, Rfits offers fully featured memory mapping and on-the-fly subsetting (by pixel and coordinate) and sparse pixel sampling, allowing for efficient inspection of very large (larger than memory) images.

[ascl:2402.001] NMMA: Nuclear Multi Messenger Astronomy framework

NMMA probes nuclear physics and cosmology with multimessenger analysis. This fully featured, Bayesian multi-messenger pipeline targets joint analyses of gravitational-wave and electromagnetic data (focusing on the optical). Using bilby (ascl:1901.011) as the back-end, the software uses a variety of samplers to sampling these data sets. NMMA uses chiral effective field theory based neutron star equation of states when performing inference, and is also capable of estimating the Hubble Constant.

[ascl:2401.020] escatter: Electron scattering in Python

escatter.py performs Monte Carlo simulations of electron scattering events. The code was developed to better understand the emission lines from the interacting supernova SN 2021adxl, specifically the blue excess seen in the Hα 6563A emission line. escatter follows a photon that was formed in a thin interface between the supernova ejecta and surrounding material as it travels radially outwards through the dense material, scattering electrons outwards until it reaches an optically thin region, and plots a histogram of the emergent photons.

[ascl:2401.019] StructureFunction: Bayesian estimation of the AGN structure function for Poisson data

StructureFunction determines the X-ray Structure Function of a population of Active Galactic Nuclei (AGN) for which two epoch X-ray observations are available and are separated by rest frame time interval. The calculation of the X-ray structure function is Bayesian. The sampling of the likelihood uses Stan (ascl:1801.003) for statistical modeling and high-performance statistical computation.

[ascl:2401.018] tidalspin: Constrain black hole spins using relativistic tidal forces properties

tidalspin uses a Bayesian approach to infer posterior distributions of a black hole's parameters (mass and spin) in an observed tidal disruption event, given a prior estimate of the black hole’s mass (e.g., from a galactic scaling relationship, or the tidal disruption event’s observed properties). These posterior distributions will only utilize the properties of tidal forces in their inference. tidalspin can be applied to the population of tidal disruption events already discovered.

[ascl:2401.017] QuantifAI: Radio interferometric imaging reconstruction with scalable Bayesian uncertainty quantification

QuantifAI reconstructs radio interferometric images using scalable Bayesian uncertainty quantification relying on data-driven (learned) priors. It relies on the convex accelerated optimization algorithms in CRR (ascl:2401.016) and is built on top of PyTorch. QuantifAI also includes MCMC algorithms for posterior sampling.

[ascl:2401.016] CRR: Convex Ridge Regularizer

CRR (Convex Ridge Regularizer) takes the gradient of regularizers that are the sum of convex-ridge functions and parameterizes them using a neural network that has a single hidden layer with increasing and learnable activation functions. The neural network is trained within a few minutes as a multistep Gaussian denoiser, and offers improvements for denoising and image reconstruction over other methods with similar reliability.

[ascl:2401.015] maskfill: Fill in masked values in an image

maskfill inward extrapolates edge pixels just outside masked regions, using iterative median filtering and the full information contained in the edge pixels. This provides seamless transitions between masked pixels and good pixels, and allows high fidelity reconstruction of gaps in continuous narrow features. An image and a mask the only required inputs.

[ascl:2401.014] LoRD: Locate Reconnection Distribution

LoRD (Locate Reconnection Distribution) identifies the locations and structures of 3D magnetic reconnection within discrete magnetic field data. The toolkit contains three main functions; the first, ARD (Analyze Reconnection Distribution) locates the grids undergoing reconnection without null points and also recognizes the local configurations of reconnection sites. ANP (Analyze Null Points) locates and classifies the 3D null points, and APNP (Analyze Projected Null Points) analyzes the 2D neutral points projected on a plane near a cell. LoRD is written in Matlab and the toolkit contains demo scripts.

[ascl:2401.013] SolarKAT: Solar imaging pipeline for MeerKAT

SolarKAT mitigates solar interference in MeerKAT data and recovers the visibilities rather than discarding them; this solar imaging pipeline takes 1GC calibrated data in Measurement Set format as input. Written in Python, the pipeline employs solar tracking, subtraction, and peeling techniques to enhance data quality by significantly reducing solar radio interference. This is achieved while preserving the flux measurements in the main field. SolarKAT is versatile and can be applied to general radio astronomy observations and solar radio astronomy; additionally, generated solar images can be used for weather forecasting. SolarKAT is deployed in Stimela (ascl:2305.007). It is based on existing radio astronomy software, including CASA (ascl:1107.013), breizorro (ascl:2305.009), WSclean (ascl:1408.023), Quartical (ascl:2305.006), and Astropy (ascl:1304.002).

[ascl:2401.012] baryon-sweep: Outlier rejection algorithm for JWST/NIRSpec IFS data

baryon-sweep produces a robust outlier rejection while simultaneously preserving the signal of the science target. The code works as a standalone solution or as a supplement to the current pipeline software. baryon-sweep creates the 2D pixel mask and mask layers, processes the sky (non-science target) spaxels, and creates a post-processed cube ready for use.

[ascl:2401.011] ostrich: Surrogate modeling using PCA and Gaussian process interpolation

Ostrich emulates surrogate models for complex and expensive functions using Principal Component Analysis (PCA) to decompose a signal, then interpolate the PCA weights over the parameters θ using a Gaussian Process interpolator. The code is trained on samples from the expensive functions, recreating and interpolating between those training samples with reduced computational cost, and recalculating for each use.

[ascl:2401.010] SYSNet: Neural Network modeling of imaging systematics in galaxy surveys

The Feed Forward Neural Network SYSNet models the relationship between the imaging maps, such as stellar density and the observed galaxy density field, in order to mitigate the systematic effects and to make a robust galaxy clustering measurements. The cost function is Mean Squared Error and a L2 regularization term, and the optimization algorithm is Adaptive Moment (ADAM).

[ascl:2401.009] Harmonic: Learnt harmonic mean estimator

harmonic learns an approximate harmonic mean estimator (referred to as a "learnt harmonic mean estimator") from posterior distribution samples to compute the marginal likelihood required for Bayesian model selection. Using a large number of independent Markov chain Monte Carlo (MCMC) chains from another package such as emcee (ascl:1303.002), harmonic uses importance sampling to learn a new target distribution in order to optimize an approximate harmonic estimator while minimizing its variance.

[ascl:2401.008] DARC: Dirac Atomic R-matrix Codes

DARC (Dirac Atomic R-matrix Codes) enables the study of continuum processes for a general atomic system. The suite of programs calculate electron-atom or electron-ion collision cross-sections. In addition, the programs include code for bound-state and photoionization calculations.

[ascl:2401.007] deal.II: Finite element library

deal.II computes solutions to partial differential equations (PDEs) using adaptive finite elements. The code provides an interface for processing PDEs accessible to both laptops and supercomputers, and has been used to investigate the local and global waveform effects of gravitational waves by numerical simulation. deal.II supports massively parallel computing of very large linear systems of equations and provides access to triangulation of various geometries of the simulation domain.

[ascl:2401.006] LoSoTo: LOFAR solutions tool

LoSoTo (LOFAR Solution Tool) performs a variety of operations on H5parm data, which is based on the HDF5 format; it isolates direction independent systematic effects and can therefore be transferred to the target field. Subsets of data can be selected for each operation using lists of axes values, regular expressions, or intervals. The LoSoTo package stores solutions in arrays organized in a hierarchical fashion; this provides flexibility and preserves performance. The code can, for example, extract Faraday rotation from RR/LL phase solutions or a rotation matrix, clip solutions around the median, and calculate the ionospheric structure function. LoSoTo includes an outlier flagging procedure, normalizes solutions to a given value, and offers an advanced plotting routine, and many other operations.

[ascl:2401.005] CosmosCanvas: Useful color maps for different astrophysical properties

CosmosCanvas creates perception-based color maps for different astrophysical properties such as spectral index and velocity fields. Three tutorials demonstrate how to use python code to exploit and adjust the boundaries in these divergent colour schemes. Intended to work with human physiology, each tutorial offers at least one default scheme that is monotonic in value both as a redundancy for supporting data information and an aid for colour blind viewers. This library relies on Gilles Ferrand's colourspace library.

[ascl:2401.004] pyPETaL: A Pipeline for Estimating AGN Time Lags

pyPETAL produces cross-correlation functions, discrete correlation functions, and mean time lags from multi-band AGN time-series data, combining multiple different codes (including pyCCF (ascl:1805.032), pyZDCF, PyROA (ascl:2107.012), and JAVELIN (ascl:1010.007)) used for active galactic nuclei (AGN) reverberation mapping (RM) analysis into a unified pipeline. This pipeline also implements outlier rejection using Damped Random Walk Gaussian process fitting, and detrending through the LinMix algorithm. pyPETAL implements a weighting scheme for all lag-producing modules, mitigating aliasing in peaks of time lag distributions between light curves. pyPETAL scales to any combination of internal code modules, supporting a variety of computational workflows.

[ascl:2401.003] LUNA: Forward model luna simulator

LUNA generates dynamically accurate lightcurves from a planet-moon pair, analytically accounting for shadow overlaps, stellar limb darkening, and planet-moon dynamical motion. The code takes transit timing/duration variations and ingress/egress asymmetries into consideration not only for the planet, but also the moon. LUNA was designed to be analytical and dynamical and to incorporate limb darkening (including non-linear laws) and account for all orbital elements, including eccentricity and longitude of the ascending node. Because the software is precise and analytic, LUNA is a highly potent tool for exomoon detection.

[ascl:2401.002] Rayleigh: Pseudo-spectral MHD

The 3-D convection code Rayleigh enables study of dynamo behavior in spherical geometry. It evolves the incompressible and anelastic MHD equations in spherical geometry using a pseudo-spectral approach. Rayleigh employs spherical harmonics in the horizontal direction and Chebyshev polynomials in the radial direction and has undergone extensive accuracy testing.

Would you like to view a random code?