Results 1-100 of 3650 (3556 ASCL, 94 submitted)
pmwd simulates and models cosmological evolutionary history. The code includes reverse time integration in addition to traditional forward simulation, enabling symmetrical dynamics analysis using the adjoint method. The pmwd particle-mesh model supports fully-differentiable analytic, semi-analytics, and deep learning components in parallel. Based on JAX (ascl:2111.002), pmwd is optimized for PU computation.
BADASS (Bayesian AGN Decomposition Analysis for SDSS Spectra) decomposes Sloan Digital Sky Survey (SDSS) spectra and fits Type 1 ("broad line") Active Galactic Nuclei (AGN) in the optical. The fitting process uses the Bayesian affine-invariant Markov-Chain Monte Carlo sampler emcee (ascl:1303.002) for robust parameter and uncertainty estimation, as well as autocorrelation analysis to access parameter chain convergence. Out of the box, BADASS fits SDSS spectra, and MANGA IFU cube data; the code can be modified to fit user-input spectra of any instrument.
dask-ms constructs xarray datasets from CASA tables, thus providing a data access layer for Measurement Set v2.0 data. It supports the CASA Data Table System, Zarr and Apache Arrow formats, but abstracts them away from the developer at the xarray dataset level. It therefore serves as a basis for writing distributed PyData Radio Astronomy applications and supports writing variables back to the respective column in the Table. The intention behind dask-ms is to support the Measurement Set as a data source and sink for the purposes of writing parallel, distributed radio astronomy algorithms.
Stimela2 develops data reduction workflows and is a significant update of Stimela (ascl:2305.007). Though designed for radio astronomy data, it can be adapted for other data processing applications. Stimela2 represents workflows by linear, concise and intuitive YAML-format "recipes". Atomic data reduction tasks (binary executables, Python functions and code, and CASA tasks) are described by YAML-format "cab definitions" detailing each task's "schema" (inputs and outputs). Stimela2 provides a rich syntax for chaining tasks together, and encourages a high degree of modularity: recipes may be nested into other recipes, and configuration is cleanly separated from recipe logic. Tasks can be executed natively or in isolated environments using containerization technologies such as Apptainer. Stimela2 facilitates the deployment of scalable, distributed workflows by interfacing with the Slurm scheduler and the Kubernetes API, the latter allowing workflows to be readily deployed in the cloud.
Twinkle calculates and plots the stellar spectral energy distribution (SED) using empirical photometric data and stellar model grids. The code was originally created to help calculate the excess infrared (IR) flux from a star; the presence of an IR excess indicates dust orbiting the star. This dust likely results from the grinding and collisions of asteroids, influenced by a larger planetary object—pointing to the potential for finding planets. Twinkle quickly calculates the temperature and location of the dust to first order by fitting the assumed blackbody or modified blackbody function to the broadband excess emission.
Codex Africanus is a Radio Astronomy algorithms library. It presents radio astronomy algorithms to the user as modular functions accepting NumPy inputs and producing NumPy outputs. Internally, it uses Numba to accelerate these codes and Dask to parallelise and distribute them.
Colume (COLUMn to vOLUME) uses the statistical and spatial distribution of a column density map to infer a likely volume density distribution along each line of sight. This Python package incorporates all pre-processing (in particular re-sampling) functions needed to efficiently work on the column density maps. Colume's outputs are saved in Numpy format.
This project presents a comprehensive spectroscopic analysis of O and B-type stars, neutron stars, and white dwarfs, with a focus on the detection of helium (He) and oxygen (O) in stellar atmospheres. By leveraging data from the Sloan Digital Sky Survey (SDSS) and utilizing tools such as Astropy, Astroquery, and Specutils, the project aims to identify key spectral lines of helium and oxygen, as well as the formation of heliox (OHe) molecules. The methodology involves querying SDSS for relevant spectral data, filtering and analyzing it based on stellar classification, and visualizing the results using advanced techniques. The findings contribute to the understanding of stellar evolution, chemical processes, and the role of these elements in various stellar classes. Additionally, the project incorporates interactive data exploration with Aladin Lite and Simbad, offering a robust framework for future astrophysical research.
This notebook provides a comprehensive approach for analyzing and visualizing astronomical data from FITS (Flexible Image Transport System) files, focusing on moment maps derived from molecular line emissions within the galaxy NGC 0628. The analysis involves applying various image processing techniques to handle corrupted pixels, reconstruct images, and enhance the quality of moment maps. The notebook also demonstrates how to simulate super-resolution to improve the spatial resolution of the data. By utilizing Gaussian filtering, median filtering, and contrast enhancement, the approach improves the clarity and precision of the data, making it suitable for detailed astrophysical studies. This tool serves as an efficient method for processing and visualizing large-scale astronomical datasets for further analysis and scientific interpretation.
NEMESISPY infers the atmospheric properties of exoplanets, such as chemical composition, using spectroscopic data. The package calculates radiative transfer using the correlated-k approximation and for parametric atmospheric modelling. NEMESISPY is a Python implementation of the well-established Fortran NEMESIS library (ascl:2210.009), which has been applied to the atmospheric retrievals of both solar system planets and exoplanets employing numerous different observing geometries.
IcyDwarf calculates the coupled physical-chemical evolution of an icy dwarf planet or moon. The code calculates the thermal evolution of an icy planetary body (moon or dwarf planet), with no chemistry, but with rock hydration, dehydration, hydrothermal circulation, core cracking, tidal heating, and porosity; the depth of cracking and a bulk water:rock ratio by mass in the rocky core are also computed. It also calculates whether cryovolcanism is possible by the exsolution of volatiles from cryolavas. IcyDwarf also determines the equilibrium fluid and rock chemistries resulting from water-rock interaction in subsurface oceans in contact with a rocky core, up to 200ºC and 1000 bar.
SMINT (Structure Model INTerpolator) obtains posterior distributions on the H/He or H2O mass fraction of a planet; its interface is user-friendly. The parameters of the planet of interest are input with specifications on the priors that should be used. SMINT returns publication-ready plots presenting the joint parameters constraints obtained from interpolating the interior models grid of interest as well as confidence intervals for each parameter.
DarkMatters calculates multi-frequency and multi-messenger emissions from WIMP annihilation and decay. This can be done both for standard channels and custom models, with the ability to produce surface brightnesses and integrated fluxes as well as maps in FITS format to compare to actual data. DarkMatters uses an accelerated ADI solver such as GALPROP (ascl:1010.028) for electron diffusion with an innovative sparse matrix approach. Additionally, there is the option to use a Green's function approximate solution (implemented in both C++ and Python).
The numerical modeling code DustPOL-py calculates the multi-wavelength polarization degree of absorption and thermal dust emission based on Radiative Torque alignment (RAT-A), Magnetically enhanced RAT (MRAT) and Radiative Torque Disruption (RAT-D). The code saves the output files (wavelength and degree of polarization) for further analysis and is idealization for diffuse ISM, molecular clouds and star-forming regions; it also predicts the polarization spectrum for one- or two-dust layers. A web-interface GUI for DustPOL-py is also available.
DArk Matter SPIkes (DAMSPI) analyzes dark matter spikes around Intermediate Mass Black Holes (IMBHs) in the Milky Way. It extracts an IMBH catalog with the corresponding dark matter spike parameters from EAGLE simulations to probe a potential gamma-ray signal from dark matter self-annihilation. The catalog includes, among others, the coordinates, mass, formation redshift, and spike parameters for each individual IMBH.
jaxspec performs statistical inference on X-ray spectra. It loads an X-ray spectrum (in the OGIP standard), defines a spectral model from the implemented components, and calculates the best parameters using state-of-the-art Bayesian approaches. The code is built on top of JAX (ascl:2111.002) to provide just-in-time compilation and automatic differentiation of the spectral models, enabling the use of sampling algorithms. jaxspec is written in pure Python and is not dependent on HEASoft (ascl:1408.004).
mochi_class extends the hi_class code (ascl:1808.010), itself a patch to the Einstein-Boltzmann solver CLASS (ascl:1106.020). It replaces α-functions by stable basis to ensure stability and takes general functions of time as input, including the dark energy equation of state or its normalized background energy-density. mochi_class provides stability test checking for mathematical (classical) instabilities in the scalar field fluctuations, and also includes a GR approximation scheme, among other new capabilities.
HIILines analytically models lines emitted by the ionized interstellar medium (ISM). It covers [OIII], [OII], Hα, and Hβ lines. The strength of HIILines is its high computational efficiency. It can be used for galaxy spectroscopic survey measurement interpolations assuming a one-zone picture and galaxy line emission measurement design and forecasts. HIILines also performs post-processing of hydrodynamical galaxy formation simulations for ISM emission lines.
McFine performs complex, multi-component hyperfine spectra fitting in astronomical data. It turns line intensities into gas conditions using a fully automated Bayesian method. Written in Python, the code uses Markov chain Monte Carlo (MCMC) to characterize model denegeracies. It handles local thermodynamic equilibrium (LTE) and radiative-transfer (RT) models and can fit individual spectra and data cubes; given a data cube, it can also use the neighboring information to attempt a better fit. McFine also fits the minimum number of distinct components to avoid overfitting.
The spectral classification code Diagnose assigns one of four classifications (star, galaxy, quasar, or unknown) to each source and returns a redshift estimate for the galaxies and quasars and a velocity estimate for the stars. The code uses a chi-squared minimization for linear combinations of principal component templates to determine a best-fit spectral classification and redshift estimate. It computes three best-fit chi-squared values: one for stellar type and velocity, one for galaxy type and redshift, and one for a quasar and redshift. Diagnose then compares the best fit of these three reduced chi-squared values to the second best fit and evaluates the difference against a statistical threshold.
The Unicorn pipeline produces data products from the 3D-HST grism survey of four CANDELS fields. It extracts interlaced 2D and 1D spectra for all objects in the Skelton et al. (2014) photometric catalogs. It then fits the 2D spectra and multi-band photometry to determine redshifts and emission line strengths. Unicorn is built on threedhst (ascl:2411.018) and has been superseded by grizli (ascl:1905.001).
threedhst reduces WFC3 grism exposures. It is essentially a wrapper around aXe (ascl:1109.016) and produces a catalog and other useful files; extracted 1D spectra are placed in a single file, and 2D spectra are in individual files. The code produces an HTML table with thumbnails of the direct images, 1D, and 2D spectra and supports the pipeline Unicorn (ascl:2411.019), which produces data products from the 3D-HST grism survey of four CANDELS fields. threedhst has been superceded by Grizli (ascl:1905.001).
CLASS LVDM modifies the CLASS code (ascl:1106.020) to incorporate the cosmological model of Lorentz invariance violation (LV) in gravity and dark matter. Compared to the usual CLASS code, it contains four new parameters: alpha, beta, and lambda characterize LV in the gravity sector , and Y characterizes LV in the dark matter sector.
fits_warp smoothly removes the distorting effect of the ionosphere and restores sources to their reference positions in both the catalog and image domain. Image warping uses pixel offsets derived from a catalog of cross-matched sources. Though initially written for low-frequency radio astronomy, fits_warp can be used to de-distort any image distorted by some vector field which is sampled by some sparse pierce-points.
atlas-fit amends the results of spectroflat (ascl:2411.014) with calibration against a solar atlas. Data for wavelength calibration and continuum-correction is generated from flat field information and selected solar atlantes. The atlas-fit package provides two tools: one to generate a list of lines from the atlas and data to use for finding a wavelength solution (dispersion), and another to amend the calibration results from the spectroflat library.
Spectroflat flat fields spectro-polarimetric data. It can be plugged into existing Python-based data reduction pipelines or used as a standalone calibration and performance analysis tool. The code includes smile distortion correction and flat field extraction. The library expects the spatial domain on the vertical-axis and the spectral domain on the horizontal axis. Spectroflat does not include any file reading/writing routines and expects numpy arrays as input.
NE2001p is a fully Python implementation of the NE2001 Galactic electron density model. The code forward models the dispersion and scattering of compact radio sources, including pulsars, fast radio bursts, AGNs, and masers, and the model predicts the distances of radio sources that lack independent distance measures.
BSAVI (Bayesian Sample Visualizer) aids likelihood analysis of model parameters where samples from a distribution in the parameter space are used as inputs to calculate a given observable. For example, selecting a range of samples will allow you to easily see how the observables change as you traverse the sample distribution. At the core of BSAVI is the Observable object, which contains the data for a given observable and instructions for plotting it. It is modular, so you can write your own function that takes the parameter values as inputs, and BSAVI will use it to compute observables on the fly. It also accepts tabular data, so if you have pre-computed observables, simply import them alongside the dataset containing the sample distribution to start visualizing. Though BSAVI was developed for use in theoretical cosmology, it can be customized to fit a wide range of visualization needs.
MMLPhoto-z estimates the photo-z of quasars using a cross-modal contrastive learning approach. This method employs adversarial training and contrastive loss functions to promote the mutual conversion between multi-band photometric data features (magnitude, color) and photometric image features, while extracting modality-invariant features. MMLPhoto-z can also be applied to tasks like photo-z estimation for galaxies with missing magnitudes. Overall, this method proves effective in enhancing the photo-z estimation across diverse datasets and conditions.
ReverseDiff implements methods to take gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object) using reverse mode automatic differentiation (AD). While performance can vary depending on the functions you evaluate, the algorithms implemented by ReverseDiff generally outperform non-AD algorithms in both speed and accuracy.
Pycosmicstar studies the star formation history for different cosmological models. The package contains two abstract classes, cosmology and structureabstract. The class cosmology is passed as a parameter for the classes that implement structureabstract. This approach takes polymorphism into account. The modeling of structures and star formation are not strongly dependent on the cosmology. Pycosmicstar generates a new cosmological class that implements the methods of abstract class cosmology that is useful to study, for example, the role of dark energy in the cosmic star formation rate evolution.
Astrocats enables astronomers to construct their own curated catalogs of astronomical data with the intention of producing shareable catalogs of that data in human-readable formats. Astrocats is used by several existing open astronomy catalogs, including the Open Supernova Catalog, Open TDE Catalog, Open Nova Catalog, and the Open Black Hole Catalog.
EFTofPNG (Effective Field Theory of Post-Newtonian Gravity) performs high precision computations in the effective field theory of post-Newtonian (PN) Gravity, including spins. Written in Mathematica, it provides computer-algebra tools to derive analytical input for gravitational-wave source modelling relevant to current observatories. EFTofPNG has been used to derive of all currently known spin-dependent conservative interaction potentials in the post-Newtonian (PN) approximation to General Relativity (GR).
HBSGSep (Hierarchical Bayesian Star-Galaxy Separations) classifies stars and galaxies photometrically by fitting templates and hierarchically learning their prior weights. The hierarchical Bayesian algorithms are unsupervised and do not use a training set nor are priors set in advance of running the algorithms; the priors for the templates are inferred from the data themselves.
GAz calculates photometric redshifts for low redshift galaxies. It finds optimal polynomial forms to fit to data. It explores the very large space of high order polynomials while only requiring optimization of a small number of terms. Tested with the 2SLAQ LRG data set, GAz generalizes well to various data sets and redshift ranges.
DarkRayNet uses recurrent neural networks (RNNs) to quickly simulate antiprotons, antideuterons, protons and Helium cosmic ray (CR) spectra at Earth for an extensive range of parameters. The corresponding neural networks are trained on GALPROP (ascl:1010.028) simulations. DarkRayNet can also simulate the cosmic ray fluxes for antideuterons; the spectra can be predicted for a signal from dark matter annihilation DM Antideuterons and for secondary emission Secondary Antideuterons.
PyMerger detects binary black hole mergers from the Einstein Telescope based on a Deep Residual Neural Network (ResNet) model; the model was trained on combined data from all three proposed sub-detectors of ET (TSDCD). The model achieved high BBH detection rates. Though not trained on BNS and BHNS mergers, PyMerger successfully detected 11,477 BNS and 323 BHNS mergers in ET-MDC, indicating its potential for broader applicability.
flashcurve estimates the necessary time windows for adaptive binning light curves in Fermi-LAT data using raw photon data. Fluxes coming from Gamma rays measured by the Fermi-LAT satellite are extremely variable. Gamma-ray light curves produced by flashcurve, which uses deep learning, optimally use adaptive bin sizes to retrieve information about the source dynamics and to combine gamma-ray observations in a multi-messenger perspective.
Mosaic characterizes the beam shape and generate efficient tilings for efficient multi-beam observations. It consists of an interferometric pattern simulator and characterizer, an optimized tiling generator, and a beamforming weights calculator. It is being used in the filter-banking beamformer in the MeerKAT telescope; more than 200 pulsars have been discovered from the multiple beam observations supported by Mosaic.
**Finalflash** is a Python package designed for primary beam corrections of uGMRT radio interferometric images. The software uses frequency-dependent beam models and FITS file handling to improve the accuracy of radio astronomical data. It is open source and available under the MIT License. The code is hosted at https://github.com/arpan-52/Finalflash.
Extensible spacetime agnostic general relativistic ray-tracing (GRRT): Gradus.jl is a suite of tools related to tracing geodesics and calculating observational signatures of accreting compact objects. Gradus.jl requires only a specification of the non-zero metric components of a chosen spacetime in order to solve the geodesic equation and compute a wide variety of trajectories and orbits. Various algorithms for calculating physical quantities are implemented generically, so they may be used with different classes of spacetime with minimal effort.
Falcon-DM simulates intermediate mass ratio inspirals in DM spikes. This lightweight N-body code is written in C++ and is specifically tuned for simulating IMRIs embedded in dark matter (DM) spikes. It features a 2nd order Drift-Kick-Drift integrator using the symplectic HOLD scheme and symmetrized, individual, time-steps for accurate time-integration. Falcon-DM also offers post-Newtonian (PN) effects up to PN2.5 using the auxiliary velocity algorithm.
Heracles manages harmonic-space statistics on the sphere. It takes catalogs of positions and function values on the sphere and turns them into angular power spectra and mixing matrices. Heracles is both a Python library, to be used in notebooks or data processing pipelines, and a tool for running measurements from the command line using a configuration file.
fastPTA forecasts the sensitivity of future Pulsar Timing Array (PTA) configurations and assesses constraints on Stochastic Gravitational Wave Background (SGWB) parameters. The code can generate mock PTA catalogs with noise levels compatible with current and future PTA experiments. These catalogs can then be used to perform Fisher forecasts of MCMC simulations.
StellarSpectraObservationFitting (SSOF) measures radial velocities and creates data-driven models (with fast, physically-motivated Gaussian Process regularization) for the time-variable spectral features for both the telluric transmission and stellar spectrum measured by Extremely Precise Radial Velocity (EPRV) spectrographs (while accounting for the wavelength-dependent instrumental line-spread function). Written in Julia, SSOF provides two methods for estimating the uncertainties on the RVs and model scores based on the photon uncertainties in the original data. For quick estimates of the uncertainties, the code looks at the local curvature of the likelihood space; the second method for estimating errors is via bootstrap resampling.
Gaspery uses the Fisher Information Matrix (FIM) to evaluate different radial velocity (RV) observing strategies; this assists observational exoplanet astronomers in constructing the observing strategy that maximizes information (or minimizes uncertainty) on the RV semi-amplitude K. The code is flexible and generalizable, however, and can maximize information on any free parameter from any model, given a time series support (x-axis).
Kamodo provides access to, interpolation of, and visualization of space weather models and data. The code allows model developers to represent simulation results as mathematical functions which may be manipulated directly. As the software does not generate model outputs, users must acquire the desired model outputs before these outputs can be functionalized by the software. Kamodo handles unit conversion transparently and supports interactive science discovery through Jupyter notebooks with minimal coding.
CloudCovErr.jl debiases fluxes and improves error bar estimates for photometry on top of structured filamentary backgrounds. It first estimates the covariance matrix of the residuals from a previous photometric model and then computes corrections to the estimated flux and flux uncertainties. Using an infilling technique to estimate the background and its uncertainty dramatically improves flux and flux uncertainty estimates for stars in images of fields with significant nebulosity.
ARK implements Computational Fluid Dynamics applications, such as Euler and all-Mach regime, on a Cartesian grid with MPI+Kokkos. It provides a performance-portable Kokkos implementation for compressible hydrodynamics and performs simulations of convection without any approximation of Boussinesq nor anelastic type. It adapts an all-Mach number scheme into a well-balanced scheme for gravity, which preserves arbitrary discrete equilibrium states up to the machine precision. The low-Mach correction in the numerical flux allows ARK to be more precise in the low-Mach regime; the code is well suited for studying highly stratified and high-Mach convective flows.
The 1D radiative-equilibrium model Exo-REM simulates young gas giants far from their star and brown dwarfs. Fluxes are calculated using the two-stream approximation assuming hemispheric closure. The radiative-convective equilibrium is solved assuming that the net flux (radiative + convective) is conservative. The conservation of flux over the pressure grid is solved iteratively using a constrained linear inversion method. Rayleigh scattering from H2, He, and H2O, as well as absorption and scattering by clouds (calculated from extinction coefficient, single scattering albedo, and asymmetry factor interpolated from precomputed tables for a set of wavelengths and particle radii), are also taken into account.
DGEM compares different computation methods for three-dimensional dust continuum radiative transfer. This simple code is based on mcpolar, translated to C++, and refactored to realize and compare radiative transfer techniques, namely Monte Carlo, Quasi-Monte-Carlo, and the Directions Grid Enumeration Method (DGEM). DGEM uses precalculated directions of the photons propagation instead of the random ones to speed up the calculations process. The code also offers a gnuplot script for plotting the resulting images.
lensitbiases is an rFFT-based N1 lensing bias calculation and tests. It is tuned for TT, P-only or MV (GMV) like quadratic estimators. It performs rFFT-based N1 and N1 matrix calculations in ~ O(ms) time per lensing multipole for Planck-like config, which allows on-the-fly evaluation of the bias. It also calculates 5 rFFT's of moderate size per L for N1 TT, 20 for PP, and 45 for MV or GMV. lensitbiases is not particularly efficient for low lensing L's, since in this case one must use large boxes.
DIRTY (DustI Radiative Transfer, Yeah!) computes the radiative transfer and dust emission from arbitrary distributions of dust illuminated by arbitrary distributions of sources (usually stars). It uses Monte Carlo methods to solve the radiative transfer problem in full 3D including non-equilibrium and equilibrium thermal dust emission. As are other similar models, DUSTY is computationally intensive; as a result, it is written in C++.
solar-vSI performs Monte Carlo integration of multi-body phase space efficiently. The calculation of solar antineutrino spectra from 8B decay requires the integration of five-body phase space. Though there is no simple analytical approach to this problem, recursive relations can be used to facilitate numerical evaluations.
measure_extinction measures extinction due to dust absorbing photons or scattering photons out of the line-of-sight. Extinction applies to the case for a star seen behind a foreground screen of dust. This package provides the tools to measure dust extinction curves using observations of two effectively identical stars, differing only in that one is seen through more dust than the other.
Forcepho infers the fluxes and shapes of galaxies from astronomical images. It models the appearance of multiple sources in multiple bands simultaneously and compares to observed data via a likelihood function. Gradients of this likelihood allow for efficient maximization of the posterior probability or sampling of the posterior probability distribution via Hamiltonian Monte Carlo. The model intrinsic galaxy shapes and positions are shared across the different bands, but the fluxes are fit separately for each band. Forcepho does not perform detection; initial locations and (very rough) parameter estimates must be supplied by the user.
BayeSED implements full Bayesian interpretation of spectral energy distributions (SEDs) of galaxies and AGNs. It performs Bayesian parameter estimation using posteriori probability distributions (PDFs) and Bayesian SED model comparison using Bayesian evidence. Its latest version BayeSED3 supports various built-in SED models and can emulate other SED models using machine learning techniques.
iPIC3D performs kinetic plasma simulations at magnetohydrodynamics time scales. This three-dimensional parallel code uses the implicit Particle-in-Cell method; implicit integration in time of the Vlasov–Maxwell system removes the numerical stability constraints. Written in C++, iPIC3D can be run with CUDA acceleration and supports MPI, OpenMP, and multi-node multi-GPU simulations.
vortex-p analyzes the velocity fields of astrophysical simulations of different natures (for example, SPH, moving-mesh, and meshless) usually spanning many orders of magnitude in scales involved. The code performs Helmholtz-Hodge decomposition (HHD); that is, it can decompose the velocity field into a solenoidal and an irrotational/compressive part Helmholtz-Hodge decomposition. vortex-p internally uses an AMR representation of the velocity field and can, in principle, capture the full dynamical range of the simulation. The package can also perform Reynolds decomposition (i.e., the decomposition of the velocity field into a bulk and a turbulent part). This is achieved by means of a multi-scale filtering of the velocity field, where the filtering scale around each point is determined by the local flow properties. vortex-p expands the vortex (ascl:2206.001) code, which had been coupled to the outputs of the MASCLET code, to a fully stand-alone tool capable of working with the outcomes of a broad range of simulation methods.
pysymlog provides utilities for binning, normalizing colors, wrangling tick marks, and other tasks, in symmetric logarithm space. For numbers spanning positive and negative values, the code works in log scale with a transition through zero, down to some threshold. This is useful for representing data that span many scales such as standard log-space that include values of zero or even negative values. pysymlog provides convenient functions for creating 1D and 2D histograms and symmetric log bins, generating logspace-like arrays through zero and managing matplotlib major and minor ticks in symlog space, as well as bringing symmetric log scaling functionality to plotly.
This paper introduces RadioSunPy, an open-source Python package developed for accessing, visualizing, and analyzing multi-band radio observations of the Sun from the RATAN-600 solar complex. The advancement of observational technologies and software for processing and visualizing spectro-polarimetric microwave data obtained with the RATAN-600 radio telescope opens new opportunities for studying the physical characteristics of solar plasma at the levels of the chromosphere and corona. These levels remain some difficult to detect in the ultraviolet and X-ray ranges. The development of these methods allows for more precise investigation of the fine structure and dynamics of the solar atmosphere, thereby deepening our understanding of the processes occurring in these layers. The obtained data also can be utilized for diagnosing solar plasma and forecasting solar activity. However, using RATAN-600 data requires extensive data processing and familiarity with the RATAN-600. The package offers comprehensive data processing functionalities, including direct access to raw data, essential processing steps such as calibration and quiet Sun normalization, and tools for analyzing solar activity. This includes automatic detection of local sources, identifying them with NOAA (National Oceanic and Atmospheric Administration) active regions, and further determining parameters for local sources and active regions. By streamlining data processing workflows, RadioSunPy enables researchers to investigate the fine structure and dynamics of the solar atmosphere more efficiently, contributing to advancements in solar physics and space weather forecasting.
ysoisochrone is a Python3 package that handles the isochrones for young stellar objects (YSOs), and utilize isochrones to derive the stellar mass and ages. Our primary method is a Bayesian inference approach, and the Python code builds on the IDL version developed in Pascucci et al. (2016). The code estimates the stellar masses, ages, and associated uncertainties by comparing their stellar effective temperature, bolometric luminosity, and their uncertainties with different stellar evolutionary models, including those specifically developed for YSOs. User-developed evolutionary tracks can also be utilized when provided in the specific format described in the code documentation.
The kete tools are intended to enable the simulation of all-sky surveys of solar system objects. This includes multi-body physics orbital dynamics, thermal and optical modeling of the objects, as well as field of view and light delay corrections. These tools in conjunction with the Minor Planet Centers (MPC) database of known asteroids can be used to not only plan surveys but can also be used to predict what objects are visible for existing or past surveys.
The primary goal for kete is to enable a set of tools that can operate on the entire MPC catalog at once, without having to do queries on specific objects. It has been used to simulate over 10 years of survey time for the NEO Surveyor mission using 10 million main-belt and near-Earth asteroids.
GalCraft creates mock integral-field spectroscopic (IFS) observations of the Milky Way and other hydrodynamical/N-body simulations. It conducts all the procedures from inputting data and spectral templates to the output of IFS data cubes in FITS format. The produced mock data cubes can be analyzed in the same way as real IFS observations by many methods, particularly codes like Voronoi binning (ascl:1211.006), pPXF (ascl:1210.002), line-strength indices, or a combination of them (e.g., the GIST pipeline, ascl:1907.025). The code is implemented using Python-native parallelization. GalCraft will be particularly useful for directly comparing the Milky Way with other MW-like galaxies in terms of kinematics and stellar population parameters and ultimately linking the Galactic and extragalactic to study galaxy evolution.
pyRRG measures the 2nd and 4th order moments using a TinyTim model to correct for PSF distortions. The code is invariant to the number exposures and orientation of the drizzle images. pyRRG uses a machine learning algorithm to automatically classify stars and galaxies; this can also be done manually if greater accuracy is needed.
Padé simulates protoplanetary disk hydrodynamics in cylindrical coordinates. Written in Fortran90, it is a finite-difference code and the compact 4th-order standard Padé scheme is used for spatial differencing. Padé differentiation is known to have spectral-like resolving power. The z direction can be periodic or non-periodic. The 4th order Runge-Kutta is used for time advancement. Padé implements a version of the FARGO technique to eliminate the time-step restriction imposed by Keplerian advection, and capturing of shocks that are not too strong can be done by using artificial bulk viscosity.
PySR performs Symbolic Regression; it uses machine learning to find an interpretable symbolic expression that optimizes some objective. Over a period of several years, PySR has been engineered from the ground up to be (1) as high-performance as possible, (2) as configurable as possible, and (3) easy to use. PySR is developed alongside the Julia library SymbolicRegression.jl, which forms the powerful search engine of PySR. Symbolic regression works best on low-dimensional datasets, but one can also extend these approaches to higher-dimensional spaces by using "Symbolic Distillation" of Neural Networks. Here, one essentially uses symbolic regression to convert a neural net to an analytic equation. Thus, these tools simultaneously present an explicit and powerful way to interpret deep neural networks.
WISE2MBH uses infrared cataloged data from the Wide-field Infrared Survey Explorer (WISE) to estimate the mass of supermassive black holes (SMBH). It implements a Monte Carlo approach for error propagation, considering mean photometric errors from WISE magnitudes, errors in fits of scaling relations used and scatter of those relations, if available.
PyExoCross, a Python adaptation of ExoCross (ascl:1803.014), post-processes molecular line lists generated by ExoMol, HITRAN, and HITEMP and other similar initiatives. It generates absorption and emission spectra and other properties, including partition functions, specific heats, and cooling functions, based on molecular line lists. The code also calculates cross sections with four line profiles: Doppler, Gaussian, Lorentzian, and Voigt. PyExoCross can convert data format between ExoMol and HITRAN, and supports importing and exporting line lists in the ExoMol and HITRAN/HITEMP formats.
GASTLI (GAS gianT modeL for Interiors) calculates the interior structure models for gas giants exoplanets. The code computes mass-radius curves, thermal evolution curves, and interior composition retrievals to fit a interior structure model to your mass, radius, age, and if available, atmospheric metallicity data. GASTLI can also plot the results, including internal and atmospheric profiles, a pressure-temperature diagram, mass-radius relations, and thermal evolution curves.
symbolic_pofk provides simple Python functions and a Fortran90 routine for precise symbolic emulations of the linear and non-linear matter power spectra and for the conversion σ 8 ↔ A s as a function of cosmology. These can be easily copied, pasted, and modified to other languages. Outside of a tested k range, the fit includes baryons by default; however, this can be switched off.
planetMagFields accesses and analyzes information about magnetic fields of planets in our solar system and visualizes them in both 2D and 3D. The code provides access to properties of a planet, such as dipole tilt, Gauss coefficients, and computed radial magnetic field at surface, and has methods to plot the field and write a vts file for 3D visualization. planetMagFields can be used to produce both 2D and 3D visualizations of a planetary field; it also provides the option of potential extrapolation.
The software framework AMReX is designed for building massively parallel block-structured adaptive mesh refinement (AMR) applications. Key features of AMReX include C++ and Fortran interfaces; 1-, 2- and 3-D support; and support for cell-centered, face-centered, edge-centered, and nodal data. The framework also supports hyperbolic, parabolic, and elliptic solves on hierarchical adaptive grid structure, optional subcycling in time for time-dependent PDEs, and parallelization via flat MPI, OpenMP, hybrid MPI/OpenMP, or MPI/MPI, and parallel I/O. AMReX supports the plotfile format with AmrVis, VisIt (ascl:1103.007), ParaView (ascl:1103.014), and yt (ascl:1011.022).
ClassiPyGRB downloads, processes, visualizes, and classifies GRBs in the Swift/BAT database. Users can query light curves for any GRB and use tools to preprocess the data, including noise/duration reduction and interpolation. The package provides a set of facilities and tutorials for classifying GRBs based on their light curves using a method based on a dimensionality reduction of the data using t-Distributed Stochastic Neighbour Embedding (TSNE); results are visualized using a Graphical User Interface (GUI). ClassiPyGRB also plots and animates the results of the TSNE analysis for a deeper hyperparameter grid search.
BeyonCE (Beyond Common Eclipsers) explores the large parameter space of eclipsing disc systems. The fitting code reduces the parameter space encompassed by the transit of circumsecondary disc (CSD) systems with azimuthally symmetric, non-uniform optical-depth profiles to constrain the size and orientation of discs with a complex sub-structure. BeyonCE does this by rejecting disc geometries that do not reproduce the measured gradients within their light curves.
resonances identifies mean-motion resonances of small bodies. It uses the REBOUND integrator (ascl:1110.016) and automatically identifies two-body and three-body mean-motion resonance in the Solar system. The package can be used for other possible planetary systems, including exoplanets. resonances accurately differentiates different types of resonances (pure, transient, uncertain) and provides an interface for mass tasks, such as finding resonant areas in a planetary system. The software can also plot time series and periodograms.
cloudyfsps is a Python interface between FSPS (ascl:1010.043) and Cloudy (ascl:9910.001). It compiles FSPS models for use as ionizing sources (Stellar SED grids) within Cloudy and generates Cloudy input files, single-parameter or grids of parameters. It runs Cloudy models in parallel and formats the output, which is nebular continuum and nebular line emission, for FSPS input and for explorative manipulation and plotting within Python. cloudyfsps includes pre-packaged plots for BPT diagrams (NII, SII, OI, OII) with observed data from HII regions and SDSS galaxies, and also provides comparisons with MAPPINGS III (ascl:1306.008) models.
Stardust extracts galaxy properties by fitting their multiwavelength data to a set of linearly combined templates. This Python package brings three different families of templates together: 1.) UV+Optical emission from dust unobscured stellar light; 2.) AGN heated dust in the MIR; and 3.) IR dust reprocessed stellar light in the NIR-FIR. Stardust's template fitting does not rely on energy balance. As a result, the total luminosity of dust obscured and dust unobscured stellar light do not rely on each other, and it is possible to fit objects such as SMGs where the energy balance approach might not be applicable.
PICASSO (Python Inpainter for Cosmological and AStrophysical SOurces) provides a suite of inpainting methodologies to reconstruct holes on images (128x128 pixels) extracted from a HEALPIX map. Three inpainting techniques are included; these are divided into two main groups: diffusive-based methods (Nearest-Neighbors), and learning-based methods that rely on training DCNNs to fill the missing pixels with the predictions learned from a training data-set (Deep-Prior and Generative Adversarial Networks). PICASSO also provides scripts for projecting from full sky HEALPIX maps to flat thumbnails images, performing inpainting on GPUs and parallel inpainting on multiple processes, and for projecting from flat images to HEALPIX. Pretrained models are also included.
MCMole3D (Monte-Carlo MOlecular Line Emission) simulates the 3D molecular cloud emission in the Milky Way. In particular, it can simulate both the unpolarized and polarized emission coming from the first rotational line of Carbon Monoxide (CO, J=1-0). MCMole3D seeks to compare the simulated emission with that observed by full sky surveys from the Planck satellite.
FGCluster runs spectral clustering onto Healpix maps for parametric foreground removal, using a map encoding the feature to cluster as inputs. Pixel similarity is given by the geometrical affinity of each pixel in the sphere. FGCluster can also take an uncertainty map as an input, in which case the adjacency is modified in such a way that the pixel similarity accounts also for the statistical significance given by the pixel values in a map and the uncertainties.
SUSHI (Semi-blind Unmixing with Sparsity for hyperspectral images) performs non-stationary unmixing of hyperspectral images. The typical use case is to map the physical parameters such as temperature and redshift from a model with multiple components using data from hyperspectral images. Applying a spatial regularization provides more robust results on voxels with low signal to noise ratio. The code has been used on X-ray astronomy but the method can be applied to any integral field unit (IFU) data cubes.
UltraDark.jl simulates cosmological scalar fields. Written in Julia, it is inspired by PyUltraLight (ascl:1810.009) and designed to be simple to use and extend. It solves a non-interacting scalar field Gross-Pitaevskii equation coupled to Poisson's equation for gravitational potential. The scalar field describes scalar dark matter in models including ultralight dark matter, fuzzy dark matter, axion-like particles and the like. It also describes an inflaton field in the reheating epoch of the early universe.
Written in Python, DarsakX is used to design and analyze the imaging performance of a multi-shell X-ray telescope with an optical configuration similar to Wolter-1 optics for astronomical purposes. It can also assess the impact of figure error on the telescope's imaging performance and optimize the optical design to improve angular resolution for wide-field telescopes. By default, DarsakX uses DarpanX (ascl:2101.015) to calculate the mirror's reflectivity.
SAQQARA analyzes stochastic gravitational wave background signals. This Simulation-based Inference (SBI) library is built on top of the swyft code (ascl:2302.016), which implements neural ratio estimation to efficiently access marginal posteriors for all parameters of interest. Simulation-based inference combined with implicit marginalization (over nuisance parameters) has been shown to be well suited for SGWB data analysis.
21cmFirstCLASS extends 21cmFAST (ascl:1102.023) and interfaces with CLASS (ascl:1106.020) to generate initial conditions at recombination that are consistent with the input cosmological model. These initial conditions can be set during the time of recombination, allowing one to compute the 21cm signal (and its spatial fluctuations) throughout the dark ages, as well as in the proceeding cosmic dawn and reionization epochs, just as in the standard 21cmFAST. 21cmFirstCLASS tracks both the CDM density field δc as well as the baryons density field δb. In addition, the user interface in 21cmFirstCLASS has been improved and allows one to easily plot the 21cm power spectrum while including noise from the output of 21cmSense (ascl:1609.013).
GRBoondi simulates generalized Proca fields on arbitrary analytic fixed backgrounds; it is based on the publicly available 3+1D numerical relativity code GRChombo (ascl:2306.039). GRBoondi reduces the prerequisite knowledge of numerical relativity and GRChombo in the numerical studies of generalized Proca theories. The main steps to perform a study are inputting the additions to the equations of motion beyond the base Proca theory; GRBoondi can then automatically incorporate the higher-order terms in the simulation. The code is written entirely in C++14 and uses hybrid MPI/OpenMP parallelism. GRBoondi inherits all of the capabilities of the main GRChombo code, which makes use of the Chombo library (ascl:1202.008) for adaptive mesh refinement.
RadioSED uses nested sampling to perform a Bayesian analysis of radio SEDs constructed from radio flux density measurements obtained as part of large area surveys (or in some limited cases, as part of targeted followup campaigns). It is a pure Python implementation, and is essentially a wrapper around Bilby (ascl:1901.011), the Bayesian inference library. RadioSED uses dynesty (ascl:1809.013) to perform the sampling steps, though other samplers could also be used. Users can make use of a pre-defined set of models and surveys from which to draw flux density measurements, or they can define their own models and provide their own input flux density measurements. All flux density measurements are referenced against the RACS-LOW survey, and source names and IDs from the survey catalogue are used as identifiers.
M_SMiLe computes an approximation of the probability of magnification for a lens system consisting of microlensing by compact objects within a galaxy cluster. It specifically focuses on the scenario where the galaxy cluster is strongly lensing a background galaxy and the compact objects, such as stars, are sensitive to this microlensing effect. The microlenses responsible for this effect are stars and stellar remnants, though exotic objects such as compact dark matter candidates (including PBHs and axion mini-halos) can contribute to this effect.
BELTCROSS2 calculates the closest approaches of asteroid to the mean orbits of meteoroid streams. It is especially useful to check if an asteroid, which was observed to become active, passed through a meteoroid stream, and through which stream, a short time before the beginning of the activity. The basic characteristics of the closest encounter of the asteroid with the stream are provided by BELTCROSS2.
Cue interprets nebular emission across a wide range of ionizing conditions of galaxies. The software, based on Cloudy (ascl:9910.001), emulates a neural net. It does not require a specific ionizing spectrum as a source, instead approximating the ionizing spectrum with a 4-part piece-wise power-law. Along with the flexible ionizing spectra, Cue allows freedom in [O/H], [N/O], [C/O], gas density, and total ionizing photon budget.
HaloFlow uses a machine learning approach to infer Mh and stellar mass, M∗, using grizy band magnitudes, morphological properties quantifying characteristic size, concentration, and asymmetry, total measured satellite luminosity, and number of satellites.
LADDER (Learning Algorithm for Deep Distance Estimation and Reconstruction) reconstructs the “cosmic distance ladder” by analyzing sequential cosmological data; it can also be applied to other sequential datasets with associated covariance information. It uses the apparent magnitude data from the Pantheon Type Ia supernovae compilation, fully incorporating covariance information to accurately predict mean values and uncertainties. It offers model-independent consistency checks for datasets such as Baryon Acoustic Oscillations (BAO) and can calibrate high-redshift datasets such as Gamma Ray Bursts (GRBs) without assuming any underlying cosmological model. Additionally, LADDER serves as a model-independent mock catalog generator for forecast-based cosmological studies.
Sonification extends the Astronify software (ascl:2408.005) to sonify a spatially distributed dataset. The package contains scripts to convert images into scatterplots and sonifications. The reproduce_image.py script takes an image file and reproduces it as a scatterplot by converting the input image to grayscale, extracting pixel values and generating scatter data based on these values, and then plotting the scatter data to create a visual representation of the image. The sonifications script converts the scatterplot data into an audio series and adjusts the note spacing and sonification range to customize an auditory representation. Sonification accepts images in PNG and JPG formats.
Astronify contains tools for sonifying astronomical data, specifically data series. Data series sonification takes a data table and maps one column to time, and one column to pitch. This technique is commonly used to sonify light curves, where observation time is scaled to listening time and flux is mapped to pitch. While Astronify’s sonification uses the columns “time” and “flux” by default, any two columns can be supplied and a sonification created.
Sailfish simulates accreting binary systems, including binary protostars, post-AGN stellar binaries, mass-transferring X-ray binaries, and double black hole systems. The binary components are "on the grid" rather than excised, and are evolved according to the Kepler two-body problem, modified to account for gravitational wave losses or self-consistent forcing from the orbiting gas. The solvers are shock-capturing and are second order accurate in space and time. Gravity is fully Newtonian. Thermodynamics can be treated using a gamma-law equation of state with a blackbody cooling term, or in the locally isothermal approximation, in which the gas temperature is set to a constant times the local free-fall speed. Sailfish is fully Cartesian and has extensive diagnostic capabilities to facilitate accurate calculations of gas-driven orbital evolution or the extraction of electromagnetic disk signatures. The code is extremely efficient, reaching more than one billion zone updates per second on an NVIDIA A100 GPU, enabling extremely high resolution of complex flows around the binary components.
Global mm-VLBI Array (GMVA) observations are accompanied by a lot of metadata (i.e., the so-called 'ANTAB' files) that contain the system temperature (Tsys) and the gain values of the individual GMVA antennas. These data are required for the amplitude calibration of GMVA data which is an essential part in the data reduction. Unfortunately, Tsys measurements in the ANTAB files are not perfect and there are almost always erroneous values in some of the ANTAB files (particularly in the VLBA data). This could lead to incorrect results in the amplitude calibration and thus need to be corrected with proper data inspection/treatment. However, every GMVA station provides the ANTAB file in their own data format which makes the examination tricky. AntabGMVA was designed to resolve these issues and allows GMVA users to manage the GMVA ANTAB files easily and efficiently. Using AntabGMVA, one can perform extraction/inspection/visualization/correction of the Tsys data from the ANTAB files and finally generate one single ANTAB file which includes all the final products.
SHARC (SHArpened Dimensionality Reduction and Classification) performs local gradient clustering-based sharpened dimensionality reduction (SDR) using neural network projections and uses these projections to make classifications. The library also contains functions for finding the optimal SDR parameters and for consolidating classification results obtained through multiple classifiers. It requires pySDR (ascl:2408.002). SHARC provides accurate and physically insightful classification of astronomical objects based on their broadband colors.
pySDR performs local gradient clustering-based sharpened dimensionality reduction (SDR). The library uses the C++ LGCDR_v1 code as its backend.
Sharpened dimensionality reduction (SDR) sharpens original data before dimensionality reduction to create visually segregrated sample clusters. user-guided labeling. Each distinct cluster can then be labeled and used to further analyze an otherwise unlabeled data set. Written in C++, SDR scales well with large high-dimensional data.
Would you like to view a random code?