Results 1-50 of 3758 (3660 ASCL, 98 submitted)
LeR calculates detectable rates of gravitational waves events (both lensed and un-lensed events). Written in Python, it performs statistical simulation and forecasting of gravitational wave (GW) events and their rates. The code samples gravitational wave source properties and lens galaxies attributes and source redshifts, and can generate image properties such as source position, magnification, and time delay. The package also calculates detectable merger rates per year. Key features of LeR include efficient sampling, optimized SNR calculations, and systematic archiving of results. LeR is tailored to support both GW population study groups and GW lensing research groups by providing a comprehensive suite of tools for GW event analysis.
R2D2 (Residual-to-Residual DNN series for high-Dynamic range imaging) performs synthesis imaging for radio interferometry. The R2D2 algorithm takes a hybrid structure between a Plug-and-Play (PnP) algorithm and a learned version of the well-known Matching Pursuit algorithm. Its reconstruction is formed as a series of residual images, iteratively estimated as outputs of iteration-specific Deep Neural Networks (DNNs), each taking the previous iteration’s image estimate and associated back-projected data residual as inputs. The primary application of the R2D2 algorithm is to solve large-scale high-resolution high-dynamic range inverse problems in radio astronomy, more specifically 2D planar monochromatic intensity imaging.
SCONE (Supernova Classification with a Convolutional Neural Network) classifies supernovae (SNe) by type using multi-band photometry data (lightcurves) using a convolutional neural networks. SCONE takes in supernova (SN) photometry data in the format output by SNANA simulations, separated into two types of files: metadata and observation data. Photometric data is pre-processed via 2D Gaussian process regression, which smooths over irregular sampling rates between filters and also allows SCONE to be independent of the filter set on which it was trained.
NRPyElliptic sets up initial data (ID) for numerical relativity (NR) using the same numerical methods employed for solving hyperbolic evolution equations. The code implements a hyperbolic relaxation method to solve complex nonlinear elliptic PDEs for NR ID. The hyperbolic PDEs are evolved forward in (pseudo)time, resulting in an exponential relaxation of the arbitrary initial guess to a steady state that coincides with the solution of the elliptic system. The package solves these equations on highly efficient numerical grids exploiting underlying symmetries in the physical scenario. NRPyElliptic is built in the NRPy+ (ascl:1807.025) framework, which facilitates the solution of hyperbolic PDEs on Cartesian-like, spherical-like, cylindrical-like, or bispherical-like numerical grids.
Wavelets are a powerful mathematical tool whose most celebrated applications are the analysis, compression and de-noising of scientific data. JOFILUREN is a wavelet code designed for data analysis and de-noising applications. It is written in Fortran, is portable, and can also be used for studying and reducing the physical effects of particle noise in particle-mesh computer simulations. JOFILUREN is introduced in Paper I, is described in Paper II, and is further discussed in Paper III, as referenced below.
* Paper I: Romeo A. B., Horellou C. and Bergh J. (2003), "N-Body Simulations with Two-Orders-of-Magnitude Higher Performance Using Wavelets", Monthly Notices of the Royal Astronomical Society 342, 337-344
https://ui.adsabs.harvard.edu/abs/2003MNRAS.342..337R/abstract
* Paper II: Romeo A. B., Horellou C. and Bergh J. (2004), "A Wavelet Add-On Code for New-Generation N-Body Simulations and Data De-Noising (JOFILUREN)", Monthly Notices of the Royal Astronomical Society 354, 1208-1222
https://ui.adsabs.harvard.edu/abs/2004MNRAS.354.1208R/abstract
* Paper III: Romeo A. B., Agertz O., Moore B. and Stadel J. (2008), "Discreteness Effects in ΛCDM Simulations: A Wavelet-Statistical View", The Astrophysical Journal 686, 1-12
https://ui.adsabs.harvard.edu/abs/2008ApJ...686....1R/abstract
Supplementary information is given in the readme file of JOFILUREN. A pedagogical introduction to wavelets and wavelet applications, containing several useful videos and lecture notes, is given here:
https://fy.chalmers.se/~romeo/RRY025/notes+videos/
Gradus.jl traces geodesics and calculates observational signatures of accreting compact objects. The code requires only a specification of the non-zero metric components of a chosen spacetime in order to solve the geodesic equation and compute a wide variety of trajectories and orbits. Gradus includes algorithms for calculating physical quantities are implemented generically, so they may be used with different classes of spacetime with minimal effort.
Colume (COLUMn to vOLUME) uses the statistical and spatial distribution of a column density map to infer a likely volume density distribution along each line of sight. The Python package incorporates all pre-processing (in particular re-sampling) functions needed to efficiently work on the column density maps. Colume's outputs are saved in Numpy format.
XGPaint, written in Julia, generates maps of extragalactic foregrounds, using astrophysical models designed to replicate the statistics of the millimeter sky. The code computes simulated galaxies from the Cosmic Infrared Background (CIB), radio galaxies, and contributions and distortions from the Sunyaev-Zeldovich (SZ) effect. XGPaint is multithreaded, and supports both HEALPix and Plate Carrée pixelizations.
The Blooming Tree (BT) algorithm identifies clusters, groups, and substructures from galaxy redshift surveys. Based on the hierarchical clustering method, it takes the projected binding energy as the linking length and provides three main analysis approaches to trim the hierarchical tree; 1.) the direct trimming (binding energy, velocity disperion, or eta); 2.) the sigma plateau, when no trimming threshold is appointed; and 3.) the blooming tree. The Blooming Tree algorithm tool works only in the terminal.
ExoSim 2 (Exoplanet Observation Simulator 2) makes spectro-photometric observations of transiting exoplanets from space, ground, and sub-orbital platforms. It is a complete rewrite of ExoSim (ascl:2002.008); it is implemented in Python 3 and uses object-oriented design principles. The package follows a three-step workflow: the creation of focal planes, the production of Sub-Exposure blocks, and the generation of non-destructive reads (NDRs). ExoSim 2 has demonstrated consistency in estimating photon conversion efficiency, saturation time, and signal generation. The simulator has also been validated independently for instantaneous read-out and jitter simulation, and for astronomical signal representation.
FELINE (Find Emission LINEs) combines a fully parallelized galaxy line template matching with the matched filter approach for individual emission features of LSDcat (ascl:1612.002). For the 3D matched filtering, the complete data cube is first median filtered to remove all continuum sources, and then cross-correlated with a template of an isolated emission feature in two spatial and one spectral dimension. FELINE then evaluates the likelihood in each spectrum of the cube for emission lines at the positions provided by a given redshift and a certain combination of typical emission features.
Superbubble Finding Algorithm identifies superbubbles in HI column density maps of both observed and simulated galaxies that has only two adjustable parameters. The algorithm takes an input column density galaxy image and returns the labels and basic measurements of the detected bubbles so plotting and further analysis can be done. The algorithm includes an automated galaxy-background separation step to focus the analysis on the galactic disk. Functions to solve for the superbubble radii, superbubble galactic radii location, and an external function to plot the detected bubbles are also packaged in Superbubble Finding Algorithm.
GaMorNet classifies galaxies morphologically using a Convolutional Neural Network. The code does not need a large amount of training data, as it is trained on simulations and then transfer-learned on a small portion of real data, and can be applied on multiple datasets. The software has a misclassification rate of less than 5%. GaMorNet is written in Python and uses the Keras and TFLearn deep learning libraries to perform all of the machine learning operations. Both these aforementioned libraries in turn use TensorFlow for their underlying tensor operations.
GaMPEN (Galaxy Morphology Posterior Estimation Network) estimates robust posteriors (i.e., values + uncertainties) for structural parameters of galaxies using a Bayesian machine learning framework. The code also automatically crops input images to an optimal size before structural parameter estimation. The package produces extremely well-calibrated (less than 5% deviation) predicted posteriors; these have been shown to be up to 60% more accurate compared to the uncertainties predicted by many light-profile fitting algorithms. Once trained, it takes GaMPEN less than a millisecond to perform a single model evaluation on a CPU. Thus, GaMPEN’s posterior prediction capabilities are ready for large galaxy samples expected from upcoming large imaging surveys, such as Rubin-LSST, Euclid, and NGRST.
StellarSpecModel interpolates the stellar spectral grid; provided with stellar parameters (Teff, FeH, logg), the package will return the corresponding stellar spectrum. It also generates and analyzes theoretical stellar spectral energy distributions (SEDs). StellarSpecModel includes functionality for both single and binary star systems, incorporating extinction models and the ability to handle photometric data in various filter bands.
LESSPayne performs semi-automatic analysis for echelle spectra of stars. It uses a neural network emulator to do a full spectrum fit to estimate stellar parameters and performs automatic continuum and equivalent width fits normalization with theoretical masks. The code uses MOOG (ascl:1202.009) for spectrum synthesis fitting, ATLAS model atmosphere interpolation, and equivalent width abundance determination. LESSPayne can also perform automatic abundance uncertainty analysis with error propagation and summary tables, and should be viewed as providing a high-quality initialization for an smhr file that reduces the time for a standard analysis.
S3Fit fits spectrum and multi-band photometric Spectral Energy Distribution (SED) simultaneously for analyzing observational data of galaxies. It improves the moderate constraints on properties of continuum models in a pure spectral fitting due to the limited wavelength coverage. The code supports multiple models with multiple components, and can handle complex systems with a mixed contribution of Active Galactic Nucleus (AGN) and its host galaxy in both of continua and emission lines (e.g., narrow lines and broad outflow lines). The fitting strategy is optimized to enable an efficient solution of the best-fit results for several tens of parameters and model components. S3Fit is also extensible for adding functions and components by users such as new band filters, star formation history functions, emission lines, and also types of models.
This library of synthetic X-ray spectra provides a tabulated version of the slim disk model for fitting tidal disruption events (TDEs). The library is created by ray-tracing stationary, general relativistic slim disks and consistently incorporating gravitational redshift, Doppler, and lensing effects.
NcorpiON integrates collisional and fragmenting systems of planetesimals or moonlets orbiting a central mass. It features a fragmentation model, based on crater scaling and ejecta models, that realistically simulates a violent impact. Written in C, the code detects collisions, computes mutual gravity, and can resolve a collision by fragmentation. The fast multipole expansions are implemented up to order six to allow for a high precision in mutual gravity computation.
Mini-chem solves chemical kinetics for gas giant atmospheric modeling. It is pared down from large chemical networks to make use of "net forward reaction tables"; this significantly reduces the number of reactions and species required to be evolved in the ODE solvers. The code's NCHO network consists of only 12 species with 10 reactions, making it a lightweight and easy to couple network to large scale 3D GCM models, or other models of interest (such as 1D or 2D kinetic modelling efforts). Mini-chem is written in Fortran and has three main parts: the input routine, the chemistry routines, and the still ODE solver.
luas builds Gaussian processes (GPs) primarily for two-dimensional data sets. It uses different optimizations to make the application of GPs to 2D data sets possible within a reasonable timeframe. The code is implemented using Jax (ascl:2111.002), which helps calculate derivatives of the log-likelihood as well as permitting the code to be easily run on either CPU or GPU. luas can be used with popular inference frameworks such as NumPyro and PyMC. The package makes it easier to account for systematics correlated across two dimensions in data sets, in addition to being helpful for any other applications (e.g., interpolation).
kpic_pipeline reduces data taken with the Keck Planet Imager and Characterizer (KPIC). Written in Python, the code processes high resolution spectroscopy data taken with KPIC to study exoplanet atmospheres; it processes and calibrate the data to enable spectroscopic model fitting. kpic_pipeline can reduce the observed data into 1D spectra for one given science target or can be used to reduce the full nightly data.
IsoFATE (Isotopic Fractionation via ATmospheric Escape) models mass fractionation resulting from diffusive separation in escaping planetary atmospheres and numerically computes atmospheric species abundance over time. The model is tuned to sub-Neptune sized planets with rocky cores of Earth-like bulk composition and primordial H/He atmospheres. F, G, K, and M type stellar fluxes are readily implemented. IsoFATE has two versions, the first of which simulates a ternary mixture of H, He, and D (deuterium); the second version is coupled to the magma ocean-atmosphere equilibrium chemistry model Atmodeller.
The IGRINS_transit data reduction pipeline takes high-resolution observations of transiting exoplanets with Gemini-S/IGRINS and produces cross-correlation detections of molecules in the exoplanet's atmosphere. IGRINS_transit removes low signal-to-noise orders, performs a secondary wavelength calibration, and uses a singular value decomposition (SVD) to separate out the signature of the transiting planet from the host star and telluric contamination.
GPS (Genesis Population Synthesis) develops population synthesis models. The code suite uses the Genesis database of planet formation models for small exoplanets (super-Earths and Mini-Neptunes). Although the codebase focuses on the Genesis models, aother models can easily be integrated with GPS. It computes the bulk compositions of the planets and simulates atmospheric loss and evolution to find the final states of the planets that can be observationally verified. GPS also offers tools to process and analyze the data from recent observations of small exoplanets in order to compare them with the models.
Gollum performs spectral visualization and analysis. It offers both a programmatic interface and a visual interface that help users analyze stellar and substellar spectra, with support included for a set of precomputed synthetic spectral model grids.
GEOCLIM.jl, written in Julia, replicates some features of the original GEOCLIM model written in Fortran. It also extends the original weathering equations WHAK and MAC, which ignore direct dependence on pCO2 and include direct pCO2 dependence respectively. The code estimates global silicate weathering rates from gridded climatology. GEOCLIM.jl estimates weathering during periods of Earth history when the continental configuration was radically different, typically more than 100 million years ago, and includes functions to compute, for example, land/ocean fraction, area-weighted average, area-weighted sum, and land mass perimeter, among other values.
Given mass, radius, and equilibrium temperature, ExoMDN can deliver a full posterior distribution of mass fractions and thicknesses of each planetary layer. A machine-learning model for the interior characterization of exoplanets based on Mixture Density Networks (MDN), ExoMDN is trained on a large dataset of more than 5.6 million synthetic planets below 25 Earth masses. These synthetic planets consist of an iron core, a silicate mantle, a water and high-pressure ice layer, and a H/He atmosphere. ExoMDN uses log-ratio transformations to convert the interior structure data into a form that the MDN can easily handle.
DIA (Delta function Difference Imaging Code) provides a difference image analysis pipeline that employs a delta-function kernel; this is useful for reducing TESS Full Frame Images. DIA's scripts are available in both Python and IDL and are nearly identical in their outputs. Together, the scripts make a pipeline that cleans and aligns images, generates a master frame by combining all available images, performs image subtraction, generates light curves, and does a basic detrending to the light curves based on magnitude. DIA can also apply bias subtraction, flat fielding, background subtraction and align images to the first image in the list.
CROCODILE (CROss-COrrelation retrievals of Directly-Imaged self-Luminous Exoplanets) runs atmospheric retrievals of directly observed gas giant exoplanets by adopting adequate likelihood functions. The code makes use of petitRADTRANS (ascl:2207.014) and PyMultiNest (ascl:1606.005) and provides a statistical framework to interpret the photometry, low-resolution spectroscopy, and medium (and higher) resolution cross-correlation spectroscopy.
Bioverse assesses the diagnostic power of a statistical exoplanet survey of the properties of nearby terrestrial exoplanets via direct imaging or transit spectroscopy. It combines Gaia-based stellar samples with Kepler-derived exoplanet demographics and a mission simulator that enables exploration of a variety of observing, follow-up, and characterization strategies. The code contains a versatile module for population-level hypothesis testing supporting trade studies and survey optimization. Bioverse supports direct imaging or transit missions, and its modularity makes it adaptable to any mission concept that makes measurements on a sample of exoplanets.
ATMOSPHERIX reads t.fits files from the Canada-France-Hawaii Telescope's near-infrared spectropolarimeter SPIRou, processes the data to remove telluric/stella contributions, and performs the correlation analysis for a given planet atmosphere template. The correlation function computes the correlation between the data and model for a grid of planet velocimetric semi-amplitude and systemic velocity. ATMOSPHERIX takes transmission spectroscopy into account and allows the user to inject a synthetic planet if desired.
APPLESOSS (A Producer of ProfiLEs for SOSS) builds 2D spatial profiles for the first, second, and third diffraction orders for a NIRISS/SOSS GR700XD/CLEAR observation. The profiles are entirely data driven, retain a high level of fidelity to the original observations, and can be used as the specprofile reference file for ATOCA (ascl:2502.016). They can also be used as a PSF weighting for optimal extractions.
AESTRA (Auto-Encoding STellar Radial-velocity and Activity) uses deep learning for precise radial velocity measurements in the presence of stellar activity noise. The architecture combines a convolutional radial-velocity estimator and a spectrum auto-encoder. The input consists of a collection of hundreds or more of spectra of a single star, which span a variety of activity states and orbital motion phases of any potential planets. AESTRA does not require any prior knowledge about the star.
exoscene simulates direct images of exoplanetary systems. Written in Python, the package has three modules. These modules can determine a planet's relative astrometry ephemeris, its phase function, and flux ratio, compute the band-integrated irradiance of a star, and accurately resample an image model array to a detector array. exoscene also offers modeling and mapping functions and has additional capabilities.
This repository implements an optimized XGBoost-based framework for photometric classification of Type Ia supernovae, addressing class imbalance through PR-AUC and F1-score prioritization. The approach is designed for scalability in large-scale astronomical surveys such as LSST and ensures improved classification robustness compared to traditional metrics like ROC-AUC.
Spinifex is a pure Python tooling for ionospheric corrections in radio astronomy, e.g. getting total electron content and rotation measures.
Deep-Transit detects transits using a deep learning based 2D object detection algorithm. The code determines the light curve and outputs the transiting candidates' bounding boxes and confidence scores. It has been trained for Kepler and TESS data, and can be extended to other photometric surveys and even ground-based observations. Deep-Transit also provides an interface for training new datasets.
ROCKE-3D (Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics) models the atmospheres and oceans of solar system and exoplanetary terrestrial planets. Written in Fortran, it is a three-dimensional General Circulation Model (GCM). ROCKE-3D requires Panoply, the SOCRATES radiation code and spectral files, and has several additional dependencies.
The spectools_ir suite analyzes medium/high-resolution IR molecular astronomical spectra. It has three main sub-modules (flux_calculator, slabspec, and slab_fitter) and also offers a sub-module (utils) with a few additional functions. Written with infrared medium/high-resolution molecular spectroscopy in mind, spectools_ir generally assumes spectra are in units of Jy and microns and uses information from the HITRAN molecular database. Some routines are more general, but users interested in other applications should proceed with caution.
NbodyGradient computes gradients of N-body integrations for Newtonian gravity and arbitrary N-body hierarchies. Developed for transit-timing analyses and written in Julia, NbodyGradient gives derivatives of the transit times with respect to the initial conditions, either masses and Cartesian coordinates/velocities or orbital elements.
Hierarchical Semi-Sparse Cube (HiSS-Cube) framework provides highly parallel processing of combined multi-modal multi-dimensional big data. The package builds a database on top of the HDF5 framework which supports parallel queries. A database index on top of HDF5 can be easily constructed in parallel, and the code supports efficient multi-modal big data combinations. The performance of HiSS-Cube is bounded by the I/O bandwidth and I/O operations per second of the underlying parallel file system; it scales linearly with the number of I/O nodes and can be extended to any kind of multidimensional data combination and information retrieval.
Spectool is a toolkit designed for processing astronomical spectral data, offering a collection of common spectral analysis algorithms. The package includes functions for spectral resampling, spectral flattening, radial velocity measurements, spectral convolution broadening, and more. Each function in the package is implemented independently, allowing users to select and utilize the desired features as needed. The functions are designed with simple and intuitive interfaces, ensuring ease of use for various data sets and analysis tasks.
hmvec is a pure Python/numpy vectorized general halo model and HOD code. It includes support for 3d power spectra involving NFW, Battaglia electron density profiles and galaxy HODs. It also supports 2d power spectra including tSZ, cosmic shear, galaxy-galaxy lensing and CMB lensing. hmvec calculates a vectorized FFT for a given profile over all points in mass and redshift, using one double loop over mass and redshift to interpolate the profile Fourier transforms to the target wavenumbers; every other part of the code is vectorized.
SZiFi (pronounced "sci-fi") implements the iterative multi-frequency matched filter (iMMF) galaxy cluster finding method. It can be used to detect galaxy clusters with mm intensity maps through their thermal Sunyaev-Zeldovich (tSZ) signal. As a novel feature, SZiFi can perform foreground deprojection via a spectrally constrained MMF or sciMMF, and can also be used for point source detection.
cosmocnc evaluates the number count likelihood of galaxy cluster catalogs. Fast Fourier Transform (FFT) convolutions are used to evaluate some of the likelihood integrals. The code supports three types of likelihoods (unbinned, binned, and an extreme value likelihood); it also supports the addition of stacked cluster data (e.g., stacked lensing profiles), which is modeled in a consistent way with the cluster catalog. The package produce mass estimates for each cluster in the sample, which are derived assuming the hierarchical model that is used to model the mass observables, and generates synthetic cluster catalogs for a given observational set-up. cosmocnc interfaces with the Markov chain Monte Carlo (MCMC) code Cobaya (ascl:1910.019), allowing for easy-to-run MCMC parameter estimation.
Sledgehamr (ScaLar fiEld Dynamics Getting solvEd witH Adaptive Mesh Refinement) simulates the dynamics of coupled scalar fields on a 3-dimensional mesh. Adaptive mesh refinement (AMR) can boost performance if spatially localized regions of the scalar field require high resolution. sledgehamr is compatible with both GPU and CPU clusters, and, because it is AMReX-based (ascl:2409.012), offers a flexible and customizable framework. This framework enables various applications, such as the generation of gravitational wave spectra.
Based on oxkat (ascl:2009.003), polkat focuses on automating full polarization calibration and snapshot (i.e., second-scale) imaging of polarimetric radio data taken with the MeerKAT telescope. Accepting raw visibilities in Measurement Set format, polkat performs the necessary data editing, calibration (reference and self-calibration), and imaging to extract the complete polarization properties for user-defined target sources. Required software packages, including, but not limited to, CASA (ascl:1107.013), WSClean (ascl:1408.023), and QuartiCal (ascl:2305.006) are containerized with Apptainer/Singularity. polkat can be run locally or on high-performance computing that uses a slurm job scheduler; for the latter option, polkat will generate the necessary job submission files.
The Python code smhr (Spectroscopy Made Harder) wraps the MOOG spectral synthesis code (ascl:1202.009) to analyze high-resolution stellar spectra. It offers numerous analysis tools, including normalization of apertures, inverse variance-weighted stitching of overlapping apertures and/or sequential exposures. The code also provides Doppler measurement and correction, automatic measurement of EWs, and multiple methods for inferring stellar parameters; further, it measures elemental abundances from EWs or spectral synthesis and performs a rigorous uncertainty analysis. smhr can be run automatically (in batch mode) or interactively through a graphical user interface. Analyses can be saved to a single file for, for example, distribution to other spectroscopists or release with a publication.
legacypipe produces DESI Legacy Imaging Surveys (aka the Legacy Surveys). It can process individual exposures from many cameras, including the Dark Energy Camera on the Blanco telescope, the 90Prime camera on the Bok telescope, and the Mosaic3 camera on the Mayall telescope. The code can also process exposures from the Hyper-SuprimeCam on Subaru, the old SuprimeCam on Subaru, MegaCam on the Canada-France-Hawaii Telescope, and image products from the GALEX and WISE satellites. Legacypipe performs source detection, and then measurement via forward-modeling using The Tractor (ascl:1604.008). It generates coadded output images as well as catalogs, plus a variety of metrics useful for understanding the properties of the imaging.
Would you like to view a random code?