ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Welcome to the ASCL

The Astrophysics Source Code Library (ASCL) is a free online registry and repository for source codes of interest to astronomers and astrophysicists, including solar system astronomers, and lists codes that have been used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. The ascl ID can be used to link to the code entry by prefacing the number with ascl.net (i.e., ascl.net/1201.001).


Most Recently Added Codes

2024 Oct 14

[submitted] DAXA: Traversing the X-ray desert by Democratising Archival X-ray Astronomy

We introduce a new, open-source, Python module for the acquisition and processing of archival data from many X-ray telescopes - Democratising Archival X-ray Astronomy (hereafter referred to as DAXA). Our software is built to increase access to, and use of, large archives of X-ray astronomy data; providing a unified, easy-to-use, Python interface to the disparate archives and processing tools. We provide this interface for the majority of X-ray telescopes launched within the last 30 years. This module enables much greater access to X-ray data for non-specialists, while preserving low-level control of processing for X-ray experts. It is useful for identifying relevant observations of a single object of interest but it excels at creating multi-mission datasets for serendipitous or targeted studies of large samples of X-ray emitting objects. The management and organization of datasets is also made easier; DAXA archives can be version controlled and updated if new data become available. Once relevant observations are identified, the raw data can be downloaded (and optionally processed) through DAXA, or pre-processed event lists, images, and exposure maps can be downloaded if they are available. X-ray observations are perfectly suited to serendipitous discoveries and archival analyses, and with a decade-long `X-ray desert' potentially on the horizon archival data will take on even greater importance; enhanced access to those archives will be vital to the continuation of X-ray astronomy.

2024 Oct 13

[ascl:2410.002] pysymlog: Symmetric (signed) logarithm scale for Python plots

pysymlog provides utilities for binning, normalizing colors, wrangling tick marks, and other tasks, in symmetric logarithm space. For numbers spanning positive and negative values, the code works in log scale with a transition through zero, down to some threshold. This is useful for representing data that span many scales such as standard log-space that include values of zero or even negative values. pysymlog provides convenient functions for creating 1D and 2D histograms and symmetric log bins, generating logspace-like arrays through zero and managing matplotlib major and minor ticks in symlog space, as well as bringing symmetric log scaling functionality to plotly.

2024 Oct 11

[submitted] RadioSunPy: A Robust Preprocessing Pipeline for RATAN-600 Solar Radio Observations Data

This paper introduces RadioSunPy, an open-source Python package developed for accessing, visualizing, and analyzing multi-band radio observations of the Sun from the RATAN-600 solar complex. The advancement of observational technologies and software for processing and visualizing spectro-polarimetric microwave data obtained with the RATAN-600 radio telescope opens new opportunities for studying the physical characteristics of solar plasma at the levels of the chromosphere and corona. These levels remain some difficult to detect in the ultraviolet and X-ray ranges. The development of these methods allows for more precise investigation of the fine structure and dynamics of the solar atmosphere, thereby deepening our understanding of the processes occurring in these layers. The obtained data also can be utilized for diagnosing solar plasma and forecasting solar activity. However, using RATAN-600 data requires extensive data processing and familiarity with the RATAN-600. The package offers comprehensive data processing functionalities, including direct access to raw data, essential processing steps such as calibration and quiet Sun normalization, and tools for analyzing solar activity. This includes automatic detection of local sources, identifying them with NOAA (National Oceanic and Atmospheric Administration) active regions, and further determining parameters for local sources and active regions. By streamlining data processing workflows, RadioSunPy enables researchers to investigate the fine structure and dynamics of the solar atmosphere more efficiently, contributing to advancements in solar physics and space weather forecasting.

2024 Oct 10

[submitted] ysoisochrone: A Python package to estimate masses and ages for YSOs

ysoisochrone is a Python3 package that handles the isochrones for young stellar objects (YSOs), and utilize isochrones to derive the stellar mass and ages. Our primary method is a Bayesian inference approach, and the Python code builds on the IDL version developed in Pascucci et al. (2016). The code estimates the stellar masses, ages, and associated uncertainties by comparing their stellar effective temperature, bolometric luminosity, and their uncertainties with different stellar evolutionary models, including those specifically developed for YSOs. User-developed evolutionary tracks can also be utilized when provided in the specific format described in the code documentation.

2024 Oct 08

[submitted] Kete

The kete tools are intended to enable the simulation of all-sky surveys of solar system objects. This includes multi-body physics orbital dynamics, thermal and optical modeling of the objects, as well as field of view and light delay corrections. These tools in conjunction with the Minor Planet Centers (MPC) database of known asteroids can be used to not only plan surveys but can also be used to predict what objects are visible for existing or past surveys.

The primary goal for kete is to enable a set of tools that can operate on the entire MPC catalog at once, without having to do queries on specific objects. It has been used to simulate over 10 years of survey time for the NEO Surveyor mission using 10 million main-belt and near-Earth asteroids.

2024 Oct 02

[submitted] vortex-p: a Helmholtz-Hodge and Reynolds decomposition algorithm for particle-based simulations

Astrophysical turbulent flows display an intrinsically multi-scale nature, making their numerical simulation and the subsequent analyses of simulated data a complex problem. In particular, two fundamental steps in the study of turbulent velocity fields are the Helmholtz-Hodge decomposition (compressive+solenoidal; HHD) and the Reynolds decomposition (bulk+turbulent; RD). These problems are relatively simple to perform numerically for uniformly-sampled data, such as the one emerging from Eulerian, fix-grid simulations; but their computation is remarkably more complex in the case of non-uniformly sampled data, such as the one stemming from particle-based or meshless simulations. In this paper, we describe, implement and test vortex-p, a publicly available tool evolved from the vortex code, to perform both these decompositions upon the velocity fields of particle-based simulations, either from smoothed particle hydrodynamics (SPH), moving-mesh or meshless codes. The algorithm relies on the creation of an ad-hoc adaptive mesh refinement (AMR) set of grids, on which the input velocity field is represented. HHD is then addressed by means of elliptic solvers, while for the RD we adapt an iterative, multi-scale filter. We perform a series of idealised tests to assess the accuracy, convergence and scaling of the code. Finally, we present some applications of the code to various SPH and meshless finite-mass (MFM) simulations of galaxy clusters performed with OpenGadget3, with different resolutions and physics, to showcase the capabilities of the code.

2024 Oct 01

[ascl:2410.001] GalCraft: Building integral-field spectrograph data cubes of the Milky Way

GalCraft creates mock integral-field spectroscopic (IFS) observations of the Milky Way and other hydrodynamical/N-body simulations. It conducts all the procedures from inputting data and spectral templates to the output of IFS data cubes in FITS format. The produced mock data cubes can be analyzed in the same way as real IFS observations by many methods, particularly codes like Voronoi binning (ascl:1211.006), pPXF (ascl:1210.002), line-strength indices, or a combination of them (e.g., the GIST pipeline, ascl:1907.025). The code is implemented using Python-native parallelization. GalCraft will be particularly useful for directly comparing the Milky Way with other MW-like galaxies in terms of kinematics and stellar population parameters and ultimately linking the Galactic and extragalactic to study galaxy evolution.

2024 Sep 30

[ascl:2409.020] pyRRG: Weak lensing shape measurement code

pyRRG measures the 2nd and 4th order moments using a TinyTim model to correct for PSF distortions. The code is invariant to the number exposures and orientation of the drizzle images. pyRRG uses a machine learning algorithm to automatically classify stars and galaxies; this can also be done manually if greater accuracy is needed.

[ascl:2409.019] Padé: Protoplanetary disk turbulence simulator

Padé simulates protoplanetary disk hydrodynamics in cylindrical coordinates. Written in Fortran90, it is a finite-difference code and the compact 4th-order standard Padé scheme is used for spatial differencing. Padé differentiation is known to have spectral-like resolving power. The z direction can be periodic or non-periodic. The 4th order Runge-Kutta is used for time advancement. Padé implements a version of the FARGO technique to eliminate the time-step restriction imposed by Keplerian advection, and capturing of shocks that are not too strong can be done by using artificial bulk viscosity.

[ascl:2409.018] PySR: High-Performance Symbolic Regression in Python and Julia

PySR performs Symbolic Regression; it uses machine learning to find an interpretable symbolic expression that optimizes some objective. Over a period of several years, PySR has been engineered from the ground up to be (1) as high-performance as possible, (2) as configurable as possible, and (3) easy to use. PySR is developed alongside the Julia library SymbolicRegression.jl, which forms the powerful search engine of PySR. Symbolic regression works best on low-dimensional datasets, but one can also extend these approaches to higher-dimensional spaces by using "Symbolic Distillation" of Neural Networks. Here, one essentially uses symbolic regression to convert a neural net to an analytic equation. Thus, these tools simultaneously present an explicit and powerful way to interpret deep neural networks.