ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:1801.006] DecouplingModes: Passive modes amplitudes

DecouplingModes calculates the amplitude of the passive modes, which requires solving the Einstein equations on superhorizon scales sourced by the anisotropic stress from the magnetic fields (prior to neutrino decoupling), and the magnetic and neutrino stress (after decoupling). The code is available as a Mathematica notebook.

[ascl:1801.005] InitialConditions: Initial series solutions for perturbations in our Universe

InitialConditions finds the initial series solutions for perturbations in our Universe. This includes all scalar (1 adiabatic, 4 isocurvature and 2 magnetic modes), vector (1 vorticity mode, 1 magnetic mode), and tensor (1 gravitational wave mode and 1 magnetic mode) perturbations including terms up to second order in the neutrino mass. It can handle the standard species (cdm, baryons, photons), and two neutrino mass eigenstates (1 light, 1 heavy).

[ascl:1801.004] hh0: Hierarchical Hubble Constant Inference

hh0 is a Bayesian hierarchical model (BHM) that describes the full distance ladder, from nearby geometric-distance anchors through Cepheids to SNe in the Hubble flow. It does not rely on any of the underlying distributions being Gaussian, allowing outliers to be modeled and obviating the need for any arbitrary data cuts.

[ascl:1801.003] Stan: Statistical inference

Stan facilitates statistical inference at the frontiers of applied statistics and provides both a modeling language for specifying complex statistical models and a library of statistical algorithms for computing inferences with those models. These components are exposed through interfaces in environments such as R, Python, and the command line.

[ascl:1801.002] iWander: Dynamics of interstellar wanderers

iWander assesses the origin of interstellar small bodies such as asteroids and comets. It includes a series of databases and tools that can be used in general for studying the dynamics of an interstellar vagabond object (small−body, interstellar spaceship and even stars).

[ascl:1801.001] BANYAN_Sigma: Bayesian classifier for members of young stellar associations

BANYAN_Sigma calculates the membership probability that a given astrophysical object belongs to one of the currently known 27 young associations within 150 pc of the Sun, using Bayesian inference. This tool uses the sky position and proper motion measurements of an object, with optional radial velocity (RV) and distance (D) measurements, to derive a Bayesian membership probability. By default, the priors are adjusted such that a probability threshold of 90% will recover 50%, 68%, 82% or 90% of true association members depending on what observables are input (only sky position and proper motion, with RV, with D, with both RV and D, respectively). The algorithm is implemented in a Python package, in IDL, and is also implemented as an interactive web page.

[submitted] A Neural Network for the Identification of Dangerous Planetesimals (Including scripts for data generation)

Two neural networks were designed to identify hazardous planetesimals that were trained on object trajectories calculated in a cloud computing environment. The first neural network was fully-connected and was trained on the orbital elements (OEs) of real/simulated planetesimals, while the second was a 1-dimensional convolutional neural network that was trained on the position Cartesian coordinates of real/simulated planetesimals. Ultimately, the network trained on OEs had a better performance by identifying one-third of known potentially hazardous objects including the 3 asteroids with the highest chance of impact with Earth (2009 FD, 1999 RQ36, 1950 DA) as established by NASA's Monte Carlo based Sentry system.

[submitted] loci: Smooth Cubic Multivariate Local Interpolations

loci is a shared library for interpolations in up to 4 dimensions. It is written in C and can be used with C/C++, Python and others. In order to calculate the coefficients of the cubic polynom, only local values are used: The data itself and all combinations of first-order derivatives, i.e. in 2D f_x, f_y and f_xy. This is in contrast to splines, where the coefficients are not calculated using derivatives, but non-local data, which can lead to over-smoothing the result.

[ascl:1712.016] LgrbWorldModel: Long-duration Gamma-Ray Burst World Model

LgrbWorldModel is written in Fortran 90 and attempts to model the population distribution of the Long-duration class of Gamma-Ray Bursts (LGRBs) as detected by the NASA's now-defunct Burst And Transient Source Experiment (BATSE) onboard the Compton Gamma Ray Observatory (CGRO). It is assumed that the population distribution of LGRBs is well fit by a multivariate log-normal distribution. The best-fit parameters of the distribution are then found by maximizing the likelihood of the observed data by BATSE detectors via a native built-in Adaptive Metropolis-Hastings Markov-Chain Monte Carlo (AMH-MCMC) Sampler.

[ascl:1712.015] SgrbWorldModel: Short-duration Gamma-Ray Burst World Model

SgrbWorldModel, written in Fortran 90, presents an attempt at modeling the population distribution of the Short-duration class of Gamma-Ray Bursts (SGRBs) as detected by the NASA's now-defunct Burst And Transient Source Experiment (BATSE) onboard the Compton Gamma Ray Observatory (CGRO). It is assumed that the population distribution of SGRBs is well fit by a multivariate log-normal distribution, whose differential cosmological rate of occurrence follows the Star-Formation-Rate (SFR) convolved with a log-normal binary-merger delay-time distribution. The best-fit parameters of the model are then found by maximizing the likelihood of the observed data by the BATSE detectors via a native built-in Adaptive Metropolis-Hastings Markov-Chain Monte Carlo (AMH-MCMC)Sampler that is part of the code. A model for the detection algorithm of the BATSE detectors is also provided.

[ascl:1712.014] QATS: Quasiperiodic Automated Transit Search

QATS detects transiting extrasolar planets in time-series photometry. It relaxes the usual assumption of strictly periodic transits by permitting a variable, but bounded, interval between successive transits.

[ascl:1712.013] photodynam: Photodynamical code for fitting the light curves of multiple body systems

Photodynam facilitates so-called "photometric-dynamical" modeling. This model is quite simple and this is reflected in the code base. A N-body code provides coordinates and the photometric code produces light curves based on coordinates.

[ascl:1712.012] MadDM: Computation of dark matter relic abundance

MadDM computes dark matter relic abundance and dark matter nucleus scattering rates in a generic model. The code is based on the existing MadGraph 5 architecture and as such is easily integrable into any MadGraph collider study. A simple Python interface offers a level of user-friendliness characteristic of MadGraph 5 without sacrificing functionality. MadDM is able to calculate the dark matter relic abundance in models which include a multi-component dark sector, resonance annihilation channels and co-annihilations. The direct detection module of MadDM calculates spin independent / spin dependent dark matter-nucleon cross sections and differential recoil rates as a function of recoil energy, angle and time. The code provides a simplified simulation of detector effects for a wide range of target materials and volumes.

[ascl:1712.011] FBEYE: Analyzing Kepler light curves and validating flares

FBEYE, the "Flares By-Eye" detection suite, is written in IDL and analyzes Kepler light curves and validates flares. It works on any 3-column light curve that contains time, flux, and error. The success of flare identification is highly dependent on the smoothing routine, which may not be suitable for all sources.

[ascl:1712.010] Flux Tube: Solar model

Flux Tube is a nonlinear, two-dimensional, numerical simulation of magneto-acoustic wave propagation in the photosphere and chromosphere of small-scale flux tubes with internal structure. Waves with realistic periods of three to five minutes are studied, after horizontal and vertical oscillatory perturbations are applied to the equilibrium model. Spurious reflections of shock waves from the upper boundary are minimized by a special boundary condition.

[ascl:1712.009] RODRIGUES: RATT Online Deconvolved Radio Image Generation Using Esoteric Software

RODRIGUES (RATT Online Deconvolved Radio Image Generation Using Esoteric Software) is a web-based radio telescope simulation and reduction tool. From a technical perspective it is a web based parameterized docker container scheduler with a result set viewer.

[ascl:1712.008] CosApps: Simulate gravitational lensing through ray tracing and shear calculation

Cosmology Applications (CosApps) provides tools to simulate gravitational lensing using two different techniques, ray tracing and shear calculation. The tool ray_trace_ellipse calculates deflection angles on a grid for light passing a deflecting mass distribution. Using MPI, ray_trace_ellipse may calculate deflection in parallel across network connected computers, such as cluster. The program physcalc calculates the gravitational lensing shear using the relationship of convergence and shear, described by a set of coupled partial differential equations.

[ascl:1712.007] SFoF: Friends-of-friends galaxy cluster detection algorithm

SFoF is a friends-of-friends galaxy cluster detection algorithm that operates in either spectroscopic or photometric redshift space. The linking parameters, both transverse and along the line-of-sight, change as a function of redshift to account for selection effects.

[ascl:1712.006] Nyx: Adaptive mesh, massively-parallel, cosmological simulation code

Nyx code solves equations of compressible hydrodynamics on an adaptive grid hierarchy coupled with an N-body treatment of dark matter. The gas dynamics in Nyx use a finite volume methodology on an adaptive set of 3-D Eulerian grids; dark matter is represented as discrete particles moving under the influence of gravity. Particles are evolved via a particle-mesh method, using Cloud-in-Cell deposition/interpolation scheme. Both baryonic and dark matter contribute to the gravitational field. In addition, Nyx includes physics for accurately modeling the intergalactic medium; in optically thin limits and assuming ionization equilibrium, the code calculates heating and cooling processes of the primordial-composition gas in an ionizing ultraviolet background radiation field.

[ascl:1712.005] draco: Analysis and simulation of drift scan radio data

draco analyzes transit radio data with the m-mode formalism. It is telescope agnostic, and is used as part of the analysis and simulation pipeline for the CHIME (Canadian Hydrogen Intensity Mapping Experiment) telescope. It can simulate time stream data from maps of the sky (using the m-mode formalism) and add gain fluctuations and correctly correlated instrumental noise (i.e. Wishart distributed). Further, it can perform various cuts on the data and make maps of the sky from data using the m-mode formalism.

[ascl:1712.004] Bitshuffle: Filter for improving compression of typed binary data

Bitshuffle rearranges typed, binary data for improving compression; the algorithm is implemented in a python/C package within the Numpy framework. The library can be used alongside HDF5 to compress and decompress datasets and is integrated through the dynamically loaded filters framework. Algorithmically, Bitshuffle is closely related to HDF5's Shuffle filter except it operates at the bit level instead of the byte level. Arranging a typed data array in to a matrix with the elements as the rows and the bits within the elements as the columns, Bitshuffle "transposes" the matrix, such that all the least-significant-bits are in a row, etc. This transposition is performed within blocks of data roughly 8kB long; this does not in itself compress data, but rearranges it for more efficient compression. A compression library is necessary to perform the actual compression. This scheme has been used for compression of radio data in high performance computing.

[ascl:1712.003] Py-SPHViewer: Cosmological simulations using Smoothed Particle Hydrodynamics

Py-SPHViewer visualizes and explores N-body + Hydrodynamics simulations. The code interpolates the underlying density field (or any other property) traced by a set of particles, using the Smoothed Particle Hydrodynamics (SPH) interpolation scheme, thus producing not only beautiful but also useful scientific images. Py-SPHViewer enables the user to explore simulated volumes using different projections. Py-SPHViewer also provides a natural way to visualize (in a self-consistent fashion) gas dynamical simulations, which use the same technique to compute the interactions between particles.

[ascl:1712.002] MPI_XSTAR: MPI-based parallelization of XSTAR program

MPI_XSTAR parallelizes execution of multiple XSTAR runs using Message Passing Interface (MPI). XSTAR (ascl:9910.008), part of the HEASARC's HEAsoft (ascl:1408.004) package, calculates the physical conditions and emission spectra of ionized gases. MPI_XSTAR invokes XSTINITABLE from HEASoft to generate a job list of XSTAR commands for given physical parameters. The job list is used to make directories in ascending order, where each individual XSTAR is spawned on each processor and outputs are saved. HEASoft's XSTAR2TABLE program is invoked upon the contents of each directory in order to produce table model FITS files for spectroscopy analysis tools.

[ascl:1712.001] KDUtils: Kinematic Distance Utilities

The Kinematic Distance utilities (KDUtils) calculate kinematic distances and kinematic distance uncertainties. The package includes methods to calculate "traditional" kinematic distances as well as a Monte Carlo method to calculate kinematic distances and uncertainties.

[ascl:1711.024] NOD3: Single dish reduction software

NOD3 processes and analyzes maps from single-dish observations affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. Its “basket-weaving” tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. A restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density.

[ascl:1711.023] HBT+: Subhalo finder and merger tree builder

HBT+ is a hybrid subhalo finder and merger tree builder for cosmological simulations. It comes as an MPI edition that can be run on distributed clusters or shared memory machines and is MPI/OpenMP parallelized, and also as an OpenMP edition that can be run on shared memory machines and is only OpenMP parallelized. This version is more memory efficient than the MPI branch on shared memory machines, and is more suitable for analyzing zoomed-in simulations that are difficult to balance on distributed clusters. Both editions support hydro simulations with gas/stars.

[ascl:1711.022] HBT: Hierarchical Bound-Tracing

HBT is a Hierarchical Bound-Tracing subhalo finder and merger tree builder, for numerical simulations in cosmology. It tracks haloes from birth and continues to track them after mergers, finding self-bound structures as subhaloes and recording their merger histories as merger trees.

[ascl:1711.021] Bifrost: Stream processing framework for high-throughput applications

Bifrost is a stream processing framework that eases the development of high-throughput processing CPU/GPU pipelines. It is designed for digital signal processing (DSP) applications within radio astronomy. Bifrost uses a flexible ring buffer implementation that allows different signal processing blocks to be connected to form a pipeline. Each block may be assigned to a CPU core, and the ring buffers are used to transport data to and from blocks. Processing blocks may be run on either the CPU or GPU, and the ring buffer will take care of memory copies between the CPU and GPU spaces.

[ascl:1711.020] MARXS: Multi-Architecture Raytrace Xray mission Simulator

MARXS (Multi-Architecture-Raytrace-Xraymission-Simulator) simulates X-ray observatories. Primarily designed to simulate X-ray instruments on astronomical X-ray satellites and sounding rocket payloads, it can also be used to ray-trace experiments in the laboratory. MARXS performs polarization Monte-Carlo ray-trace simulations from a source (astronomical or lab) through a collection of optical elements such as mirrors, baffles, and gratings to a detector.

[ascl:1711.019] SPIDERMAN: Fast code to simulate secondary transits and phase curves

SPIDERMAN calculates exoplanet phase curves and secondary eclipses with arbitrary surface brightness distributions in two dimensions. The code uses a geometrical algorithm to solve exactly the area of sections of the disc of the planet that are occulted by the star. Approximately 1000 models can be generated per second in typical use, which makes making Markov Chain Monte Carlo analyses practicable. The code is modular and allows comparison of the effect of multiple different brightness distributions for a dataset.

[ascl:1711.018] LExTeS: Link Extraction and Testing Suite

LExTeS (Link Extraction and Testing Suite) extracts hyperlinks from PDF documents, tests the extracted links to see which are broken, and tabulates the results. Though written to support a particular set of PDF documents, the dataset and scripts can be edited for use on other documents.

[ascl:1711.017] FATS: Feature Analysis for Time Series

FATS facilitates and standardizes feature extraction for time series data; it quickly and efficiently calculates a compilation of many existing light curve features. Users can characterize or analyze an astronomical photometric database, though this library is not necessarily restricted to the astronomical domain and can also be applied to any kind of time series data.

[ascl:1711.016] Thindisk: Protoplanetary disk model

Thindisk computes the line emission from a geometrically thin protoplanetary disk. It creates a datacube in FITS format that can be processed with a data reduction package (such as GILDAS, ascl:1305.010) to produce synthetic images and visibilities. These synthetic data can be compared with observations to determine the properties (e.g. central mass or inclination) of an observed disk. The disk is assumed to be in Keplerian rotation at a radius lower than the centrifugal radius (which can be set to a large value, for a purely Keplerian disk), and in infall with rotation beyond the centrifugal radius.

[ascl:1711.015] rac-2d: Thermo-chemical for modeling water vapor formation in protoplanetary disks

rec-2d models the distribution of water vapor in protoplanetary disks. Given a distribution of gas and dust, rac-2d first solves the dust temperature distribution with a Monte Carlo method and then solves the gas temperature distribution and chemical composition. Although the geometry is symmetric with respect to rotation around the central axis and reflection about the midplane, the photon propagation is done in full three dimensions. After establishing the dust temperature distribution, the disk chemistry is evolved for 1 Myr; the heating and cooling processes are coupled with chemistry, allowing the gas temperature to be evolved in tandem with chemistry based on the heating and cooling rates.

[ascl:1711.014] Gammapy: Python toolbox for gamma-ray astronomy

Gammapy analyzes gamma-ray data and creates sky images, spectra and lightcurves, from event lists and instrument response information; it can also determine the position, morphology and spectra of gamma-ray sources. It is used to analyze data from H.E.S.S., Fermi-LAT, and the Cherenkov Telescope Array (CTA).

[ascl:1711.013] HO-CHUNK: Radiation Transfer code

HO-CHUNK calculates radiative equilibrium temperature solution, thermal and PAH/vsg emission, scattering and polarization in protostellar geometries. It is useful for computing spectral energy distributions (SEDs), polarization spectra, and images.

[ascl:1711.012] megaman: Manifold Learning for Millions of Points

megaman is a scalable manifold learning package implemented in python. It has a front-end API designed to be familiar to scikit-learn but harnesses the C++ Fast Library for Approximate Nearest Neighbors (FLANN) and the Sparse Symmetric Positive Definite (SSPD) solver Locally Optimal Block Precodition Gradient (LOBPCG) method to scale manifold learning algorithms to large data sets. It is designed for researchers and as such caches intermediary steps and indices to allow for fast re-computation with new parameters.

[ascl:1711.011] galkin: Milky Way rotation curve data handler

galkin is a compilation of kinematic measurements tracing the rotation curve of our Galaxy, together with a tool to treat the data. The compilation is optimized to Galactocentric radii between 3 and 20 kpc and includes the kinematics of gas, stars and masers in a total of 2780 measurements collected from almost four decades of literature. The user-friendly software provided selects, treats and retrieves the data of all source references considered. This tool is especially designed to facilitate the use of kinematic data in dynamical studies of the Milky Way with various applications ranging from dark matter constraints to tests of modified gravity.

[ascl:1711.010] galstreams: Milky Way streams footprint library and toolkit

galstreams provides a compilation of spatial information for known stellar streams and overdensities in the Milky Way and includes Python tools for visualizing them. ASCII tables are also provided for quick viewing of the stream's footprints using TOPCAT (ascl:1101.010). As of 2022, the library provides celestial, distance, proper motion and radial velocity tracks for each stream (pm/vrad when available) stored as AstroPy (ascl:1304.002) SkyCoord objects and a stream's (heliocentric) coordinate frame is realized as an AstroPy reference frame. The code offers polygon footprints and pole (at mid point) and pole tracks in the heliocentric and Galactocentric (GSR) frames. It also offers angular momentum tracks in a heliocentric reference frame at rest with respect to the Galactic center, and provides uniformly reported stream length, end points and mid-point, heliocentric and Galactocentric mid-pole, track and discovery references and information flag denoting which of the 6D attributes (sky, distance, proper motions and radial velocity) are available in the track object.

[ascl:1711.009] Lightning: SED Fitting Package

Lightning is a spectral energy distribution (SED) fitting procedure that quickly and reliably recovers star formation history (SFH) and extinction parameters. The SFH is modeled as discrete steps in time. The code consists of a fully vectorized inversion algorithm to determine SFH step intensities and combines this with a grid-based approach to determine three extinction parameters.

[ascl:1711.008] clustep: Initial conditions for galaxy cluster halo simulations

clustep generates a snapshot in GADGET-2 (ascl:0003.001) format containing a galaxy cluster halo in equilibrium; this snapshot can also be read in RAMSES (ascl:1011.007) using the DICE patch. The halo is made of a dark matter component and a gas component, with the latter representing the ICM. Each of these components follows a Dehnen density profile, with gamma=0 or gamma=1. If gamma=1, then the profile corresponds to a Hernquist profile.

[ascl:1711.007] galstep: Initial conditions for spiral galaxy simulations

galstep generates initial conditions for disk galaxy simulations with GADGET-2 (ascl:0003.001), RAMSES (ascl:1011.007) and GIZMO (ascl:1410.003), including a stellar disk, a gaseous disk, a dark matter halo and a stellar bulge. The first two components follow an exponential density profile, and the last two a Dehnen density profile with gamma=1 by default, corresponding to a Hernquist profile.

[ascl:1711.006] RGW: Goodman-Weare Affine-Invariant Sampling

RGW is a lightweight R-language implementation of the affine-invariant Markov Chain Monte Carlo sampling method of Goodman & Weare (2010). The implementation is based on the description of the python package emcee (ascl:1303.002).

[ascl:1711.005] correlcalc: Two-point correlation function from redshift surveys

correlcalc calculates two-point correlation function (2pCF) of galaxies/quasars using redshift surveys. It can be used for any assumed geometry or Cosmology model. Using BallTree algorithms to reduce the computational effort for large datasets, it is a parallelised code suitable for running on clusters as well as personal computers. It takes redshift (z), Right Ascension (RA) and Declination (DEC) data of galaxies and random catalogs as inputs in form of ascii or fits files. If random catalog is not provided, it generates one of desired size based on the input redshift distribution and mangle polygon file (in .ply format) describing the survey geometry. It also calculates different realisations of (3D) anisotropic 2pCF. Optionally it makes healpix maps of the survey providing visualization.

[ascl:1711.004] BayesVP: Full Bayesian Voigt profile fitting

BayesVP offers a Bayesian approach for modeling Voigt profiles in absorption spectroscopy. The code fits the absorption line profiles within specified wavelength ranges and generates posterior distributions for the column density, Doppler parameter, and redshifts of the corresponding absorbers. The code uses publicly available efficient parallel sampling packages to sample posterior and thus can be run on parallel platforms. BayesVP supports simultaneous fitting for multiple absorption components in high-dimensional parameter space. The package includes additional utilities such as explicit specification of priors of model parameters, continuum model, Bayesian model comparison criteria, and posterior sampling convergence check.

[ascl:1711.003] FTbg: Background removal using Fourier Transform

FTbg performs Fourier transforms on FITS images and separates low- and high-spatial frequency components by a user-specified cut. Both components are then inverse Fourier transformed back to image domain. FTbg can remove large-scale background/foreground emission in many astrophysical applications. FTbg has been designed to identify and remove Galactic background emission in Herschel/Hi-GAL continuum images, but it is applicable to any other (e.g., Planck) images when background/foreground emission is a concern.

[ascl:1711.002] inhomog: Biscale kinematical backreaction analytical evolution

The inhomog library provides Raychaudhuri integration of cosmological domain-wise average scale factor evolution using an analytical formula for kinematical backreaction Q_D evolution. The inhomog main program illustrates biscale examples. The library routine lib/Omega_D_precalc.c is callable by RAMSES (ascl:1011.007) using the RAMSES extension ramses-scalav.

[ascl:1711.001] SpcAudace: Spectroscopic processing and analysis package of Audela software

SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

[ascl:1710.024] pred_loggs: Predicting individual galaxy G/S probability distributions

pred_loggs models the entire PGF probability density field, enabling iterative statistical modeling of upper limits and prediction of full G/S probability distributions for individual galaxies.

[ascl:1710.023] LIMEPY: Lowered Isothermal Model Explorer in PYthon

LIMEPY solves distribution function (DF) based lowered isothermal models. It solves Poisson's equation used on input parameters and offers fast solutions for isotropic/anisotropic, single/multi-mass models, normalized DF values, density and velocity moments, projected properties, and generates discrete samples.

Would you like to view a random code?