ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 2101-2200 of 3556 (3462 ASCL, 94 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:2306.044] nuSpaceSim: Cosmic neutrino simulation

nuSpaceSim simulates upward-going extensive air showers caused by neutrino interactions with the atmosphere. It is an end-to-end, neutrino flux to space-based signal detection, modeling tool for the design of sub-orbital and space-based neutrino detection experiments. This comprehensive suite of modeling packages accepts an experimental design input and then models the experiment's sensitivity to both the diffuse, cosmogenic neutrino flux as well as astrophysical neutrino transient events, such as that postulated from binary neutron star (BNS) mergers. nuSpaceSim calculates the tau neutrino acceptance for the Optical Cherenkov technique; tau propagation is interpolated using included data tables from nupyprop (ascl:2306.044). The simulation is parameterized by an input XML configuration file, with settings for detector characteristics and global parameters; nuSpaceSim also provides a python API for programmatic access.

[ascl:2102.014] nway: Bayesian cross-matching of astronomical catalogs

nway is a source cross-matching tool for arbitrarily many astronomical catalogs. It features Bayesian match probabilities based on astronomical sky coordinates (RA, DEC), works with arbitrarily many catalogs, and can handle varying errors. nway can also incorporate additional prior information, such as the magnitude or color distributions of the sources to match, and works accurately and fast in small areas and all-sky catalogs.

[ascl:2202.002] NWelch: Spectral analysis of time series with nonuniform observing cadence

NWelch uses Welch's method to estimate the power spectra, complex cross-spectrum, magnitude-squared coherence, and phase spectrum of bivariate time series with nonuniform observing cadence. For univariate time series, users can apply the Welch's power spectrum estimator or compute a nonuniform fast Fourier transform-based periodogram. Options include tapering in the time domain and computing bootstrap false alarm levels. Users may choose standard 50%-overlapping Welch's segments or apply a custom-made segmentation scheme. NWelch was designed for Doppler planet searches but may be applied to any type of time series.

[ascl:1712.006] Nyx: Adaptive mesh, massively-parallel, cosmological simulation code

Nyx code solves equations of compressible hydrodynamics on an adaptive grid hierarchy coupled with an N-body treatment of dark matter. The gas dynamics in Nyx use a finite volume methodology on an adaptive set of 3-D Eulerian grids; dark matter is represented as discrete particles moving under the influence of gravity. Particles are evolved via a particle-mesh method, using Cloud-in-Cell deposition/interpolation scheme. Both baryonic and dark matter contribute to the gravitational field. In addition, Nyx includes physics for accurately modeling the intergalactic medium; in optically thin limits and assuming ionization equilibrium, the code calculates heating and cooling processes of the primordial-composition gas in an ionizing ultraviolet background radiation field.

[ascl:2112.019] O'TRAIN: Optical TRAnsient Identification NEtwork

The O'TRAIN package identifies transients in astronomical images based on a Convolutional Neural Network (CNN). It works on images from different telescopes and, through the use of Docker, can be deployed on different operating systems. O'TRAIN uses image cutouts containing real and false transients provided by the user to train a CNN algorithm implemented with Keras. Built-in diagnostics help to characterize the accuracy of the training, and a trained model is used to classify any new cutouts.

[ascl:1408.019] O2scl: Object-oriented scientific computing library

O2scl is an object-oriented library for scientific computing in C++ useful for solving, minimizing, differentiating, integrating, interpolating, optimizing, approximating, analyzing, fitting, and more. Many classes operate on generic function and vector types; it includes classes based on GSL and CERNLIB. O2scl also contains code for computing the basic thermodynamic integrals for fermions and bosons, for generating almost all of the most common equations of state of nuclear and neutron star matter, and for solving the TOV equations. O2scl can be used on Linux, Mac and Windows (Cygwin) platforms and has extensive documentation.

[ascl:1608.012] OBERON: OBliquity and Energy balance Run on N-body systems

OBERON (OBliquity and Energy balance Run on N-body systems) models the climate of Earthlike planets under the effects of an arbitrary number and arrangement of other bodies, such as stars, planets and moons. The code, written in C++, simultaneously computes N body motions using a 4th order Hermite integrator, simulates climates using a 1D latitudinal energy balance model, and evolves the orbital spin of bodies using the equations of Laskar (1986a,b).

[ascl:1307.008] Obit: Radio Astronomy Data Handling

Obit is a group of software packages for handling radio astronomy data, especially interferometric and single dish OTF imaging. Obit is primarily an environment in which new data processing algorithms can be developed and tested but which can also be used for production processing of a certain range of scientific problems. The package supports both prepackaged, compiled tasks and a python interface to the major class functionality to allow rapid prototyping using python scripts; it allows access to multiple disk--resident data formats, in particular access to either AIPS disk data or FITS files. Obit applications are interoperable with Classic AIPS and the ObitTalk python interface gives access to AIPS tasks as well as Obit libraries and tasks.

[submitted] ObsPlanner

Simple program for planning and managing astronomical observations as observational diary or logs.

[submitted] obsplanning - a set of python utilities to aid in planning astronomical observations

Obsplanning is a suite of tools to help plan astronomical observations from ground-based observatories, for traditional single-site as well as multi-station (VLBI) observing. Conveniently determine observability of objects in the sky from your observatory, and produce plots to help you prepare for your observations over the course of a session. Celestial source coordinates (including solar system objects) can be queried or created, and transformed. Calibrator or reference sources can be selected by proximity, and slew order can be optimized to save valuable telescope time. Plots and visualizations can be easily made to chart source elevation and transits, source proximity to the Sun and Moon, concurrent 'up time' of sources at multiple sites (for VLBI or tandem observations), 'dark time' at a telescope site for a given year, finder plots made from real images (with options to query online databases), and more.

[ascl:1910.020] OCD: O'Connell Effect Detector using push-pull learning

OCD (O'Connell Effect Detector) detects eclipsing binaries that demonstrate the O'Connell Effect. This time-domain signature extraction methodology uses a supporting supervised pattern detection algorithm. The methodology maps stellar variable observations (time-domain data) to a new representation known as Distribution Fields (DF), the properties of which enable efficient handling of issues such as irregular sampling and multiple values per time instance. Using this representation, the code applies a metric learning technique directly on the DF space capable of specifically identifying the stars of interest; the metric is tuned on a set of labeled eclipsing binary data from the Kepler survey, targeting particular systems exhibiting the O’Connell Effect. This code is useful for large-scale data volumes such as that expected from next generation telescopes such as LSST.

[ascl:1901.002] OCFit: Python package for fitting of O-C diagrams

OCFit fits and analyzes O-C diagrams using Genetic Algorithms and Markov chain Monte Carlo methods. The MC method is used to determine a very good estimation of errors of the parameters. Unlike some other fitting routines, OCFit does not need any initial values of fitted parameters. An intuitive graphic user interface is provided for ease of fitting, and nine common models of periodic O-C changes are included.

[ascl:1812.018] OctApps: Octave functions for continuous gravitational-wave data analysis

The OctApps library provides various functions, written in Octave, for performing searches for the weak signatures of continuous gravitational waves from rapidly-rotating neutron stars amidst the instrumental noise of the LIGO and Virgo detectors.

[ascl:1010.048] OCTGRAV: Sparse Octree Gravitational N-body Code on Graphics Processing Units

Octgrav is a very fast tree-code which runs on massively parallel Graphical Processing Units (GPU) with NVIDIA CUDA architecture. The algorithms are based on parallel-scan and sort methods. The tree-construction and calculation of multipole moments is carried out on the host CPU, while the force calculation which consists of tree walks and evaluation of interaction list is carried out on the GPU. In this way, a sustained performance of about 100GFLOP/s and data transfer rates of about 50GB/s is achieved. It takes about a second to compute forces on a million particles with an opening angle of $ heta approx 0.5$.

To test the performance and feasibility, we implemented the algorithms in CUDA in the form of a gravitational tree-code which completely runs on the GPU. The tree construction and traverse algorithms are portable to many-core devices which have support for CUDA or OpenCL programming languages. The gravitational tree-code outperforms tuned CPU code during the tree-construction and shows a performance improvement of more than a factor 20 overall, resulting in a processing rate of more than 2.8 million particles per second.

The code has a convenient user interface and is freely available for use.

[ascl:2101.012] Octo-Tiger: HPX parallelized 3-D hydrodynamic code for stellar mergers

Octo-Tiger models mass transfer in binary systems using a Cartesian adaptive mesh refinement grid. It simulates the evolution of star systems based on a modified fast multipole method (FMM) on adaptive octrees. The code takes shock heating into account and uses the dual energy formalism with an ideal gas equation of state; it also conserves linear and angular momenta to machine precision. Octo-Tiger is implemented in C++ and is parallelized using the High Performance ParalleX (HPX) runtime system.

[ascl:1905.021] ODEPACK: Ordinary differential equation solver library

ODEPACK solves for the initial value problem for ordinary differential equation systems. It consists of nine solvers, a basic solver called LSODE and eight variants of it: LSODES, LSODA, LSODAR, LSODPK, LSODKR, LSODI, LSOIBT, and LSODIS. The collection is suitable for both stiff and nonstiff systems. It includes solvers for systems given in explicit form, dy/dt = f(t,y), and also solvers for systems given in linearly implicit form, A(t,y) dy/dt = g(t,y). The ODEPACK solvers are written in standard Fortran and there are separate double and single precision versions. Each solver consists of a main driver subroutine having the same name as the solver and some number of subordinate routines. For each solver, there is also a demonstration program, which solves one or two simple problems in a somewhat self-checking manner.

[ascl:2211.018] ODNet: Asteroid occultation detection convolutional neural network

ODNet uses a convolutional neural network to examine frames of a given observation, using the flux of a targeted star along time, to detect occultations. This is particularly useful to reliably detect asteroid occultations for the Unistellar Network, which consists of 10,000 digital telescopes owned by citizen scientists that is regularly used to record asteroid occultations. ODNet is not costly in term of computing power, opening the possibility for embedding the code on the telescope directly. ODNet's models were developed and trained using TensorFlow version 2.4.

[ascl:1810.010] ODTBX: Orbit Determination Toolbox

ODTBX (Orbit Determination Toolbox) provides orbit determination analysis, advanced mission simulation, and analysis for concept exploration, proposal, early design phase, and/or rapid design center environments. The core ODTBX functionality is realized through a set of estimation commands that incorporate Monte Carlo data simulation, linear covariance analysis, and measurement processing at a generic level; its functions and utilities are combined in a flexible architecture to allow modular development of navigation algorithms and simulations. ODTBX is written in Matlab and Java.

[ascl:2002.005] ODUSSEAS: Observing Dwarfs Using Stellar Spectroscopic Energy-Absorption Shapes

ODUSSEAS (Observing Dwarfs Using Stellar Spectroscopic Energy-Absorption Shapes) uses machine learning to derive the Teff and [Fe/H] of M dwarf stars by using their optical spectra obtained by different spectrographs with different resolutions. The software uses the measurement of the pseudo equivalent widths for more than 4000 stellar absorption lines and the machine learning Python package scikit-learn (https://scikit-learn.org/stable/) to predict the stellar parameters.

[ascl:1601.004] Odyssey: Ray tracing and radiative transfer in Kerr spacetime

Odyssey is a GPU-based General Relativistic Radiative Transfer (GRRT) code for computing images and/or spectra in Kerr metric describing the spacetime around a rotating black hole. Odyssey is implemented in CUDA C/C++. For flexibility, the namespace structure in C++ is used for different tasks; the two default tasks presented in the source code are the redshift of a Keplerian disk and the image of a Keplerian rotating shell at 340GHz. Odyssey_Edu, an educational software package for visualizing the ray trajectories in the Kerr spacetime that uses Odyssey, is also available.

[ascl:1906.015] OIT: Nonconvex optimization approach to optical-interferometric imaging

In the context of optical interferometry, only undersampled power spectrum and bispectrum data are accessible, creating an ill-posed inverse problem for image recovery. Recently, a tri-linear model was proposed for monochromatic imaging, leading to an alternated minimization problem; in that work, only a positivity constraint was considered, and the problem was solved by an approximated Gauss–Seidel method.

The Optical-Interferometry-Trilinear code improves the approach on three fundamental aspects. First, the estimated image is defined as a solution of a regularized minimization problem, promoting sparsity in a fixed dictionary using either an l1 or a (re)weighted-l1 regularization term. Second, the resultant non-convex minimization problem is solved using a block-coordinate forward–backward algorithm. This algorithm is able to deal both with smooth and non-smooth functions, and benefits from convergence guarantees even in a non-convex context. Finally, the model and algorithm are generalized to the hyperspectral case, promoting a joint sparsity prior through an l2,1 regularization term.

[ascl:1806.018] OMEGA: One-zone Model for the Evolution of GAlaxies

OMEGA (One-zone Model for the Evolution of GAlaxies) calculates the global chemical evolution trends of galaxies. From an input star formation history, it uses SYGMA to create as a function of time multiple simple stellar populations with different masses, ages, and initial compositions. OMEGA offers several prescriptions for modeling the star formation efficiency and the evolution of galactic inflows and outflows. OMEGA is part of the NuGrid (ascl:1610.015) chemical evolution package.

[ascl:2212.020] Omega: Photon equations of motion

Omega solves the photon equations of motion in the environment surrounding a black hole. This black hole can be either Schwarzschild (nonrotating) or Kerr (rotating) by choice of the user. The software offers numerous options, such as the geometrical setup of the accretion disk around the black hole (including no disk, band, slab, wedge, among others, the spin parameter of the central black hole, and the thickness of the accretion disk. Other options that can be set includ the azimuthal angle of the photon emission/reception, the poloidal angle of the photon emission/reception, and how far away or close to the system to look.

[ascl:1907.010] OMNICAL: Redundant calibration code for low frequency radio interferometers

OMNICAL calibrates antennas in the redundant subset of the array. The code consists of two algorithms, a logarithmic method (logcal) and a linearized method (lincal). OMNICAL makes visibilities from physically redundant baselines agree with each other and also explicitly minimizes the variance within redundant visibilities.

[ascl:2403.014] OneLoopBispectrum: Computation of the one-loop bispectrum of galaxies in redshift space

OneLoopBispectrum computes the one-loop bispectrum of galaxies in redshift space. It computes and simplifies the bispectrum kernels using Mathematica; this is cosmology-independent. The code also computes the full and flattened bispectrum templates, given the pre-computed integration kernels. OneLoopBispectrum uses Mathematica to read in and combine the bispectrum templates, and Python to interpolate and extract the one-loop bispectrum.

[ascl:1904.024] OoT: Out-of-Transit Light Curve Generator

OoT (Out-of-Transit) calculates the light curves and radial velocity signals due to a planet orbiting a star. It explicitly models the effects of tides, orbital motion. relativistic beaming, and reflection of the stars light by the planet. The code can also be used to model secondary eclipses.

[ascl:2104.009] OpacityTool: Dust opacities for disk modeling

OpacityTool computes dust opacities for disc modelling; it includes a number of robust facts obtained from observations and theory and goes beyond astronomical silicates. It provides output files with κext(λ),κabs(λ),κsca(λ) as a function of wavelength λ, and the 6 scattering matrix elements for randomly oriented particles, F11(λ,θ), F12(λ,θ), F22(λ,θ), F33(λ, θ), F34(λ, θ), F44(λ, θ) as functions of wavelength and scattering angle θ.

This code is superseded by optool (ascl:2104.010).

[ascl:1604.001] OpenMHD: Godunov-type code for ideal/resistive magnetohydrodynamics (MHD)

OpenMHD is a Godunov-type finite-volume code for ideal/resistive magnetohydrodynamics (MHD). It is written in Fortran 90 and is parallelized by using MPI-3 and OpenMP. The code was originally developed for studying magnetic reconnection problems and has been made publicly available in the hope that others may find it useful.

[ascl:1502.002] OpenOrb: Open-source asteroid orbit computation software

OpenOrb (OOrb) contains tools for rigorously estimating the uncertainties resulting from the inverse problem of computing orbital elements using scarce astrometry. It uses the least-squares method and also contains both Monte-Carlo (MC) and Markov-Chain MC versions of the statistical ranging method. Ranging obtains sampled, non-Gaussian orbital-element probability-density functions and is optimized for cases where the amount of astrometry is scarce or spans a relatively short time interval.

[ascl:1911.003] OpenSPH: Astrophysical SPH and N-body simulations and interactive visualization tools

OpenSPH runs hydrodynamical and N-body simulations and was written for asteroid collisions and subsequent gravitational evolution. The code offers SPH and N-body solvers with several different equations of state and material rheologies. It is written in C++14 with a modular object-oriented design, focused on extensibility and maintainability, and it can be used either as a library or as a standalone graphical program that allows to set up the problem in a convenient graphical node editor. The graphical program further allows real-time visualization of the simulation, diagnostics and tools for analysis of the results.

[ascl:1509.009] OPERA: Objective Prism Enhanced Reduction Algorithms

OPERA (Objective Prism Enhanced Reduction Algorithms) automatically analyzes astronomical images using the objective-prism (OP) technique to register thousands of low resolution spectra in large areas. It detects objects in an image, extracts one-dimensional spectra, and identifies the emission line feature. The main advantages of this method are: 1) to avoid subjectivity inherent to visual inspection used in past studies; and 2) the ability to obtain physical parameters without follow-up spectroscopy.

[ascl:1411.004] OPERA: Open-source Pipeline for Espadons Reduction and Analysis

OPERA (Open-source Pipeline for Espadons Reduction and Analysis) is an open-source collaborative software reduction pipeline for ESPaDOnS data. ESPaDOnS is a bench-mounted high-resolution echelle spectrograph and spectro-polarimeter designed to obtain a complete optical spectrum (from 370 to 1,050 nm) in a single exposure with a mode-dependent resolving power between 68,000 and 81,000. OPERA is fully automated, calibrates on two-dimensional images and reduces data to produce one-dimensional intensity and polarimetric spectra. Spectra are extracted using an optimal extraction algorithm. Though designed for CFHT ESPaDOnS data, the pipeline is extensible to other echelle spectrographs.

[submitted] Opik Collision Probability

The Opik method gives the mean probability of collision of a small body with a given planet. It is a statistical value valid for an orbit with given (a,e,i) and undefined argument of perihelion. In some cases, the planet can eject the small body from the solar system; in these cases, the program estimates the mean time for the ejection. The Opik method does not take into account other perturbers than the planet considered, so it only provides an idea of the timescales involved.

[ascl:2112.018] Optab: Ideal-gas opacity tables generator

Optab, written in Fortran90, generates ideal-gas opacity tables. It computes opacity based on user-provided chemical equilibrium abundances, and outputs mean opacities as well as monochromatic opacities, thus providing opacity tables that are consistent with one's equation of state for radiation hydrodynamics simulations. For convenience, Optab also provides interfaces for FastChem (ascl:1804.025) or TEA (ascl:1505.031) for computing chemical abundances.

[ascl:1803.013] optBINS: Optimal Binning for histograms

optBINS (optimal binning) determines the optimal number of bins in a uniform bin-width histogram by deriving the posterior probability for the number of bins in a piecewise-constant density model after assigning a multinomial likelihood and a non-informative prior. The maximum of the posterior probability occurs at a point where the prior probability and the the joint likelihood are balanced. The interplay between these opposing factors effectively implements Occam's razor by selecting the most simple model that best describes the data.

[ascl:2104.010] OpTool: Command-line driven tool for creating complex dust opacities

Optool computes dust opacities and scattering matrices, for specific grain sizes or averaged over size distributions. It is derived from OpacityTool (ascl:2104.009) and implements the Distribution of Hollow Spheres (DHS) statistical method to approximate irregular and low porosity grains. Mie theory is available as a limiting case of DHS. It also implements the Tazaki Modified Mean Field Theory (MMF) to treat fractal and highly porous aggregates. The refractive index data for many astronomically relevant materials are compiled into the code, and external refractive index data can be used as well. A compact and intuitive command line interface makes it easy to construct complex particles on the fly. Available output formats are ASCII and FITS, including files directly readable by RADMC-3D (ascl:1202.015). A python interface to the FORTRAN program is included.

[ascl:2102.016] OPUS: Interoperable access to analysis and simulation codes

OPUS (Observatoire de Paris UWS System) provides interoperable access to analysis and simulation codes on local machines or work clusters. This job control system was developed using the micro-framework bottle.py, and executes jobs asynchronously to better manage jobs with a long execution duration. The software follows the proposed IVOA Provenance Data Model to capture and expose the provenance information of jobs and results.

[ascl:1310.001] ORAC-DR: Astronomy data reduction pipeline

ORAC-DR is a generic data reduction pipeline infrastructure; it includes specific data processing recipes for a number of instruments. It is used at the James Clerk Maxwell Telescope, United Kingdom Infrared Telescope, AAT, and LCOGT. This pipeline runs at the JCMT Science Archive hosted by CADC to generate near-publication quality data products; the code has been in use since 1998.

[ascl:1210.024] ORBADV: ORBital ADVection by interpolation

ORBADV adopts a ZEUS-like scheme to solve magnetohydrodynamic equations of motion in a shearing sheet. The magnetic field is discretized on a staggered mesh, and magnetic field variables represent fluxes through zone faces. The code uses obital advection to ensure fast and accurate integration in a large shearing box.

[ascl:1702.001] ORBE: Orbital integrator for educational purposes

ORBE performs numerical integration of an arbitrary planetary system composed by a central star and up to 100 planets and minor bodies. ORBE calculates the orbital evolution of a system of bodies by means of the computation of the time evolution of their orbital elements. It is easy to use and is suitable for educational use by undergraduate students in the classroom as a first approach to orbital integrators.

[ascl:1307.016] orbfit: Orbit fitting software

Orbfit determines positions and orbital elements, and associated uncertainties, of outer solar system planets. The orbit-fitting procedure is greatly streamlined compared with traditional methods because acceleration can be treated as a perturbation to the inertial motion of the body. Orbfit quickly and accurately calculates orbital elements and ephemerides and their associated uncertainties for targets ≳ 10 AU from the Sun and produces positional estimates and uncertainty ellipses even in the face of the substantial degeneracies of short-arc orbit fits; the sole a priori assumption is that the orbit should be bound or nearly so.

[ascl:1106.015] OrbFit: Software to Determine Orbits of Asteroids

OrbFit is a software system allowing one to compute the orbits of asteroids starting from the observations, to propagate these orbits, and to compute predictions on the future (and past) position on the celestial sphere. It is a tool to be used to find a well known asteroid, to recover a lost one, to attribute a small group of observations, to identify two orbits with each other, to study the future (and/or past) close approaches to Earth, thus to assess the risk of an impact, and more.

[ascl:1804.009] orbit-estimation: Fast orbital parameters estimator

orbit-estimation tests and evaluates the Stäckel approximation method for estimating orbit parameters in galactic potentials. It relies on the approximation of the Galactic potential as a Stäckel potential, in a prolate confocal coordinate system, under which the vertical and horizontal motions decouple. By solving the Hamilton Jacobi equations at the turning points of the horizontal and vertical motions, it is possible to determine the spatial boundary of the orbit, and hence calculate the desired orbit parameters.

[ascl:1910.009] orbitize: Orbit-fitting for directly imaged objects

orbitize fits the orbits of directly-imaged objects by packaging the Orbits for the Impatient (OFTI) algorithm and a parallel-tempered Markov Chain Monte Carlo (MCMC) algorithm into a consistent API. It accepts observations in three measurement formats, which can be mixed in the same input file, generates orbits, and plots the computed orbital parameters. orbitize offers numerous ways to visualize the data, including histograms, corner plots, and orbit plots. Generated orbits can be saved in HDF5 format for future use and analysis.

[ascl:2307.059] orbitN: Symplectic integrator for near-Keplerian planetary systems

orbitN generates accurate and reproducible long-term orbital solutions for near-Keplerian planetary systems with a dominant mass M0. The code focuses on hierarchical systems without close encounters but can be extended to include additional features. Among other features, the package includes M0's quadrupole moment, a lunar contribution, and post-Newtonian corrections (1PN) due to M0 (fast symplectic implementation). To reduce numerical roundoff errors, orbitN features Kahan compensated summation.

[ascl:1409.007] ORBS: A reduction software for SITELLE and SpiOMM data

ORBS merges, corrects, transforms and calibrates interferometric data cubes and produces a spectral cube of the observed region for analysis. It is a fully automatic data reduction software for use with SITELLE (installed at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont Mégantic); these imaging Fourier transform spectrometers obtain a hyperspectral data cube which samples a 12 arc-minutes field of view into 4 millions of visible spectra. ORBS is highly parallelized; its core classes (ORB) have been designed to be used in a suite of softwares for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS).

[ascl:1911.019] OrbWeaver: Galaxy/(sub)halo orbital processing tool

OrbWeaver extracts orbits from halo catalogs, enabling large statistical studies of their orbital parameters. The code is run in two stages. For the first run, a configuration file is used to modify orbit host selection and the region around orbit host used for the superset of orbiting halos. Each orbit host has a orbit forest (containing halos that passed within the region of interest); the code generates a pre-processed catalog which contains a superset of orbiting halo for each identified orbit host. The second run uses the file list generated in the first stage for the creation of the orbit catalog, which is the final output.

[ascl:2001.009] ORCS: Analysis engine for SITELLE spectral cubes

ORCS (Outils de Réduction de Cubes Spectraux) is an analysis engine for SITELLE spectral cubes. The software extracts integrated spectra, fits the sinc emission lines, and recalibrates data in wavelength, astrometry and flux. ORCS offers a choice between a Bayesian or a classical fitting algorithm
, and also provides automatic source detection and radial velocity correction.

[ascl:1304.012] ORIGAMI: Structure-finding routine in N-body simulation

ORIGAMI is a dynamical method of determining the morphology of particles in a cosmological simulation by checking for whether, and in how many dimensions, a particle has undergone shell-crossing. The code is written in C and makes use of the Delaunay tessellation calculation routines from the VOBOZ package (which relies on the Qhull package).

[ascl:2002.003] ORIGIN: detectiOn and extRactIon of Galaxy emIssion liNes

ORIGIN performs blind detection of faint emitters in MUSE datacubes. The algorithm is tuned to detect faint spatial-spectral emission signatures while allowing for a stable false detection rate over the data cube, and providing in the same time an automated and reliable estimation of the purity. ORIGIN implements a nuisance removal part based on a continuum subtraction combining a Discrete Cosine Transform and an iterative Principal Component Analysis and a detection part based on the local maxima of Generalized Likelihood Ratio test statistics obtained for a set of spatial-spectral profiles of emission line emitters. In addition, it performs a purity estimation in which the proportion of true emission lines is estimated from the data itself: the distribution of the local maxima in the noise only configuration is estimated from that of the local minima.

[ascl:1204.013] ORSA: Orbit Reconstruction, Simulation and Analysis

ORSA is an interactive tool for scientific grade Celestial Mechanics computations. Asteroids, comets, artificial satellites, solar and extra-solar planetary systems can be accurately reproduced, simulated, and analyzed. The software uses JPL ephemeris files for accurate planets positions and has a Qt-based graphical user interface. It offers an advanced 2D plotting tool and 3D OpenGL viewer and the standalone numerical library liborsa and can import asteroids and comets from all the known databases (MPC, JPL, Lowell, AstDyS, and NEODyS). In addition, it has an integrated download tool to update databases.

[ascl:2105.012] orvara: Orbits from Radial Velocity, Absolute, and/or Relative Astrometry

orvara (Orbits from Radial Velocity, Absolute, and/or Relative Astrometry) fits orbits of bright stars and their faint companions (exoplanets, brown dwarfs, white dwarfs, and low-mass stars). It can use any combination of radial velocity, relative astrometry, and absolute astrometry data and offers a variety of plots from the orbital fit, such as the radial velocity orbit over an extended time baseline, position angle between two companions, and a density plot of the predicted position at a chosen epoch. orvara can also check convergence of fitted parameters in the HDU1 extension, save the results from the fitted and inferred parameters from the HDU1 extension, and plot the results of a three-body or multiple-body fit.

[ascl:1908.012] oscode: Oscillatory ordinary differential equation solver

oscode solves oscillatory ordinary differential equations efficiently. It is designed to deal with equations of the form x¨(t)+2γ(t)x˙(t)+ω2(t)x(t)=0, where γ(t) and ω(t) can be given as explicit functions or sequence containers (Eigen::Vectors, arrays, std::vectors, lists) in C++ or as numpy.arrays in Python. oscode makes use of an analytic approximation of x(t) embedded in a stepping procedure to skip over long regions of oscillations, giving a reduction in computing time. The approximation is valid when the frequency changes slowly relative to the timescales of integration, it is therefore worth applying when this condition holds for at least some part of the integration range.

[ascl:1710.021] OSIRIS Toolbox: OH-Suppressing InfraRed Imaging Spectrograph pipeline

OSIRIS Toolbox reduces data for the Keck OSIRIS instrument, an integral field spectrograph that works with the Keck Adaptive Optics System. It offers real-time reduction of raw frames into cubes for display and basic analysis. In this real-time mode, it takes about one minute for a preliminary data cube to appear in the “quicklook” display package. The reduction system also includes a growing set of final reduction steps including correction of telluric absorption and mosaicing of multiple cubes.

[ascl:2007.018] OSPEX: Object Spectral Executive

OSPEX (Object Spectral Executive) is an object-oriented interface for X-ray spectral analysis of solar data. The next generation of SPEX (ascl:2007.017), it reads and displays input data, selects and subtracts background, selects time intervals of interest, selects a combination of photon flux model components to describe the data, and fits those components to the spectrum in each time interval selected. During the fitting process, the response matrix is used to convert the photon model to the model counts to compare with the input count data. The resulting time-ordered fit parameters are stored and can be displayed and analyzed with OSPEX. The entire OSPEX session can be saved in the form of a script and the fit results stored in the form of a FITS file. Part of the SolarSoft (ascl:1208.013) package, OSPEX works with any type of data structured in the form of time-ordered count spectra; RHESSI, Fermi, SOXS, MESSENGER, Yohkoh, SMM, and SMART data analysis have all been implemented in OSPEX.

[ascl:2109.027] OSPREI: Sun-to-Earth (or satellite) CME simulator

OSPREI simulates the Sun-to-Earth (or satellite) behavior of CMEs. It is comprised of three separate models: ForeCAT, ANTEATR, and FIDO. ForeCAT uses the PFSS background to determine the external magnetic forces on a CME; ANTEATR takes the ForeCAT CME and propagates it to the final satellite distance, and outputs the final CME speed (both propagation and expansion), size, and shape (and their profiles with distance) as well as the arrival time and internal thermal and magnetic properties of the CME. FIDO takes the evolved CME from ANTEATR with the position and orientation from ForeCAT and passes the CME over a synthetic spacecraft. The relative location of the spacecraft within the CME determines the in situ magnetic field vector and velocity. It also calculates the Kp index from these values. OSPREI includes tools for creating figures from the results, including histograms, contour plots, and ensemble correlation plots, and new figures can be created using the results object that contains all the simulation data in an easily accessible format.

[ascl:1805.014] OSS: OSSOS Survey Simulator

Comparing properties of discovered trans-Neptunian Objects (TNOs) with dynamical models is impossible due to the observational biases that exist in surveys. The OSSOS Survey Simulator takes an intrinsic orbital model (from, for example, the output of a dynamical Kuiper belt emplacement simulation) and applies the survey biases, so the biased simulated objects can be directly compared with real discoveries.

[ascl:2401.011] ostrich: Surrogate modeling using PCA and Gaussian process interpolation

Ostrich emulates surrogate models for complex and expensive functions using Principal Component Analysis (PCA) to decompose a signal, then interpolate the PCA weights over the parameters θ using a Gaussian Process interpolator. The code is trained on samples from the expensive functions, recreating and interpolating between those training samples with reduced computational cost, and recalculating for each use.

[ascl:2211.009] ovejero: Bayesian neural network inference of strong gravitational lenses

ovejero conducts hierarchical inference of strongly-lensed systems with Bayesian neural networks. It requires lenstronomy (ascl:1804.012) and fastell (ascl:9910.003) to run lens models with elliptical mass distributions. The code trains Bayesian Neural Networks (BNNs) to predict posteriors on strong gravitational lensing images and can integrate with forward modeling tools in lenstronomy to allow comparison between BNN outputs and more traditional methods. ovejero also provides hierarchical inference tools to generate population parameter estimates and unbiased posteriors on independent test sets.

[ascl:1611.011] OXAF: Ionizing spectra of Seyfert galaxies for photoionization modeling

OXAF provides a simplified model of Seyfert Active Galactic Nucleus (AGN) continuum emission designed for photoionization modeling. It removes degeneracies in the effects of AGN parameters on model spectral shapes and reproduces the diversity of spectral shapes that arise in physically-based models. OXAF accepts three parameters which directly describe the shape of the output ionizing spectrum: the energy of the peak of the accretion disk emission Epeak, the photon power-law index of the non-thermal X-ray emission Γ, and the proportion of the total flux which is emitted in the non-thermal component pNT. OXAF accounts for opacity effects where the accretion disk is ionized because it inherits the ‘color correction’ of OPTXAGNF, the physical model upon which OXAF is based.

[ascl:2009.003] oxkat: Semi-automated imaging of MeerKAT observations

oxkat semi-automatically performs calibration and imaging of data from the MeerKAT radio telescope. Taking as input raw visibilities in Measurement Set format, the entire processing workflow is covered, from flagging and reference calibration, to imaging and self-calibration, and (optionally) direction-dependent calibration. The oxkat scripts use Python, and draw on numerous existing radio astronomy packages, including CASA (ascl:1107.013), WSClean (ascl:1408.023), and CubiCal (ascl:1805.031), among others, that are containerized using Singularity. Submission scripts for slurm and PBS job schedulers are automatically generated where necessary, catering for HPC facilities that are commonly used for processing MeerKAT data.

[ascl:2111.011] p-winds: Python implementation of Parker wind models for planetary atmospheres

p-winds produces simplified, 1-D models of the upper atmosphere of a planet and performs radiative transfer to calculate observable spectral signatures. The scalable implementation of 1D models allows for atmospheric retrievals to calculate atmospheric escape rates and temperatures. In addition, the modular implementation allows for a smooth plugging-in of more complex descriptions to forward model their corresponding spectral signatures (e.g., self-consistent or 3D models).

[ascl:1806.011] P2DFFT: Parallelized technique for measuring galactic spiral arm pitch angles

P2DFFT is a parallelized version of 2DFFT (ascl:1608.015). It isolates and measures the spiral arm pitch angle of galaxies. The code allows direct input of FITS images, offers the option to output inverse Fourier transform FITS images, and generates idealized logarithmic spiral test images of a specified size that have 1 to 6 arms with pitch angles of -75 degrees to 75 degrees​​. Further, it can output Fourier amplitude versus inner radius and pitch angle versus inner radius for each Fourier component (m = 0 to m = 6), and calculates the Fourier amplitude weighted mean pitch angle across m = 1 to m = 6 versus inner radius.

[ascl:1402.030] P2SAD: Particle Phase Space Average Density

P2SAD computes the Particle Phase Space Average Density (P2SAD) in galactic haloes. The model for the calculation is based on the stable clustering hypothesis in phase space, the spherical collapse model, and tidal disruption of substructures. The multiscale prediction for P2SAD computed by this IDL code can be used to estimate signals sensitive to the small scale structure of dark matter distributions (e.g. dark matter annihilation). The code computes P2SAD averaged over the whole virialized region of a Milky-Way-size halo at redshift zero.

[ascl:1205.002] p3d: General data-reduction tool for fiber-fed integral-field spectrographs

p3d is semi-automatic data-reduction tool designed to be used with fiber-fed integral-field spectrographs. p3d is a highly general and freely available tool based on IDL but can be used with full functionality without an IDL license. It is easily extended to include improved algorithms, new visualization tools, and support for additional instruments. It uses a novel algorithm for automatic finding and tracing of spectra on the detector, and includes two methods of optimal spectrum extraction in addition to standard aperture extraction. p3d also provides tools to combine several images, perform wavelength calibration and flat field data.

[ascl:1105.002] PACCE: Perl Algorithm to Compute Continuum and Equivalent Widths

PACCE (Perl Algorithm to Compute continuum and Equivalent Widths) computes continuum and equivalent widths. PACCE is able to determine mean continuum and continuum at line center values, which are helpful in stellar population studies, and is also able to compute the uncertainties in the equivalent widths using photon statistics.

[ascl:1110.011] Pacerman: Polarisation Angle CorrEcting Rotation Measure ANalysis

Pacerman, written in IDL, is a new method to calculate Faraday rotation measure maps from multi-frequency polarisation angle data. In order to solve the so called n-pi-ambiguity problem which arises from the observationally ambiguity of the polarisation angle which is only determined up to additions of n times pi, where n is an integer, we suggest using a global scheme. Instead of solving the n-pi-ambiguity for each data point independently, our algorithm, which we chose to call Pacerman solves the n-pi-ambiguity for a high signal-to-noise region "democratically" and uses this information to assist computations in adjacent low signal-to-noise areas.

[ascl:2212.013] PACMAN: Planetary Atmosphere, Crust, and MANtle geochemical evolution

PACMAN (Planetary Atmosphere, Crust, and MANtle geochemical evolution) runs a coupled redox-geochemical-climate evolution model. It runs Monte Carlo calculations over nominal parameter ranges, including number of iterations and number of cores for parallelization, which can be altered to reproduce different scenarios and sensitivity tests. Model outputs and corresponding input parameters are saved in separate files which are used to plot results; the the user can choose which outputs to plot, including all successful outputs, nominal Earth outputs, waterworld false positives, desertworld false positives, and high CO2:H2O false positives. Among other functions, PACMAN contains functions for interpolating the pre-computed Outgoing Longwave Radiation (OLR) grid, the atmosphere-ocean partitioning grid, and the stratospheric water vapor grid, calculating bond albedo and outgassing fluxes.

[ascl:1708.014] PACSman: IDL Suite for Herschel/PACS spectrometer data

PACSman provides an alternative for several reduction and analysis steps performed in HIPE (ascl:1111.001) on PACS spectroscopic data; it is written in IDL. Among the operations possible with it are transient correction, line fitting, map projection, and map analysis, and unchopped scan, chop/nod, and the decommissioned wavelength switching observation modes are supported.

[ascl:2404.024] pAGN: AGN disk model equations solver

Written in Python, pAGN solves AGN disk model equations. The code is highly customizable and, with the correct inputs, provides a fully evolved AGN disk model through parametric 1D curves for key disk parameters such as temperature and density. pAGN can be used to study migration torques in AGN disks, simulations of compact object formation inside gas disks, and comparisons with new, more complex models of AGN disks.

[ascl:2211.004] PAHDecomp: Decomposing the mid-IR spectra of extremely obscured galaxies

PAHDecomp models mid-infrared spectra of galaxies; it is based on the popular PAHFIT code (ascl:1210.009). In contrast to PAHFIT, this model decomposes the continuum into a star-forming component and an obscured nuclear component based on Bayesian priors on the shape of the star-forming component (using templates + prior on extinction), making this tool ideally suited for modeling the spectra of heavily obscured galaxies. PAHDecomp successfully recovers properties of Compact Obscured Nuclei (CONs) where the inferred nuclear optical depth strongly correlates with the surface brightness of HCN-vib emission in the millimeter. This is currently set up to run on the short low modules of Spitzer IRS data (5.2 - 14.2 microns) but will be ideal for JWST/MIRI MRS data in the future.

[ascl:1210.009] PAHFIT: Properties of PAH Emission

PAHFIT is an IDL tool for decomposing Spitzer IRS spectra of PAH emission sources, with a special emphasis on the careful recovery of ambiguous silicate absorption, and weak, blended dust emission features. PAHFIT is primarily designed for use with full 5-35 micron Spitzer low-resolution IRS spectra. PAHFIT is a flexible tool for fitting spectra, and you can add or disable features, compute combined flux bands, change fitting limits, etc., without changing the code.

PAHFIT uses a simple, physically-motivated model, consisting of starlight, thermal dust continuum in a small number of fixed temperature bins, resolved dust features and feature blends, prominent emission lines (which themselves can be blended with dust features), as well as simple fully-mixed or screen dust extinction, dominated by the silicate absorption bands at 9.7 and 18 microns. Most model components are held fixed or are tightly constrained. PAHFIT uses Drude profiles to recover the full strength of dust emission features and blends, including the significant power in the wings of the broad emission profiles. This means the resulting feature strengths are larger (by factors of 2-4) than are recovered by methods which estimate the underlying continuum using line segments or spline curves fit through fiducial wavelength anchors.

[ascl:1606.002] PAL: Positional Astronomy Library

The PAL library is a partial re-implementation of Pat Wallace's popular SLALIB library written in C using a Gnu GPL license and layered on top of the IAU's SOFA library (or the BSD-licensed ERFA) where appropriate. PAL attempts to stick to the SLA C API where possible.

[ascl:2202.005] palettable: Color palettes for Python

Palettable is a library of color palettes for Python. The code is written in pure Python with no dependencies; it can be used to supply color maps for matplotlib plots, customize matplotlib plots, and to supply colors for a web application.

[ascl:2405.021] PALpy: Python positional astronomy library interface

PALpy provides a Python interface to PAL, the positional Astronomy Library (ascl:1606.002), which is written in C. All arguments modified by the C API are returned and none are modified. The one routine that is different is palObs, which returns a simple dict that can be searched using standard Python. The keys to the dict are the short names and the values are another dict with keys name, long, lat and height.

[ascl:2210.029] paltas: Simulation-based inference on strong gravitational lensing systems

paltas conducts simulation-based inference on strong gravitational lensing images. It builds on lenstronomy (ascl:1804.012) to create large datasets of strong lensing images with realistic low-mass halos, Hubble Space Telescope (HST) observational effects, and galaxy light from HST's COSMOS field. paltas also includes the capability to easily train neural posterior estimators of the parameters of the lensing system and to run hierarchical inference on test populations.

[ascl:1406.002] PAMELA: Optimal extraction code for long-slit CCD spectroscopy

PAMELA is an implementation of the optimal extraction algorithm for long-slit CCD spectroscopy and is well suited for time-series spectroscopy. It properly implements the optimal extraction algorithm for curved spectra, including on-the-fly cosmic ray rejection as well as proper calculation and propagation of the errors. The software is distributed as part of the Starlink software collection (ascl:1110.012).

[ascl:1805.021] PampelMuse: Crowded-field 3D spectroscopy

PampelMuse analyzes integral-field spectroscopic observations of crowded stellar fields and provides several subroutines to perform the individual steps of the data analysis. All analysis steps assume that the IFS data has been properly reduced and that all the instrumental artifacts have been removed. PampelMuse is designed to correctly handle IFS data regardless of which instrument was used to observe the data. In addition to the actual data, the software also requires an estimate of the variances for the analysis; optionally, it can use a bad pixel mask. The analysis relies on the presence of a reference catalogue, containing coordinates and magnitudes of the stars in and around the observed field.

[ascl:2212.008] panco2: Pressure profile measurements of galaxy clusters

panco2 extracts measurements of the pressure profile of the hot gas inside galaxy clusters from millimeter-wave observations. The extraction is performed using forward modeling the millimeter-wave signal of clusters and MCMC sampling of a posterior distribution for the parameters given the input data. Many characteristic features of millimeter-wave observations can be taken into account, such as filtering (both through PSF smearing and transfer functions), point source contamination, and correlated noise.

[ascl:1906.016] PandExo: Instrument simulations for exoplanet observation planning

PandExo generates instrument simulations of JWST’s NIRSpec, NIRCam, NIRISS and NIRCam and HST WFC3 for planning exoplanet observations. It uses throughput calculations from STScI’s Exposure Time Calculator, Pandeia, and offers both an online tool and a python package.

[ascl:2303.009] Pandora: Fast exomoon transit detection algorithm

Pandora searches for exomoons by employing an analytical photodynamical model that includes stellar limb darkening, full and partial planet-moon eclipses, and barycentric motion of planet and moon. The code can be used with nested samplers such as UltraNest (ascl:1611.001) or dynesty (ascl:1809.013). Pandora is fast, calculating 10,000 models and log-likelihood evaluation per second (give or take an order of magnitude, depending on parameters and data); this means that a retrieval with 250 Mio. evaluations until convergence takes about 5 hours on a single core. For searches in large amounts of data, it is most efficient to assign one core per light curve.

[ascl:1511.009] Pangloss: Reconstructing lensing mass

Pangloss reconstructs all the mass within a light cone through the Universe. Understanding complex mass distributions like this is important for accurate time delay lens cosmography, and also for accurate lens magnification estimation. It aspires to use all available data in an attempt to make the best of all mass maps.

[ascl:2404.010] Panphasia: Create cosmological and resimulation initial conditions

Panphasia computes a very large realization of a Gaussian white noise field. The field has a hierarchical structure based on an octree geometry with 50 octree levels fully populated. The code sets up Gaussian initial conditions for cosmological simulations and resimulations of structure formation. Panphasia provides an easy way to publish the linear phases used to set up cosmological simulation initial conditions; publishing phases enriches the literature and makes it easier to reproduce and extend published simulation work.

[ascl:2105.020] PAP: PHANGS-ALMA pipeline

The PHANGS-ALMA pipeline process data from radio interferometer observations. It uses CASA (ascl:1107.013), AstroPy (ascl:1304.002), and other affiliated packages to process data from calibrated visibilities to science-ready spectral cubes and maps. The PHANGS-ALMA pipeline offers a flexible alternative to the scriptForImaging script distributed by ALMA. The pipeline runs in two separate software environments: CASA 5.6 or 5.7 (staging, imaging and post-processing) and Python 3.6 or later (derived products) with modern versions of several packages.

[ascl:1103.008] Parallel HOP: A Scalable Halo Finder for Massive Cosmological Data Sets

Modern N-body cosmological simulations contain billions ($10^9$) of dark matter particles. These simulations require hundreds to thousands of gigabytes of memory, and employ hundreds to tens of thousands of processing cores on many compute nodes. In order to study the distribution of dark matter in a cosmological simulation, the dark matter halos must be identified using a halo finder, which establishes the halo membership of every particle in the simulation. The resources required for halo finding are similar to the requirements for the simulation itself. In particular, simulations have become too extensive to use commonly-employed halo finders, such that the computational requirements to identify halos must now be spread across multiple nodes and cores. Here we present a scalable-parallel halo finding method called Parallel HOP for large-scale cosmological simulation data. Based on the halo finder HOP, it utilizes MPI and domain decomposition to distribute the halo finding workload across multiple compute nodes, enabling analysis of much larger datasets than is possible with the strictly serial or previous parallel implementations of HOP. We provide a reference implementation of this method as a part of the toolkit yt, an analysis toolkit for Adaptive Mesh Refinement (AMR) data that includes complementary analysis modules. Additionally, we discuss a suite of benchmarks that demonstrate that this method scales well up to several hundred tasks and datasets in excess of $2000^3$ particles. The Parallel HOP method and our implementation can be readily applied to any kind of N-body simulation data and is therefore widely applicable. Parallel HOP is part of yt.

[ascl:1106.009] PARAMESH V4.1: Parallel Adaptive Mesh Refinement

PARAMESH is a package of Fortran 90 subroutines designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, it can operate as a domain decomposition tool for users who want to parallelize their serial codes, but who do not wish to use adaptivity.

The package builds a hierarchy of sub-grids to cover the computational domain, with spatial resolution varying to satisfy the demands of the application. These sub-grid blocks form the nodes of a tree data-structure (quad-tree in 2D or oct-tree in 3D). Each grid block has a logically cartesian mesh. The package supports 1, 2 and 3D models. PARAMESH is released under the NASA-wide Open-Source software license.

[ascl:1010.039] Parameter Estimation from Time-Series Data with Correlated Errors: A Wavelet-Based Method and its Application to Transit Light Curves

We consider the problem of fitting a parametric model to time-series data that are afflicted by correlated noise. The noise is represented by a sum of two stationary Gaussian processes: one that is uncorrelated in time, and another that has a power spectral density varying as $1/f^gamma$. We present an accurate and fast [O(N)] algorithm for parameter estimation based on computing the likelihood in a wavelet basis. The method is illustrated and tested using simulated time-series photometry of exoplanetary transits, with particular attention to estimating the midtransit time. We compare our method to two other methods that have been used in the literature, the time-averaging method and the residual-permutation method. For noise processes that obey our assumptions, the algorithm presented here gives more accurate results for midtransit times and truer estimates of their uncertainties.

[ascl:2009.008] Paramo: PArticle and RAdiation MOnitor

Paramo (PArticle and RAdiation MOnitor) numerically solves the Fokker-Planck kinetic equation, which is used to model the dynamics of a particle distribution function, using a robust implicit method, for the proper modeling of the acceleration processes, and accounts for accurate cooling coefficient (e.g., radiative cooling with Klein-Nishina corrections). The numerical solution at every time step is used to calculate radiations processes, namely synchrotron and IC, with sophisticated numerical techniques, obtaining the multi-wavelength spectral evolution of the system.

[ascl:2008.016] ParaMonte: Parallel Monte Carlo library

ParaMonte contains serial and parallel Monte Carlo routines for sampling mathematical objective functions of arbitrary-dimensions. It is used for posterior distributions of Bayesian models in data science, Machine Learning, and scientific inference and unifies the automation of Monte Carlo simulations. ParaMonte is user friendly and accessible from multiple programming environments, including C, C++, Fortran, MATLAB, and Python, and offers high performance at runtime and scalability across many parallel processors.

[ascl:1103.014] ParaView: Data Analysis and Visualization Application

ParaView is an open-source, multi-platform data analysis and visualization application. ParaView users can quickly build visualizations to analyze their data using qualitative and quantitative techniques. The data exploration can be done interactively in 3D or programmatically using ParaView's batch processing capabilities.

ParaView was developed to analyze extremely large datasets using distributed memory computing resources. It can be run on supercomputers to analyze datasets of terascale as well as on laptops for smaller data.

[ascl:1601.010] PARAVT: Parallel Voronoi Tessellation

PARAVT offers massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition take into account consistent boundary computation between tasks, and support periodic conditions. In addition, the code compute neighbors lists, Voronoi density and Voronoi cell volumes for each particle, and can compute density on a regular grid.

[ascl:2007.014] PARS: Paint the Atmospheres of Rotating Stars

PARS (Paint the Atmospheres of Rotating Stars) quickly computes magnitudes and spectra of rotating stellar models. It uses the star's mass, equatorial radius, rotational speed, luminosity, and inclination as input; the models incorporate Roche mass distribution (where all mass is at the center of the star), solid body rotation, and collinearity of effective gravity and energy flux.

[ascl:1502.005] PARSEC: PARametrized Simulation Engine for Cosmic rays

PARSEC (PARametrized Simulation Engine for Cosmic rays) is a simulation engine for fast generation of ultra-high energy cosmic ray data based on parameterizations of common assumptions of UHECR origin and propagation. Implemented are deflections in unstructured turbulent extragalactic fields, energy losses for protons due to photo-pion production and electron-pair production, as well as effects from the expansion of the universe. Additionally, a simple model to estimate propagation effects from iron nuclei is included. Deflections in the Galactic magnetic field are included using a matrix approach with precalculated lenses generated from backtracked cosmic rays. The PARSEC program is based on object oriented programming paradigms enabling users to extend the implemented models and is steerable with a graphical user interface.

[ascl:1208.020] ParselTongue: AIPS Python Interface

ParselTongue is a Python interface to classic AIPS, Obit and possibly other task-based data reduction packages. It serves as the software infrastructure for some of the ALBUS implementation. It allows you to run AIPS tasks, and access AIPS headers and extension tables from Python. There is also support for running Obit tasks and accessing data in FITS files. Full access to the visibilities in AIPS UV data is also available.

[ascl:2110.008] ParSNIP: Parametrization of SuperNova Intrinsic Properties

ParSNIP learns generative models of transient light curves from a large dataset of transient light curves. It is designed to work with light curves in sncosmo format using the lcdata package to handle large datasets. This code can be used for classification of transients, cosmological distance estimation, and identifying novel transients.

[ascl:2306.026] Parthenon: Portable block-structured adaptive mesh refinement framework

The Parthenon framework, derived from Athena++ (ascl:1912.005), handles massively-parallel, device-accelerated adaptive mesh refinement. It provides a device first/device resident approach, transparent packing of data across blocks (to reduce/hide kernel launch latency), and direct device-to-device communication via asynchronous, one-sided MPI communication to enable high performance. Parthenon uses an intermediate abstraction layer to hide complexity of device kernel launches, offers support for particles and abstract variable control via metadata tags, and has a flexible plug-in package system.

[ascl:1010.005] Particle module of Piernik MHD code

Piernik is a multi-fluid grid magnetohydrodynamic (MHD) code based on the Relaxing Total Variation Diminishing (RTVD) conservative scheme. The original code has been extended by addition of dust described within the particle approximation. The dust is now described as a system of interacting particles. The particles can interact with gas, which is described as a fluid. The comparison between the test problem results and the results coming from fluid simulations made with Piernik code shows the most important differences between fluid and particle approximations used to describe dynamical evolution of dust under astrophysical conditions.

[ascl:2207.029] ParticleGridMapper: Particle data interpolator

ParticleGridMapper.jl interpolates particle data onto either a Cartesian (uniform) grid or an adaptive mesh refinement (AMR) grid where each cell contains no more than one particle. The AMR grid can be trimmed with a user-defined maximum level of refinement. Three different interpolation schemes are supported: nearest grid point (NGP), smoothed-particle hydrodynamics (SPH), and Meshless finite mass (MFM). It is multi-threading parallel.

[ascl:1010.073] partiview: Immersive 4D Interactive Visualization of Large-Scale Simulations

In dense clusters a bewildering variety of interactions between stars can be observed, ranging from simple encounters to collisions and other mass-transfer encounters. With faster and special-purpose computers like GRAPE, the amount of data per simulation is now exceeding 1TB. Visualization of such data has now become a complex 4D data-mining problem, combining space and time, and finding interesting events in these large datasets. We have recently starting using the virtual reality simulator, installed in the Hayden Planetarium in the American Museum for Natural History, to tackle some of these problem. partiview is a program that enables you to visualize and animate particle data. partiview runs on relatively simple desktops and laptops, but is mostly compatible with its big brother VirDir.

[ascl:1809.003] PASTA: Python Astronomical Stacking Tool Array

PASTA performs median stacking of astronomical sources. Written in Python, it can filter sources, provide stack statistics, generate Karma annotations, format source lists, and read information from stacked Flexible Image Transport System (FITS) images. PASTA was originally written to examine polarization stack properties and includes a Monte Carlo modeler for obtaining true polarized intensity from the observed polarization of a stack. PASTA is also useful as a generic stacking tool, even if polarization properties are not being examined.

Would you like to view a random code?