ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 1-250 of 3434 (3345 ASCL, 89 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:2307.050] νHawkHunter: Forecasting of PBH neutrinos

νHawkHunter explores the prospects of detecting neutrinos produced by the evaporation of primordial black holes in ground-based experiments. It makes use of neutrino fluxes from Hawking radiation computed with BlackHawk (ascl:2012.020). νHawkHunter is also be used for Diffuse Supernova Neutrino Background or similar studies by replacing the signal fluxes by the proper ones.

[ascl:2306.006] β-SGP: Scaled Gradient Projection algorithm using β-divergence

β-SGP deconvolves an astronomical image with a known Point Spread Function, providing a means for restoration of telescopic images due to issues ranging from atmospheric turbulence to instrumental aberrations. The code supports improved astrometry, deblending of overlapping sources, faint source detection, and identification of point sources near bright extended objects, and other tasks. β-SGP generalizes the Scaled Gradient Projection (SGP) image deconvolution algorithm using β-divergence as a loss function to restore distorted stellar shapes.

[ascl:2202.003] Zwindstroom: Cosmological growth factors from fluid calculations

Zwindstroom computes background quantities and scale-dependent growth factors for cosmological models with free-streaming species, such as massive neutrinos. Following the earlier REPS code (ascl:1612.022), the code uses a Newtonian fluid approximation with external neutrino sound speed to close the Boltzmann hierarchy. Zwindstroom supports multi-fluid models with distinct transfer functions and sound speeds. A flexible python interface facilitates interaction with CLASS (ascl:1106.020) through classy. There is also a Zwindstroom plugin for the cosmological initial conditions generator monofonIC (ascl:2008.024) that allows for higher-order LPT ICs for massive neutrino simulations in a single step.

[ascl:2106.033] ZWAD: Anomaly detection pipeline

ZWAD (ZTF anomaly detection pipeline) examines data and performs tailored feature extraction. The code then uses machine learning methods to searches for outliers, and identifies anomalies to be examined for validation by experts. Used with the SNAD ZTF data releases object viewer (ascl:2106.034), the infrastructure helps experts to form global views of specific scientifically interesting candidates.

[ascl:2106.034] ztf-viewer: SNAD ZTF data releases object viewer

The SNAD ZTF DR4 object viewer enables quick expert investigation of objects within the public Zwicky Transient Facility (ZTF) data releases. The viewer allows visualization of raw and folded light curves and metadata, as well as cross-match information with the General Catalog of Variable Stars, the International Variable Stars Index, the ATLAS Catalog of Variable Stars, the ZTF Catalog of Periodic Variable Stars, the Transient Name Server, the Open Astronomy Catalogs, the OGLE III Catalog of Variable Stars, the Simbad Astronomical Data Base, Gaia DR2 distances (Bailer-Jones+, 2018), and Vizier. The viewer is also available for ZTF DR2 and ZTF DR3.

[ascl:1011.003] ZPEG: An Extension of the Galaxy Evolution Model PEGASE.2

Photometric redshifts are estimated on the basis of template scenarios with the help of the code ZPEG, an extension of the galaxy evolution model PEGASE.2 and available on the PEGASE web site. The spectral energy distribution (SED) templates are computed for nine spectral types including starburst, irregular, spiral and elliptical. Dust, extinction and metal effects are coherently taken into account, depending on evolution scenarios. The sensitivity of results to adding near-infrared colors and IGM absorption is analyzed. A comparison with results of other models without evolution measures the evolution factor which systematically increases the estimated photometric redshift values by $Delta z$ > 0.2 for z > 1.5. Moreover we systematically check that the evolution scenarios match observational standard templates of nearby galaxies, implying an age constraint of the stellar population at z=0 for each type. The respect of this constraint makes it possible to significantly improve the accuracy of photometric redshifts by decreasing the well-known degeneracy problem. The method is applied to the HDF-N sample. From fits on SED templates by a $chi^2$-minimization procedure, not only is the photometric redshift derived but also the corresponding spectral type and the formation redshift $z_for$ when stars first formed. Early epochs of galaxy formation z > 5 are found from this new method and results are compared to faint galaxy count interpretations.

[ascl:2203.027] Zoobot: Deep learning galaxy morphology classifier

Zoobot classifies galaxy morphology with Bayesian CNN. Deep learning models were trained on volunteer classifications; these models were able to both learn from uncertain volunteer responses and predict full posteriors (rather than point estimates) for what volunteers would have said. The code reproduces and improves Galaxy Zoo DECaLS automated classifications, and can be finetuned for new tasks.

[ascl:2105.010] ZOGY: Python implementation of proper image subtraction

ZOGY performs optimal image subtraction; the code is designed specifically for the MeerLICHT and BlackGEM pipelines, but should also be useful to apply to images of other telescopes. The module accepts a new and a reference FITS image, runs SExtractor (ascl:1010.064) on them, and finds their WCS solution using Astrometry.net (ascl:1208.001). ZOGY then uses PSFex (ascl:1301.001) to infer the position-dependent PSFs of the images and SWarp (ascl:1010.068) to map the reference image to the new image and performs optimal image subtraction. This produces the subtracted image, the significance image, the corrected significance image, and the PSF photometry image and associated error image. The inferred PSFs are also used to extract optimal photometry of all sources detected by SExtractor.

[ascl:2306.012] ZodiPy: Zodiacal emission simulations in timestreams or HEALPix for solar system observers

ZodiPy simulates the zodiacal emission in intensity that an arbitrary solar system observer is predicted to see given an interplanetary dust model, either in the form of timestreams or full-sky HEALPix maps. Written in Python, the code makes zodiacal emission simulations more accessible by providing a simple interface to existing models.

[ascl:1202.002] ZODIPIC: Zodiacal Cloud Image Synthesis

ZODIPIC synthesizes images of exozodiacal clouds. As a default, ZODIPIC creates an image of the solar zodiacal cloud as seen from 10 pc, but it contains many parameters that are tweakable from the command line, making it a handy general-purpose model for optically-thin debris disks that yields both accurate images and photometric information simultaneously. Written in IDL, ZODIPIC includes dust with real optical constants, user-specified dust maps and can compute images as seen through a linear polarizer.

[ascl:1511.022] ZInCo: Zoomed Initial Conditions

ZInCo manipulates existing initial conditions (ICs) compatible with GADGET-2/3 (ascl:0003.001) ICs, allowing different flavors of zoom-in simulations rather then producing new ICs from scratch. The code can manipulate initial conditions with multiple types of particles, unlike the vast majority of zoom-in ICs codes available, preserving their properties and random field. This allows ZInCo to take advantage of other codes that produce ICs featuring a broad range of different cosmologies; it can be used also on existing ICs even in the unlikely case nothing is known about their properties. The code is written in C++ and parallelized using MPI.

[ascl:2306.017] Zeus21: Simulations of 21-cm at cosmic dawn

Zeus21 (Zippy Early-Universe Solver for 21-cm) captures the nonlocal and nonlinear physics of cosmic dawn to create an effective model for the 21-cm power spectrum and global signal. The code takes advantage of the approximate log-normality of the star-formation rate density (SFRD) during cosmic dawn to compute the 21-cm power spectrum analytically. It agrees with more expensive semi-numerical simulations to roughly 10% precision, but has comparably negligible computational cost (~ s) and memory requirements. Zeus21 pairs well with data from HERA, but can be used for any 21-cm inference or prediction. Its capabilities include finding the 21-cm power spectrum (at a broad range of k and z), the global signal, IGM temperatures (Tk, Ts, Tcolor), neutral fraction xHI, Lyman-alpha fluxes, and the evolution of the SFRD; all across cosmic dawn z=5-35. It can also predict UVLFs for HST and JWST. Zeus21 can use three different astrophysical models, one of which emulates 21cmFAST (ascl:1102.023), and can vary the cosmology through CLASS (ascl:1106.020).

[ascl:2008.010] zeus: Lightning Fast MCMC

Zeus is a pure-Python implementation of the Ensemble Slice Sampling method. Ensemble Slice Sampling improves upon Slice Sampling by bypassing some of that method's difficulties; it also exploits an ensemble of parallel walkers, thus making it immune to linear correlations. Zeus offers fast and robust Bayesian inference and efficient Markov Chain Monte Carlo without hand-tuning. The code provides excellent performance in terms of autocorrelation time and convergence rate, can scale to multiple CPUs without any extra effort, and includes convergence diagnostics.

[ascl:1102.028] ZEUS-MP/2: Computational Fluid Dynamics Code

ZEUS-MP is a multiphysics, massively parallel, message-passing implementation of the ZEUS code. ZEUS-MP offers an MHD algorithm that is better suited for multidimensional flows than the ZEUS-2D module by virtue of modifications to the method of characteristics scheme first suggested by Hawley & Stone. This MHD module is shown to compare quite favorably to the TVD scheme described by Ryu et al. ZEUS-MP is the first publicly available ZEUS code to allow the advection of multiple chemical (or nuclear) species. Radiation hydrodynamic simulations are enabled via an implicit flux-limited radiation diffusion (FLD) module. The hydrodynamic, MHD, and FLD modules can be used, singly or in concert, in one, two, or three space dimensions. In addition, so-called 1.5D and 2.5D grids, in which the "half-D'' denotes a symmetry axis along which a constant but nonzero value of velocity or magnetic field is evolved, are supported. Self-gravity can be included either through the assumption of a GM/r potential or through a solution of Poisson's equation using one of three linear solver packages (conjugate gradient, multigrid, and FFT) provided for that purpose. Point-mass potentials are also supported.

Because ZEUS-MP is designed for large simulations on parallel computing platforms, considerable attention is paid to the parallel performance characteristics of each module in the code. Strong-scaling tests involving pure hydrodynamics (with and without self-gravity), MHD, and RHD are performed in which large problems (2563 zones) are distributed among as many as 1024 processors of an IBM SP3. Parallel efficiency is a strong function of the amount of communication required between processors in a given algorithm, but all modules are shown to scale well on up to 1024 processors for the chosen fixed problem size.

[ascl:1306.014] ZEUS-2D: Simulation of fluid dynamical flows

ZEUS-2D is a hydrodynamics code based on ZEUS which adds a covariant differencing formalism and algorithms for compressible hydrodynamics, MHD, and radiation hydrodynamics (using flux-limited diffusion) in Cartesian, cylindrical, or spherical polar coordinates.

[ascl:1102.027] ZENO: N-body and SPH Simulation Codes

The ZENO software package integrates N-body and SPH simulation codes with a large array of programs to generate initial conditions and analyze numerical simulations. Written in C, the ZENO system is portable between Mac, Linux, and Unix platforms. It is in active use at the Institute for Astronomy (IfA), at NRAO, and possibly elsewhere.

Zeno programs can perform a wide range of simulation and analysis tasks. While many of these programs were first created for specific projects, they embody algorithms of general applicability and embrace a modular design strategy, so existing code is easily applied to new tasks. Major elements of the system include structured data file utilities facilitate basic operations on binary data, including import/export of ZENO data to other systems; snapshot generation routines to create particle distributions with various properties; systems with user-specified density profiles can be realized in collisionless or gaseous form; multiple spherical and disk components may be set up in mutual equilibrium; and snapshot manipulation routines permit the user to sift, sort, and combine particle arrays, translate and rotate particle configurations, and assign new values to data fields associated with each particle.

Simulation codes include both pure N-body and combined N-body/SPH programs. Pure N-body codes are available in both uniprocessor and parallel versions. SPH codes offer a wide range of options for gas physics, including isothermal, adiabatic, and radiating models. Snapshot analysis programs calculate temporal averages, evaluate particle statistics, measure shapes and density profiles, compute kinematic properties, and identify and track objects in particle distributions. Visualization programs generate interactive displays and produce still images and videos of particle distributions; the user may specify arbitrary color schemes and viewing transformations.

[ascl:1911.012] Zeltron: Explicit 3D relativistic electromagnetic Particle-In-Cell code

Zeltron is an explicit 3D relativistic electromagnetic Particle-In-Cell code suited for modeling particle acceleration in astrophysical plasmas. The code is efficiently parallelized with the Message Passing Interface, and can be run on a laptop computer or on multiple cores on current supercomputers. Zeltron takes into account the effect of the radiation reaction force on the motion of the particles; it assigns variable weights to the macro-particles to model particle density gradients, and does not strictly conserve the total energy. The code uses linear interpolation to deposit the charges and currents generated by each particle at the nodes of the computational grid, and computes the charge and current densities for Maxwell's equations. Zeltron contains a large set of analysis tools, including plasma density, particle spectrum, optically thin synchrotron and inverse Compton spectra, angular distributions, and stress-energy tensor.

[ascl:1512.016] ZeldovichRecon: Halo correlation function using the Zeldovich approximation

ZeldovichRecon computes the halo correlation function using the Zeldovich approximation. It includes 3 variants: 1.) zelrecon.cpp, which computes the various contributions to the correlation function; 2.) zelrecon_ctypes.cpp, which is designed to be called from Python using the ctypes library; and 3.) a version which implements the "ZEFT" formalism of "A Lagrangian effective field theory" [arxiv:1506.05264] including the alpha term described in that paper.

[ascl:1605.016] zeldovich-PLT: Zel'dovich approximation initial conditions generator

zeldovich-PLT generates Zel'dovich approximation (ZA) initial conditions (i.e. first-order Lagrangian perturbation theory) for cosmological N-body simulations, optionally applying particle linear theory (PLT) corrections.

[ascl:2205.012] Zelda: Generate correlation functions and power spectra from a galaxy catalog

The Zelda command-line tool extracts correlation functions in velocity space from a galaxy catalog. Zelda is modular, extendable, and can be generalized to produce power spectra and to work in position space. Written in C, it was heavily inspired by the cosmological Boltzmann code CLASS (ascl:1106.020). Zelda is a parallel code using the OpenMP standard.

[ascl:1110.005] ZEBRA: Zurich Extragalactic Bayesian Redshift Analyzer

The current version of the Zurich Extragalactic Bayesian Redshift Analyzer (ZEBRA) combines and extends several of the classical approaches to produce accurate photometric redshifts down to faint magnitudes. In particular, ZEBRA uses the template-fitting approach to produce Maximum Likelihood and Bayesian redshift estimates based on: (1.) An automatic iterative technique to correct the original set of galaxy templates to best represent the SEDs of real galaxies at different redshifts; (2.) A training set of spectroscopic redshifts for a small fraction of the photometric sample; and (3.) An iterative technique for Bayesian redshift estimates, which extracts the full two-dimensional redshift and template probability function for each galaxy.

[ascl:1404.002] ZDCF: Z-Transformed Discrete Correlation Function

The cross-correlation function (CCF) is commonly employed in the study of AGN, where it is used to probe the structure of the broad line region by line reverberation, to study the continuum emission mechanism by correlating multi-waveband light curves and to seek correlations between the variability and other AGN properties. The z -transformed discrete correlation function (ZDCF) is a method for estimating the CCF of sparse, unevenly sampled light curves. Unlike the commonly used interpolation method, it does not assume that the light curves are smooth and it does provide errors on its estimates.

[ascl:2310.007] zCluster: Measure photometric redshifts for galaxy clusters

zCluster measures galaxy cluster photometric redshifts using data from broadband photometry in large public surveys, given a priori knowledge of the cluster position. The code retrieves and uses redshift probability distributions in order to create a projected two-dimensional density map of a targeted galaxy cluster, which is later convolved with a Gaussian kernel to smooth the map. zCluster also produces photometric redshift estimates and galaxy density maps for any point in the sky using the included zField tool.

[ascl:1907.017] ZChecker: Zwicky Transient Facility moving target checker for short object lists

ZChecker finds, measures, and visualizes known comets in the Zwicky Transient Facility time-domain survey. Images of targets are identified using on-line ephemeris generation and survey metadata. The photometry of the targets are measured and the images are processed with temporal filtering to highlight morphological variations in time.

[ascl:1807.017] ZBARYCORR: Barycentric redshift calculator

ZBARYCORR determines the barycentric redshift (zB) for a given star. It calculates the positions and velocities of solar system objects, applies the rotation, precession, nutation, and polar motion of the Earth, applies the stellar motion using the Markwardt library (ascl:1807.016), Shapiro delay, and light-travel term, and finally calculates the quantity zB—the barycentric correction independent of the measured redshift. A Python wrapper, BARYCORR (ascl:1807.018), is available.

[ascl:1607.012] ZASPE: Zonal Atmospheric Stellar Parameters Estimator

ZASPE (Zonal Atmospheric Stellar Parameters Estimator) computes the atmospheric stellar parameters (Teff, log(g), [Fe/H] and vsin(i)) from echelle spectra via least squares minimization with a pre-computed library of synthetic spectra. The minimization is performed only in the most sensitive spectral zones to changes in the atmospheric parameters. The uncertainities and covariances computed by ZASPE assume that the principal source of error is the systematic missmatch between the observed spectrum and the sythetic one that produces the best fit. ZASPE requires a grid of synthetic spectra and can use any pre-computed library minor modifications.

[ascl:1602.003] ZAP: Zurich Atmosphere Purge

ZAP (Zurich Atmosphere Purge) provides sky subtraction for integral field spectroscopy; its approach is based on principal component analysis (PCA) developed for the Multi Unit Spectrographic Explorer (MUSE) integral field spectrograph. ZAP employs filtering and data segmentation to enhance the inherent capabilities of PCA for sky subtraction. ZAP reduces sky emission residuals while robustly preserving the flux and line shapes of astronomical sources; this method works in a variety of observational situations from sparse fields with a low density of sources to filled fields in which the target source fills the field of view. With the inclusion of both of these situations the method is generally applicable to many different science cases and should also be useful for other instrumentation.

[ascl:1011.022] yt: A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

yt is an open source, community-developed volumetric analysis and visualization toolkit. Originally designed for handling Enzo's (ascl:1010.072) structure adaptive mesh refinement (AMR) data, yt has been extended to work with numerous simulation methods and simulation codes including Orion, RAMSES (ascl:1011.007), and FLASH (ascl:1010.082). Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to data representation on-disk or in-memory. yt can be used for projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation and topologically-connected isocontour identification.

yt benefits from the contributions of a broad range of community members, and a full list of credits for the code can be found on the yt website or in the source repository.

[ascl:1203.010] Youpi: YOUr processing PIpeline

Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.

[ascl:2208.025] Yonder: Data denoising and reconstruction

YONDER uses singular value decomposition to perform low-rank data denoising and reconstruction. It takes a tabular data matrix and an error matrix as input and returns a denoised version of the original dataset as output. The approach enables a more accurate data analysis in the presence of uncertainties. Consequently, this package can be used as a simple toolbox to perform astronomical data cleaning.

[ascl:1312.009] YODA: Yet another Object Detection Application

YODA, implemented in C++, performs object detection, photometry and star-galaxy classification on astronomical images. Developed specifically to cope with the multi-band imaging data common in modern extragalactic imaging surveys, it is modular and therefore easily adaptable to specific needs. YODA works under conditions of inhomogeneous background noise across the detection frame, and performs accurate aperture photometry in image sets not sharing a common coordinate system or pixel scale as is often the case in present-day extragalactic survey work.

[ascl:1403.012] YNOGKM: Time-like geodesics in the Kerr-Newmann Spacetime calculations

YNOGKM (Yun-Nan observatories geodesic in a Kerr-Newman spacetime for massive particles) performs fast calculation of time-like geodesics in the Kerr-Newman (K-N) spacetime; it is a direct extension of YNOGK (Yun-Nan observatories geodesic Kerr) calculating null geodesics in a Kerr spacetime. The four Boyer-Lindquis coordinates and proper time are expressed as functions of a parameter p semi-analytically by using the Weierstrass' and Jacobi's elliptic functions and integrals. The elliptic integrals are computed by Carlson's elliptic integral method, which guarantees the fast speed of the code. The source Fortran file ynogkm.f90 contains three modules: constants, rootfind, ellfunction, and blcoordinates.

[ascl:1305.008] YNOGK: Calculating null geodesics in the Kerr spacetime

YNOGK, written in Fortran, calculates the null geodesics in the Kerr spacetime. It uses Weierstrass' and Jacobi's elliptic functions to express all coordinates and affine parameters as analytical and numerical functions of a parameter $p$, which is an integral value along the geodesic. The information about the turning points do not need to be specified in advance by the user, allowing applications such as imaging, the calculation of line profiles or the observer-emitter problem to become root finding problems. Elliptic integrations are computed by Carlson's elliptic integral method, which allows fast computation.

[ascl:1908.022] YMW16: Electron-density model

YMW16 models the distribution of free electrons in the Galaxy, the Magellanic Clouds and the inter-galactic medium and can be used to estimate distances for real or simulated pulsars and fast radio bursts (FRBs) based on their position and dispersion measure. The Galactic model is based on 189 pulsars that have independently determined distances as well as dispersion measures, whereas simpler models are used for the electron density in the MC and the IGM.

[ascl:1306.016] Yaxx: Yet another X-ray extractor

Yaxx is a Perl script that facilitates batch data processing using Perl open source software and commonly available software such as CIAO/Sherpa, S-lang, SAS, and FTOOLS. For Chandra and XMM analysis it includes automated spectral extraction, fitting, and report generation. Yaxx can be run without climbing an extensive learning curve; even so, yaxx is highly configurable and can be customized to support complex analysis. yaxx uses template files and takes full advantage of the unique Sherpa / S-lang environment to make much of the processing user configurable. Although originally developed with an emphasis on X-ray data analysis, yaxx evolved to be a general-purpose pipeline scripting package.

[ascl:2212.011] xwavecal: Wavelength calibrating echelle spectrographs

The xwavecal library automatically wavelength calibrates echelle spectrographs for high precision radial velocity work. The routines are designed to operate on data with extracted 1D spectra. The library provides a convienience function which returns a list of wavelengths from just a list of spectral feature coordinates (pixel and order) and a reference line list. The returned wavelengths are the wavelengths of the measured spectral features under the best fit wavelength model. xwavecal also provides line identification and spectral reduction utilities. The library is modular; each step of the wavelength calibration is a stage which can be disabled by removing the associated line in the config.ini file. Wavelength calibrating data which already have spectra means only using the wavelength calibration stages. Using the full experimental pipeline means enabling the other data reduction stages, such as overscan subtraction.

[ascl:9910.008] XSTAR: A program for calculating conditions and spectra of photoionized gases

XSTAR is a command-driven, interactive, computer program for calculating the physical conditions and emission spectra of photoionized gases. It may be applied in a wide variety of astrophysical contexts. Stripped to essentials, its job may be described simply: A spherical gas shell surrounding a central source of ionizing radiation absorbs some of this radiation and reradiates it in other portions of the spectrum; XSTAR computes the effects on the gas of absorbing this energy, and the spectrum of reradiated light. The user supplies the shape and strength of the incident continuum, the elemental abundances in the gas, its density or pressure, and its thickness; the code can be directed to return any of a large number of derived quantities, including (but not limited to) the ionization balance and temperature, opacity tables, and emitted line and continuum fluxes.

[ascl:9910.005] XSPEC: An X-ray spectral fitting package

It has been over a decade since the first paper was published containing results determined using the general X-ray spectral-fitting program XSPEC. Since then XSPEC has become the most widely used program for this purpose, being the de facto standard for the ROSAT and the de jure standard for the ASCA and XTE satellites. Probably the most important features of XSPEC are the large number of theoretical models available and the facilities for adding new models.

[ascl:1805.016] xspec_emcee: XSPEC-friendly interface for the emcee package

XSPEC_EMCEE is an XSPEC-friendly interface for emcee (ascl:1303.002). It carries out MCMC analyses of X-ray spectra in the X-ray spectral fitting program XSPEC (ascl:9910.005). It can run multiple xspec processes simultaneously, speeding up the analysis, and can switch to parameterizing norm
parameters in log space.

[ascl:1207.008] xSonify: Sonification software

xSonify maps scientific data to acoustic sequences. Listening to data can help discover patterns in huge amounts of data. Written in Java, xSonify allows visually impaired people to examine numerical data for patterns. The data can be imported from local files or from remote databases via the Internet. Single results of measurements from spacecraft instruments can be selected by their corresponding variables in a specific time frame. The results are transformed into MIDI sequences which can be played with a selection of different instruments from a soundbank. Another software module enables xSonify to convert the sonified data into other sound formats to make it easier to archive and exchange the Sonification results with other scientists.

[submitted] Xsmurf - Measuring multifractal properties with the continuous wavelet transform modulus maxima (WTMM) method

Xsmurf is a software package written in C/Tcl/Tk that implements the continuous wavelet transform modulus maxima method, an image processing tool for measuring fractal and multifractal properties in experimental and simulation data.
Multifractal analysis is described in the following page: http://www.scholarpedia.org/article/Wavelet-based_multifractal_analysis

Xsmurf has been used in multiple applications in astrophysics, e.g. :
- analysis of solar magnetograms for characterizing complexity of evolving regions
- fractal/multifractal nature and anisotropic structure of Galactic atomic hydrogen (H I)
- analysis of simulation data (velocity field, ...) of turbulent flow

[ascl:1509.001] XSHPipelineManager: Wrapper for the VLT/X-shooter Data Reduction Pipeline

XSHPipelineManager provides a framework for reducing spectroscopic observations taken by the X-shooter spectrograph at the Very Large Telescope. This Python code wraps recipes developed by the European Southern Observatory and runs the full X-shooter data reduction pipeline. The code offers full flexibility in terms of what data reduction recipes to include and which calibration files to use. During the data reduction chain restart-files are saved, making it possible to restart at any step in the chain.

[ascl:2301.009] Xpol: Pseudo-Cl power spectrum estimator

Xpol computes angular power spectra based on cross-correlation between maps and covariance matrices. The code is written in C and is fully MPI parallelized in CPU and memory using spherical transform by s2hat (ascl:1110.013). It has been used to derive CMB and dust power spectra for Archeops and CMB, dust, CIB, SZ, SZ-CIB for Planck, among others.

[ascl:1212.002] XPHOT: Estimation of properties of weak X-ray sources

XPHOT is an IDL implementation of a non-parametric method for estimating the apparent and intrinsic broad-band fluxes and absorbing X-ray column densities of weak X-ray sources. XPHOT is intended for faint sources with greater than ∼5-7 counts but fewer than 100-300 counts where parametric spectral fitting methods will be superior. This method is similar to the long-standing use of color-magnitude diagrams in optical and infrared astronomy, with X-ray median energy replacing color index and X-ray source counts replacing magnitude. Though XPHOT was calibrated for thermal spectra characteristic of stars in young stellar clusters, recalibration should be possible for some other classes of faint X-ray sources such as extragalactic active galactic nuclei.

[ascl:1502.019] XPCell: Convective plasma cells simulator

XPCell simulates convective plasma cells. The program is implemented in two versions, one using GNUPLOT and the second OpenGL. XPCell offers a GUI to introduce the parameter required by the program.

[ascl:2110.022] XookSuut: Model circular and noncircular flows on 2D velocity maps

XookSuut models circular and noncircular flows on resolved velocity maps. The code performs nonparametric fits to derive kinematic models without assuming analytical functions on the different velocity components of the models. It recovers the circular and radial motions in galaxies in dynamical equilibrium and can derive the noncircular motions induced by oval distortions, such as that produced by stellar bars. XookSuut explores the full space of parameters on a N-dimensional space to derive their mean values; this combined method efficiently recovers the constant parameters and the different kinematic components.

[ascl:1402.020] XNS: Axisymmetric equilibrium configuration of neutron stars

XNS solves for the axisymmetric equilibrium configuration of neutron stars in general relativity. It can model differentially rotating and magnetic fields that are either purely toroidal, purely poloidal or in the mixed twisted torus configuration. Einsten's equations are solved using the XCFC approximation for the metric in spherical coordinates.

[ascl:1303.021] Xmatch: GPU Enhanced Astronomic Catalog Cross-Matching

Xmatch is a cross-platform, multi-GPU tool which allows for extremely fast cross-matching between two Astronomic catalogs. It is capable of asyncronously managing multiple GPUs, ideal for workstation and cluster environments.

[ascl:1704.012] XID+: Next generation XID development

XID+ is a prior-based source extraction tool which carries out photometry in the Herschel SPIRE (Spectral and Photometric Imaging Receiver) maps at the positions of known sources. It uses a probabilistic Bayesian framework that provides a natural framework in which to include prior information, and uses the Bayesian inference tool Stan to obtain the full posterior probability distribution on flux estimates.

[ascl:1511.004] Xgremlin: Interferograms and spectra from Fourier transform spectrometers analysis

Xgremlin is a hardware and operating system independent version of the data analysis program Gremlin used for Fourier transform spectrometry. Xgremlin runs on PCs and workstations that use the X11 window system, including cygwin in Windows. It is used to Fourier transform interferograms, plot spectra, perform phase corrections, perform intensity and wavenumber calibration, and find and fit spectral lines. It can also be used to construct synthetic spectra, subtract continua, compare several different spectra, and eliminate ringing around lines.

[ascl:1807.031] xGDS: Exploration Ground Data Systems

xGDS (Exploration Ground Data Systems) synthesizes real world data (from sensors, robots, ROVs, mobile devices, etc) and human observations into rich, digital maps and displays for analysis, decision making, and collaboration. xGDS processes and maps data (including video) in real-time during operations and uses it to support live role-based geolocated note taking. Notes can be used to search for and display important data. The software enables real-time analysis of data, permitting one to make inferences and plan new data collection operations while still in the field.

[ascl:2301.012] XGA: Efficient analysis of XMM observations

XGA (X-ray: Generate and Analyse) analyzes X-ray sources observed by the XMM-Newton Space telescope. It is based around declaring different types of source and sample objects which correspond to real X-ray sources, finding all available data, and then insulating the user from the tedious generation and basic analysis of X-ray data products. XGA generates photometric products and spectra for individual sources, or whole samples, with just a few lines of code. Though not a pipeline, pipelines for complex analysis can be built on top of it. XGA provides an easy to use (and parallelized) Python interface with XMM's Science Analysis System (ascl:1404.004), as well as with XSPEC (ascl:9910.005). All XMM products and fit results are read into an XGA source storage structure, thus avoiding the need to leave a Python environment at any point during the analysis. This module also supports more complex analyses for specific object types such as the easy generation of scaling relations, the measurement of gas masses for galaxy clusters, and the PSF correction of images.

[ascl:1502.018] XFGLENSES: Gravitational lens visualizer

XFGL visualizes gravitational lenses. It has an XFORM GUI and is completely interactive with the mouse. It uses OpenGL for the simulations.

[ascl:1112.013] XEphem: Interactive Astronomical Ephemeris

XEphem is a scientific-grade interactive astronomical ephemeris package for UNIX-like systems. Written in C, X11 and Motif, it is easily ported to systems. XEphem computes heliocentric, geocentric and topocentric information for all objects and has built-in support for all planets, the moons of Mars, Jupiter, Saturn, Uranus and Earth, central meridian longitude of Mars and Jupiter, Saturn's rings, and Jupiter's Great Red Spot. It allows user-defined objects including stars, deepsky objects, asteroids, comets and Earth satellites, provides special efficient handling of large catalogs including Tycho, Hipparcos, GSC, displays data in configurable tabular formats in conjunction with several interactive graphical views, and displays a night-at-a-glance 24 hour graphic showing when any selected objects are up. It also displays 3-D stereo Solar System views that are particularly well suited for visualizing comet trajectories, quickly finds all close pairs of objects in the sky, and sorts and prints all catalogs with very flexible criteria for creating custom observing lists.

[ascl:1107.010] XDSPRES: CL-based package for Reducing OSIRIS Cross-dispersed Spectra

The CL-based package XDSPRES is a complete reducing facility for cross-dispersed spectra taken with the Ohio State Infrared Imager/Spectrometer, as installed at the SOAR telescope. This instrument provides spectra in the range between 1.2um and 2.35um in a single exposure, with resolving power of R ~ 1200. XDSPRES consists of two tasks, namely xdflat and doosiris. The former is a completely automated code for preparing normalized flat field images from raw flat field exposures. Doosiris provides a complete reduction pipeline that requires a minimum of user interaction. The user guide explains the general steps towards a fully reduced spectrum.

[ascl:1302.016] XDQSO: Photometic quasar probabilities and redshifts

XDQSO, written in IDL, calculates photometric quasar probabilities to mimick SDSS-III’s BOSS quasar target selection or photometric redshifts for quasars, whether in three redshift ranges (z < 2.2; 2.2 leq z leq 3.5; z > 3.5) or arbitrary redshift ranges.

[ascl:1708.026] XDGMM: eXtreme Deconvolution Gaussian Mixture Modeling

XDGMM uses Gaussian mixtures to do density estimation of noisy, heterogenous, and incomplete data using extreme deconvolution (XD) algorithms which is compatible with the scikit-learn machine learning methods. It implements both the astroML and Bovy et al. (2011) algorithms, and extends the BaseEstimator class from scikit-learn so that cross-validation methods work. It allows the user to produce a conditioned model if values of some parameters are known.

[ascl:1907.029] XDF-GAN: Mock astronomical survey generator

XDF-GAN generates mock galaxy surveys with a Spatial Generative Adversarial Network (SGAN)-like architecture. Mock galaxy surveys are generated from data that is preprocessed as little as possible (preprocessing is only a 99.99th percentile clipping). The outputs can also be tessellated together to create a very large survey, limited in size only by the RAM of the generation machine.

[ascl:1810.016] XCLASS: eXtended CASA Line Analysis Software Suite

XCLASS (eXtended CASA Line Analysis Software Suite) extends CASA (ascl:1107.013) with new functions for modeling interferometric and single dish data. It provides a tool for calculating synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, taking into account the finite source size and dust attenuation. It also includes an interface for MAGIX (ascl:1303.009) to find the parameter set that most closely reproduces the data.

[ascl:1312.005] XAssist: Automatic analysis of X-ray astrophysics data

XAssist provides automation of X-ray astrophysics, specifically data reprocessing, source detection, and preliminary spatial, temporal and spectral analysis for each source with sufficient counts, with an emphasis on galaxies. It has been used for data from Chandra, ROSAT, XMM-Newton, and other various projects.

[ascl:2102.005] X-PSI: X-ray Pulse Simulation and Inference

X-PSI simulates rotationally-modified (pulsed) surface X-ray emission from neutron stars, taking into account relativistic effects on the emitted radiation. This can then be used to perform Bayesian statistical inference on real or simulated astronomical data sets. Model parameters of interest may include neutron star mass and radius (useful to constrain the properties of ultradense nuclear matter) or the system geometry and properties of the hot emitting surface-regions. To achieve this, X-PSI couples code for likelihood functionality (simulation) with existing open-source software for posterior sampling (inference).

[ascl:1601.019] WzBinned: Binned and uncorrelated estimates of dark energy EOS extractor

WzBinned extracts binned and uncorrelated estimates of dark energy equation of state w(z) using Type Ia supernovae Hubble diagram and other cosmological probes and priors. It can handle an arbitrary number of input distance modulus data (entered as an input file SNdata.dat) and various existing cosmological information.

[ascl:2310.003] wwz: Weighted wavelet z-transform code

wwz provides a python3 implementation of the Foster weighted wavelet z-transform, a wavelet-based method for periodicity analysis of unevenly sampled data.

[ascl:1909.011] WVTICs: SPH initial conditions using Weighted Voronoi Tesselations

WVTICs generates glass-like initial conditions for Smoothed Particle Hydrodynamics. Relaxation of the particle distribution is done using an algorithm based on Weighted Voronoi Tesselations; additional particle reshuffling can be enabled to improve over- and undersampled maxima/minima. The WBTICs package includes a full suite of analytical test problems.

[ascl:1211.003] WVT Binning: Spatially adaptive 2-D binning

WVT Binning is a spatially adaptive 2-dimensional binning algorithm designed to bin sparse X-ray data. It can handle background subtracted, exposure corrected data to produce intensity images, hardness ratio maps, or temperature maps. The algorithm is an extension of Cappellari & Copin's (2003) Voronoi binning code and uses Weighted Voronoi Tesselations (WVT) to produce a very compact binning structure with a constant S/N per bin. The bin size adjusts to the required resolution in single-pixel steps, which minimizes the scatter around the target S/N. The code is very versatile and can in principle be applied to any type of data. The user manual contains instructions on how to apply the WVT binning code to X-ray data and how to extend the algorithm to other problems.

[ascl:1207.014] wvrgcal: Correction of atmospheric phase fluctuations in ALMA observations

wvrgcal is a command line front end to LibAIR, the atmospheric inference library for phase correction of ALMA data using water vapour radiometers, and is the user-facing application for calculating atmospheric phase correction from WVR data. wvrgcal outputs a CASA gain calibration table which can then be applied to the observed data in the usual way.

Note: wvrgcal has been incorporated into the NRAO CASA suite.

[ascl:2209.013] wsynphot: Synthetic photometry package using quantities

wsynphot provides a broad set of filters, including observation facility, instrument, and wavelength range, and functions for imaging stars to produce a filter curve showing the transmission of light for each wavelength value. It can create a filter curve object, plot the curve, and allows the user to do calculations on the filter curve object.

[ascl:1402.029] wssa_utils: WSSA 12 micron dust map utilities

wssa_utils contains utilities for accessing the full-sky, high-resolution maps of the WSSA 12 micron data release. Implementations in both Python and IDL are included. The code allows users to sample values at (longitude, latitude) coordinates of interest with ease, transparently mapping coordinates to WSSA tiles and performing interpolation. The wssa_utils software also serves to define a unique WSSA 12 micron flux at every location on the sky.

[ascl:1010.071] WSHAPE: Gravitational Softening and Adaptive Mass Resolution

Pairwise forces between particles in cosmological N-body simulations are generally softened to avoid hard collisions. Physically, this softening corresponds to treating the particles as diffuse clouds rather than point masses. For particles of unequal mass (and hence unequal softening length), computing the softened force involves a nontrivial double integral over the volumes of the two particles. We show that Plummer force softening is consistent with this interpretation of softening while spline softening is not. We provide closed-form expressions and numerical implementation for pairwise gravitational force laws for pairs of particles of general softening scales $epsilon_1$ and $epsilon_2$ assuming the commonly used cloud profiles: NGP, CIC, TSC, and PQS. Similarly, we generalize Plummer force law into pairs of particles of general softenings. We relate our expressions to the gaussian, Plummer and spline force softenings known from literature. Our expressions allow possible inclusions of pointlike particles such as stars or supermassive black holes.

[ascl:1408.023] WSClean: Widefield interferometric imager

WSClean (w-stacking clean) is a fast generic widefield imager. It uses the w-stacking algorithm and can make use of the w-snapshot algorithm. It supports full-sky imaging and proper beam correction for homogeneous dipole arrays such as the MWA. WSClean allows Hogbom and Cotton-Schwab cleaning, and can clean polarizations joinedly. All operations are performed on the CPU; it is not specialized for GPUs.

[ascl:1304.004] Wqed: Lightcurve Analysis Suite

Wqed (pronounced "Wicked") is a set of tools developed by the Delaware Asteroseismic Research Center (DARC) to simplify the process of reducing time-series CCD data on variable stars. It does not provide tools to measure the brightness of stars in individual frames, focusing instead on what comes next:

    - selecting and removing data lost to cloud,
    - removing the effects of light cloud and seeing variations,
    - keeping track of what star a given data set refers to, and when that data was taken, and
    - performing barycentric corrections to data.

[ascl:2112.023] wpca: Weighted Principal Component Analysis in Python

wpca, written in Python, offers several implementations of Weighted Principal Component Analysis and uses an interface similar to scikit-learn's sklearn.decomposition.PCA. Implementations include a direct decomposition of a weighted covariance matrix to compute principal vectors, and then a weighted least squares optimization to compute principal components, and an iterative expectation-maximization approach to solve simultaneously for the principal vectors and principal components of weighted data. It also includes a standard non-weighted PCA implemented using the singular value decomposition, primarily to be useful for testing.

[ascl:1907.030] Wōtan: Stellar detrending methods

Wōtan provides free and open source algorithms to remove trends from time-series data automatically as an aid to search efficiently for transits in stellar light curves from surveys. The toolkit helps determine empirically the best tool for a given job, serving as a one-stop solution for various smoothing tasks.

[submitted] World Observatory

World Observatory visualizes S/N-versus-cost tradeoffs for large optical and near-infrared telescopes. Both mid-latitude and Arctic/Antarctic sites can be considered; the intent is a simple simulation to grow intuition for where major capital costs lie relative to key observatory design choices, and against expected scientific performance at various sites. User-defined unit costs for (a possibly "effective") roadway, enclosure, aperture, focal length, and adaptive optics can be scaled up for polar sites, and down for better seeing and lower sky brightness in K-band. Observatory models and results are immediately displayed side-by-side. Either point-source-detection S/N or recovery of bulge-to-total ratios in a simulated galaxy survey are divided by the total project cost, thus providing a universal metric.

[ascl:1204.014] WOMBAT: sWift Objects for Mhd BAsed on Tvd

WOMBAT (sWift Objects for Mhd BAsed on Tvd) is an astrophysical fluid code that is an implementation of a non-relativistic MHD TVD scheme; an extension for relativistic MHD has been added. The code operates on 1, 2, and 3D Eulerian meshes (cartesian and cylindrical coordinates) with magnetic field divergence restriction controlled by a constrained transport (CT) scheme. The user can tune code performance to a given processor based on chip cache sizes. Proper settings yield significant speed-ups due to efficient cache reuse.

[ascl:1212.007] WOLF: FITS file processor

WOLF processes FITS files and generates photometry files, annotated JPGs, opacity maps, background, transient detection and luminance changes detection. This software was used to process data for the Night Sky Live project.

[ascl:2011.012] wobble: Time-series spectra analyzer

wobble analyzes time-series spectra. It was designed with stabilized extreme precision radial velocity (EPRV) spectrographs in mind, but is highly flexible and extensible to a variety of applications. It takes a data-driven approach to deriving radial velocities and requires no a priori knowledge of the stellar spectrum or telluric features.

[ascl:1312.002] WND-CHARM: Multi-purpose image classifier

WND-CHARM quantitatively analyzes morphologies of galaxy mergers and associate galaxies by their morphology. It computes a large set (up to ~2700) of image features for each image based on the WND-CHARM algorithm. It can then split the images into training and test sets and classify them. The software extracts the image content descriptor from raw images, image transforms, and compound image transforms. The most informative features are then selected, and the feature vector of each image is used for classification and similarity measurement using Fisher discriminant scores and a variation of Weighted Nearest Neighbor analysis. WND-CHARM's results comparable favorably to the performance of task-specific algorithms developed for tested datasets. The simple user interface allows researchers who are not knowledgeable in computer vision methods and have no background in computer programming to apply image analysis to their data.

[ascl:1204.001] WM-basic: Modeling atmospheres of hot stars

WM-basic is an easy-to-use interface to a program package which models the atmospheres of Hot Stars (and also SN and GN). The release comprises all programs required to calculate model atmospheres which especially yield ionizing fluxes and synthetic spectra. WM-basic is a native 32-bit application, conforming to the Multiple Documents Interface (MDI) standards for Windows XP/2000/NT/9x. All components of the program package have been compiled with Digital Visual Fortran V6.6(Pro) and Microsoft Visual C++.

[ascl:1812.001] WISP: Wenger Interferometry Software Package

WISP (Wenger Interferometry Software Package) is a radio interferometry calibration, reduction, imaging, and analysis package. WISP is a collection of Python code implemented through CASA (ascl:1107.013). Its generic and modular framework is designed to handle any continuum or spectral line radio interferometry data.

[ascl:1806.004] WiseView: Visualizing motion and variability of faint WISE sources

WiseView renders image blinks of Wide-field Infrared Survey Explorer (WISE) coadds spanning a multi-year time baseline in a browser. The software allows for easy visual identification of motion and variability for sources far beyond the single-frame detection limit, a key threshold not surmounted by many studies. WiseView transparently gathers small image cutouts drawn from many terabytes of unWISE coadds, facilitating access to this large and unique dataset. Users need only input the coordinates of interest and can interactively tune parameters including the image stretch, colormap and blink rate. WiseView was developed in the context of the Backyard Worlds: Planet 9 citizen science project, and has enabled hundreds of brown dwarf candidate discoveries by citizen scientists and professional astronomers.

[ascl:9910.007] WINGSPAN: A WINdows Gamma-ray SPectral Analysis program

WINGSPAN is a program written to analyze spectral data from the Burst and Transient Source Experiment (BATSE) on NASA's Compton Gamma-Ray Observatory. Data files in the FITS (BFITS) format are suitable for input into the program. WINGSPAN can be used to view and manipulate event time histories or count spectra, and also has the capability to perform spectral deconvolution via a standard forward folding model fitting technique (Levenberg-Marquardt algorithm). Although WINGSPAN provides many functions for data manipulation, the program was designed to allow users to easily plug in their own external IDL routines. These external routines have access to all data read from the FITS files, as well as selection intervals created in the main part of WINGSPAN (background intervals and model, etc).

[ascl:2109.013] WimPyDD: WIMP direct–detection rates predictor

WimPyDD calculates accurate predictions for the expected rates in WIMP direct–detection experiments within the framework of Galilean–invariant non–relativistic effective theory. The object–oriented customizable Python code handles different scenarios including inelastic scattering, WIMP of arbitrary spin, and a generic velocity distribution of WIMP in the Galactic halo.

[ascl:2112.010] WIMpy_NREFT: Dark Matter direct detection rates detector

WIMpy_NREFT (also known as WIMpy) calculates Dark Matter-Nucleus scattering rates in the framework of non-relativistic effective field theory (NREFT). It currently supports operators O1 to O11, as well as millicharged and magnetic dipole Dark Matter. It can be used to generate spectra for Xenon, Argon, Carbon, Germanium, Iodine and Fluorine targets. WIMpy_NREFT also includes functionality to calculate directional recoil spectra, as well as signals from coherent neutrino-nucleus scattering (including fluxes from the Sun, atmosphere and diffuse supernovae).

[ascl:2203.030] Wigglewave: Linearized governing equations solver

Wigglewave uses a finite difference method to solve the linearized governing equations for a torsion Alfvèn wave propagating in a plasma with negligible plasma beta and in a force-free axisymmetric magnetic field with no azimuthal component embedded in a high density divergent tube structure. Wigglewave is fourth order in time and space using a fourth-order central difference scheme for calculating spatial derivatives and a fourth-order Runge-Kutta (RK4) scheme for updating at each timestep. The solutions calculated are the perturbations to the velocity, v and to the magnetic field, b. All variables are calculated over a uniform grid in radius r and height z.

[ascl:1010.084] WhiskyMHD: Numerical Code for General Relativistic Magnetohydrodynamics

Whisky is a code to evolve the equations of general relativistic hydrodynamics (GRHD) and magnetohydrodynamics (GRMHD) in 3D Cartesian coordinates on a curved dynamical background. It was originally developed by and for members of the EU Network on Sources of Gravitational Radiation and is based on the Cactus Computational Toolkit. Whisky can also implement adaptive mesh refinement (AMR) if compiled together with Carpet.

Whisky has grown from earlier codes such as GR3D and GRAstro_Hydro, but has been rewritten to take advantage of some of the latest research performed here in the EU. The motivation behind Whisky is to compute gravitational radiation waveforms for systems that involve matter. Examples would include the merger of a binary system containing a neutron star, which are expected to be reasonably common in the universe and expected to produce substantial amounts of radiation. Other possible sources are given in the projects list.

[ascl:1911.018] WhereWolf: Galaxy/(sub)Halo ghosting tool for N-body simulations

WhereWolf tracks (sub)haloes even if they have been lost by a halo finder in cosmological simulations and supplements halo catalogs such as VELOCIraptor (ascl:1911.020) with these recovered (sub)haloes. The code can improve measurements of the subhalo/halo mass function and present estimates of the distribution of radii at which subhaloes merge.

[ascl:2101.003] whereistheplanet: Predicting positions of directly imaged companions

whereistheplanet predicts the locations of directly imaged companions (mainly exoplanets and brown dwarfs) based on past orbital fits to the data. This tool helps coordinate follow-up observations to characterize their properties, as precise pointing of the instrument is often needed. It uses orbitize! (ascl:1910.009) as a backend. whereistheplanet is available as a Python API, a command line tool, and a web form at whereistheplanet.com.

[ascl:1404.013] WFC3UV_GC: WFC3 UVIS geometric-distortion correction

WFC3UV_GC is an improved geometric-distortion solution for the Hubble Space Telescope UVIS channel of Wide Field Camera 3 for ten broad-band filters. The solution is made up of three parts:

1.) a 3rd-order polynomial to deal with the general optical distortion;
2.) a table of residuals that accounts for both chip-related anomalies and fine-structure introduced by the filter; and,
3.) a linear transformation to put the two chips into a convenient master frame.

[ascl:2301.003] WF4Py: Gravitational waves waveform models in pure Python language

WF4Py implements frequency-domain gravitational wave waveform models in pure Python, thus enabling parallelization over multiple events at a time. Waveforms in WF4Py are built as classes; the functions take dictionaries containing the parameters of the events to analyze as input and provide Fourier domain waveform models. All the waveforms are accurately checked with their implementation in LALSuite (ascl:2012.021) and are a core element of GWFAST (ascl:2212.001).

[ascl:1705.015] WeirdestGalaxies: Outlier Detection Algorithm on Galaxy Spectra

WeirdestGalaxies finds the weirdest galaxies in the Sloan Digital Sky Survey (SDSS) by using a basic outlier detection algorithm. It uses an unsupervised Random Forest (RF) algorithm to assign a similarity measure (or distance) between every pair of galaxy spectra in the SDSS. It then uses the distance matrix to find the galaxies that have the largest distance, on average, from the rest of the galaxies in the sample, and defined them as outliers.

[ascl:1010.069] WeightWatcher: Code to Produce Control Maps

WeightWatcher is a program that combines weight-maps, flag-maps and polygon data in order to produce control maps which can directly be used in astronomical image-processing packages like Drizzle, SWarp or SExtractor.

[ascl:1010.042] WeightMixer: Hybrid Cross-power Spectrum Estimation

This code, which requires HEALPix 2.x (ascl:1107.018), allows you to generate power spectrum estimators from WMAP 5-year maps and generate hybrid cross- and auto- power spectrum and covariance from general foreground-cleaned maps. In addition, it allows you to simulate combined maps or combinations of maps for individual detectors and do MPI spherical transforms of arrays of maps, calculate coupling matrices etc. The code includes all of LensPix (ascl:1102.025), the MPI framework used for doing spherical transforms (based on HealPix).

[ascl:1609.007] Weighted EMPCA: Weighted Expectation Maximization Principal Component Analysis

Weighted EMPCA performs principal component analysis (PCA) on noisy datasets with missing values. Estimates of the measurement error are used to weight the input data such that the resulting eigenvectors, when compared to classic PCA, are more sensitive to the true underlying signal variations rather than being pulled by heteroskedastic measurement noise. Missing data are simply limiting cases of weight = 0. The underlying algorithm is a noise weighted expectation maximization (EM) PCA, which has additional benefits of implementation speed and flexibility for smoothing eigenvectors to reduce the noise contribution.

[ascl:1504.007] WebbPSF: James Webb Space Telescope PSF Simulation Tool

WebbPSF provides a PSF simulation tool in a flexible and easy-to-use software package implemented in Python. Functionality includes support for spectroscopic modes of JWST NIRISS, MIRI, and NIRSpec, including modeling of slit losses and diffractive line spread functions.

[ascl:2307.051] WeakLensingQML: Quadratic Maximum Likelihood estimator applied to Weak Lensing

WeakLensingQML implements the Quadratic Maximum Likelihood (QML) estimator and applies it to simulated cosmic shear data and compares the results to a Pseudo-Cl implementation. The package computes and saves relevant data files for later processes, such as the fiduciary cosmic shear power spectrum used in the analysis, the sky mask, and computing an analytic version of the QML's covariance matrix. The core of the package implements a conjugate-gradient approach for the quadratic estimator, and is parallelized for maximum performance. The code relies on the Eigen linear algebra package and the HealPix spherical harmonic transform library. A post-processing script analyzes the results and compares the QML's estimates with those from the Pseudo-Cl estimator; it then produces an array of plots highlighting the results.

[ascl:2109.021] WeakLensingDeblending: Weak lensing fast simulations and analysis of blended objects

WeakLensingDeblending provides weak lensing fast simulations and analysis for the LSST Dark Energy Science Collaboration. It is used to study the effects of overlapping sources on shear estimation, photometric redshift algorithms, and deblending algorithms. Users can run their own simulations (of LSST and other surveys) or download the galaxy catalog and simulation outputs to use with their own code.

[ascl:2206.016] wdwarfdate: White dwarfs age calculator

wdwarfdate derives the Bayesian total age of a white dwarf from an effective temperature and a surface gravity. It runs a chain of models assuming single star evolution and estimates the following parameters and their uncertainties: total age of the object, mass and cooling age of the white dwarf, and mass and lifetime of the progenitor star.

[ascl:2007.013] wdtools: Spectroscopic analysis of white dwarfs

wdtools characterizes the atmospheric parameters of white dwarfs using spectroscopic data. The flagship class is the generative fitting pipeline (GFP), which fits ab initio theoretical models to observed spectra in a Bayesian framework using high-speed neural networks to interpolate synthetic spectra.

[ascl:2206.012] WDPhotTools: White Dwarf Photometric SED fitter and luminosity function builder

WDPhotTools generates color-color diagrams and color-magnitude diagrams in various photometric systems, plots cooling profiles from different models, and computes theoretical white dwarf luminosity functions based on the built-in or supplied models of the (1) initial mass function, (2) total stellar evolution lifetime, (3) initial-final mass relation, and (4) white dwarf cooling time. The software has three main parts: the formatters that handle the output models from various works in the format as they are downloaded; the photometric fitter that solves for the WD parameters based on the photometry, with or without distance and reddening; and the generator of the white dwarf luminosity function in bolometric magnitudes or in any of the photometric systems available from the atmosphere model.

[ascl:2307.037] WDMWaveletTransforms: Fast forward and inverse WDM wavelet transforms

WDMWaveletTransforms implements the fast forward and inverse WDM wavelet transforms in Python from both the time and frequency domains. The frequency domain transforms are inherently faster and more accurate. The wavelet domain->frequency domain and frequency domain->wavelet domain transforms are nearly exact numerical inverses of each other for a variety of inputs tested, including Gaussian random noise. WDMWaveletTransforms has both command line and Python interfaces.

[ascl:1807.020] wdmerger: Simulate white dwarf mergers with CASTRO

wdmerger simulates binary white dwarf mergers (and related events) in CASTRO (ascl:1105.010) and provides useful information on the viability of mergers of white dwarfs as a progenitor for Type Ia supernovae.

[ascl:1806.012] WDEC: White Dwarf Evolution Code

WDEC (White Dwarf Evolution Code), written in Fortran, offers a fast and fairly easy way to produce models of white dwarfs. The code evolves hot (~100,000 K) input models down to a chosen effective temperature by relaxing the models to be solutions of the equations of stellar structure. The code can also be used to obtain g-mode oscillation modes for the models.

[ascl:2004.004] WD: Wilson-Devinney binary star modeling

Wilson-Devinney binary star modeling code (WD) is a complete package for modeling binary stars and their eclipes and consists of two main modules. The LC module generates light and radial velocity curves, spectral line profiles, images, conjunction times, and timing residuals; the DC module handles differential corrections, performing parameter adjustment of light curves, velocity curves, and eclipse timings by the Least Squares criterion. WD handles eccentric orbits and asynchronous rotation, and can compute velocity curves (with proximity and eclipse effects). It offers options for detailed reflection and nonlinear (logarithmic law) limb darkening, adjustment of spot parameters, an optional provision for spots to drift over the surface, and can follow light curve development over large numbers of orbits. Absolute flux solution allow Direct Distance Estimation (DDE) and there are improved solutions for ellipsoidal variables and for eclipsing binaries (EBs) with very shallow eclipses. Absolute flux solutions also can estimate temperatures of both EB components under suitable circumstances.

[ascl:1109.015] WCSTools: Image Astrometry Toolkit

WCSTools is a package of programs and a library of utility subroutines for setting and using the world coordinate systems (WCS) in the headers of the most common astronomical image formats, FITS and IRAF .imh, to relate image pixels to sky coordinates. In addition to dealing with image WCS information, WCSTools has extensive catalog search, image header manipulation, and coordinate and time conversion tasks. This software is all written in very portable C, so it should compile and run on any computer with a C compiler.

[ascl:1108.003] WCSLIB and PGSBOX

WCSLIB is a C library, supplied with a full set of Fortran wrappers, that implements the "World Coordinate System" (WCS) standard in FITS (Flexible Image Transport System). It also includes a PGPLOT-based routine, PGSBOX, for drawing general curvilinear coordinate graticules and a number of utility programs.

[ascl:2311.001] wcpy: Wavelength Calibrator

The graphical user interface Wavelength Calibrator facilitates wavelength calibration. Although developed for astronomical data reduction, it can also be used in any place where wavelength calibration is needed.

[ascl:2206.024] Wavetrack: Arbitrary time-evolving solar object recognition and tracking

Wavetrack recognizes and tracks CME shock waves, filaments, and other solar objects. The code creates base images by averaging а series of images a few minutes prior to the start of the eruption and constructs base difference images by subtracting base images from the current raw image of the sequence. This enhances the change in intensity caused by coronal bright fronts, omits static details, and reduces noise. Wavetrack then chooses an appropriate intensity interval and decomposes the base difference or running difference image with an A-Trous wavelet transform, where each wavelet coefficient is obtained by convolving the image array with a corresponding iteration of the wavelet kernel. When the maximum value of the wavelet coefficients for a connected set of pixels satisfies certain conditions, this region is considered as a structure on the respective wavelet coefficient. Separate stand-alone object masks are obtained with a clustering algorithm and objects are renumbered according to the number of the quadrant they belong at each iteration.

[ascl:2307.038] WarpX: Time-based electromagnetic and electrostatic Particle-In-Cell code

WarpX is an advanced electromagnetic & electrostatic Particle-In-Cell code. It supports many features including Perfectly-Matched Layers (PML), mesh refinement, and the boosted-frame technique. A highly-parallel and highly-optimized code, WarpX can run on GPUs and multi-core CPUs, includes load balancing capabilities, and scales to the largest supercomputers.

[ascl:1807.002] Warpfield: Winds And Radiation Pressure: Feedback Induced Expansion, colLapse and Dissolution

Warpfield (Winds And Radiation Pressure: Feedback Induced Expansion, colLapse and Dissolution) calculates shell dynamics and shell structure simultaneously for isolated massive clouds (≥105 M). This semi-analytic 1D feedback model scans a large range of physical parameters (gas density, star formation efficiency, and metallicity) to estimate escape fractions of ionizing radiation fesc, I, the minimum star formation efficiency ∊min required to drive an outflow, and recollapse time-scales for clouds that are not destroyed by feedback.

[ascl:2207.019] walter: Predictor for the number of resolved stars in a given observation from RST

walter calculates the number density of stars detected in a given observation aiming to resolve a stellar population. The code also calculates the exposure time needed to reach certain population features, such as the horizontal branch, and provides an estimate of the crowding limit. walter was written with the expectation that such calculations will be very useful for planning surveys with the Nancy Grace Roman Space Telescope (RST, formerly WFIRST).

[ascl:2108.004] WaldoInSky: Anomaly detection algorithms for time-domain astronomy

WaldoInSky finds anomalous astronomical light curves and their analogs. The package contains four methods: an adaptation of the Unsupervised Random Forest for anomaly detection in light curves that operates on the light curve points and their power spectra; two manifold-learning methods (the t-SNE and UMAP) that operate on the DMDT maps (image representations of the light curves), and that can be used to find analog light curves in the low-dimensional representation; and an Isolation Forest method for evaluating approaches of light curve pre-processing, before they are passed to the anomaly detectors. WaldoInSky also contain code for random sparsification of light curves.

[ascl:2301.021] WALDO: Waveform AnomaLy DetectOr

WALDO (Waveform AnomaLy DetectOr) flags possible anomalous Gravitational Waves from Numerical Relativity catalogs using deep learning. It uses a U-Net architecture to learn the waveform features of a dataset. After computing the mismatch between those waveforms and the neural predictions, WALDO isolates high mismatch evaluations for anomaly search.

[ascl:1710.001] vysmaw: Fast visibility stream muncher

The vysmaw client library facilitates the development of code for processes to tap into the fast visibility stream on the National Radio Astronomy Observatory's Very Large Array correlator back-end InfiniBand network. This uses the vys protocol to allow loose coupling to clients that need to remotely access memory over an Infiniband network.

[ascl:1704.011] VULCAN: Chemical Kinetics For Exoplanetary Atmospheres

VULCAN describes gaseous chemistry from 500 to 2500 K using a reduced C-H-O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry, and can be used to examine the theoretical trends produced when the temperature-pressure profile and carbon-to-oxygen ratio are varied.

[ascl:1407.013] VStar: Variable star data visualization and analysis tool

VStar is a multi-platform, easy-to-use variable star data visualization and analysis tool. Data for a star can be read from the AAVSO (American Association of Variable Star Observers) database or from CSV and TSV files. VStar displays light curves and phase plots, can produce a mean curve, and analyzes time-frequency with Weighted Wavelet Z-Transform. It offers tools for period analysis, filtering, and other functions.

[ascl:1811.017] VPLanet: Virtual planet simulator

VPLanet (Virtual Planetary Laboratory) simulates planetary system evolution with a focus on habitability. Physical models, typically consisting of ordinary differential equations for stellar, orbital, tidal, rotational, atmospheric, internal, magnetic, climate, and galactic evolution, are coupled together to simulate evolution for the age of a system.

[ascl:1408.016] vpguess: Fitting multiple Voigt profiles to spectroscopic data

vpguess facilitates the fitting of multiple Voigt profiles to spectroscopic data. It is a graphical interface to VPFIT (ascl:1408.015). Originally meant to simplify the process of setting up first guesses for a subsequent fit with VPFIT, it has developed into a full interface to VPFIT. It may also be used independently of VPFIT for displaying data, playing around with data and models, "chi-by-eye" fits, displaying the result of a proper fit, pretty plots, etc. vpguess is written in C, and the graphics are based on PGPLOT (ascl:1103.002).

[ascl:1408.015] VPFIT: Voigt profile fitting program

The VPFIT program fits multiple Voigt profiles (convolved with the instrument profiles) to spectroscopic data that is in FITS or an ASCII file. It requires CFITSIO (ascl:1010.001) and PGPLOT (ascl:1103.002); the tarball includes RDGEN (ascl:1408.017), which can be used with VPFIT to set up the fits, fit the profiles, and examine the result in interactive mode for setting up initial guesses; vpguess (ascl:1408.016) can also be used to set up an initial file.

[ascl:1309.008] VOStat: Statistical analysis of astronomical data

VOStat allows astronomers to use both simple and sophisticated statistical routines on large datasets. This tool uses the large public-domain statistical computing package R. Datasets can be uploaded in either ASCII or VOTABLE (preferred) format. The statistical computations are performed by the VOStat and results are returned to the user.

[ascl:1205.011] VOSpec: VO Spectral Analysis Tool

VOSpec is a multi-wavelength spectral analysis tool with access to spectra, theoretical models and atomic and molecular line databases registered in the VO. The standard tools of VOSpec include line and continuum fitting, redshift and reddening correction, spectral arithmetic and convolution between spectra, equivalent width and flux calculations, and a best fitting algorithm for fitting selected SEDs to a TSAP service. VOSpec offers several display modes (tree vs table) and organising functionalities according to the available metadata for each service, including distance from the observation position.

[ascl:2206.001] vortex: Helmholtz-Hodge decomposition for an AMR velocity field

vortex performs a Helmholtz-Hodge decomposition on vector fields defined on AMR grids, decomposing a vector field in its solenoidal (divergence-less) and compressive (curl-less) parts. It works natively on vector fields defined on Adaptive Mesh Refinement (AMR) grids, so that it can perform the decomposition over large dynamical ranges; it is also applicable to particle-based simulations. As vortex is devised primarily to investigate the properties of the turbulent velocity field in the Intracluster Medium (ICM), it also includes routines for multi-scale filtering the velocity field.

[ascl:1211.006] VorBin: Voronoi binning method

VorBin (Voronoi binning method) bins two-dimensional data to a constant signal-to-noise ratio per bin. It optimally solves the problem of preserving the maximum spatial resolution of general two-dimensional data, given a constraint on the minimum signal-to-noise ratio. The method is available in both IDL and Python.

[ascl:1309.006] VOPlot: Toolkit for Scientific Discovery using VOTables

VOPlot is a tool for visualizing astronomical data. It was developed in Java and acts on data available in VOTABLE, ASCII and FITS formats. VOPlot is available as a stand alone version, which is to be installed on the user's machine, or as a web-based version fully integrated with the VizieR database.

[ascl:1309.007] VOMegaPlot: Plotting millions of points

VOMegaPlot, a Java based tool, has been developed for visualizing astronomical data that is available in VOTable format. It has been specifically optimized for handling large number of points (in the range of millions). It has the same look and feel as VOPlot (ascl:1309.006) and both these tools have certain common functionality.

[ascl:2109.003] VOLKS2: VLBI Observation for transient Localization Keen Searcher

The VOLK2 (VLBI Observation for transient Localization Keen Searcher) pipeline conducts single pulse searches and localization in regular VLBI observations as well as single pulse detections from known sources in dedicated observations. In VOLKS2, the search and localization are two independent steps. The search step takes the idea of geodetic VLBI post processing, which fully utilizes the cross spectrum fringe phase information to maximize the signal power. Compared with auto spectrum based method, it is able to extract single pulses from highly RFI contaminated data. The localization uses the geodetic VLBI solving methods, which derives the single pulse location by solving a set of linear equations given the relation between the residual delay and the offset to a priori position.

[ascl:1811.016] VoigtFit: Absorption line fitting for Voigt profiles

VoigtFit fits Voigt profiles to absorption lines. It fits multiple components for various atomic lines simultaneously, allowing parameters to be tied and fixed, and can automatically fit a polynomial continuum model together with the line profiles. A physical model can be used to constrain thermal and turbulent broadening of absorption lines as well as implementing molecular excitation models. The code uses a χ2 minimization approach to find the best solution and offers interactive features such as manual continuum placement locally around each line, manual masking of undesired fitting regions, and interactive definition of velocity components for various elements, improving the ease of estimating initial guesses.

[ascl:1411.003] voevent-parse: Parse, manipulate, and generate VOEvent XML packets

voevent-parse, written in Python, parses, manipulates, and generates VOEvent XML packets; it is built atop lxml.objectify. Details of transients detected by many projects, including Fermi, Swift, and the Catalina Sky Survey, are currently made available as VOEvents, which is also the standard alert format by future facilities such as LSST and SKA. However, working with XML and adhering to the sometimes lengthy VOEvent schema can be a tricky process. voevent-parse provides convenience routines for common tasks, while allowing the user to utilise the full power of the lxml library when required. An earlier version of voevent-parse was part of the pysovo (ascl:1411.002) library.

[ascl:1304.005] VOBOZ/ZOBOV: Halo-finding and Void-finding algorithms

VOBOZ (VOronoi BOund Zones) is an algorithm to find haloes in an N-body dark matter simulation which has little dependence on free parameters.

ZOBOV (ZOnes Bordering On Voidness) is an algorithm that finds density depressions in a set of points without any free parameters or assumptions about shape. It uses the Voronoi tessellation to estimate densities to find both voids and subvoids. It also measures probabilities that each void or subvoid arises from Poisson fluctuations.

[ascl:2009.002] vlt-sphere: Automatic VLT/SPHERE data reduction and analysis

The high-contrast imager SPHERE at the Very Large Telescope combines extreme adaptive optics and coronagraphy to directly image exoplanets in the near-infrared. The vlt-sphere package enables easy reduction of the data coming from IRDIS and IFS, the two near-infrared subsystems of SPHERE. The package relies on the official ESO pipeline (ascl:1402.010), which must be installed separately.

[ascl:1908.014] Vlasiator: Hybrid-Vlasov simulation code

Vlasiator is a 6-dimensional Vlasov theory-based simulation. It simulates the entire near-Earth space at a global scale using the kinetic hybrid-Vlasov approach, to study fundamental plasma processes (reconnection, particle acceleration, shocks), and to gain a deeper understanding of space weather.

[ascl:2207.020] vKompth: Time-dependent Comptonization model for black-hole X-ray binaries

vKompth fits the energy-dependent rms-amplitude and phase-lag spectra of low-frequency quasi-periodic oscillations in low mass black-hole X-ray binaries using a variable Comptonization model. The accretion disc is modeled as a multi-temperature blackbody source emitting soft photons which are then Compton up-scattered in a spherical corona, including feedback of Comptonized photons that return to the disc.

[ascl:1701.002] Vizic: Jupyter-based interactive visualization tool for astronomical catalogs

Vizic is a Python visualization library that builds the connection between images and catalogs through an interactive map of the sky region. The software visualizes catalog data over a custom background canvas using the shape, size and orientation of each object in the catalog and displays interactive and customizable objects in the map. Property values such as redshift and magnitude can be used to filter or apply colormaps, and objects can be selected for further analysis through standard Python functions from inside a Jupyter notebook.

Vizic allows custom overlays to be appended dynamically on top of the sky map; included are Voronoi, Delaunay, Minimum Spanning Tree and HEALPix layers, which are helpful for visualizing large-scale structure. Overlays can be generated, added or removed dynamically with one line of code. Catalog data is kept in a non-relational database. The Jupyter Notebook allows the user to create scripts to analyze and plot the data selected/displayed in the interactive map, making Vizic a powerful and flexible interactive analysis tool. Vizic be used for data inspection, clustering analysis, galaxy alignment studies, outlier identification or simply large-scale visualizations.

[ascl:1402.001] Vissage: ALMA VO Desktop Viewer

Vissage (VISualisation Software for Astronomical Gigantic data cubEs) is a FITS browser primarily targeting FITS data cubes obtained from ALMA. Vissage offers basic functionality for viewing three-dimensional data cubes, integrated intensity map, flipbook, channel map, and P-V diagram. It has several color sets and color scales available, offers panning and zooming, and can connect with the ALMA WebQL system and the JVO Subaru Image Cutout Service.

[ascl:1011.020] VisIVO: Integrated Tools and Services for Large-Scale Astrophysical Visualization

VisIVO is an integrated suite of tools and services specifically designed for the Virtual Observatory. This suite constitutes a software framework for effective visual discovery in currently available (and next-generation) very large-scale astrophysical datasets. VisIVO consists of VisiVO Desktop - a stand alone application for interactive visualization on standard PCs, VisIVO Server - a grid-enabled platform for high performance visualization and VisIVO Web - a custom designed web portal supporting services based on the VisIVO Server functionality. The main characteristic of VisIVO is support for high-performance, multidimensional visualization of very large-scale astrophysical datasets. Users can obtain meaningful visualizations rapidly while preserving full and intuitive control of the relevant visualization parameters. This paper focuses on newly developed integrated tools in VisIVO Server allowing intuitive visual discovery with 3D views being created from data tables. VisIVO Server can be installed easily on any web server with a database repository. We discuss briefly aspects of our implementation of VisiVO Server on a computational grid and also outline the functionality of the services offered by VisIVO Web. Finally we conclude with a summary of our work and pointers to future developments.

[ascl:1103.007] VisIt: Interactive Parallel Visualization and Graphical Analysis Tool

VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool’s features.

VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

[ascl:1408.010] VisiOmatic: Celestial image viewer

VisiOmatic is a web client for IIPImage (ascl:1408.009) and is used to visualize and navigate through large science images from remote locations. It requires STIFF (ascl:1110.006), is based on the Leaflet Javascript library, and works on both touch-based and mouse-based devices.

[ascl:1802.006] VISIBLE: VISIbility Based Line Extraction

VISIBLE applies approximated matched filters to interferometric data, allowing line detection directly in visibility space. The filter can be created from a FITS image or RADMC3D output image, and the weak line data can be a CASA MS or uvfits file. The filter response spectrum can be output either to a .npy file or returned back to the user for scripting.

[ascl:2102.007] viscm: Colormaps analyzer and creator

viscm is a Python tool for visualizing and designing colormaps using colorspacious and matplotlib.

[ascl:1804.019] ViSBARD: Visual System for Browsing, Analysis and Retrieval of Data

ViSBARD interactively visualizes and analyzes space physics data. It provides an interactive integrated 3-D and 2-D environment to determine correlations between measurements across many spacecraft. It supports a variety of spacecraft data products and MHD models and is easily extensible to others. ViSBARD provides a way of visualizing multiple vector and scalar quantities as measured by many spacecraft at once. The data are displayed three-dimesionally along the orbits which may be displayed either as connected lines or as points. The data display allows the rapid determination of vector configurations, correlations between many measurements at multiple points, and global relationships. With the addition of magnetohydrodynamic (MHD) model data, this environment can also be used to validate simulation results with observed data, use simulated data to provide a global context for sparse observed data, and apply feature detection techniques to the simulated data.

[ascl:2305.002] Virtual Telescope: Next-Generation Space Telescope Simulator

Virtual Telescope predicts the signal-to-noise and other parameters of imaging and/or spectroscopic observations as a function of telescope size, detector noise, and other factors for the Next-Generation Space Telescope.

[ascl:1204.012] VirGO: A Visual Browser for the ESO Science Archive Facility

VirGO is the next generation Visual Browser for the ESO Science Archive Facility developed by the Virtual Observatory (VO) Systems Department. It is a plug-in for the popular open source software Stellarium adding capabilities for browsing professional astronomical data. VirGO gives astronomers the possibility to easily discover and select data from millions of observations in a new visual and intuitive way. Its main feature is to perform real-time access and graphical display of a large number of observations by showing instrumental footprints and image previews, and to allow their selection and filtering for subsequent download from the ESO SAF web interface. It also allows the loading of external FITS files or VOTables, the superimposition of Digitized Sky Survey (DSS) background images, and the visualization of the sky in a `real life' mode as seen from the main ESO sites. All data interfaces are based on Virtual Observatory standards which allow access to images and spectra from external data centers, and interaction with the ESO SAF web interface or any other VO applications supporting the PLASTIC messaging system.

[ascl:2108.006] viper: Velocity and IP EstimatoR

viper (Velocity and IP EstimatoR) measures differential radial velocities from stellar spectra taken through iodine or other gas cells. It convolves the product of a stellar template and a gas cell spectrum with an instrumental profile. Via least square fitting, it optimizes the parameters of the instrumental profile, the wavelength solution, flux normalization, and the stellar Doppler shift. viper offers various functions to describe the instrumental profile such as Gaussian, super-Gaussian, skewed Gaussian or mixtures of Gaussians. The code is developed for echelle spectra; it can handle data from CES, CRIRES+, KECK, OES, TCES, and UVES, and additional instruments can easily be added. A graphical interface facilitates the work with numerous flexible options.

[ascl:1603.003] VIP: Vortex Image Processing pipeline for high-contrast direct imaging of exoplanets

VIP (Vortex Image Processing pipeline) provides pre- and post-processing algorithms for high-contrast direct imaging of exoplanets. Written in Python, VIP provides a very flexible framework for data exploration and image processing and supports high-contrast imaging observational techniques, including angular, reference-star and multi-spectral differential imaging. Several post-processing algorithms for PSF subtraction based on principal component analysis are available as well as the LLSG (Local Low-rank plus Sparse plus Gaussian-noise decomposition) algorithm for angular differential imaging. VIP also implements the negative fake companion technique coupled with MCMC sampling for rigorous estimation of the flux and position of potential companions.

[ascl:1010.058] VINE: A numerical code for simulating astrophysical systems using particles

VINE is a particle based astrophysical simulation code. It uses a tree structure to efficiently solve the gravitational N-body problem and Smoothed Particle Hydrodynamics (SPH) to simulate gas dynamical effects. The code has been successfully used for a number of studies on galaxy interactions, galactic dynamics, star formation and planet formation and given the implemented physics, other applications are possible as well.

[ascl:1201.006] VIM: Visual Integration and Mining

VIM (Virtual Observatory Integration and Mining) is a data retrieval and exploration application that assumes an astronomer has a list of 'sources' (positions in the sky), and wants to explore archival catalogs, images, and spectra of the sources, in order to identify, select, and mine the list. VIM does this either through web forms, building a custom 'data matrix,' or locally through downloadable Python code. Any VO-registered catalog service can be used by VIM, as well as co-registered image cutouts from VO-image services, and spectra from VO-spectrum services. The user could, for example, show together: proper motions from GSC2, name and spectral type from NED, magnitudes and colors from 2MASS, and cutouts and spectra from SDSS. VIM can compute columns across surveys and sort on these (eg. 2MASS J magnitude minus SDSS g). For larger sets of sources, VIM utilizes the asynchronous Nesssi services from NVO, that can run thousands of cone and image services overnight.

[ascl:1403.016] Viewpoints: Fast interactive linked plotting of large multivariate data sets

Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It uses linked scatterplots to find relations in a few seconds that can take much longer with other plotting tools. Its features include linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal.

[ascl:1407.014] VIDE: The Void IDentification and Examination toolkit

The Void IDentification and Examination toolkit (VIDE) identifies voids using a modified version of the parameter-free void finder ZOBOV (ascl:1304.005); a Voronoi tessellation of the tracer particles is used to estimate the density field followed by a watershed algorithm to group Voronoi cells into zones and subsequently voids. Output is a summary of void properties in plain ASCII; a Python API is provided for analysis tasks, including loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles.

[ascl:1404.010] VictoriaReginaModels: Stellar evolutionary tracks

The Victoria–Regina stellar models are comprised of seventy-two grids of stellar evolutionary tracks accompanied by complementary zero-age horizontal branches and are presented in “equivalent evolutionary phase” (.eep) files. This Fortran 77 software interpolates isochrones, isochrone population functions, luminosity functions, and color functions of stellar evolutionary tracks.

[ascl:1306.015] VHD: Viscous pseudo-Newtonian accretion

VHD is a numerical study of viscous fluid accretion onto a black hole. The flow is axisymmetric and uses a pseudo-Newtonian potential to model relativistic effects near the event horizon. VHD is based on ZEUS-2D (Stone & Norman 1992) with the addition of an explicit scheme for the viscosity.

[ascl:1204.007] VH-1: Multidimensional ideal compressible hydrodynamics code

VH-1 is a multidimensional ideal compressible hydrodynamics code written in FORTRAN for use on any computing platform, from desktop workstations to supercomputers. It uses a Lagrangian remap version of the Piecewise Parabolic Method developed by Paul Woodward and Phil Colella in their 1984 paper. VH-1 comes in a variety of versions, from a simple one-dimensional serial variant to a multi-dimensional version scalable to thousands of processors.

[ascl:1904.019] Vevacious: Global minima of one-loop effective potentials generator

Vevacious takes a generic expression for a one-loop effective potential energy function and finds all the tree-level extrema, which are then used as the starting points for gradient-based minimization of the one-loop effective potential. The tunneling time from a given input vacuum to the deepest minimum, if different from the input vacuum, can be calculated. The parameter points are given as files in the SLHA format (though is not restricted to supersymmetric models), and new model files can be easily generated automatically by the Mathematica package SARAH (ascl:1904.020).

[ascl:2307.017] Veusz: Scientific plotting package

Veusz produces a wide variety of publication-ready 2D and 3D plots. Plots are created by building up plotting widgets with a consistent object-based interface, and the package provides many options for customizing plots. Veusz can import data from text, CSV, HDF5 and FITS files; datasets can also be entered within the program and new datasets created via the manipulation of existing datasets using mathematical expressions and more. The program can also be extended, by adding plugins supporting importing new data formats, different types of data manipulation or for automating tasks, and it supports vector and bitmap output, including PDF, Postscript, SVG and EMF.

[ascl:2203.022] Vetting: Stand-alone tools for vetting transit signals in Kepler, K2 and TESS data

vetting contains simple, stand-alone Python tools for vetting transiting signals in NASA's Kepler, K2, and TESS data. The code performs a centroid test to look for significant changes in the centroid of a star during a transit or eclipse. vetting requires an installation of Python 3.8 or higher.

[ascl:1503.011] VESPA: False positive probabilities calculator

Validation of Exoplanet Signals using a Probabilistic Algorithm (VESPA) calculates false positive probabilities and statistically validates transiting exoplanets. Written in Python, it uses isochrones [ascl:1503.010] and the package simpledist.

[ascl:1802.005] Verne: Earth-stopping effect for heavy dark matter

Verne calculates the Earth-stopping effect for super-heavy Dark Matter (DM). The code allows you to calculate the speed distribution (and DM signal rate) at an arbitrary detector location on the Earth. The calculation takes into account the full anisotropic DM velocity distribution and the full velocity dependence of the DM-nucleus cross section. Results can be obtained for any DM mass and cross section, though the results are most reliable for very heavy DM particles.

[ascl:1802.002] venice: Mask utility

venice reads a mask file (DS9 or fits type) and a catalogue of objects (ascii or fits type) to create a pixelized mask, find objects inside/outside a mask, or generate a random catalogue of objects inside/outside a mask. The program reads the mask file and checks if a point, giving its coordinates, is inside or outside the mask, i.e. inside or outside at least one polygon of the mask.

[ascl:1911.020] VELOCIraptor-STF: Six-dimensional Friends-of-Friends phase space halo finder

VELOCIraptor-STF, formerly STructure Finder (ascl:1306.009), is a 6-Dimensional Friends-of-Friends (6D-FoF) phase space halo finder and constructs halo catalogs. The code uses using MPI and OpenMP APIs and can be compiled as a library for on-the-fly halo finding within an N-body/hydrodynamnical code. There is an associated halo merger tree code TreeFrog (ascl:1911.021).

[ascl:2308.014] velocileptors: Velocity-based Lagrangian and Eulerian PT expansions of redshift-space distortions

velocileptors computes the real- and redshift-space power spectra and correlation functions of biased tracers using 1-loop perturbation theory (with effective field theory counter terms and up to cubic biasing) as well as the real-space pairwise velocity moments. It provides simple computation of the power spectrum wedges or multipoles, and uses a reduced set of parameters for computing the most common case of the redshift-space power spectrum. In addition, velocileptors offers two "direct expansion" modules available in LPT and EPT.

[ascl:1010.021] velfit: A Code for Modeling Non-Circular Flows in Disk Galaxies

High-quality velocity maps of galaxies frequently exhibit signatures of non-circular streaming motions. velfit yields results that are more easily interpreted than the commonly used procedure. It can estimate the magnitudes of forced non-circular motions over a broad range of bar strengths from a strongly barred galaxy, through cases of mild bar-like distortions to placing bounds on the shapes of halos in galaxies having extended rotation curves.

This code is no longer maintained and has been superseded by DiskFit (ascl:1209.011).

[ascl:1610.009] velbin: radial velocity corrected for binary orbital motions

Velbin convolves the radial velocity offsets due to binary orbital motions with a Gaussian to model an observed velocity distribution. This can be used to measure the mean velocity and velocity dispersion from an observed radial velocity distribution, corrected for binary orbital motions. Velbin fits single- or multi-epoch data with any arbitrary binary orbital parameter distribution (as long as it can be sampled properly), however it always assumes that the intrinsic velocity distribution (i.e. corrected for binary orbital motions) is a Gaussian. Velbin samples (and edits) a binary orbital parameter distribution, fits an observed radial velocity distribution, and creates a mock radial velocity distribution that can be used to provide the fitted radial velocities in the single_epoch or multi_epoch methods.

[ascl:2301.020] VDA: Void Dwarf Analyzer

void-dwarf-analysis analyzes Keck Cosmic Web Imager datacubes to produce maps of kinematic properties (velocity and velocity dispersion), emission line fluxes, and gas-phase metallicities of void dwarf galaxies.

[ascl:2311.002] VCAL-SPHERE: Hybrid pipeline for reduction of VLT/SPHERE data

VCAL-SPHERE, for VIP-based Calibration of VLT/SPHERE data, is a versatile pipeline for high-contrast imaging of exoplanets and circumstellar disks. The pipeline covers all steps of data reduction, including raw calibration, pre-processing and post-processing (i.e., modeling and subtraction of the stellar halo), for the IFS, IRDIS-DBI and IRDIS-CI modes (and combinations thereof) of the VLT instrument SPHERE. The three main steps of the reduction correspond to different modules, where the first follows the recommended EsoRex (ascl:1504.003) workflow and associated recipes with occasional inclusion of VIP (ascl:1603.003) routines (e.g., for PCA-based sky subtraction), while the other two stages fully rely on the VIP toolbox. Although the default parameters of the pipeline should yield a good reduction in most cases, these can be tuned using JSON parameter files for each stage of the pipeline for optimal reduction of specific datasets.

[ascl:1809.004] VBBINARYLENSING: Microlensing light-curve computation

VBBinaryLensing forward models gravitational microlensing events using the advanced contour integration method; it supports single and binary lenses. The lens map is inverted on a collection of points on the source boundary to obtain a corresponding collection of points on the boundaries of the images from which the area of the images can be recovered by use of Green’s theorem. The code takes advantage of a number of techniques to make contour integration much more efficient, including using a parabolic correction to increase the accuracy of the summation, introducing an error estimate on each arc of the boundary to enable defining an optimal sampling, and allowing the inclusion of limb darkening. The code is written as a C++ library and wrapped as a Python package, and can be called from either C++ or Python.

[ascl:1704.005] VaST: Variability Search Toolkit

VaST (Variability Search Toolkit) finds variable objects on a series of astronomical images in FITS format. The software performs object detection and aperture photometry using SExtractor (ascl:1010.064) on each image, cross-matches lists of detected stars, performs magnitude calibration with respect to the first (reference) image and constructs a lightcurve for each object. The sigma-magnitude, Stetson's L variability index, Robust Median Statistic (RoMS) and other plots may be used to visually identify variable star candidates. The two distinguishing features of VaST are its ability to perform accurate aperture photometry of images obtained with non-linear detectors and to handle complex image distortions. VaST can be used in cases of unstable PSF (e.g., bad guiding or with digitized wide-field photographic images), and has been successfully applied to images obtained with telescopes ranging from 0.08 to 2.5m in diameter equipped with a variety of detectors including CCD, CMOS, MIC and photographic plates.

[ascl:1208.016] VARTOOLS: Light Curve Analysis Program

The VARTOOLS program is a command line utility that provides tools for analyzing time series astronomical data. It implements a number of routines for calculating variability/periodicity statistics of light curves, as well as tools for modifying and filtering light curves.

[ascl:2109.026] Varstar Detect: Variable star detection in TESS data

Varstar Detect uses several numerical and statistical methods to filter and interpret the data obtained from TESS. It performs an amplitude test to determine whether a star is variable and if so, provides the characteristics of each star through phenomenological analysis of the lightcurve.

[ascl:2208.007] VapoRock: Modeling magma ocean atmospheres and stellar nebula

VapoRock calculates the equilibrium partial pressures of metal-bearing gas species of specific elements above the magma ocean surface to determine the metal-bearing composition of the atmosphere as a function of temperature and the bulk composition of the magma ocean. It utilizes ENKI's ThermoEngine (ascl:2208.006) and combines estimates for element activities in silicate melts with thermodynamic data for metal and metal oxide vapor species.

[ascl:1111.012] VAPOR: Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers

VAPOR is the Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers. VAPOR provides an interactive 3D visualization environment that runs on most UNIX and Windows systems equipped with modern 3D graphics cards. VAPOR provides:

- A visual data discovery environment tailored towards the specialized needs of the astro and geosciences CFD community
- A desktop solution capable of handling terascale size data sets
- Advanced interactive 3D visualization tightly coupled with quantitative data analysis
- Support for multi-variate, time-varying data
- Close coupling with RSI's powerful interpretive data language, IDL
- Support for 3D visualization of WRF-ARW datasets

[ascl:1506.010] VAPID: Voigt Absorption-Profile [Interstellar] Dabbler

VAPID (Voigt Absorption Profile [Interstellar] Dabbler) models interstellar absorption lines. It predicts profiles and optimizes model parameters by least-squares fitting to observed spectra. VAPID allows cloud parameters to be optimized with respect to several different data set simultaneously; those data sets may include observations of different transitions of a given species, and may have different S/N ratios and resolutions.

[ascl:1309.002] VAPHOT: Precision differential aperture photometry package

VAPHOT is an aperture photometry package for precise time−series photometry of uncrowded fields, geared towards the extraction of target lightcurves of eclipsing or transiting systems. Its photometric main routine works within the IRAF (ascl:9911.002) environment and is built upon the standard aperture photometry task 'phot' from IRAF, using optimized aperture sizes. The associated analysis program 'VANALIZ' works in the IDL environment. It performs differential photometry with graphical and numerical output. VANALIZ produces plots indicative of photometric stability and permits the interactive evaluation and weighting of comparison stars. Also possible is the automatic or manual suppression of data-points and the output of statistical analyses. Several methods for the calculation of the reference brightness are offered. Specific routines for the analysis of transit 'on'-'off' photometry, comparing the target brightness inside against outside a transit are also available.

[ascl:1702.004] Validation: Codes to compare simulation data to various observations

Validation provides codes to compare several observations to simulated data with stellar mass and star formation rate, simulated data stellar mass function with observed stellar mass function from PRIMUS or SDSS-GALEX in several redshift bins from 0.01-1.0, and simulated data B band luminosity function with observed stellar mass function, and to create plots for various attributes, including stellar mass functions, and stellar mass to halo mass. These codes can model predictions (in some cases alongside observational data) to test other mock catalogs.

[ascl:1810.004] VaeX: Visualization and eXploration of Out-of-Core DataFrames

VaeX (Visualization and eXploration) interactively visualizes and explores big tabular datasets. It can calculate statistics such as mean, sum, count, and standard deviation on an N-dimensional grid up to a billion (109) objects/rows per second. Visualization is done using histograms, density plots, and 3d volume rendering, allowing interactive exploration of big data. VaeX uses memory mapping, zero memory copy policy and lazy computations for best performance, and integrates well with the Jupyter/IPython notebook/lab ecosystem.

[ascl:1406.009] VADER: Viscous Accretion Disk Evolution Resource

VADER is a flexible, general code that simulates the time evolution of thin axisymmetric accretion disks in time-steady potentials. VADER handles arbitrary viscosities, equations of state, boundary conditions, and source and sink terms for both mass and energy.

[ascl:1207.003] VAC: Versatile Advection Code

The Versatile Advection Code (VAC) is a freely available general hydrodynamic and magnetohydrodynamic simulation software that works in 1, 2 or 3 dimensions on Cartesian and logically Cartesian grids. VAC runs on any Unix/Linux system with a Fortran 90 (or 77) compiler and Perl interpreter. VAC can run on parallel machines using either the Message Passing Interface (MPI) library or a High Performance Fortran (HPF) compiler.

[ascl:1911.002] uvplot: Interferometric visibilities plotter

uvplot makes nice plots of deprojected interferometric visibilities (often called uvplots). It implements plotting functionality, handles MS tables with spectral windows with different number of channels, and can import visibilities from ASCII to MS Table. It also allows export of specific channels. uvplot can be installed inside the NRAO CASA package (ascl:1107.013).

[ascl:1410.004] UVOTPY: Swift UVOT grism data reduction

The two Swift UVOT grisms provide uv (170.0-500.0 nm) and visible (285.0-660.0 nm) spectra with a resolution of R~100 and 75. To reduce the grism data, UVOTPY extracts a spectrum given source sky position, and outputs a flux calibrated spectrum. UVOTPY is a replacement for the UVOTIMGRISM FTOOL (ascl:9912.002) in the HEADAS Swift package. Its extraction uses a curved aperture for the uv spectra, accounts the coincidence losses in the detector, provides more accurate anchor positions for the wavelength scale, and is valid for the whole detector.

[ascl:1402.017] UVMULTIFIT: Fitting astronomical radio interferometric data

UVMULTIFIT, written in Python, is a versatile library for fitting models directly to visibility data. These models can depend on frequency and fitting parameters in an arbitrary algebraic way. The results from the fit to the visibilities of sources with sizes smaller than the diffraction limit of the interferometer are superior to the output obtained from a mere analysis of the deconvolved images. Though UVMULTIFIT is based on the CASA package, it can be easily adapted to other analysis packages that have a Python API.

[ascl:1606.006] uvmcmcfit: Parametric models to interferometric data fitter

Uvmcmcfit fits parametric models to interferometric data. It is ideally suited to extract the maximum amount of information from marginally resolved observations with interferometers like the Atacama Large Millimeter Array (ALMA), Submillimeter Array (SMA), and Plateau de Bure Interferometer (PdBI). uvmcmcfit uses emcee (ascl:1303.002) to do Markov Chain Monte Carlo (MCMC) and can measure the goodness of fit from visibilities rather than deconvolved images, an advantage when there is strong gravitational lensing and in other situations. uvmcmcfit includes a pure-Python adaptation of Miriad’s (ascl:1106.007) uvmodel task to generate simulated visibilities given observed visibilities and a model image and a simple ray-tracing routine that allows it to account for both strongly lensed systems (where multiple images of the lensed galaxy are detected) and weakly lensed systems (where only a single image of the lensed galaxy is detected).

[ascl:2208.014] uvcombine: Combine images with different resolutions

uvcombine combines single-dish and interferometric data. It can combine high-resolution images that are missing large angular scales (Fourier-domain short-spacings) with low-resolution images containing the short/zero spacing. uvcombine includes the "feathering" technique for interferometry data, implementing a similar approach to CASA’s (ascl:1107.013) feather task but with additional options. Also included are consistency tests for the flux calibration and single-dish scale by comparing the data in the uv-overlap range.

[ascl:1412.003] UTM: Universal Transit Modeller

The Universal Transit Modeller (UTM) is a light-curve simulator for all kinds of transiting or eclipsing configurations between arbitrary numbers of several types of objects, which may be stars, planets, planetary moons, and planetary rings. A separate fitting program, UFIT (Universal Fitter) is part of the UTM distribution and may be used to derive best fits to light-curves for any set of continuously variable parameters. UTM/UFIT is written in IDL code and its source is released in the public domain under the GNU General Public License.

[ascl:1411.012] util_2comp: Planck-based two-component dust model utilities

The util_2comp software utilities generate predictions of far-infrared Galactic dust emission and reddening based on a two-component dust emission model fit to Planck HFI, DIRBE and IRAS data from 100 GHz to 3000 GHz. These predictions and the associated dust temperature map have angular resolution of 6.1 arcminutes and are available over the entire sky. Implementations in IDL and Python are included.

[ascl:2209.012] URILIGHT: Time-dependent Monte-Carlo radiative-transfer

The time dependent Monte-Carlo code URILIGHT, written in Fortran 90, assumes homologous expansion. Energy deposition resulting from the decay of radioactive isotopes is calculated by a Monte-Carlo solution of the γ-ray transport, for which interaction with matter is included through Compton scattering and photoelectric absorption. The temperature is iteratively solved for in each cell by requiring that the total emissivity equals the total absorbed energy.

[ascl:1412.009] URCHIN: Reverse ray tracer

URCHIN is a Smoothed Particle Hydrodynamics (SPH) reverse ray tracer (i.e. from particles to sources). It calculates the amount of shielding from a uniform background that each particle experiences. Preservation of the adaptive density field resolution present in many gas dynamics codes and uniform sampling of gas resolution elements with rays are two of the benefits of URCHIN; it also offers preservation of Galilean invariance, high spectral resolution, and preservation of the standard uniform UV background in optically thin gas.

[ascl:1512.019] UPSILoN: AUtomated Classification of Periodic Variable Stars using MachIne LearNing

UPSILoN (AUtomated Classification of Periodic Variable Stars using MachIne LearNing) classifies periodic variable stars such as Delta Scuti stars, RR Lyraes, Cepheids, Type II Cepheids, eclipsing binaries, and long-period variables (i.e. superclasses), and their subclasses (e.g. RR Lyrae ab, c, d, and e types) using well-sampled light curves from any astronomical time-series surveys in optical bands regardless of their survey-specific characteristics such as color, magnitude, and sampling rate. UPSILoN consists of two parts, one which extracts variability features from a light curve, and another which classifies a light curve, and returns extracted features, a predicted class, and a class probability. In principle, UPSILoN can classify any light curves having arbitrary number of data points, but using light curves with more than ~80 data points provides the best classification quality.

[ascl:1504.001] UPMASK: Unsupervised Photometric Membership Assignment in Stellar Clusters

UPMASK, written in R, performs membership assignment in stellar clusters. It uses photometry and spatial positions, but can take into account other types of data. UPMASK takes into account arbitrary error models; the code is unsupervised, data-driven, physical-model-free and relies on as few assumptions as possible. The approach followed for membership assessment is based on an iterative process, principal component analysis, a clustering algorithm and a kernel density estimation.

[submitted] unWISE-verse: An Integrated WiseView and Zooniverse Data Pipeline

unWISE-verse is an integrated Python pipeline for downloading sets of unWISE time-resolved coadd cutouts from the WiseView image service and uploading subjects to Zooniverse.org for use in astronomical citizen science research. This software was initially designed for the Backyard Worlds: Cool Neighbors research project and is optimized for target sets containing low luminosity brown dwarf candidates. However, unWISE-verse can be applied to other future astronomical research projects that seek to make use of unWISE infrared sky maps, such as studies of infrared variable/transient sources.

[ascl:1901.004] unwise_psf: PSF models for unWISE coadds

The unwise_psf Python module renders point spread function (PSF) models appropriate for use in modeling of unWISE coadd images. unwise_psf translates highly detailed single-exposure WISE PSF models in detector coordinates to the corresponding pixelized PSF models in coadd space, accounting for subtleties including the WISE scan direction and its considerable variation near the ecliptic poles. Applications of the unwise_psf module include performing forced photometry on unWISE coadds, constructing WISE-selected source catalogs based on unWISE coadds and masking unWISE coadd regions contaminated by bright stars.

[ascl:2211.005] unTimely_Catalog_explorer: A search and visualization tool for the unTimely Catalog

unTimely Catalog Explorer searches for and visualizes detections in the unTimely Catalog, a full-sky, time-domain catalog of detections based on WISE and NEOWISE image data acquired between 2010 and 2020. The tool searches the catalog by coordinates to create finder charts for each epoch with overplotted catalog positions and light curves using the unTimely photometry, to overplot these light curves with AllWISE multi-epoch and NEOWISE-R single exposure (L1b) photometry, and to create image blinks with overlaid catalog positions in GIF format.

[ascl:2109.015] unpopular: Using CPM detrending to obtain TESS light curves

unpopular is an implementation of the Causal Pixel Model (CPM) de-trending method to obtain TESS Full-Frame Image (FFI) light curves. The code, written in Python, models the systematics in the light curves of individual pixels as a linear combination of light curves from many other distant pixels and removes shared flux variations. unpopular is able to preserve sector-length astrophysical signals, allowing for the extraction of multi-sector light curves from the FFI data.

[ascl:1110.021] Univiewer: Visualisation Program for HEALPix Maps

Univiewer is a visualisation program for HEALPix maps. It is written in C++ and uses OpenGL and the wxWidgets library for cross-platform portability. Using it you can:

- Rotate and zoom maps on the sphere in 3D
- Create high-resolution views of square patches of the map
- Change maximum and minimum values of the colourmap interactively
- Calculate the power spectrum of the full-sky map or a patch
- Display any column of a HEALPix map FITS file on the sphere

Since Univiewer uses OpenGL for 3D graphics, its performance is dependent your video card. It has been tested successfully on computers with as little as 8Mb video memory, but it is recommended to have at least 32Mb to get good performance.

In the 3D view, a HEALPix map is projected onto a ECP pixelation to create a texture which is wrapped around the sphere. In calculating the power spectrum, the spherical harmonic transforms are computed using the same ECP pixelation. This inevitably leads to some discrepancies at small scales due to repixelation effects, but they are reasonably small.

[ascl:2302.011] UniverseMachine: Empirical model for galaxy formation

The UniverseMachine applies simple empirical models of galaxy formation to dark matter halo merger trees. For each model, it generates an entire mock universe, which it then observes in the same way as the real Universe to calculate a likelihood function. It includes an advanced MCMC algorithm to explore the allowed parameter space of empirical models that are consistent with observations.

[ascl:1503.007] UniPOPS: Unified data reduction suite

UniPOPS, a suite of programs and utilities developed at the National Radio Astronomy Observatory (NRAO), reduced data from the observatory's single-dish telescopes: the Tucson 12-m, the Green Bank 140-ft, and archived data from the Green Bank 300-ft. The primary reduction programs, 'line' (for spectral-line reduction) and 'condar' (for continuum reduction), used the People-Oriented Parsing Service (POPS) as the command line interpreter. UniPOPS unified previous analysis packages and provided new capabilities; development of UniPOPS continued within the NRAO until 2004 when the 12-m was turned over to the Arizona Radio Observatory (ARO). The submitted code is version 3.5 from 2004, the last supported by the NRAO.

[ascl:2111.014] UniMAP: Unicorn Multi-window Anomaly Detection Pipeline

The data analysis UniMAP (Unicorn Multi-window Anomaly Detection Pipeline) leverages the Temporal Outlier Factor (TOF) method to find anomalies in LVC data. The pipeline requires a target detector and a start and stop GPS time describing a time interval to analyze, and has three outputs: 1.) an array of GPS times corresponding to TOF detections; 2.) a long q-transform of the entire data interval with visualizations of the TOF detections in the time series; and 3.) q-transforms of the data windows that triggered TOF detections.

[ascl:1804.022] UniDAM: Unified tool to estimate Distances, Ages, and Masses

UniDAM obtains a homogenized set of stellar parameters from spectrophotometric data of different surveys. Parallax and extinction data can be incorporated into the isochrone fitting method used in UniDAM to reduce distance and age estimate uncertainties for TGAS stars for distances up to 1 kpc and decrease distance Gaia end-of-mission parallax uncertainties by about a factor of 20 and age uncertainties by a factor of two for stars up to 10 kpc away from the Sun.

[submitted] UMIST

Astrochemistry database of chemical species.

[ascl:2008.006] Umbrella: Asteroid detection, validation, and identification

Umbrella detects, validates, and identifies asteroids. The core of this software suite, Umbrella2, includes algorithms and interfaces for all steps of the processing pipeline, including a novel detection algorithm for faint trails. A detection pipeline accessible as a desktop program (ViaNearby) builds on the library to provide near real-time data reduction of asteroid surveys on the Wide Field Camera of the Isaac Newton Telescope. Umbrella can read and write MPC optical reports, supports SkyBoT and VizieR querying, and can be extended by user image processing functions to take advantage of the algorithms framework as a multi-threaded CPU scheduler for easy algorithm parallelization.

[ascl:1104.007] ULySS: A Full Spectrum Fitting Package

ULySS (University of Lyon Spectroscopic Analysis Software) is an open-source software package written in the GDL/IDL language to analyze astronomical data. ULySS fits a spectrum with a linear combination of non-linear components convolved with a line-of-sight velocity distribution (LOSVD) and multiplied by a polynomial continuum. ULySS is used to study stellar populations of galaxies and star clusters and atmospheric parameters of stars.

[submitted] Ulula: a lightweight 2D hydro code for teaching

Ulula is an ultra-lightweight 2D hydro code for teaching purposes. The code is written in pure python and is designed to be as short and easy to understand as possible, while not compromising on performance. The latter is achieved with a simple Godunov solver and by using numpy for all array operations.

[ascl:1611.001] UltraNest: Pythonic Nested Sampling Development Framework and UltraNest

This three-component package provides a Pythonic implementation of the Nested Sampling integration algorithm for Bayesian model comparison and parameter estimation. It offers multiple implementations for constrained drawing functions and a test suite to evaluate the correctness, accuracy and efficiency of various implementations. The three components are:

- a modular framework for nested sampling algorithms (nested_sampling) and their development;
- a test framework to evaluate the performance and accuracy of algorithms (testsuite); and
- UltraNest, a fast C implementation of a mixed RadFriends/MCMC nested sampling algorithm.

[ascl:2008.012] Ujti: Geodesics in general relativity

Ujti calculates geodesics, gravitational lenses and gravitational redshift in principle, for any metric. Special attention has been given to compact objects, so the current implementation considers only metrics in spherical coordinates.

[ascl:1704.002] UDAT: A multi-purpose data analysis tool

UDAT is a pattern recognition tool for mass analysis of various types of data, including image and audio. Based on its WND-CHARM (ascl:1312.002) prototype, UDAT computed a large set of numerical content descriptors from each file it analyzes, and selects the most informative features using statistical analysis. The tool can perform automatic classification of galaxy images by training with annotated galaxy images. It also has unsupervised learning capabilities, such as query-by-example of galaxies based on morphology. That is, given an input galaxy image of interest, the tool can search through a large database of images to retrieve the galaxies that are the most similar to the query image. The downside of the tool is its computational complexity, which in most cases will require a small or medium cluster.

[ascl:1303.006] UCLCHEM: Time and depth dependent gas-grain chemical model

UCLCHEM is a time and depth dependent gas-grain chemical model that can be used to estimate the fractional abundances (with respect to hydrogen) of gas and surface species in every environment where molecules are present. The model includes both gas and surface reactions. The code starts from the most diffuse state where all the gas is in atomic form and evolve sthe gas to its final density. Depending on the temperature, atoms and molecules from the gas freeze on to the grains and they hydrogenate where possible. The advantage of this approach is that the ice composition is not assumed but it is derived by a time-dependent computation of the chemical evolution of the gas-dust interaction process. The code is very modular, has been used to model a variety of regions and can be coupled with the UCL_PDR and SMMOL codes.

[ascl:1303.004] UCL_PDR: Time dependent photon-dissociation regions model

UCL_PDR is a time dependent photon-dissociation regions model that calculates self consistently the thermal balance. It can be used with gas phase only species as well as with surface species. It is very modular, has the possibility of accounting for density and pressure gradients and can be coupled with UCL_CHEM as well as with SMMOL. It has been used to model small scale (e.g. knots in proto-planetary nebulae) to large scale regions (high redshift galaxies).

[ascl:2309.002] UBHM: Uncertainty quantification of black hole mass estimation

Uncertain_blackholemass predicts virial black hole masses using a neural network model and quantifies their uncertainties. The scripts retrieve data and run feature extraction and uncertainty quantification for regression. They can be used separately or deployed to existing machine learning methods to generate prediction intervals for the black hole mass predictions.

[ascl:2302.020] UBER: Universal Boltzmann Equation Solver

UBER (Universal Boltzmann Equation Solver) solves the general form of Fokker-Planck equation and Boltzmann equation, diffusive or non-diffusive, that appear in modeling planetary radiation belts. Users can freely specify the coordinate system, boundary geometry and boundary conditions, and the equation terms and coefficients. The solver works for problems in one to three spatial dimensions. The solver is based upon the mathematical theory of stochastic differential equations. By its nature, the solver scheme is intrinsically Monte Carlo, and the solutions thus contain stochastic uncertainty, though the user may dictate an arbitrarily small relative tolerance of the stochastic uncertainty at the cost of longer Monte Carlo iterations.

[submitted] U.S. Naval Observatory Ephemerides of the Largest Asteroids (USNO/AE98)

USNO/AE98 contains ephemerides for fifteen of the largest asteroids that The Astronomical Almanac has used since its 2000 edition. These ephemerides are based on the Jet Propulsion Laboratory (JPL) planetary ephemeris DE405 and, thus, aligned to the International Celestial Reference System (ICRS). The data cover the period from 1799 November 16 (JD 2378450.5) through 2100 February 1 (JD 2488100.5). The internal uncertainty in the mean longitude at epoch, 1997 December 18, ranges from 0.05 arcseconds for 7 Iris through 0.22 arcseconds for 65 Cybele, and the uncertainty in the mean motion varies from 0.02 arcseconds per century for 4 Vesta to 0.14 arcseconds per century for 511 Davida.

The Astronomical Almanac has published ephemerides for 1 Ceres, 2 Pallas, 3 Juno, and 4 Vesta since its 1953 edition. Historically, these four asteroids have been observed more than any of the others. Ceres, Pallas, and Vesta deserve such attention because as they are the three most massive asteroids, the source of significant perturbations of the planets, the largest in linear size, and among the brightest main belt asteroids. Studying asteroids may provide clues to the origin and primordial composition of the solar system, data for modeling the chaotic dynamics of small solar system bodies, and assessments of potential collisions. Therefore, USNO/AE98 includes more than the traditional four asteroids.

The following criteria were used to select main belt asteroids for USNO/AE98:

Diameter greater than 300 km, presumably among the most massive asteroids
Excellent observing history and discovered before 1850
Largest in their taxonomic class
The massive asteroids included may be studied for their perturbing effects on the planets while those with detailed observing histories may be used to evaluate the accuracy limits of asteroid ephemerides. The fifteen asteroids that met at least one of these criteria are

1 Ceres (new mass determination)
2 Pallas (new mass determination)
3 Juno
4 Vesta (new mass determination)
6 Hebe
7 Iris
8 Flora
9 Metis
10 Hygiea
15 Eunomia
16 Psyche
52 Europa
65 Cybele
511 Davida
704 Interamnia
The refereed paper by Hilton (1999, Astron. J. 117, 1077) describes the USNO/AE98 asteroid ephemerides in detail. The associated USNO/AA Tech Note 1998-12 includes residual plots for all fifteen asteroids and a comparison between these ephemerides and those used in The Astronomical Almanac through 1999.

Software to compact, read, and interpolate the USNO/AE98 asteroid ephemerides is also available. It is written in C and designed to work with the C edition of the Naval Observatory Vector Astrometry Software (NOVAS). The programs could be used with tabular ephemerides of other asteroids as well. The associated README file provides the details of this system.

[ascl:1303.008] TYCHO: Stellar evolution code

TYCHO is a general, one dimensional (spherically symmetric) stellar evolution code written in structured Fortran 77; it is designed for hydrostatic and hydrodynamic stages including mass loss, accretion, pulsations and explosions. Mixing and convection algorithms are based on 3D time-dependent simulations. It offers extensive on-line graphics using Tim Pearson's PGPLOT (ascl:1103.002) with X-windows and runs effectively on Linux and Mac OS X laptop and desktop computers.
NOTE: This code is no longer being supported.

[ascl:1210.025] TwoDSSM: Self-gravitating 2D shearing sheet

TwoDSSM solves the equations of self-gravitating hydrodynamics in the shearing sheet, with cooling. TwoDSSM is configured to use a simple, exponential cooling model, although it contains code for a more complicated (and perhaps more realistic) cooling model based on a one-zone vertical model. The complicated cooling model can be switched on using a flag.

[ascl:1407.002] TWODSPEC: Long-slit and optical fiber array spectra extensions for FIGARO

TWODSPEC offers programs for the reduction and analysis of long-slit and optical fiber array spectra, implemented as extensions to the FIGARO package (ascl:1203.013). The software are currently distributed as part of the Starlink software collection (ascl:1110.012). These programs are designed to do as much as possible for the user, to assist quick reduction and analysis of data; for example, LONGSLIT can fit multiple Gaussians to line profiles in batch and decides how many components to fit.

[ascl:1708.015] TWO-POP-PY: Two-population dust evolution model

TWO-POP-PY runs a two-population dust evolution model that follows the upper end of the dust size distribution and the evolution of the dust surface density profile and treats dust surface density, maximum particle size, small and large grain velocity, and fragmentation. It derives profiles that describe the dust-to-gas ratios and the dust surface density profiles well in protoplanetary disks, in addition to the radial flux by solid material rain out.

[ascl:2210.025] tvguide: Observability by TESS

tvguide determines whether stars and galaxies are observable by TESS. It uses an object's right ascension and declination and estimates the pointing of TESS's cameras using predicted spacecraft ephemerides to determine whether and for how long the object is observable with TESS. tvguide returns a file with two columns, the first the minimum number of sectors the target is observable for and the second the maximum.

[ascl:1304.015] TVD: Total Variation Diminishing code

TVD solves the magnetohydrodynamic (MHD) equations by updating the fluid variables along each direction using the flux-conservative, second-order, total variation diminishing (TVD), upwind scheme of Jin & Xin. The magnetic field is updated separately in two-dimensional advection-constraint steps. The electromotive force (EMF) is computed in the advection step using the TVD scheme, and this same EMF is used immediately in the constraint step in order to preserve ∇˙B=0 without the need to store intermediate fluxes. The code is extended to three dimensions using operator splitting, and Runge-Kutta is used to get second-order accuracy in time. TVD offers high-resolution per grid cell, second-order accuracy in space and time, and enforcement of the ∇˙B=0 constraint to machine precision. Written in Fortran, It has no memory overhead and is fast. It is also available in a fully scalable message-passing parallel MPI implementation.

[ascl:1907.015] TurbuStat: Turbulence statistics in spectral-line data cubes

TurbuStat implements a variety of turbulence-based statistics described in the astronomical literature and defines distance metrics for each statistic to quantitatively compare spectral-line data cubes, as well as column density, integrated intensity, or other moment maps. The software can simulate observations of fractional Brownian Motion fields, including 2-D images and optically thin H I data cubes. TurbuStat also offers multicore fast-Fourier-transform support and provides a segmented linear model for fitting lines with a break point.

[ascl:1205.004] Turbospectrum: Code for spectral synthesis

Turbospectrum is a 1D LTE spectrum synthesis code which covers 600 molecules, is fast with many lines, and uses the treatment of line broadening described by Barklem & O’Mara (1998).

[submitted] Turbospectrum_NLTE

Latest version of TS (Turbospectrum), with NLTE capabilities.
Computation of stellar spectra (flux and intensities) in 1D or average stellar atmosphere models.
In order to compute NLTE stellar spectra, additional data is needed, downloadable outside GitHub.
See documentation in DOC folder

Python wrappers are available at https://github.com/EkaterinaSe/TurboSpectrum-Wrapper/ and https://github.com/JGerbs13/TSFitPy
They allow interpolation between models and fitting of spectra to derive stellar parameters.

[ascl:1906.006] turboSETI: Python-based SETI search algorithm

TurboSETI analyzes filterbank data (frequency vs. time) for narrow band drifting signals; its main purpose is to search for signals of extraterrestrial origin. TurboSETI can search the data for hundreds of drift rates (in Hz/sec) and handles either .fil or .h5 file formats. It has several dependencies, including Blimpy (ascl:1906.002) and Astropy (ascl:1304.002).

[ascl:1011.011] turboGL: Accurate Modeling of Weak Lensing

turboGL is a fast Mathematica code based on a stochastic approach to cumulative weak lensing. It can easily compute the lensing PDF relative to arbitrary halo mass distributions, selection biases, number of observations, halo profiles and evolutions, making it a useful tool to study how lensing depends on cosmological parameters and impact on observations.

[ascl:2110.004] TULIPS: Tool for Understanding the Lives, Interiors, and Physics of Stars

TULIPS (Tool for Understanding the Lives, Interiors, and Physics of Stars) creates diagrams of the structure and evolution of stars. It creates plots and movies based on output from the MESA stellar evolution code (ascl:1010.083). TULIPS represents stars as circles of varying size and color. The code can also visualize the size and perceived color of stars, their interior mixing and nuclear burning processes, their chemical composition, and can compare different MESA models.

[ascl:1604.012] TTVFaster: First order eccentricity transit timing variations (TTVs)

TTVFaster implements analytic formulae for transit time variations (TTVs) that are accurate to first order in the planet–star mass ratios and in the orbital eccentricities; the implementations are available in several languages, including IDL, Julia, Python and C. These formulae compare well with more computationally expensive N-body integrations in the low-eccentricity, low mass-ratio regime when applied to simulated and to actual multi-transiting Kepler planet systems.

[ascl:1404.015] TTVFast: Transit timing inversion

TTVFast efficiently calculates transit times for n-planet systems and the corresponding radial velocities. The code uses a symplectic integrator with a Keplerian interpolator for the calculation of transit times (Nesvorny et al. 2013); it is available in both C and Fortran.

[ascl:2210.010] TSRecon: Time series reconstruction method of massive astronomical catalogs

The time series reconstruction method of massive astronomical catalogs reconstructs all celestial objects' time series data for astronomical catalogs with great accuracy. In addition, the program, which requires a Spark cluster, solves the boundary source leakage problem on the premise of ensuring accuracy, and the user can set different parameters for different data sets to filter the error records in the catalogs.

[ascl:1406.011] TSP: Time-Series/Polarimetry Package

TSP is an astronomical data reduction package that handles time series data and polarimetric data from a variety of different instruments, and is distributed as part of the Starlink software collection (ascl:1110.012).

[ascl:1509.005] TRUVOT: True Background Technique for the Swift UVOT Grisms

TRUVOT decontaminates Swift UVOT grism spectra for transient objects. The technique makes use of template images in a process similar to image subtraction.

[ascl:2007.019] TROVE: Theoretical ROVibrational Energies

TROVE (Theoretical ROVibrational Energies) performs variational calculations of rovibrational energies for general polyatomic molecules of arbitrary structure in isolated electronic states. The software numerically constructs the kinetic energy operator, which is represented as an expansion in terms of internal coordinates. The code is self-contained, requiring no analytical pre-derivation of the kinetic energy operator. TROVE is also general and can be used with any internal coordinates.

[ascl:2008.025] TRISTAN: TRIdimensional STANford code

TRISTAN (TRIdimensional STANford) is a fully electromagnetic code with full relativistic particle dynamics. The code simulates large-scale space plasma phenomena such as the formation of systems of galaxies. TRISTAN particles which hit the boundaries are arrested there and redistributed more uniformly by having the boundaries slightly conducting, thus allowing electrons to recombine with ions and provides a realistic way of eliminating escaping particles from the code. Fresh particle fluxes can then be introduced independently across the boundaries. Written in 1993, this code has largely been superceded by TRISTAN-MP (ascl:1908.008).

[ascl:1908.008] TRISTAN-MP: TRIdimensional STANford - Massively Parallel code

TRISTAN-MP is a fully relativistic Particle-In-Cell (PIC) code for plasma physics computations and self-consistently solves the full set of Maxwell’s equations, along with the relativistic equations of motion for the charged particles. Fields are discretized on a finite 3D or 2D mesh, the computational grid; the code then uses time-centered and space-centered finite difference schemes to advance the equations in time via the Lorentz force equation, and to calculate spatial derivatives, so that the algorithm is second order accurate in space and time. The charges and currents derived from the particles' velocities and positions are then used as source terms to re-calculate the electromagnetic fields. TRISTAN-MP is based on the original TRISTAN code (ascl:2008.025) by O. Buneman (1993).

[ascl:1605.010] TRIPPy: Python-based Trailed Source Photometry

TRIPPy (TRailed Image Photometry in Python) uses a pill-shaped aperture, a rectangle described by three parameters (trail length, angle, and radius) to improve photometry of moving sources over that done with circular apertures. It can generate accurate model and trailed point-spread functions from stationary background sources in sidereally tracked images. Appropriate aperture correction provides accurate, unbiased flux measurement. TRIPPy requires numpy, scipy, matplotlib, Astropy (ascl:1304.002), and stsci.numdisplay; emcee (ascl:1303.002) and SExtractor (ascl:1010.064) are optional.

[ascl:1405.008] TRIPP: Time Resolved Imaging Photometry Package

Written in IDL, TRIPP performs CCD time series reduction and analysis. It provides an on-line check of the incoming frames, performs relative aperture photometry and provides a set of time series tools, such as calculation of periodograms including false alarm probability determination, epoc folding, sinus fitting, and light curve simulations.

[ascl:2207.022] triple-stability: Triple-star system stability determinator

triple-stability uses a simple form of an artificial neural network, a multi-layer perceptron, to check whether a given configuration of a triple-star system is dynamically stable. The code is written in Python and the MLP classifier can be imported to other custom Python3 scripts.

[ascl:1210.014] TRIP: General computer algebra system for celestial mechanics

TRIP is an interactive computer algebra system that is devoted to perturbation series computations, and specially adapted to celestial mechanics. Its development started in 1988, as an upgrade of the special purpose FORTRAN routines elaborated by J. Laskar for the demonstration of the chaotic behavior of the Solar System. TRIP is a mature and efficient tool for handling multivariate generalized power series, and embeds two kernels, a symbolic and a numerical kernel. This numerical kernel communicates with Gnuplot or Grace to plot the graphics and allows one to plot the numerical evaluation of symbolic objects.

[ascl:2107.028] TRINITY: Dark matter halos, galaxies and supermassive black holes empirical model

TRINITY statistically connects dark matter halos, galaxies and supermassive black holes (SMBHs) from z=0-10. Constrained by multiple galaxy (0 < z < 10) and SMBH datasets (0 < z < 6.5), the empirical model finds the posterior probability distributions of the halo-galaxy-SMBH connection and SMBH properties, all of which are allowed to evolve with redshift. TRINITY can predict many observational data, such as galaxy stellar mass functions and quasar luminosity functions, and underlying galaxy and SMBH properties, including SMBH Eddington average Eddington ratios. These predictions are made by different code files. There are basically two types of prediction codes: the first type generates observable data given input redshift or redshift invertals; the second type generates galaxy or SMBH properties as a function of host halo mass and redshift.

[ascl:1508.009] Trilogy: FITS image conversion software

Trilogy automatically scales and combines FITS images to produce color or grayscale images using Python scripts. The user assigns images to each color channel (RGB) or a single image to grayscale luminosity. Trilogy determines the intensity scaling automatically and independently in each channel to display faint features without saturating bright features. Each channel's scaling is determined based on a sample of the image (or summed images) and two input parameters. One parameter sets the output luminosity of "the noise," currently determined as 1-sigma above the sigma-clipped mean. The other parameter sets what fraction of the data (if any) in the sample region should be allowed to saturate. Default values for these parameters (0.15% and 0.001%, respectively) work well, but the user is able to adjust them. The scaling is accomplished using the logarithmic function y = a log(kx + 1) clipped between 0 and 1, where a and k are constants determined based on the data and desired scaling parameters as described above.

[ascl:1612.019] Trident: Synthetic spectrum generator

Trident creates synthetic absorption-line spectra from astrophysical hydrodynamics simulations. It uses the yt package (ascl:1011.022) to read in simulation datasets and extends it to provide realistic synthetic observations appropriate for studies of the interstellar, circumgalactic, and intergalactic media.

[ascl:2002.004] triceratops: Candidate exoplanet rating tool

triceratops (Tool for Rating Interesting Candidate Exoplanets and Reliability Analysis of Transits Originating from Proximate Stars) validates planet candidates from the Transiting Exoplanet Survey Satellite (TESS). The code calculates the probabilities of a wide range of transit-producing scenarios using the primary transit of the planet candidate and preexisting knowledge of its host and nearby stars. It then uses the known properties of these stars to calculate star-specific priors for each scenario with estimates of stellar multiplicity and planet occurrence rates.

[ascl:2309.001] TRES: TRiple Evolution Simulation package

TRES simulates hierarchical triple systems with stellar and planetary components, including stellar evolution, stellar winds, tides, general relativistic effects, mass transfer, and three-body dynamics. It combines stellar evolution and interactions with three-body dynamics in a self-consistent way. The code includes the effects of common-envelope evolution, circularized stable mass transfer, tides, gravitational wave emission and up-to-date stellar evolution through SeBa (ascl:1201.003). Other stellar evolution codes, such as SSE (ascl:1303.015), can also be used. TRES is written in the AMUSE (ascl:1107.007) software framework.

[ascl:1911.021] TreeFrog: Construct halo merger trees and compare halo catalogs

TreeFrog reads in particle IDs information between various structure catalogs and cross matches catalogs, assuming that particle IDs are unique and constant across snapshots. Though it is built as a cross correlator (in that it can match particles across several different catalogs), its principle use is as halo merger tree builder. TreeFrog produces links between objects found at different snapshots (or catalogs) and uses several possible functions to evaluate the merit of a link between one object at a given snapshot (or in a given catalog) to another object in a previous snapshot (or different catalog). It can also produce a full graph. The code utilizes MPI and OpenMP. It is optimzed for reading VELOCIraptor (ascl:1911.020) output but can also read output from other structure finders such as AHF (ascl:1102.009).

[ascl:1508.007] TreeCorr: Two-point correlation functions

TreeCorr efficiently computes two-point correlation functions. It can compute correlations of regular number counts, weak lensing shears, or scalar quantities such as convergence or CMB temperature fluctuations. Two-point correlations may be auto-correlations or cross-correlations, including any combination of shear, kappa, and counts. Two-point functions can be done with correct curved-sky calculation using RA, Dec coordinates, on a Euclidean tangent plane, or in 3D using RA, Dec and a distance. The front end is written in Python, which can be used as a Python module or as a standalone executable using configuration files; the actual computation of the correlation functions is done in C++ using ball trees (similar to kd trees), making the calculation extremely efficient, and when available, OpenMP is used to run in parallel on multi-core machines.

[ascl:1412.011] TraP: Transients discovery pipeline for image-plane surveys

The TraP is a pipeline for detecting and responding to transient and variable sources in a stream of astronomical images. Images are initially processed using a pure-Python source-extraction package, PySE (ascl:1805.026), which is bundled with the TraP. Source positions and fluxes are then loaded into a SQL database for association and variability detection. The database structure allows for estimation of past upper limits on newly detected sources, and for forced fitting of previously detected sources which have since dropped below the blind-extraction threshold. Developed with LOFAR data in mind, the TraP has been used with data from other radio observatories.

[ascl:2001.002] TRANSPHERE: 1-D spherical continuum radiative transfer

TRANSPHERE is a simple dust continuum radiative transfer code for spherically symmetric circumstellar envelopes. It handles absorption and re-emission and computes the dust temperature self-consistently; it does not, however, deal with scattering. TRANSPHERE uses a variable eddington factor method for the radiative transfer. The RADMD code (ascl:1108.016) is more versatile, but for a spherically symmetric problem for which scattering is of much concern, it may be easier to use a simple code such as TRANSPHERE.

Please note that this code has not been updated since 2006.

[ascl:1703.010] TransitSOM: Self-Organizing Map for Kepler and K2 transits

A self-organizing map (SOM) can be used to identify planetary candidates from Kepler and K2 datasets with accuracies near 90% in distinguishing known Kepler planets from false positives. TransitSOM classifies a Kepler or K2 lightcurve using a self-organizing map (SOM) created and pre-trained using PyMVPA (ascl:1703.009). It includes functions for users to create their own SOMs.

[ascl:2103.010] TransitFit: Exoplanet transit fitting package for multi-telescope datasets

TransitFit fits exoplanetary transit light-curves for transmission spectroscopy studies. The code uses nested sampling for efficient and robust multi-epoch, multi-wavelength fitting of transit data obtained from one or more telescopes. TransitFit allows per-telescope detrending to be performed simultaneously with parameter fitting, including the use of user-supplied detrending alogorithms. Host limb darkening can be fitted either independently ("uncoupled") for each filter or combined ("coupled") using prior conditioning from the PHOENIX stellar atmosphere models. For this, TransitFit uses the Limb Darkening Toolkit (ascl:1510.003) together with filter profiles, including user-supplied filter profiles.

[ascl:1704.008] Transit: Radiative-transfer code for planetary atmospheres

Transit calculates the transmission or emission spectrum of a planetary atmosphere with application to extrasolar-planet transit and eclipse observations, respectively. It computes the spectra by solving the one-dimensional line-by-line radiative-transfer equation for an atmospheric model.

[ascl:1611.008] Transit Clairvoyance: Predicting multiple-planet systems for TESS

Transit Clairvoyance uses Artificial Neural Networks (ANNs) to predict the most likely short period transiters to have additional transiters, which may double the discovery yield of the TESS (Transiting Exoplanet Survey Satellite). Clairvoyance is a simple 2-D interpolant that takes in the number of planets in a system with period less than 13.7 days, as well as the maximum radius amongst them (in Earth radii) and orbital period of the planet with maximum radius (in Earth days) in order to predict the probability of additional transiters in this system with period greater than 13.7 days.

[ascl:1106.014] Transit Analysis Package (TAP and autoKep): IDL Graphical User Interfaces for Extrasolar Planet Transit Photometry

We present an IDL graphical user interface-driven software package designed for the analysis of extrasolar planet transit light curves. The Transit Analysis Package (TAP) software uses Markov Chain Monte Carlo (MCMC) techniques to fit light curves using the analytic model of Mandel and Agol (2002). The package incorporates a wavelet based likelihood function developed by Carter and Winn (2009) which allows the MCMC to assess parameter uncertainties more robustly than classic chi-squared methods by parameterizing uncorrelated "white" and correlated "red" noise. The software is able to simultaneously analyze multiple transits observed in different conditions (instrument, filter, weather, etc). The graphical interface allows for the simple execution and interpretation of Bayesian MCMC analysis tailored to a user's specific data set and has been thoroughly tested on ground-based and Kepler photometry. AutoKep provides a similar GUI for the preparation of Kepler MAST archive data for analysis by TAP or any other analysis software. This paper describes the software release and provides instructions for its use.

[ascl:1501.011] transfer: The Sloan Digital Sky Survey Data Transfer Infrastructure

The Sloan Digital Sky Survey (SDSS) produces large amounts of data daily. transfer, written in Python, provides the effective automation needed for daily data transfer operations and management and operates essentially free of human intervention. This package has been tested and used successfully for several years.

[ascl:2212.023] Tranquillity: Creating black hole spin divergence plots

Tranquillity creates an observing screen looking toward a black hole - accretion disk system, seeks the object, then searches and locates its contour. Subsequently, it attempts to locate the first Einstein "echo" ring and its location. Finally, it collates the retrieved information and draws conclusions; these include the accretion disk level inclination compared to the line of sight and the main disk and the first echo median. The displacement, and thus the divergence of the latter two, is the required information in order to construct the divergence plots. Other programs can later on automatically read these plots and provide estimations of the central black hole spin.

[ascl:2012.012] TRAN_K2: Planetary transit search

TRAN_K2 searches for periodic transits in the photometric time series of the Kepler K2 mission. The search is made by considering stellar variability and instrumental systematics. TRAN_K2 is written in Fortran 77 and has a single input parameter file that can be edited by the user depending on the type of run and parameter ranges to be used.

[ascl:1601.001] TRADES: TRAnsits and Dynamics of Exoplanetary Systems

TRADES (TRAnsits and Dynamics of Exoplanetary Systems) simultaneously fits observed radial velocities and transit times data to determine the orbital parameters of exoplanetary systems from observational data. It uses a dynamical simulator for N-body systems that also fits the available data during the orbital integration and determines the best combination of the orbital parameters using grid search, χ2 minimization, genetic algorithms, particle swarm optimization, and bootstrap analysis.

[ascl:1304.011] TPZ: Trees for Photo-Z

TPZ, a parallel code written in python, produces robust and accurate photometric redshift PDFs by using prediction tree and random forests. The code also produces ancillary information about the sample used, such as prior unbiased errors estimations (giving an estimation of performance) and a ranking of importance of variables as well as a map of performance indicating where extra training data is needed to improve overall performance. It is designed to be easy to use and a tutorial is available.

Would you like to view a random code?