ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 251-500 of 1954 (1927 ASCL, 27 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:1110.006] STIFF: Converting Scientific FITS Images to TIFF

STIFF is a program that converts scientific FITS1 images to the more popular TIFF2 format for illustration purposes. Most FITS readers and converters do not do a proper job at converting FITS image data to 8 bits. 8-bit images stored in JPEG, PNG or TIFF files have the intensities implicitely stored in a non-linear way. Most current FITS image viewers and converters provide the user an incorrect translation of the FITS image content by simply rescaling linearly input pixel values. A first consequence is that the people working on astronomical images usually have to apply narrow intensity cuts or square-root or logarithmic intensity transformations to actually see something on their deep-sky images. A less obvious consequence is that colors obtained by combining images processed this way are not consistent across such a large range of surface brightnesses. Though with other software the user is generally afforded a choice of nonlinear transformations to apply in order to make the faint stuff stand out more clearly in the images, with the limited selection of choices provides, colors will not be accurately rendered, and some manual tweaking will be necessary. The purpose of STIFF is to produce beautiful pictures in an automatic and consistent way.

[ascl:1810.014] STiC: Stockholm inversion code

STiC is a MPI-parallel non-LTE inversion code for observed full-Stokes observations. The code processes lines from multiple atoms in non-LTE, including partial redistribution effects of scattered photons in angle and frequency of scattered photons (PRD), and can be used with model atmospheres that have a complex depth stratification without introducing artifacts.

[submitted] stginga: Ginga for STScI

stginga customizes Ginga to aid data analysis for the data supported by STScI (e.g., HST or JWST). For instance, it provides plugins and configuration files that understand HST and JWST data products.

[ascl:1306.009] STF: Structure Finder

STF is a general structure finder designed to find halos, subhaloes, and tidal debris in N-body simulations. The current version is designed to read in "particle data" (that is SPH N-body data), but a simple modification of the I/O can have it read grid data from Grid based codes.

[ascl:1805.006] StePS: Stereographically Projected Cosmological Simulations

StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.

[ascl:1809.014] stepped_luneburg: Stacked-based ray tracing code to model a stepped Luneburg lens

stepped_luneburg investigates the scattered light properties of a Luneburg lens approximated as a series of concentric shells with discrete refractive indices. The optical Luneburg lens has promising applications for low-cost, continuous all-sky monitoring to obtain transit light curves of bright, nearby stars. This code implements a stack-based algorithm that tracks all reflected and refracted rays generated at each optical interface of the lens as described by Snell's law. The Luneburg lens model parameters, such as number of lens layers, the power-law that describes the refractive indices, the number of incident rays, and the initial direction of the incident wavefront can be altered to optimize lens performance. The stepped_luneburg module can be imported within the Python environment or used with scripting, and it is accompanied by two other modules, enc_int and int_map, that help the user to determine the resolving power of the lens and the strength of scattered light haloes for the purpose of quality assessment.

[ascl:1901.012] stellarWakes: Dark matter subhalo searches using stellar kinematic data

stellarWakes uses stellar kinematic data to search for dark matter (DM) subhalos through their gravitational perturbations to the stellar phase-space distribution.

[ascl:1303.028] Stellarics: Inverse Compton scattering from stellar heliospheres

Cosmic ray electrons scatter on the photon fields around stars, including the sun, to create gamma rays by the inverse Compton effect. Stellarics computes the spectrum and angular distribution of this emission. The software also includes general-purpose routines for inverse Compton scattering on a given electron spectrum, for example for interstellar or astrophysical source modelling.

[ascl:1505.009] StellaR: Stellar evolution tracks and isochrones tools

stellaR accesses and manipulates publicly available stellar evolutionary tracks and isochrones from the Pisa low-mass database. It retrieves and plots the required calculations from CDS, constructs by interpolation tracks or isochrones of compositions different to the ones available in the database, constructs isochrones for age not included in the database, and extracts relevant evolutionary points from tracks or isochrones.

[ascl:1108.013] STELLA: Multi-group Radiation Hydrodynamics Code

STELLA is a one-dimensional multi-group radiation hydrodynamics code. STELLA incorporates implicit hydrodynamics coupled to a multi-group non-equilibrium radiative transfer for modeling SN II-L light curves. The non-equilibrium description of radiation is crucial for this problem since the presupernova envelope may be of low mass and very dilute. STELLA implicitly treats time dependent equations of the angular moments of intensity averaged over a frequency bin. Local thermodynamic equilibrium is assumed to determine the ionization levels of materials.

[ascl:1108.018] STECKMAP: STEllar Content and Kinematics via Maximum A Posteriori likelihood

STECKMAP stands for STEllar Content and Kinematics via Maximum A Posteriori likelihood. It is a tool for interpreting galaxy spectra in terms of their stellar populations through the derivation of their star formation history, age-metallicity relation, kinematics and extinction. The observed spectrum is projected onto a temporal sequence of models of single stellar populations, so as to determine a linear combination of these models that best fits the observed spectrum. The weights of the various components of this linear combination indicate the stellar content of the population. This procedure is regularized using various penalizing functions. The principles of the method are detailed in Ocvirk et al. 2006.

[ascl:1206.006] statpl: Goodness-of-fit for power-law distributed data

statpl estimates the parameter of power-law distributed data and calculates goodness-of-fit tests for them. Many objects studied in astronomy follow a power-law distribution function (DF), for example the masses of stars or star clusters. Such data is often analyzed by generating a histogram and fitting a straight line to it. The parameters obtained in this way can be severely biased, and the properties of the underlying DF, such as its shape or a possible upper limit, are difficult to extract. statpl is an (effectively) bias-free estimator for the exponent and the upper limit.

[ascl:1704.004] STATCONT: Statistical continuum level determination method for line-rich sources

STATCONT determines the continuum emission level in line-rich spectral data by inspecting the intensity distribution of a given spectrum by using different statistical approaches. The sigma-clipping algorithm provides the most accurate continuum level determination, together with information on the uncertainty in its determination; this uncertainty is used to correct the final continuum emission level. In general, STATCONT obtains accuracies of < 10 % in the continuum determination, and < 5 % in most cases. The main products of the software are the continuum emission level, together with its uncertainty, and data cubes containing only spectral line emission, i.e. continuum-subtracted data cubes. STATCONT also includes the option to estimate the spectral index or variation of the continuum emission with frequency.

[ascl:1805.010] StarSmasher: Smoothed Particle Hydrodynamics code for smashing stars and planets

Smoothed Particle Hydrodynamics (SPH) is a Lagrangian particle method that approximates a continuous fluid as discrete nodes, each carrying various parameters such as mass, position, velocity, pressure, and temperature. In an SPH simulation the resolution scales with the particle density; StarSmasher is able to handle both equal-mass and equal number-density particle models. StarSmasher solves for hydro forces by calculating the pressure for each particle as a function of the particle's properties - density, internal energy, and internal properties (e.g. temperature and mean molecular weight). The code implements variational equations of motion and libraries to calculate the gravitational forces between particles using direct summation on NVIDIA graphics cards. Using a direct summation instead of a tree-based algorithm for gravity increases the accuracy of the gravity calculations at the cost of speed. The code uses a cubic spline for the smoothing kernel and an artificial viscosity prescription coupled with a Balsara Switch to prevent unphysical interparticle penetration. The code also implements an artificial relaxation force to the equations of motion to add a drag term to the calculated accelerations during relaxation integrations. Initially called StarCrash, StarSmasher was developed originally by Rasio.

[ascl:1703.005] starsense_algorithms: Performance evaluation of various star sensors

The Matlab starsense_algorithms package evaluates the performance of various star sensors through the implementation of centroiding, geometric voting and QUEST algorithms. The physical parameters of a star sensor are parametrized and by changing these parameters, performance estimators such as sky coverage, memory requirement, and timing requirements can be estimated for the selected star sensor.

[ascl:1107.008] STARS: A Stellar Evolution Code

We have developed a detailed stellar evolution code capable of following the simultaneous evolution of both stars in a binary system, together with their orbital properties. To demonstrate the capabilities of the code we investigate potential progenitors for the Type IIb supernova 1993J, which is believed to have been an interacting binary system prior to its primary exploding. We use our detailed binary stellar evolution code to model this system to determine the possible range of primary and secondary masses that could have produced the observed characteristics of this system, with particular reference to the secondary. Using the luminosities and temperatures for both stars (as determined by Maund et al. 2004) and the remaining mass of the hydrogen envelope of the primary at the time of explosion, we find that if mass transfer is 100 per cent efficient the observations can be reproduced by a system consisting of a 15 solar mass primary and a 14 solar mass secondary in an orbit with an initial period of 2100 days. With a mass transfer efficiency of 50 per cent, a more massive system consisting of a 17 solar mass primary and a 16 solar mass secondary in an initial orbit of 2360 days is needed. We also investigate some of the uncertainties in the evolution, including the effects of tidal interaction, convective overshooting and thermohaline mixing.

[ascl:1810.005] STARRY: Analytic computation of occultation light curves

STARRY computes light curves for various applications in astronomy: transits and secondary eclipses of exoplanets, light curves of eclipsing binaries, rotational phase curves of exoplanets, light curves of planet-planet and planet-moon occultations, and more. By modeling celestial body surface maps as sums of spherical harmonics, STARRY does all this analytically and is therefore fast, stable, and differentiable. Coded in C++ but wrapped in Python, STARRY is easy to install and use.

[ascl:1609.002] StarPy: Quenched star formation history parameters of a galaxy using MCMC

StarPy derives the quenching star formation history (SFH) of a single galaxy through the Bayesian Markov Chain Monte Carlo method code emcee (ascl:1303.002). The sample function implements the emcee EnsembleSampler function for the galaxy colors input. Burn-in is run and calculated for the length specified before the sampler is reset and then run for the length of steps specified. StarPy provides the ability to use the look-up tables provided or creating your own.

[ascl:1406.020] STARMAN: Stellar photometry and image/table handling

STARMAN is a stellar photometry package designed for the reduction of data from imaging systems. Its main components are crowded-field photometry programs, aperture photometry programs, a star finding program, and a CCD reduction program.

Image and table handling are served by a large number of programs which have a general use in photometry and other types of work. The package is a coherent whole, for use in the entire process of stellar photometry from raw images to the final standard-system magnitudes and their plotting as color-magnitude and color-color diagrams. It was distributed as part of the Starlink software collection (ascl:1110.012).

[ascl:1110.012] Starlink: Multi-purpose Astronomy Software

Starlink has many applications within it to meet a variety of needs; it includes:

  • a general astronomical image viewer;
  • data reduction tools, including programs for reducing CCD-like data;
  • general-purpose data-analysis and visualisation tools;
  • image processing, data visualisation, and manipulating NDF components;
  • a flexible and powerful library for handling World Coordinate Systems (partly based on the SLALIB library);
  • a library of routines intended to make accurate and reliable positional-astronomy applications easier to write; and
  • and a Hierarchical Data System that is portable and flexible for storing and retrieving data.

[ascl:1411.022] Starlink Figaro: Starlink version of the Figaro data reduction software package

Starlink Figaro is an independently-maintained fork of Figaro (ascl:1203.013) that runs in the Starlink software environment (ascl:1110.012). It is a general-purpose data reduction package targeted mainly at optical/IR spectroscopy. It uses the NDF data format and the ADAM libraries for parameters and messaging.

[ascl:1108.006] STARLIGHT: Spectral Synthesis Code

The study of stellar populations in galaxies is entering a new era with the availability of large and high quality databases of both observed galactic spectra and state-of-the-art evolutionary synthesis models. The power of spectral synthesis can be investigated as a mean to estimate physical properties of galaxies. Spectral synthesis is nothing more than the decomposition of an observed spectrum in terms of a superposition of a base of simple stellar populations of various ages and metallicities, producing astrophysically interesting output such as the star-formation and chemical enrichment histories of a galaxy, its extinction and velocity dispersion. This is what the STARLIGHT spectral synthesis code does.

[ascl:1010.076] Starlab: A Software Environment for Collisional Stellar Dynamics

Traditionally, a simulation of a dense stellar system required choosing an initial model, running an integrator, and analyzing the output. Almost all of the effort went into writing a clever integrator that could handle binaries, triples and encounters between various multiple systems efficiently. Recently, the scope and complexity of these simulations has increased dramatically, for three reasons: 1) the sheer size of the data sets, measured in Terabytes, make traditional 'awking and grepping' of a single output file impractical; 2) the addition of stellar evolution data brings qualitatively new challenges to the data reduction; 3) increased realism of the simulations invites realistic forms of 'SOS': Simulations of Observations of Simulations, to be compared directly with observations. We are now witnessing a shift toward the construction of archives as well as tailored forms of visualization including the use of virtual reality simulators and planetarium domes, and a coupling of both with budding efforts in constructing virtual observatories. This review describes these new trends, presenting Starlab as the first example of a full software environment for realistic large-scale simulations of dense stellar systems.

[ascl:1505.007] Starfish: Robust spectroscopic inference tools

Starfish is a set of tools used for spectroscopic inference. It robustly determines stellar parameters using high resolution spectral models and uses Markov Chain Monte Carlo (MCMC) to explore the full posterior probability distribution of the stellar parameters. Additional potential applications include other types of spectra, such as unresolved stellar clusters or supernovae spectra.

[ascl:1204.008] StarFISH: For Inferring Star-formation Histories

StarFISH is a suite of programs designed to determine the star formation history (SFH) of a stellar population, given multicolor stellar photometry and a library of theoretical isochrones. It constructs a library of synthetic color-magnitude diagrams from the isochrones, which includes the effects of extinction, photometric errors and completeness, and binarity. A minimization routine is then used to determine the linear combination of synthetic CMDs that best matches the observed photometry. The set of amplitudes modulating each synthetic CMD describes the star formation history of the observed stellar population.

[ascl:0011.001] StarFinder: A code for stellar field analysis

StarFinder is an IDL code for the deep analysis of stellar fields, designed for Adaptive Optics well-sampled images with high and low Strehl ratio. The Point Spread Function is extracted directly from the frame, to take into account the actual structure of the instrumental response and the atmospheric effects. The code is written in IDL language and organized in the form of a self-contained widget-based application, provided with a series of tools for data visualization and analysis. A description of the method and some applications to Adaptive Optics data are presented.

[ascl:1010.074] StarCrash: 3-d Evolution of Self-gravitating Fluid Systems

StarCrash is a parallel fortran code based on Smoothed Particle Hydrodynamics (SPH) techniques to calculate the 3-d evolution of self-gravitating fluid systems. The code in particularly suited to the study of stellar interactions, such as mergers of binary star systems and stellar collisions. The StarCrash code comes with several important features, including:

  • Several routines which construct the initial conditions appropriate to a wide variety of physical systems
  • An efficient parallel neighbor-finding algorithm for calculating hydrodynamic quantities
  • A parallel gravitational field solver based on FFT convolution techniques, which uses the FFTW software libraries
  • Relaxation Techniques for single stars and synchronized binaries
  • Three different artificial viscosity treatments to calculate the thermodynamic evolution of the matter
  • An optional gravitational radiation back-reaction treatment, which calculates the damping force from gravity wave losses to lowest relativistic order in a spatially accurate way

[ascl:1104.003] Starburst99: Synthesis Models for Galaxies with Active Star Formation

Starburst99 is a comprehensive set of model predictions for spectrophotometric and related properties of galaxies with active star formation. The models are presented in a homogeneous way for five metallicities between Z = 0.040 and 0.001 and three choices of the initial mass function. The age coverage is 10^6 to 10^9 yr. Spectral energy distributions are used to compute colors and other quantities.

[ascl:1805.009] STARBLADE: STar and Artefact Removal with a Bayesian Lightweight Algorithm from Diffuse Emission

STARBLADE (STar and Artefact Removal with a Bayesian Lightweight Algorithm from Diffuse Emission) separates superimposed point-like sources from a diffuse background by imposing physically motivated models as prior knowledge. The algorithm can also be used on noisy and convolved data, though performing a proper reconstruction including a deconvolution prior to the application of the algorithm is advised; the algorithm could also be used within a denoising imaging method. STARBLADE learns the correlation structure of the diffuse emission and takes it into account to determine the occurrence and strength of a superimposed point source.

[ascl:1111.010] Starbase Data Tables: An ASCII Relational Database for Unix

Database management is an increasingly important part of astronomical data analysis. Astronomers need easy and convenient ways of storing, editing, filtering, and retrieving data about data. Commercial databases do not provide good solutions for many of the everyday and informal types of database access astronomers need. The Starbase database system with simple data file formatting rules and command line data operators has been created to answer this need. The system includes a complete set of relational and set operators, fast search/index and sorting operators, and many formatting and I/O operators. Special features are included to enhance the usefulness of the database when manipulating astronomical data. The software runs under UNIX, MSDOS and IRAF.

[ascl:1801.003] Stan: Statistical inference

Stan facilitates statistical inference at the frontiers of applied statistics and provides both a modeling language for specifying complex statistical models and a library of statistical algorithms for computing inferences with those models. These components are exposed through interfaces in environments such as R, Python, and the command line.

[ascl:1105.012] Stagger: MHD Method for Modeling Star Formation

Stagger is an astrophysical MHD code actively used to model star formation. It is equipped with a multi-frequency radiative transfer module and a comprehensive equation of state module that includes a large number of atomic and molecular species, to be able to compute realistic 3-D models of the near-surface layers of stars. The current version of the code allows a discretization that explicitly conserves mass, momentum, energy, and magnetic flux. The tensor formulation of the viscosity ensures that the viscous force is insensitive to the coordinate system orientation, thereby avoiding artificial grid-alignment.

[ascl:1901.006] ssos: Solar system objects detection pipeline

The ssos pipeline detects and identifies known and unknown Solar System Objects (SSOs) in astronomical images. ssos requires at least 3 images with overlapping field-of-views in the sky taken within a reasonable amount of time (e.g., 2 hours, 1 night). SSOs are detected mainly by judging the apparent motion of all sources in the images. The pipeline serves as a wrapper for the SExtractor (ascl:1010.064) and SCAMP (ascl:1010.063) software suites and allows different source extraction strategies to be chosen. All sources in the images are subject to a highly configurable filter pipeline. ssos is a versatile, light-weight, and easy-to-use software for surveys or PI-observation campaigns lacking a dedicated SSO detection pipeline.

[ascl:1807.032] SSMM: Slotted Symbolic Markov Modeling for classifying variable star signatures

SSMM (Slotted Symbolic Markov Modeling) reduces time-domain stellar variable observations to classify stellar variables. The method can be applied to both folded and unfolded data, and does not require time-warping for waveform alignment. Written in Matlab, the performance of the supervised classification code is quantifiable and consistent, and the rate at which new data is processed is dependent only on the computational processing power available.

[ascl:1303.015] SSE: Single Star Evolution

SSE is a rapid single-star evolution (SSE) code; these analytical formulae cover all phases of evolution from the zero-age main-sequence up to and including remnant phases. It is valid for masses in the range 0.1-100 Msun and metallicity can be varied. The SSE package contains a prescription for mass loss by stellar winds. It also follows the evolution of rotational angular momentum for the star.

[ascl:1705.005] SPTCLASS: SPecTral CLASSificator code

SPTCLASS assigns semi-automatic spectral types to a sample of stars. The main code includes three spectral classification schemes: the first one is optimized to classify stars in the mass range of TTS (K5 or later, hereafter LATE-type scheme); the second one is optimized to classify stars in the mass range of IMTTS (F late to K early, hereafter Gtype scheme), and the third one is optimized to classify stars in the mass range of HAeBe (F5 or earlier, hereafter HAeBe scheme). SPTCLASS has an interactive module that allows the user to select the best result from the three schemes and analyze the input spectra.

[ascl:1411.025] SPT Lensing Likelihood: South Pole Telescope CMB lensing likelihood code

The SPT lensing likelihood code, written in Fortran90, performs a Gaussian likelihood based upon the lensing potential power spectrum using a file from CAMB (ascl:1102.026) which contains the normalization required to get the power spectrum that the likelihood call is expecting.

[ascl:1201.013] SPS: SPIRE Photometer Simulator

The SPS software simulates the operation of the Spectral and Photometric Imaging Receiver on-board the ESA’s Herschel Space Observatory. It is coded using the Interactive Data Language (IDL), and produces simulated data at the level-0 stage (non-calibrated data in digitised units). The primary uses for the simulator are to:

  • optimize and characterize the photometer observing functions
  • aid in the development, validation, and characterization of the SPIRE data pipeline
  • provide a realistic example of SPIRE data, and thus to facilitate the development of specific analysis tools for specific science cases.
It should be noted that the SPS is not an officially supported product of the SPIRE ICC, and was originally developed for ICC use only. Consequently the SPS can be supported only on a "best efforts" basis.

[ascl:1806.013] SpS: Single-pulse Searcher

The presence of human-made interference mimicking the behavior of celestial radio pulses is a major challenge when searching for radio pulses emitted on millisecond timescales by celestial radio sources such as pulsars and fast radio bursts due to the highly imbalanced samples. Single-pulse Searcher (SpS) reduces the presence of radio interference when processing standard output from radio single-pulse searches to produce diagnostic plots useful for selecting good candidates. The modular software allows modifications for specific search characteristics. LOTAAS Single-pulse Searcher (L-SpS) is an implementation of different features of the software (such as a machine-learning approach) developed for a particular study: the LOFAR Tied-Array All-Sky Survey (LOTAAS).

[ascl:1506.008] SPRITE: Sparsity-based super-resolution algorithm

SPRITE (Sparse Recovery of InstrumenTal rEsponse) computes a well-resolved compact source image from several undersampled and noisy observations. The algorithm is based on sparse regularization; adding a sparse penalty in the recovery leads to far better accuracy in terms of ellipticity error, especially at low S/N.

[ascl:1411.015] SPOTROD: Semi-analytic model for transits of spotted stars

SPOTROD is a model for planetary transits of stars with an arbitrary limb darkening law and a number of homogeneous, circular spots on their surface. It facilitates analysis of anomalies due to starspot eclipses, and is a free, open source implementation written in C with a Python API.

[ascl:1809.006] spops: Spinning black-hole binary population synthesis

spops is a database of populations synthesis simulations of spinning black-hole binary systems, together with a python module to query it. Data are obtained with the startrack and precession [ascl:1611.004] numerical codes to consistently evolve binary stars from formation to gravitational-wave detection. spops allows quick exploration of the interplay between stellar physics and black-hole spin dynamics.

[ascl:1103.005] Splotch: Ray Tracer to Visualize SPH Simulations

Splotch is a light and fast, publicly available, ray-tracer software tool which supports the effective visualization of cosmological simulations data. The algorithm it relies on is designed to deal with point-like data, optimizing the ray-tracing calculation by ordering the particles as a function of their 'depth', defined as a function of one of the coordinates or other associated parameters. Realistic three-dimensional impressions are reached through a composition of the final colour in each pixel properly calculating emission and absorption of individual volume elements.

[ascl:1402.007] SPLAT: Spectral Analysis Tool

SPLAT is a graphical tool for displaying, comparing, modifying and analyzing astronomical spectra stored in NDF, FITS and TEXT files as well as in NDX format. It can read in many spectra at the same time and then display these as line plots. Display windows can show one or several spectra at the same time and can be interactively zoomed and scrolled, centered on specific wavelengths, provide continuous coordinate readout, produce printable hardcopy and be configured in many ways. Analysis facilities include the fitting of a polynomial to selected parts of a spectrum, the fitting of Gaussian, Lorentzian and Voigt profiles to emission and absorption lines and the filtering of spectra using average, median and line-shape window functions as well as wavelet denoising. SPLAT also supports a full range of coordinate systems for spectra, which allows coordinates to be displayed and aligned in many different coordinate systems (wavelength, frequency, energy, velocity) and transformed between these and different standards of rest (topocentric, heliocentric, dynamic and kinematic local standards of rest, etc). SPLAT is distributed as part of the Starlink (ascl:1110.012) software collection.

[ascl:1402.008] SPLAT-VO: Spectral Analysis Tool for the Virtual Observatory

SPLAT-VO is an extension of the SPLAT (Spectral Analysis Tool, ascl:1402.007) graphical tool for displaying, comparing, modifying and analyzing astronomical spectra; it includes facilities that allow it to work as part of the Virtual Observatory (VO). SPLAT-VO comes in two different forms, one for querying and downloading spectra from SSAP servers and one for interoperating with VO tools, such as TOPCAT (ascl:1101.010).

[ascl:1103.004] SPLASH: Interactive Visualization Tool for Smoothed Particle Hydrodynamics Simulations

SPLASH (formerly SUPERSPHPLOT) visualizes output from (astrophysical) simulations using the Smoothed Particle Hydrodynamics (SPH) method in one, two and three dimensions. Written in Fortran 90, it uses the PGPLOT graphics subroutine library for plotting. It is based around a command-line menu structure but utilizes the interactive capabilities of PGPLOT to manipulate data interactively in the plotting window. SPLASH is fully interactive; visualizations can be changed rapidly at the touch of a button (e.g. zooming, rotating, shifting cross section positions etc). Data is read directly from the code dump format giving rapid access to results and the visualization is advanced forwards and backwards through timesteps by single keystrokes. SPLASH uses the SPH kernel to render plots of not only density but other physical quantities, giving a smooth representation of the data.

[ascl:1512.015] Spirality: Spiral arm pitch angle measurement

Spirality measures spiral arm pitch angles by fitting galaxy images to spiral templates of known pitch. Written in MATLAB, the code package also includes GenSpiral, which produces FITS images of synthetic spirals, and SpiralArmCount, which uses a one-dimensional Fast Fourier Transform to count the spiral arms of a galaxy after its pitch is determined.

[ascl:1710.004] SPIPS: Spectro-Photo-Interferometry of Pulsating Stars

SPIPS (Spectro-Photo-Interferometry of Pulsating Stars) combines radial velocimetry, interferometry, and photometry to estimate physical parameters of pulsating stars, including presence of infrared excess, color excess, Teff, and ratio distance/p-factor. The global model-based parallax-of-pulsation method is implemented in Python. Derived parameters have a high level of confidence; statistical precision is improved (compared to other methods) due to the large number of data taken into account, accuracy is improved by using consistent physical modeling and reliability of the derived parameters is strengthened by redundancy in the data.

[ascl:1608.020] SPIDERz: SuPport vector classification for IDEntifying Redshifts

SPIDERz (SuPport vector classification for IDEntifying Redshifts) applies powerful support vector machine (SVM) optimization and statistical learning techniques to custom data sets to obtain accurate photometric redshift (photo-z) estimations. It is written for the IDL environment and can be applied to traditional data sets consisting of photometric band magnitudes, or alternatively to data sets with additional galaxy parameters (such as shape information) to investigate potential correlations between the extra galaxy parameters and redshift.

[ascl:1711.019] SPIDERMAN: Fast code to simulate secondary transits and phase curves

SPIDERMAN calculates exoplanet phase curves and secondary eclipses with arbitrary surface brightness distributions in two dimensions. The code uses a geometrical algorithm to solve exactly the area of sections of the disc of the planet that are occulted by the star. Approximately 1000 models can be generated per second in typical use, which makes making Markov Chain Monte Carlo analyses practicable. The code is modular and allows comparison of the effect of multiple different brightness distributions for a dataset.

[ascl:1903.016] SpiceyPy: Python wrapper for the NAIF C SPICE Toolkit

SpiceyPy is a Python wrapper for the NAIF C SPICE Toolkit (ascl:1903.015). It is compatible with Python 2 and 3, and was written using ctypes.

[ascl:1903.015] SPICE: Observation Geometry System for Space Science Missions

The SPICE (Spacecraft Planet Instrument C-matrix [“Camera matrix”] Events) toolkit offers a set of building blocks for constructing tools supporting multi-mission, international space exploration programs and research in planetary science, heliophysics, Earth science, and for observations from terrestrial observatories. It computes many kinds of observation geometry parameters, including the ephemerides, orientations, sizes, and shapes of planets, satellites, comets and asteroids. It can also compute the orientation of a spacecraft, its various moving structures, and an instrument's field-of-view location on a planet's surface or atmosphere. It can determine when a specified geometric event occurs, such as when an object is in shadow or is in transit across another object. The SPICE toolkit is available in FORTRAN 77, ANSI C, IDL, and MATLAB.

[ascl:1709.001] SPHYNX: SPH hydrocode for subsonic hydrodynamical instabilities and strong shocks

SPHYNX addresses subsonic hydrodynamical instabilities and strong shocks; it is Newtonian, grounded on the Euler-Lagrange formulation of the smoothed-particle hydrodynamics technique, and density based. SPHYNX uses an integral approach for estimating gradients, a flexible family of interpolators to suppress pairing instability, and incorporates volume elements to provides better partition of the unity.

[ascl:1103.009] SPHRAY: A Smoothed Particle Hydrodynamics Ray Tracer for Radiative Transfer

SPHRAY, a Smoothed Particle Hydrodynamics (SPH) ray tracer, is designed to solve the 3D, time dependent, radiative transfer (RT) equations for arbitrary density fields. The SPH nature of SPHRAY makes the incorporation of separate hydrodynamics and gravity solvers very natural. SPHRAY relies on a Monte Carlo (MC) ray tracing scheme that does not interpolate the SPH particles onto a grid but instead integrates directly through the SPH kernels. Given initial conditions and a description of the sources of ionizing radiation, the code will calculate the non-equilibrium ionization state (HI, HII, HeI, HeII, HeIII, e) and temperature (internal energy/entropy) of each SPH particle. The sources of radiation can include point like objects, diffuse recombination radiation, and a background field from outside the computational volume. The MC ray tracing implementation allows for the quick introduction of new physics and is parallelization friendly. A quick Axis Aligned Bounding Box (AABB) test taken from computer graphics applications allows for the acceleration of the raytracing component. We present the algorithms used in SPHRAY and verify the code by performing all the test problems detailed in the recent Radiative Transfer Comparison Project of Iliev et. al. The Fortran 90 source code for SPHRAY and example SPH density fields are made available online.

[ascl:1502.012] SPHGR: Smoothed-Particle Hydrodynamics Galaxy Reduction

SPHGR (Smoothed-Particle Hydrodynamics Galaxy Reduction) is a python based open-source framework for analyzing smoothed-particle hydrodynamic simulations. Its basic form can run a baryonic group finder to identify galaxies and a halo finder to identify dark matter halos; it can also assign said galaxies to their respective halos, calculate halo & galaxy global properties, and iterate through previous time steps to identify the most-massive progenitors of each halo and galaxy. Data about each individual halo and galaxy is collated and easy to access.

SPHGR supports a wide range of simulations types including N-body, full cosmological volumes, and zoom-in runs. Support for multiple SPH code outputs is provided by pyGadgetReader (ascl:1411.001), mainly Gadget (ascl:0003.001) and TIPSY (ascl:1111.015).

[ascl:1311.005] Spheroid: Electromagnetic Scattering by Spheroids

Spheroid determines the size distribution of polarizing interstellar dust grains based on electromagnetic scattering by spheroidal particles. It contains subroutines to treat the case of complex refractive indices, and also includes checks for some limiting cases.

[ascl:1309.004] Spherical: Geometry operations and searches on spherical surfaces

The Spherical Library provides an efficient and accurate mathematical representation of shapes on the celestial sphere, such as sky coverage and footprints. Shapes of arbitrary complexity and size can be dynamically created from simple building blocks, whose exact area is also analytically computed. This methodology is also perfectly suited for censoring problematic parts of datasets, e.g., bad seeing, satellite trails or diffraction spikes of bright stars.

[ascl:1806.023] Spheral++: Coupled hydrodynamical and gravitational numerical simulations

Spheral++ provides a steerable parallel environment for performing coupled hydrodynamical and gravitational numerical simulations. Hydrodynamics and gravity are modeled using particle-based methods (SPH and N-Body). It uses an Adaptive Smoothed Particle Hydrodynamics (ASPH) algorithm, provides a total energy conserving compatible hydro mode, and performs fluid and solid material modeling and damage and fracture modeling in solids.

[ascl:9912.001] SPH_1D: Hierarchical gravity/SPH treecode for simulations of interacting galaxies

We describe a fast tree algorithm for gravitational N-body simulation on SIMD parallel computers. The tree construction uses fast, parallel sorts. The sorted lists are recursively divided along their x, y and z coordinates. This data structure is a completely balanced tree (i.e., each particle is paired with exactly one other particle) and maintains good spatial locality. An implementation of this tree-building algorithm on a 16k processor Maspar MP-1 performs well and constitutes only a small fraction (approximately 15%) of the entire cycle of finding the accelerations. Each node in the tree is treated as a monopole. The tree search and the summation of accelerations also perform well. During the tree search, node data that is needed from another processor is simply fetched. Roughly 55% of the tree search time is spent in communications between processors. We apply the code to two problems of astrophysical interest. The first is a simulation of the close passage of two gravitationally, interacting, disk galaxies using 65,636 particles. We also simulate the formation of structure in an expanding, model universe using 1,048,576 particles. Our code attains speeds comparable to one head of a Cray Y-MP, so single instruction, multiple data (SIMD) type computers can be used for these simulations. The cost/performance ratio for SIMD machines like the Maspar MP-1 make them an extremely attractive alternative to either vector processors or large multiple instruction, multiple data (MIMD) type parallel computers. With further optimizations (e.g., more careful load balancing), speeds in excess of today's vector processing computers should be possible.

[ascl:1404.017] Spextool: Spectral EXtraction tool

Spextool (Spectral EXtraction tool) is an IDL-based data reduction package for SpeX, a medium resolution near-infrared spectrograph on the NASA IRTF. It performs all of the steps necessary to produce spectra ready for analysis and publication including non-linearity corrections, flat fielding, wavelength calibration, telluric correction, flux calibration, and order merging.

[ascl:1308.014] SPEX: High-resolution cosmic X-ray spectra analysis

SPEX is optimized for the analysis and interpretation of high-resolution cosmic X-ray spectra. The software is especially suited for fitting spectra obtained by current X-ray observatories like XMM-Newton, Chandra, and Suzaku. SPEX can fit multiple spectra with different model components simultaneously and handles highly complex models with many free parameters.

[ascl:1807.014] SPEGID: Single-Pulse Event Group IDentification

SPEGID (Single-Pulse Event Group IDentification) identifies astrophysical pulse candidates as trial single-pulse event groups (SPEGs) by first applying Density Based Spatial Clustering of Applications with Noise (DBSCAN) on trial single-pulse events and then merging the clusters that fall within the expected DM (Dispersion Measure) and time span of astrophysical pulses. SPEGID also calculates the peak score for each SPEG in the S/N versus DM space to identify the expected peak-like shape in the signal-to-noise (S/N) ratio versus DM curve of astrophysical pulses. Additionally, SPEGID groups SPEGs that appear at a consistent DM and therefore are likely emitted from the same source. After running SPEGID, periocity.py can be used to find (or verify) the underlying periodicity among a group of SPEGs (i.e., astrophysical pulse candidates).

[ascl:1310.008] SPECX: Spectral Line Data Reduction Package

SPECX is a general purpose line data reduction system. It can read and write FITS data cubes but has specialist support for the GSD format data from the James Clerk Maxwell Telescope. It includes commands to store and retrieve intermediate spectra in storage registers and perform the fitting and removal of polynomial, harmonic and Gaussian baselines.

SPECX can filter and edit spectra and list and display spectra on a graphics terminal. It is able to perform Fourier transform and power spectrum calculations, process up to eight spectra (quadrants) simultaneously with either the same or different center, and assemble a number of reduced individual spectra into a map file and contour or greyscale any plane or planes of the resulting cube.

Two versions of SPECX are distributed. Version 6.x is the VMS and Unix version and is distributed as part of the Starlink software collection. Version 7.x is a complete rewrite of SPECX distributed for Windows.

[ascl:1902.011] SpecViz: 1D Spectral Visualization Tool

SpecViz interactively visualizes and analyzes 1D astronomical spectra. It reads data from FITS and ASCII tables and allows spectra to be easily plotted and examined. It supports instrument-specific data quality handling, flexible spectral units conversions, custom plotting attributes, plot annotations, tiled plots, among other features. SpecViz includes a measurement tool for spectral lines for performing and recording measurements and a model fitting capability for creating simple (e.g., single Gaussian) or multi-component models (e.g., multiple Gaussians for emission and absorption lines in addition to regions of flat continua). SpecViz is built on top of the Specutils (ascl:1902.012) Astropy-affiliated python library, providing a visual, interactive interface to the analysis capabilities in that library.

[ascl:1210.016] Specview: 1-D spectral visualization and analysis of astronomical spectrograms

Specview is a tool for 1-D spectral visualization and analysis of astronomical spectrograms. Written in Java, it is capable of reading all the Hubble Space Telescope spectral data formats as well as data from several other instruments (such as IUE, FUSE, ISO, FORS and SDSS), preview spectra from MAST, and data from generic FITS and ASCII tables. It can read data from Virtual Observatory servers, and read and write spectrogram data in Virtual Observatory SED format. It can also read files in the SPC Galactic format used in the chemistry field. Once ingested, data can be plotted and examined with a large selection of custom settings. Specview supports instrument-specific data quality handling, flexible spectral units conversions, custom plotting attributes, plot annotations, tiled plots, hardcopy to JPEG files and PostScript file or printer, etc. Specview can be used to build wide-band SEDs, overplotting or combining data from the same astronomical source taken with different instruments and/or spectral bands. Data can be further processed with averaging, splicing, detrending, and Fourier filtering tools. Specview has a spectral model fitting capability that enables the user to work with multi-component models (including user-defined models) and fit models to data.

[ascl:1902.012] Specutils: Spectroscopic analysis and reduction

Specutils provides a basic interface for the loading, manipulation, and common forms of analysis of spectroscopic data. Its generic data containers and accompanying modules can be used to build a particular scientific workflow or higher-level analysis tool. It is an AstroPy (ascl:1304.002) affiliated package, and SpecViz (ascl:1902.011), which is built on top of Specutils, provides a visual, interactive interface to its analysis capabilities.

[ascl:9910.002] SPECTRUM: A stellar spectral synthesis program

SPECTRUM ((C) Richard O. Gray, 1992-2008) is a stellar spectral synthesis program which runs on a number of platforms, including most flavors of UNIX and LINUX. It will also run under Windwos 9x/ME/NT/2000/XP using the Cygwin tools or the distributed Windows binaries. The code for SPECTRUM has been written in the "C" language. SPECTRUM computes the LTE synthetic spectrum given a stellar atmosphere model. SPECTRUM can use as input the fully blanketed stellar atmosphere models of Robert Kurucz including the new models of Castelli and Kurucz, but any other stellar atmosphere model which can be cast into the format of Kurucz's models can be used as well. SPECTRUM can be programmed with "command-line switches" to give a number of different outputs. In the default mode, SPECTRUM computes the stellar-disk-integrated normalized-intensity spectrum, but in addition, SPECTRUM will compute the absolute monochromatic flux from the stellar atmosphere or the specific intensity from any point on the stellar surface.

[ascl:1202.010] SPECTRE: Manipulation of single-order spectra

SPECTRE's chief purpose is the manipulation of single-order spectra, and it performs many of the tasks contained in such IRAF routines as "splot" and "rv". It is not meant to replace the much more general capabilities of IRAF, but does some functions in a manner that some might find useful. A brief list of SPECTRE tasks are: spectrum smoothing; equivalent width calculation; continuum rectification; noise spike excision; and spectrum comparison. SPECTRE was written to manipulate coude spectra, and thus is probably most useful for working on high dispersion spectra. Echelle spectra can be gathered from various observatories, reduced to singly-dimensioned spectra using IRAF, then written out as FITS files, thus becoming accessible to SPECTRE. Three different spectra may be manipulated and displayed simultaneously. SPECTRE, written in standard FORTRAN77, can be used only with the SM graphics package.

[ascl:1609.017] spectral-cube: Read and analyze astrophysical spectral data cubes

Spectral-cube provides an easy way to read, manipulate, analyze, and write data cubes with two positional dimensions and one spectral dimension, optionally with Stokes parameters. It is a versatile data container for building custom analysis routines. It provides a uniform interface to spectral cubes, robust to the wide range of conventions of axis order, spatial projections, and spectral units that exist in the wild, and allows easy extraction of cube sub-regions using physical coordinates. It has the ability to create, combine, and apply masks to datasets and is designed to work with datasets too large to load into memory, and provide basic summary statistic methods like moments and array aggregates.

[ascl:1701.003] Spectra: Time series power spectrum calculator

Spectra calculates the power spectrum of a time series equally spaced or not based on the Spectral Correlation Coefficient (Ferraz-Mello 1981, Astron. Journal 86 (4), 619). It is very efficient for detection of low frequencies.

[ascl:1111.005] SPECTCOL: Spectroscopic and Collisional Data Retrieval

Studies of astrophysical non-LTE media require the combination of atomic and molecular spectroscopic and collisional data often described differently in various databases. SPECTCOL is a tool that implements VAMDC standards, retrieve relevant information from different databases such as CDMS, HITRAN, BASECOL, and can upload local files. All transfer of data between the client and the databases use the VAMDC-XSAMS schema. The spectroscopic and collisional information is combined and useful outputs (ascii or xsams) are provided for the study of the interstellar medium.

[ascl:1904.018] Specstack: A simple spectral stacking tool

Specstack creates stacked spectra using a simple algorithm with sigma-clipping to combine the spectra of galaxies in the rest-frame into a single averaged spectrum. Though written originally for galaxy spectra, it also works for other types of objects. It is written in Python and is started from the command-line.

[ascl:1404.014] SpecPro: Astronomical spectra viewer and analyzer

SpecPro is an interactive program for viewing and analyzing spectra, particularly in the context of modern imaging surveys. In addition to displaying the 1D and 2D spectrum, SpecPro can simultaneously display available stamp images as well as the spectral energy distribution of a source. This extra information can help significantly in assessing a spectrum.

[ascl:1407.003] SPECDRE: Spectroscopy Data Reduction

Specdre performs spectroscopy data reduction and analysis. General features of the package include data cube manipulation, arc line calibration, resampling and spectral fitting. Particular care is taken with error propagation, including tracking covariance. SPECDRE is distributed as part of the Starlink software collection (ascl:1110.012).

[ascl:1203.003] spec2d: DEEP2 DEIMOS Spectral Pipeline

The DEEP2 DEIMOS Data Reduction Pipeline ("spec2d") is an IDL-based, automated software package designed to reduce Keck/DEIMOS multi-slit spectroscopic observations, collected as part of the DEEP2 Galaxy Redshift Survey. The pipeline is best suited for handling data taken with the 1200 line/mm grating tilted towards the red (lambda_c ~ 7800Å). The spec2d reduction package takes the raw DEIMOS data as its input and produces a variety of outputs including 2-d slit spectra and 1-d object spectra.

[ascl:1010.016] SpDust/SpDust.2: Code to Calculate Spinning Dust Spectra

SpDust is an IDL program that evaluates the spinning dust emissivity for user-provided environmental conditions. A new version of the code became available in March, 2010.

[ascl:1711.001] SpcAudace: Spectroscopic processing and analysis package of Audela software

SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

[ascl:1511.011] SparsePZ: Sparse Representation of Photometric Redshift PDFs

SparsePZ uses sparse basis representation to fully represent individual photometric redshift probability density functions (PDFs). This approach requires approximately half the parameters for the same multi-Gaussian fitting accuracy, and has the additional advantage that an entire PDF can be stored by using a 4-byte integer per basis function. Only 10-20 points per galaxy are needed to reconstruct both the individual PDFs and the ensemble redshift distribution, N(z), to an accuracy of 99.9 per cent when compared to the one built using the original PDFs computed with a resolution of δz = 0.01, reducing the required storage of 200 original values by a factor of 10-20. This basis representation can be directly extended to a cosmological analysis, thereby increasing computational performance without losing resolution or accuracy.

[ascl:1105.006] SPARC: Seismic Propagation through Active Regions and Convection

The Seismic Propagation through Active Regions and Convection (SPARC) code was developed by S. Hanasoge. The acoustic wavefield in SPARC is simulated by numerically solving the linearised 3-D Euler equations in Cartesian geometry (e.g., see Hanasoge, Duvall and Couvidat (2007)). Spatial derivatives are calculated using sixth-order compact finite differences (Lele,1992) and time evolution is achieved through the repeated application of an optimized second-order five-stage Runge-Kutta scheme (Hu, 1996). Periodic horizontal boundaries are used.

[ascl:1812.005] SPAMCART: Smoothed PArticle Monte CArlo Radiative Transfer

SPAMCART generates synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. It follows discrete luminosity packets as they propagate through a density field, and computes the radiative equilibrium temperature of the ambient dust from their trajectories. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. The code strictly adheres to Kirchhoff's law of radiation. The algorithm is based on the Lucy Monte Carlo radiative transfer method and is fairly simple to implement, as it uses data structures that are already constructed for other purposes in modern particle codes

[ascl:1408.006] SPAM: Source Peeling and Atmospheric Modeling

SPAM is a extension to AIPS for reducing high-resolution, low-frequency radio interferometric observations. Direction-dependent ionospheric calibration and image-plane ripple suppression are among the features that help to make high-quality sub-GHz images. Data reductions are captured in well-tested Python scripts that execute AIPS tasks directly (mostly during initial data reduction steps), call high-level functions that make multiple AIPS or ParselTongue calls, and require few manual operations.

[ascl:1806.010] SpaghettiLens: Web-based gravitational lens modeling tool

SpaghettiLens allows citizen scientists to model gravitational lenses collaboratively; the software should also be easily adaptable to any other, reasonably similar problem. It lets volunteers execute a computer intensive task that cannot be easily executed client side and relies on citizen scientists collaborating. SpaghettiLens makes survey data available to citizen scientists, manages the model configurations generated by the volunteers, stores the resulting model configuration, and delivers the actual model. A model can be shared and discussed with other volunteers and revised, and new child models can be created, resulting in a branching version tree of models that explore different possibilities. Scientists can choose a collection of models; discussion among volunteers and scientists prune the tree to determine which models will receive further analysis.

[ascl:1401.002] SpacePy: Python-Based Tools for the Space Science Community

SpacePy provides data analysis and visualization tools for the space science community. Written in Python, it builds on the capabilities of the NumPy and MatPlotLib packages to make basic data analysis, modeling and visualization easier. It contains modules for handling many complex time formats, obtaining data from the OMNI database, and accessing the powerful Onera library. It contains a library of commonly used empirical relationships, performs association analysis, coordinate transformations, radiation belt modeling, and CDF reading, and creates publication quality plots.

[ascl:1504.002] SPA: Solar Position Algorithm

The Solar Position Algorithm (SPA) calculates the solar zenith and azimuth angles in the period from the year -2000 to 6000, with uncertainties of +/- 0.0003 degrees based on the date, time, and location on Earth. SPA is implemented in C; in addition to being available for download, an online calculator using this code is available at http://www.nrel.gov/midc/solpos/spa.html.

[ascl:1805.028] SP_Ace: Stellar Parameters And Chemical abundances Estimator

SP_Ace (Stellar Parameters And Chemical abundances Estimator) estimates the stellar parameters Teff, log g, [M/H], and elemental abundances. It employs 1D stellar atmosphere models in Local Thermodynamic Equilibrium (LTE). The code is highly automated and suitable for analyzing the spectra of large spectroscopic surveys with low or medium spectral resolution (R = 2000-20 000). A web service for calculating these values with the software is also available.

[ascl:1307.020] SOPT: Sparse OPTimisation

SOPT (Sparse OPTimisation) is a C implementation of the Sparsity Averaging Reweighted Analysis (SARA) algorithm. The approach relies on the observation that natural images exhibit strong average sparsity; average sparsity outperforms state-of-the-art priors that promote sparsity in a single orthonormal basis or redundant frame, or that promote gradient sparsity.

[ascl:1607.014] SOPIE: Sequential Off-Pulse Interval Estimation

SOPIE (Sequential Off-Pulse Interval Estimation) provides functions to non-parametrically estimate the off-pulse interval of a source function originating from a pulsar. The technique is based on a sequential application of P-values obtained from goodness-of-fit tests for the uniform distribution, such as the Kolmogorov-Smirnov, Cramér-von Mises, Anderson-Darling and Rayleigh goodness-of-fit tests.

[ascl:1810.017] SOPHISM: Software Instrument Simulator

SOPHISM models astronomical instrumentation from the entrance of the telescope to data acquisition at the detector, along with software blocks dealing with, for example, demodulation, inversion, and compression. The code performs most analyses done with light in astronomy, such as differential photometry, spectroscopy, and polarimetry. The simulator offers flexibility and implementation of new effects and subsystems, making it user-adaptable for a wide variety of instruments. SOPHISM can be used for all stages of instrument definition, design, operation, and lifetime tracking evaluation.

[ascl:1412.014] SOPHIA: Simulations Of Photo Hadronic Interactions in Astrophysics

SOPHIA (Simulations Of Photo Hadronic Interactions in Astrophysics) solves problems connected to photohadronic processes in astrophysical environments and can also be used for radiation and background studies at high energy colliders such as LEP2 and HERA, as well as for simulations of photon induced air showers. SOPHIA implements well established phenomenological models, symmetries of hadronic interactions in a way that describes correctly the available exclusive and inclusive photohadronic cross section data obtained at fixed target and collider experiments.

[ascl:1701.012] SONG: Second Order Non-Gaussianity

SONG computes the non-linear evolution of the Universe in order to predict cosmological observables such as the bispectrum of the Cosmic Microwave Background (CMB). More precisely, it is a second-order Boltzmann code, as it solves the Einstein and Boltzmann equations up to second order in the cosmological perturbations.

[ascl:1208.013] SolarSoft: Programming and data analysis environment for solar physics

SolarSoft is a set of integrated software libraries, data bases, and system utilities which provide a common programming and data analysis environment for Solar Physics. The SolarSoftWare (SSW) system is built from Yohkoh, SOHO, SDAC and Astronomy libraries and draws upon contributions from many members of those projects. It is primarily an IDL based system, although some instrument teams integrate executables written in other languages. The SSW environment provides a consistent look and feel at widely distributed co-investigator institutions to facilitate data exchange and to stimulate coordinated analysis. Commonalities and overlap in solar data and analysis goals are exploited to permit application of fundamental utilities to the data from many different solar instruments. The use of common libraries, utilities, techniques and interfaces minimizes the learning curve for investigators who are analyzing new solar data sets, correlating results from multiple experiments or performing research away from their home institution.

[ascl:1412.001] SoFiA: Source Finding Application

SoFiA is a flexible source finding pipeline designed to detect and parameterise sources in 3D spectral-line data cubes. SoFiA combines several powerful source finding and parameterisation algorithms, including wavelet denoising, spatial and spectral smoothing, source mask optimisation, spectral profile fitting, and calculation of the reliability of detections. In addition to source catalogues in different formats, SoFiA can also generate a range of output data cubes and images, including source masks, moment maps, sub-cubes, position-velocity diagrams, and integrated spectra. The pipeline is controlled by simple parameter files and can either be invoked on the command line or interactively through a modern graphical user interface.

[ascl:1403.026] SOFA: Standards of Fundamental Astronomy

SOFA (Standards Of Fundamental Astronomy) is a collection of subprograms, in source-code form, that implement official IAU algorithms for fundamental astronomy computations. SOFA offers more than 160 routines for fundamental astronomy, including time scales (including dealing with leap seconds), Earth rotation, sidereal time, precession, nutation, polar motion, astrometry and transforms between various reference systems (e.g. BCRS, ICRS, GCRS, CIRS, TIRS, ITRS). The subprograms are supported by 55 vector/matrix routines, and are available in both Fortran77 and C implementations.

[ascl:1504.021] SOAP 2.0: Spot Oscillation And Planet 2.0

SOAP (Spot Oscillation And Planet) 2.0 simulates the effects of dark spots and bright plages on the surface of a rotating star, computing their expected radial velocity and photometric signatures. It includes the convective blueshift and its inhibition in active regions.

[ascl:1902.001] SNTD: Supernova Time Delays

Supernova Time Delays (SNTD) simulates and measures time delay of multiply-imaged supernovae, and offers an improved characterization of the uncertainty caused by microlensing. Lensing time delays can be determined by fitting the multiple light curves of these objects; measuring these delays provide precise tests of lens models or constraints on the Hubble constant and other cosmological parameters that are independent of the local distance ladder. Fitting the effects of microlensing without an accurate prior often leads to biases in the time delay measurement and over-fitting to the data; this can be mitigated by using a Gaussian Process Regression (GPR) technique to determine the uncertainty due to microlensing. SNTD can produce accurate simulations for wide-field time domain surveys such as LSST and WFIRST.

[ascl:1805.017] SNSEDextend: SuperNova Spectral Energy Distributions extrapolation toolkit

SNSEDextend extrapolates core-collapse and Type Ia Spectral Energy Distributions (SEDs) into the UV and IR for use in simulations and photometric classifications. The user provides a library of existing SED templates (such as those in the authors' SN SED Repository) along with new photometric constraints in the UV and/or NIR wavelength ranges. The software then extends the existing template SEDs so their colors match the input data at all phases. SNSEDextend can also extend the SALT2 spectral time-series model for Type Ia SN for a "first-order" extrapolation of the SALT2 model components, suitable for use in survey simulations and photometric classification tools; as the code does not do a rigorous re-training of the SALT2 model, the results should not be relied on for precision applications such as light curve fitting for cosmology.

[ascl:1703.006] SNRPy: Supernova remnant evolution modeling

SNRPy (Super Nova Remnant Python) models supernova remnant (SNR) evolution and is useful for understanding SNR evolution and to model observations of SNR for obtaining good estimates of SNR properties. It includes all phases for the standard path of evolution for spherically symmetric SNRs and includes alternate evolutionary models, including evolution in a cloudy ISM, the fractional energy loss model, and evolution in a hot low-density ISM. The graphical interface takes in various parameters and produces outputs such as shock radius and velocity vs. time, SNR surface brightness profile and spectrum.

[ascl:1505.023] SNooPy: TypeIa supernovae analysis tools

The SNooPy package (also known as SNpy), written in Python, contains tools for the analysis of TypeIa supernovae. It offers interactive plotting of light-curve data and models (and spectra), computation of reddening laws and K-corrections, LM non-linear least-squares fitting of light-curve data, and various types of spline fitting, including Diercx and tension. The package also includes a SNIa lightcurve template generator in the CSP passbands, estimates of Milky-Way Extinction, and a module for dealing with filters and spectra.

[ascl:1505.022] Snoopy: General purpose spectral solver

Snoopy is a spectral 3D code that solves the MHD and Boussinesq equations, such as compressibility, particles, and Braginskii viscosity, and several other physical effects. It's useful for turbulence study involving shear and rotation. Snoopy requires the FFTW library (ascl:1201.015), and can run on parallel machine using MPI OpenMP or both at the same time.

[ascl:1107.001] SNID: Supernova Identification

We present an algorithm to identify the type of an SN spectrum and to determine its redshift and age. This algorithm, based on the correlation techniques of Tonry & Davis, is implemented in the Supernova Identification (SNID) code. It is used by members of ongoing high-redshift SN searches to distinguish between type Ia and type Ib/c SNe, and to identify "peculiar" SNe Ia. We develop a diagnostic to quantify the quality of a correlation between the input and template spectra, which enables a formal evaluation of the associated redshift error. Furthermore, by comparing the correlation redshifts obtained using SNID with those determined from narrow lines in the SN host galaxy spectrum, we show that accurate redshifts (with a typical error less than 0.01) can be determined for SNe Ia without a spectrum of the host galaxy. Last, the age of an input spectrum is determined with a typical 3-day accuracy, shown here by using high-redshift SNe Ia with well-sampled light curves. The success of the correlation technique confirms the similarity of some SNe Ia at low and high redshifts. The SNID code, which is available to the community, can also be used for comparative studies of SN spectra, as well as comparisons between data and models.

[ascl:1505.033] SNEC: SuperNova Explosion Code

SNEC (SuperNova Explosion Code) is a spherically-symmetric Lagrangian radiation-hydrodynamics code that follows supernova explosions through the envelope of their progenitor star, produces bolometric (and approximate multi-color) light curve predictions, and provides input to spectral synthesis codes for spectral modeling. SNEC's features include 1D (spherical) Lagrangian Newtonian hydrodynamics with artificial viscosity, stellar equation of state with a Saha solver ionization/recombination, equilibrium flux-limited photon diffusion with OPAL opacities and low-temperature opacities, and prediction of bolometric light curves and multi-color lightcurves (in the blackbody approximation).

[ascl:1611.017] SNCosmo: Python library for supernova cosmology

SNCosmo synthesizes supernova spectra and photometry from SN models, and has functions for fitting and sampling SN model parameters given photometric light curve data. It offers fast implementations of several commonly used extinction laws and can be used to construct SN models that include dust. The SNCosmo library includes supernova models such as SALT2, MLCS2k2, Hsiao, Nugent, PSNID, SNANA and Whalen models, as well as a variety of built-in bandpasses and magnitude systems, and provides convenience functions for reading and writing peculiar data formats used in other packages. The library is extensible, allowing new models, bandpasses, and magnitude systems to be defined using an object-oriented interface.

[ascl:1010.027] SNANA: A Public Software Package for Supernova Analysis

SNANA is a general analysis package for supernova (SN) light curves that contains a simulation, light curve fitter, and cosmology fitter. The software is designed with the primary goal of using SNe Ia as distance indicators for the determination of cosmological parameters, but it can also be used to study efficiencies for analyses of SN rates, estimate contamination from non-Ia SNe, and optimize future surveys. Several SN models are available within the same software architecture, allowing technical features such as K-corrections to be consistently used among multiple models, and thus making it easier to make detailed comparisons between models. New and improved light-curve models can be easily added. The software works with arbitrary surveys and telescopes and has already been used by several collaborations, leading to more robust and easy-to-use code. This software is not intended as a final product release, but rather it is designed to undergo continual improvements from the community as more is learned about SNe.

[ascl:1310.007] SMURF: SubMillimeter User Reduction Facility

SMURF reduces submillimeter single-dish continuum and heterodyne data. It is mainly targeted at data produced by the James Clerk Maxwell Telescope but data from other telescopes have been reduced using the package. SMURF is released as part of the bundle that comprises Starlink (ascl:1110.012) and most of the packages that use it. The two key commands are MAKEMAP for the creation of maps from sub millimeter continuum data and MAKECUBE for the creation of data cubes from heterodyne array instruments. The software can also convert data from legacy JCMT file formats to the modern form to allow it to be processed by MAKECUBE. SMURF is a core component of the ORAC-DR (ascl:1310.001) data reduction pipeline for JCMT.

[ascl:1303.005] SMMOL: Spherical Multi-level MOLecular line radiative transfer

SMMOL (Spherical Multi-level MOLecular line radiative transfer) is a molecular line radiative transfer code that uses Accelerated Lambda Iteration to solve the coupled level population and line transfer problem in spherical geometry. The code uses a discretized grid and a ray tracing methodology. SMMOL is designed for high optical depth regimes and can cope with maser emission as long as the spatial-velocity sampling is fine enough.

[ascl:1904.005] SMILI: Sparse Modeling Imaging Library for Interferometry

SMILI uses sparse sampling techniques and other regularization methods for interferometric imaging. The python-interfaced library is mainly designed for very long baseline interferometry, and has been under the active development primarily for the Event Horizon Telescope (EHT).

[ascl:1308.001] SMILE: Orbital analysis and Schwarzschild modeling of triaxial stellar systems

SMILE is interactive software for studying a variety of 2D and 3D models, including arbitrary potentials represented by a basis-set expansion, a spherical-harmonic expansion with coefficients being smooth functions of radius (splines), or a set of fixed point masses. Its main features include:

  • orbit integration in various 2d and 3d potentials (including N-body and basis-set representations of an arbitrary potential);
  • methods for analysis of orbital class, fundamental frequencies, regular or chaotic nature of an orbit, computation of Lyapunov exponents;
  • Poincaré sections (in 2d) and frequency maps (in 3d) for analyzing orbital structure of potential;
  • construction of self-consistent Schwarzschild models; and
  • convenient visualization and integrated GUI environment, and a console scriptable version.
SMILE is portable to different platforms including MS Windows, Linux and Mac.

[ascl:1804.010] SMERFS: Stochastic Markov Evaluation of Random Fields on the Sphere

SMERFS (Stochastic Markov Evaluation of Random Fields on the Sphere) creates large realizations of random fields on the sphere. It uses a fast algorithm based on Markov properties and fast Fourier Transforms in 1d that generates samples on an n X n grid in O(n2 log n) and efficiently derives the necessary conditional covariance matrices.

[ascl:1202.013] SME: Spectroscopy Made Easy

Spectroscopy Made Easy (SME) is IDL software and a compiled external library that fits an observed high-resolution stellar spectrum with a synthetic spectrum to determine stellar parameters. The SME external library is available for Mac, Linux, and Windows systems. Atomic and molecular line data formatted for SME may be obtained from VALD. SME can solve for empirical log(gf) and damping parameters, using an observed spectrum of a star (usually the Sun) as a constraint.

[ascl:1603.007] SMARTIES: Spheroids Modelled Accurately with a Robust T-matrix Implementation for Electromagnetic Scattering

SMARTIES calculates the optical properties of oblate and prolate spheroidal particles, with comparable capabilities and ease-of-use as Mie theory for spheres. This suite of MATLAB codes provides a fully documented implementation of an improved T-matrix algorithm for the theoretical modelling of electromagnetic scattering by particles of spheroidal shape. Included are scripts that cover a range of scattering problems relevant to nanophotonics and plasmonics, including calculation of far-field scattering and absorption cross-sections for fixed incidence orientation, orientation-averaged cross-sections and scattering matrix, surface-field calculations as well as near-fields, wavelength-dependent near-field and far-field properties, and access to lower-level functions implementing the T-matrix calculations, including the T-matrix elements which may be calculated more accurately than with competing codes.

[ascl:1210.021] SMART: Spectroscopic Modeling Analysis and Reduction Tool

SMART is an IDL-based software tool, developed by the IRS Instrument Team at Cornell University, that allows users to reduce and analyze Spitzer data from all four modules of the Infrared Spectrograph, including the peak-up arrays. The software is designed to make full use of the ancillary files generated in the Spitzer Science Center pipeline so that it can either remove or flag artifacts and corrupted data and maximize the signal-to-noise ratio in the extraction routines. It can be run in both interactive and batch modes. SMART includes visualization tools for assessing data quality, basic arithmetic operations for either two-dimensional images or one-dimensional spectra, extraction of both point and extended sources, and a suite of spectral analysis tools.

[ascl:1106.012] SLUG: Stochastically Lighting Up Galaxies

The effects of stochasticity on the luminosities of stellar populations are an often neglected but crucial element for understanding populations in the low mass or low star formation rate regime. To address this issue, we present SLUG, a new code to "Stochastically Light Up Galaxies". SLUG synthesizes stellar populations using a Monte Carlo technique that treats stochastic sampling properly including the effects of clustering, the stellar initial mass function, star formation history, stellar evolution, and cluster disruption. This code produces many useful outputs, including i) catalogs of star clusters and their properties, such as their stellar initial mass distributions and their photometric properties in a variety of filters, ii) two dimensional histograms of color-magnitude diagrams of every star in the simulation, iii) and the photometric properties of field stars and the integrated photometry of the entire simulated galaxy. After presenting the SLUG algorithm in detail, we validate the code through comparisons with starburst99 in the well-sampled regime, and with observed photometry of Milky Way clusters. Finally, we demonstrate the SLUG's capabilities by presenting outputs in the stochastic regime.

[ascl:1010.035] SLR: Stellar Locus Regression

Stellar Locus Regression (SLR) is a simple way to calibrate colors at the 1-2% level, and magnitudes at the sub-5% level as limited by 2MASS, without the traditional use of standard stars. With SLR, stars in any field are "standards." This is an entirely new way to calibrate photometry. SLR exploits the simple fact that most stars lie along a well defined line in color-color space called the stellar locus. Cross-match point-sources in flattened images taken through different passbands and plot up all color vs color combinations, and you will see the stellar locus with little effort. SLR calibrates colors by fitting these colors to a standard line. Cross-match with 2MASS on top of that, and SLR will deliver calibrated magnitudes as well.

[ascl:9906.001] SLOPES: Least-squares linear regression lines for bivariate datasets

SLOPES computes six least-squares linear regression lines for bivariate datasets of the form (x_i,y_i) with unknown population distributions. Measurement errors, censoring (nondetections) or other complications are not treated. The lines are: the ordinary least-squares regression of y on x, OLS(Y|X); the inverse regression of x on y, OLS(X_Y); the angular bisector of the OLS lines; the orthogonal regression line; the reduced major axis, and the mean-OLS line. The latter four regressions treat the variables symmetrically, while the first two regressions are asymmetrical. Uncertainties for the regression coefficients of each method are estimated via asymptotic formulae, bootstrap resampling, and bivariate normal simulation. These methods, derivation of the regression coefficient uncertainties, and discussions of their use are provided in three papers listed below. The user is encouraged to read and reference these studies.

[ascl:1507.005] slimplectic: Discrete non-conservative numerical integrator

slimplectic is a python implementation of a numerical integrator that uses a fixed time-step variational integrator formalism applied to the principle of stationary nonconservative action. It allows nonconservative effects to be included in the numerical evolution while preserving the major benefits of normally conservative symplectic integrators, particularly the accurate long-term evolution of momenta and energy. slimplectic is appropriate for exploring cosmological or celestial N-body dynamics problems where nonconservative interactions, e.g. dynamical friction or dissipative tides, can play an important role.

[ascl:1409.010] Slim: Numerical data compression for scientific data sets

Slim performs lossless compression on binary data files. Written in C++, it operates very rapidly and achieves better compression on noisy physics data than general-purpose tools designed primarily for text.

[ascl:1105.004] SLiM: A Code for the Simulation of Wave Propagation through an Inhomogeneous, Magnetised Solar Atmosphere

The semi-spectral linear MHD (SLiM) code follows the interaction of linear waves through an inhomogeneous three-dimensional solar atmosphere. The background model allows almost arbitrary perturbations of density, temperature, sound speed as well as magnetic and velocity fields. The code is useful in understanding the helioseismic signatures of various solar features, including sunspots.

[ascl:1611.021] SlicerAstro: Astronomy (HI) extension for 3D Slicer

SlicerAstro extends 3D Slicer, a multi-platform package for visualization and medical image processing, to provide a 3-D interactive viewer with 3-D human-machine interaction features, based on traditional 2-D input/output hardware, and analysis capabilities.

[ascl:1403.025] SLALIB: A Positional Astronomy Library

SLALIB is a library of routines that make accurate and reliable positional-astronomy applications easier to write. Most SLALIB routines are concerned with astronomical position and time, but a number have wider trigonometrical, numerical or general applications. A Fortran implementation of SLALIB under GPL licensing is available as part of Starlink (ascl:1110.012).

[ascl:1312.014] SL1M: Synthesis through L1 Minimization

SL1M deconvolves radio synthesis images based on direct inversion of the measured visibilities that can deal with the non-coplanar base line effect and can be applied to telescopes with direction dependent gains. The code is more computationally demanding than some existing methods, but is highly parallelizable and scale well to clusters of CPUs and GPUs. The algorithm is also extremely flexible, allowing the solution of the deconvolution problem on arbitrarily placed pixels.

[ascl:1511.003] SkyView Virtual Telescope

The SkyView Virtual telescope provides access to survey datasets ranging from radio through the gamma-ray regimes. Over 100 survey datasets are currently available. The SkyView library referenced here is used as the basis for the SkyView web site (at http://skvyiew.gsfc.nasa.gov) but is designed for individual use by researchers as well.

SkyView's approach to access surveys is distinct from most other toolkits. Rather than providing links to the original data, SkyView attempts to immediately re-render the source data in the user-requested reference frame, projection, scaling, orientation, etc. The library includes a set of geometry transformation and mosaicking tools that may be integrated into other applications independent of SkyView.

[ascl:1312.007] SkyNet: Neural network training tool for machine learning in astronomy

SkyNet is an efficient and robust neural network training code for machine learning. It is able to train large and deep feed-forward neural networks, including autoencoders, for use in a wide range of supervised and unsupervised learning applications, such as regression, classification, density estimation, clustering and dimensionality reduction. SkyNet is implemented in C/C++ and fully parallelized using MPI.

[ascl:1710.005] SkyNet: Modular nuclear reaction network library

The general-purpose nuclear reaction network SkyNet evolves the abundances of nuclear species under the influence of nuclear reactions. SkyNet can be used to compute the nucleosynthesis evolution in all astrophysical scenarios where nucleosynthesis occurs. Any list of isotopes can be evolved and SkyNet supports various different types of nuclear reactions. SkyNet is modular, permitting new or existing physics, such as nuclear reactions or equations of state, to be easily added or modified.

[ascl:1010.066] SkyMaker: Astronomical Image Simulations Made Easy

SkyMaker is a program that simulates astronomical images. It accepts object lists in ASCII generated by the Stuff program to produce realistic astronomical fields. SkyMaker is part of the EFIGI development project.

[ascl:1408.007] Skycorr: Sky emission subtraction for observations without plain sky information

Skycorr is an instrument-independent sky subtraction code that uses physically motivated line group scaling in the reference sky spectrum by a fitting approach for an improved sky line removal in the object spectrum. Possible wavelength shifts between both spectra are corrected by fitting Chebyshev polynomials and advanced rebinning without resolution decrease. For the correction, the optimized sky line spectrum and the automatically separated sky continuum (without scaling) is subtracted from the input object spectrum. Tests show that Skycorr performs well (per cent level residuals) for data in different wavelength regimes and of different resolution, even in the cases of relatively long time lags between the object and the reference sky spectrum. Lower quality results are mainly restricted to wavelengths not dominated by airglow lines or pseudo continua by unresolved strong emission bands.

[ascl:1109.019] SkyCat: Visualization and Catalog and Data Access Tool

SkyCat is a tool that combines visualization of images and access to catalogs and archive data for astronomy. The tool, developed in Tcl/Tk, was originally conceived as a demo of the capabilities of the class library that was developed for the VLT. The Skycat sources currently consist of five packages:

  • Tclutil - Generic Tcl and C++ utilities
  • Astrotcl - Astronomical Tcl and C++ utilities
  • RTD - Real-time Display classes and widgets
  • Catlib - Catalog library and widgets
  • Skycat - Skycat application and library package
All of the required packages are always included in the tarfile.

[ascl:1609.014] Sky3D: Time-dependent Hartree-Fock equation solver

Written in Fortran 90, Sky3D solves the static or dynamic equations on a three-dimensional Cartesian mesh with isolated or periodic boundary conditions and no further symmetry assumptions. Pairing can be included in the BCS approximation for the static case. The code can be easily modified to include additional physics or special analysis of the results and requires LAPACK and FFTW3.

[ascl:1109.003] SKIRT: Stellar Kinematics Including Radiative Transfer

SKIRT is a radiative transfer code based on the Monte Carlo technique. The name SKIRT, acronym for Stellar Kinematics Including Radiative Transfer, reflects the original motivation for its creation: it has been developed to study the effects of dust absorption and scattering on the observed kinematics of dusty galaxies. In a second stage, the SKIRT code was extended with a module to self-consistently calculate the dust emission spectrum under the assumption of local thermal equilibrium. This LTE version of SKIRT has been used to model the dust extinction and emission of various types of galaxies, as well as circumstellar discs and clumpy tori around active galactic nuclei. A new, extended version of SKIRT code can perform efficient 3D radiative transfer calculations including a self-consistent calculation of the dust temperature distribution and the associated FIR/submm emission with a full incorporation of the emission of transiently heated grains and PAH molecules.

[ascl:1102.020] SKID: Finding Gravitationally Bound Groups in N-body Simulations

SKID finds gravitationally bound groups in N-body simulations. The SKID program will group different types of particles depending on the type of input binary file. This could be either dark matter particles, gas particles, star particles or gas and star particles depending on what is in the input tipsy binary file. Once groups with at least a certain minimum number of members have been determined, SKID will remove particles which are not bound to the group. SKID must use the original positions of all the particles to determine whether or not particles are bound. This procedure which we call unbinding, is again dependent on the type of grouping we are dealing with. There are two cases, one for dark matter only or star particles only (case 1 unbinding), the other for inputs including gas (also stars in a dark matter environment this is case 2 unbinding).

Skid version 1.3 is a much improved version of the old denmax-1.1 version. The new name was given to avoid confusion with the DENMAX program of Gelb & Bertschinger, and although it is based on the same idea it represents a substantial evolution in the method.

[ascl:1903.002] SIXTE: Simulation of X-ray Telescopes

SIXTE simulates X-Ray telescope observation; the software performs instrument performance analyses and produces simulated event files for mission and analysis studies. SIXTE strives to find a compromise between exactness of the simulation and speed. Using calibration files such as the PSF, RMF and ARF makes efficient simulations possible at comparably high speed, even though they include nonlinear effects such as pileup. Setups for some current and future missions, such as XMM-Newton and Athena, are included in the package; others can be added by the user with relatively little effort through specifying the main instrument characteristics in a flexible, human-readable XML-based format. Properties of X-ray sources to be simulated are described in a detector-independent format, i.e., the same input can be used for simulating observations with all available instruments, and the same input can also be used for simulations with the SIMX simulator. The input files are easily generated from standard data such as XSPEC (ascl:9910.005) spectral models or FITS images with tools provided with the SIXTE distribution. The input data scale well from single point sources up to very complicated setups.

[ascl:1111.008] SITools2: A Framework for Archival Systems

SITools2 is a CNES generic tool performed by a joint effort between CNES and scientific laboratories. SITools provides a self-manageable data access layer deployed on already existing scientific laboratory databases. This new version of SITools is a JAVA-based framework, under open source license, that provides a portable archive system, highly configurable, easy to use by laboratories, with a plugin mechanism so developers can add their own applications.

[ascl:1212.008] SIR: Stokes Inversion based on Response functions

SIR is a general-purpose code capable of dealing with gradients of the physical quantities with height. It admits one and two-component model atmospheres. It allows the recovery of the stratification of the temperature, the magnetic field vector, and the line of sight velocity through the atmosphere, and the micro- and macroturbulence velocities - which are assumed to be constant with depth. It is based on the response functions, which enter a Marquardt nonlinear least-squares algorithm in a natural way. Response functions are calculated at the same time as the full radiative transfer equation for polarized light is integrated, which determines values of many free parameters in a reasonable computation time. SIR demonstrates high stability, accuracy, and uniqueness of results, even when simulated observations present signal-to-noise ratios of the order of the lowest acceptable values in real observations.

[ascl:1609.018] SIP: Systematics-Insensitive Periodograms

SIP (Systematics-Insensitive Periodograms) extends the generative model used to create traditional sine-fitting periodograms for finding the frequency of a sinusoid by including systematic trends based on a set of eigen light curves in the generative model in addition to using a sum of sine and cosine functions over a grid of frequencies, producing periodograms with vastly reduced systematic features. Acoustic oscillations in giant stars and measurement of stellar rotation periods can be recovered from the SIP periodograms without detrending. The code can also be applied to detection other periodic phenomena, including eclipsing binaries and short-period exoplanet candidates.

[ascl:1010.026] SingLe: A F90-package devoted to Softened Gravity in gaseous discs

SofteningLength: Because Newton's law of Gravitation diverges as the relative separations |r'-r| tends to zero, it is common to add a positive constant λ also known as the "softening length", i.e. :

|r'-r|² ← |r'-r|² + λ².

SingLe determines the appropriate value of this Softening Length λ for a given disc local structure (thickness 2h and vertical stratification ρ), in the axially symmetric, flat disc limit, preserving at best the Newtonian character of the gravitational potential and associated forces. Mass density ρ(z) is assumed to be locally expandable in the z-direction according to:

ρ(z)= ρ0[1 + a1(z/h)2+...+aq (z/h)2q+...+aN (z/h)2 N].

[ascl:1708.019] SINFONI Pipeline: Data reduction pipeline for the Very Large Telescope SINFONI spectrograph

The SINFONI pipeline reduces data from the Very Large Telescope's SINFONI (Spectrograph for INtegral Field Observations in the Near Infrared) instrument. It can evaluate the detector linearity and generate a corresponding non linear pixel map, create a master dark and a hot-pixel map, a master flat and a map of pixels which have intensities greater than a given threshold. It can also compute the optical distortions and slitlets distances, and perform wavelength calibration, PSF, telluric standard and other science data reduction, and can coadd bad pixel maps, collapse a cube to an image over a given wavelength range, perform cube arithmetics, among other useful tasks.

[ascl:1307.013] SIMX: Event simulator

SIMX simulates a photon-counting detector's response to an input source, including a simplified model of any telescope. The code is not a full ray-trace, but a convolution tool that uses standard descriptions of telescope PSF (via either a simple Gaussian parameter, an energy-dependent encircled-energy function, or an image of the PSF) and the detector response (using the OGIP response function) to model how sources will appear. simx uses a predefined set of PSFs, vignetting information, and instrumental responses and outputs to make the simulation. It is designed to be a 'approximation' tool to estimate issues such as source confusion, background effects, pileup, and other similar issues.

[ascl:1904.016] simuTrans: Gravity-darkened exoplanet transit simulator

simuTrans models transit light curves affected by gravity-darkened stars. The code defines a star on a grid by modeling the brightness of each point as blackbody emission, then sets a series of parameters and uses emcee (ascl:1303.002) to explore the posterior probability distribution for the remaining fitted parameters and determine their best-fit values.

[ascl:1903.006] SimSpin: Kinematic analysis of galaxy simulations

The R-package SimSpin measures the kinematics of a galaxy simulation as if it had been observed using an IFU. The functions included in the package can produce a kinematic data cube and measure the "observables" from this data cube, specifically the observable spin parameter λr. This package, once installed, is fully documented and tested.

[ascl:1606.010] SimpLens: Interactive gravitational lensing simulator

SimpLens illustrates some of the theoretical ideas important in gravitational lensing in an interactive way. After setting parameters for elliptical mass distribution and external mass, SimpLens displays the mass profile and source position, the lens potential and image locations, and indicate the image magnifications and contours of virtual light-travel time. A lens profile can be made shallower or steeper with little change in the image positions and with only total magnification affected.

[ascl:1110.022] simple_cosfitter: Supernova-centric Cosmological Fitter

This is an implementation of a fairly simple-minded luminosity distance fitter, intended for use with supernova data. The calculational technique is based on evaluating the $chi^2$ of the model fit on a grid and marginalization over various nuisance parameters. Of course, the nature of these things is that this code has gotten steadily more complex, so perhaps the simple moniker is no longer justified.

[ascl:1010.025] SimFast21: Simulation of the Cosmological 21cm Signal

SimFast 21 generates a simulation of the cosmological 21cm signal. While limited to low spatial resolution, the next generation low-frequency radio interferometers that target 21 cm observations during the era of reionization and prior will have instantaneous fields-of-view that are many tens of square degrees on the sky. Predictions related to various statistical measurements of the 21 cm brightness temperature must then be pursued with numerical simulations of reionization with correspondingly large volume box sizes, of order 1000 Mpc on one side. The authors pursued a semi-numerical scheme to simulate the 21 cm signal during and prior to Reionization by extending a hybrid approach where simulations are performed by first laying down the linear dark matter density field, accounting for the non-linear evolution of the density field based on second-order linear perturbation theory as specified by the Zel'dovich approximation, and then specifying the location and mass of collapsed dark matter halos using the excursion-set formalism. The location of ionizing sources and the time evolving distribution of ionization field is also specified using an excursion-set algorithm. They account for the brightness temperature evolution through the coupling between spin and gas temperature due to collisions, radiative coupling in the presence of Lyman-alpha photons and heating of the intergalactic medium, such as due to a background of X-ray photons. The method is capable of producing the required large volume simulations with adequate resolution in a reasonable time so a large number of realizations can be obtained with variations in assumptions related to astrophysics and background cosmology that govern the 21 cm signal.

[submitted] SimCADO - An observations simulator for infrared telescopes and instruments

SimCADO is a python package which allows the user to simulate observations with any NIR/Vis imaging system. The package was originally designed to simulate images for the European extremely large telescope (ELT) and MICADO, however with the proper input it is capable of simulating observations from many different Telescope + Instrument configurations.

The documentation can be found here: https://simcado.readthedocs.io/en/latest/

[ascl:1811.011] SIM5: Library for ray-tracing and radiation transport in general relativity

The SIM5 library contains routines for relativistic raytracing and radiation transfer in GR. Written C with a Python interface, it has a special focus on raytracing from accretion disks, tori, hot spots or any other 3D configuration of matter in Kerr geometry, but it can be used with any other metric as well. It handles both optically thick and thin sources as well as transport of polarization of the radiation and calculates the propagation of light rays from the source to an observer through a curved spacetime. It supports parallelization and runs on GPUs.

[ascl:1603.001] SILSS: SPHERE/IRDIS Long-Slit Spectroscopy pipeline

The ESO's VLT/SPHERE instrument includes a unique long-slit spectroscopy (LSS) mode coupled with Lyot coronagraphy in its infrared dual-band imager and spectrograph (IRDIS) for spectral characterization of young, giant exoplanets detected by direct imaging. The SILSS pipeline is a combination of the official SPHERE pipeline and additional custom IDL routines developed within the SPHERE consortium for the speckle subtraction and spectral extraction of a companion's spectrum; it offers a complete end-to-end pipeline, from raw data (science+calibrations) to a final spectrum of the companion. SILSS works on both the low-resolution (LRS) and medium-resolution (MRS) data, and allows correction for some of the known biases of the instrument. Documentation is included in the header of the main routine of the pipeline.

[ascl:1107.016] SIGPROC: Pulsar Signal Processing Programs

SIGPROC is a package designed to standardize the initial analysis of the many types of fast-sampled pulsar data. Currently recognized machines are the Wide Band Arecibo Pulsar Processor (WAPP), the Penn State Pulsar Machine (PSPM), the Arecibo Observatory Fourier Transform Machine (AOFTM), the Berkeley Pulsar Processors (BPP), the Parkes/Jodrell 1-bit filterbanks (SCAMP) and the filterbank at the Ooty radio telescope (OOTY). The SIGPROC tools should help users look at their data quickly, without the need to write (yet) another routine to read data or worry about big/little endian compatibility (byte swapping is handled automatically).

[ascl:1110.023] SiFTO: An Empirical Method for Fitting SN Ia Light Curves

SiFTO is an empirical method for modeling Type Ia supernova (SN Ia) light curves by manipulating a spectral template. We make use of high-redshift SN data when training the model, allowing us to extend it bluer than rest-frame U. This increases the utility of our high-redshift SN observations by allowing us to use more of the available data. We find that when the shape of the light curve is described using a stretch prescription, applying the same stretch at all wavelengths is not an adequate description. SiFTO therefore uses a generalization of stretch which applies different stretch factors as a function of both the wavelength of the observed filter and the stretch in the rest-frame B band. SiFTO has been compared to other published light-curve models by applying them to the same set of SN photometry, and it's been demonstrated that SiFTO and SALT2 perform better than the alternatives when judged by the scatter around the best-fit luminosity distance relationship. When SiFTO and SALT2 are trained on the same data set the cosmological results agree.

[ascl:1703.007] sidm-nbody: Monte Carlo N-body Simulation for Self-Interacting Dark Matter

Self-Interacting Dark Matter (SIDM) is a hypothetical model for cold dark matter in the Universe. A strong interaction between dark matter particles introduce a different physics inside dark-matter haloes, making the density profile cored, reduce the number of subhaloes, and trigger gravothermal collapse. sidm-nbody is an N-body simulation code with Direct Simulation Monte Carlo scattering for self interaction, and some codes to analyse gravothermal collapse of isolated haloes. The N-body simulation is based on GADGET 1.1.

[ascl:1706.009] sick: Spectroscopic inference crank

sick infers astrophysical parameters from noisy observed spectra. Phenomena that can alter the data (e.g., redshift, continuum, instrumental broadening, outlier pixels) are modeled and simultaneously inferred with the astrophysical parameters of interest. This package relies on emcee (ascl:1303.002); it is best suited for situations where a grid of model spectra already exists, and one would like to infer model parameters given some data.

[ascl:1411.026] sic: Sparse Inpainting Code

sic (Sparse Inpainting Code) generates Gaussian, isotropic CMB realizations, masks them, and recovers the large-scale masked data using sparse inpainting; it is written in Fortran90.

[ascl:1704.003] Shwirl: Meaningful coloring of spectral cube data with volume rendering

Shwirl visualizes spectral data cubes with meaningful coloring methods. The program has been developed to investigate transfer functions, which combines volumetric elements (or voxels) to set the color, and graphics shaders, functions used to compute several properties of the final image such as color, depth, and/or transparency, as enablers for scientific visualization of astronomical data. The program uses Astropy (ascl:1304.002) to handle FITS files and World Coordinate System, Qt (and PyQt) for the user interface, and VisPy, an object-oriented Python visualization library binding onto OpenGL.

[ascl:1110.004] SHTOOLS: Tools for Working with Spherical Harmonics

SHTOOLS is an archive of fortran 95 based software that can be used to perform (among others) spherical harmonic transforms and reconstructions, rotations of spherical harmonic coefficients, and multitaper spectral analyses on the sphere. The package accommodates any standard normalization of the spherical harmonic functions ("geodesy" 4π normalized, Schmidt semi-normalized, orthonormalized, and unnormalized), and either real or complex spherical harmonics can be employed. Spherical harmonic transforms are calculated by exact quadrature rules using either (1) the sampling theorem of Driscoll and Healy (1994) where data are equally sampled (or spaced) in latitude and longitude, or (2) Gauss-Legendre quadrature. A least squares inversion routine for irregularly sampled data is included as well. The Condon-Shortley phase factor of (-1)m can be used or excluded with the associated Legendre functions. The spherical harmonic transforms are accurate to approximately degree 2800, corresponding to a spatial resolution of better than 4 arc minutes. Routines are included for performing localized multitaper spectral analyses and standard gravity calculations, such as computation of the geoid, and the determination of the potential associated with finite-amplitude topography. The routines are fast. Spherical harmonic transforms and reconstructions take on the order of 1 second for bandwidths less than 600 and about 3 minutes for bandwidths close to 2800.

[ascl:1107.005] Sherpa: CIAO Modeling and Fitting Package

Sherpa is the CIAO (ascl:1311.006) modeling and fitting application made available by the Chandra X-ray Center (CXC). It can be used for analysis of images, spectra and time series from many telescopes, including optical telescopes such as Hubble. Sherpa is flexible, modular and extensible. It has an IPython user interface and it is also an importable Python module. Sherpa models, optimization and statistic functions are available via both C++ and Python for software developers wishing to link such functions directly to their own compiled code.

The CIAO 4.3 Sherpa release supports fitting of 1-D X-ray spectra from Chandra and other X-ray missions, as well as 1-D non-X-ray data, including ASCII data arrays, radial profiles, and lightcurves. The options for grating data analysis include fitting the spectrum with multiple response files required for overlapping orders in LETG observations. Modeling of 2-D spatial data is fully supported, including the PSF and exposure maps. User specified models can be added to Sherpa with advanced "user model" functionality.

[ascl:1108.002] SHERA: SHEar Reconvolution Analysis

Current and upcoming wide-field, ground-based, broad-band imaging surveys promise to address a wide range of outstanding problems in galaxy formation and cosmology. Several such uses of ground-based data, especially weak gravitational lensing, require highly precise measurements of galaxy image statistics with careful correction for the effects of the point-spread function (PSF). The SHERA (SHEar Reconvolution Analysis) software simulates ground-based imaging data with realistic galaxy morphologies and observing conditions, starting from space-based data (from COSMOS, the Cosmological Evolution Survey) and accounting for the effects of the space-based PSF. This code simulates ground-based data, optionally with a weak lensing shear applied, in a model-independent way using a general Fourier space formalism. The utility of this pipeline is that it allows for a precise, realistic assessment of systematic errors due to the method of data processing, for example in extracting weak lensing galaxy shape measurements or galaxy radial profiles, given user-supplied observational conditions and real galaxy morphologies. Moreover, the simulations allow for the empirical test of error estimates and determination of parameter degeneracies, via generation of many noise maps. The public release of this software, along with a large sample of cleaned COSMOS galaxy images (corrected for charge transfer inefficiency), should enable upcoming ground-based imaging surveys to achieve their potential in the areas of precision weak lensing analysis, galaxy profile measurement, and other applications involving detailed image analysis.

This code is no longer maintained and has been superseded by GalSim (ascl:1402.009).

[ascl:1108.017] SHELLSPEC: Simple Radiative Transfer along Line of Sight in Moving Media

SHELLSPEC is designed to calculate lightcurves, spectra and images of interacting binaries and extrasolar planets immersed in a moving circumstellar environment which is optically thin. It solves simple radiative transfer along the line of sight in moving media. The assumptions include LTE and optional known state quantities and velocity fields in 3D. Optional (non)transparent objects such as a spot, disc, stream, jet, shell or stars as well as an empty space may be defined (embedded) in 3D and their composite synthetic spectrum calculated. Roche model can be used as a boundary condition for the radiative tranfer. The program does not solve the inverse problem of finding the stellar and orbital parameters.

[ascl:1508.010] SHDOM: Spherical Harmonic Discrete Ordinate Method for atmospheric radiative transfer

The Spherical Harmonic Discrete Ordinate Method (SHDOM) radiative transfer model computes polarized monochromatic or spectral band radiative transfer in a one, two, or three-dimensional medium for either collimated solar and/or thermal emission sources of radiation. The model is written in a variant of Fortran 77 and in Fortran90 and requires a Fortran 90 compiler. Also included are programs for generating the optical property files input to SHDOM from physical properties of water cloud particles and aerosols.

[ascl:1811.005] Shark: Flexible semi-analytic galaxy formation model

Shark is a flexible semi-analytic galaxy formation model for easy exploration of different physical processes. Shark has been implemented with several models for gas cooling, active galactic nuclei, stellar and photo-ionization feedback, and star formation (SF). The software can determine the stellar mass function and stellar–halo mass relation at z=0–4; cosmic evolution of the star formation rate density, stellar mass, atomic and molecular hydrogen; local gas scaling relations; and structural galaxy properties. It performs particularly well for the mass–size relation for discs/bulges, the gas–stellar mass and stellar mass–metallicity relations. Shark is written in C++11 and has been parallelized with OpenMP.

[ascl:1307.014] Shapelets: Image Modelling

Shapelets are a complete, orthonormal set of 2D basis functions constructed from Laguerre or Hermite polynomials weighted by a Gaussian. A linear combination of these functions can be used to model any image, in a similar way to Fourier or wavelet synthesis. The shapelet decomposition is particularly efficient for images localized in space, and provide a high level of compression for individual galaxies in astronomical data. The basis has many elegant mathematical properties that make it convenient for image analysis and processing.

[ascl:1204.010] Shape: A 3D Modeling Tool for Astrophysics

Shape is a flexible interactive 3D morpho-kinematical modeling application for astrophysics. It reduces the restrictions on the physical assumptions, data type and amount required for a reconstruction of an object's morphology. It applies interactive graphics and allows astrophysicists to provide a-priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation.

[ascl:1011.005] Shape of Cosmic String Loops

Complicated cosmic string loops will fragment until they reach simple, non-intersecting ("stable") configurations. Through extensive numerical study we characterize these attractor loop shapes including their length, velocity, kink, and cusp distributions. We find that an initial loop containing $M$ harmonic modes will, on average, split into 3M stable loops. These stable loops are approximately described by the degenerate kinky loop, which is planar and rectangular, independently of the number of modes on the initial loop. This is confirmed by an analytic construction of a stable family of perturbed degenerate kinky loops. The average stable loop is also found to have a 40% chance of containing a cusp. We examine the properties of stable loops of different lengths and find only slight variation. Finally we develop a new analytic scheme to explicitly solve the string constraint equations.

[ascl:1605.003] Shadowfax: Moving mesh hydrodynamical integration code

Shadowfax simulates galaxy evolution. Written in object-oriented modular C++, it evolves a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. For the hydrodynamical integration, it makes use of a (co-) moving Lagrangian mesh. The code has a 2D and 3D version, contains utility programs to generate initial conditions and visualize simulation snapshots, and its input/output is compatible with a number of other simulation codes, e.g. Gadget2 (ascl:0003.001) and GIZMO (ascl:1410.003).

[ascl:1712.015] SgrbWorldModel: Short-duration Gamma-Ray Burst World Model

SgrbWorldModel, written in Fortran 90, presents an attempt at modeling the population distribution of the Short-duration class of Gamma-Ray Bursts (SGRBs) as detected by the NASA's now-defunct Burst And Transient Source Experiment (BATSE) onboard the Compton Gamma Ray Observatory (CGRO). It is assumed that the population distribution of SGRBs is well fit by a multivariate log-normal distribution, whose differential cosmological rate of occurrence follows the Star-Formation-Rate (SFR) convolved with a log-normal binary-merger delay-time distribution. The best-fit parameters of the model are then found by maximizing the likelihood of the observed data by the BATSE detectors via a native built-in Adaptive Metropolis-Hastings Markov-Chain Monte Carlo (AMH-MCMC)Sampler that is part of the code. A model for the detection algorithm of the BATSE detectors is also provided.

[ascl:1210.005] SGNAPS: Software for Graphical Navigation, Analysis and Plotting of Spectra

SGNAPS allows the user to plot a one-dimensional spectrum, together with the corresponding two-dimensional and a reference spectrum (for example the sky spectrum). This makes it possible to check on the reality of spectral features that are present in the one-dimensional spectrum, which could be due to bad sky subtraction or fringing residuals. It is also possible to zoom in and out all three spectra, edit the one-dimensional spectrum, smooth it with a simple square window function, measure the signal to noise over a selected wavelength interval, and fit the position of a selected spectral line. SGNAPS also allows the astronomer to obtain quick redshift estimates by providing a tool to fit or mark the position of a spectral line, and a function that will compute a list of possible redshifts based on a list of known lines in galaxy spectra. SGNAPS is derived from the plotting tools of VIPGI and contains almost all of their capabilities.

[ascl:1712.007] SFoF: Friends-of-friends galaxy cluster detection algorithm

SFoF is a friends-of-friends galaxy cluster detection algorithm that operates in either spectroscopic or photometric redshift space. The linking parameters, both transverse and along the line-of-sight, change as a function of redshift to account for selection effects.

[ascl:1304.013] SFH: Star Formation History

SFH is an efficient IDL tool that quickly computes accurate predictions for the baryon budget history in a galactic halo.

[ascl:1010.064] SExtractor: Source Extractor

This new software optimally detects, de-blends, measures and classifies sources from astronomical images: SExtractor (Source Extractor). A very reliable star/galaxy separation can be achieved on most images using a neural network trained with simulated images. Salient features of SExtractor include its ability to work on very large images, with minimal human intervention, and to deal with a wide variety of object shapes and magnitudes. It is therefore particularly suited to the analysis of large extragalactic surveys.

[ascl:1508.006] SExSeg: SExtractor segmentation

SExSeg forces SExtractor (ascl:1010.064) to run using a pre-defined segmentation map (the definition of objects and their borders). The defined segments double as isophotal apertures. SExSeg alters the detection image based on a pre-defined segmenation map while preparing your "analysis image" by subtracting the background in a separate SExtractor run (using parameters you specify). SExtractor is then run in "double-image" mode with the altered detection image and background-subtracted analysis image.

[ascl:1803.009] SETI-EC: SETI Encryption Code

The SETI Encryption code, written in Python, creates a message for use in testing the decryptability of a simulated incoming interstellar message. The code uses images in a portable bit map (PBM) format, then writes the corresponding bits into the message, and finally returns both a PBM image and a text (TXT) file of the entire message. The natural constants (c, G, h) and the wavelength of the message are defined in the first few lines of the code, followed by the reading of the input files and their conversion into 757 strings of 359 bits to give one page. Each header of a page, i.e. the little-endian binary code translation of the tempo-spatial yardstick, is calculated and written on-the-fly for each page.

[ascl:1304.009] Sérsic: Exact deprojection of Sérsic surface brightness profiles

Sérsic is an implementation of the exact deprojection of Sérsic surface brightness profiles described in Baes and Gentile (2011). This code depends on the mpmath python library for an implementation of the Meijer G function required by the Baes and Gentile (hereafter B+G) formulas for rational values of the Sérsic index. Sérsic requires rational Sérsic indices, but any irrational number can be approximated arbitrarily well by some rational number. The code also depends on scipy, but the dependence is mostly for testing. The implementation of the formulas and the formulas themselves have undergone comprehensive testing.

[ascl:1312.001] SERPent: Scripted E-merlin Rfi-mitigation PipelinE for iNTerferometry

SERPent is an automated reduction and RFI-mitigation procedure that uses the SumThreshold methodology. It was originally developed for the LOFAR pipeline. SERPent is written in Parseltongue, enabling interaction with the Astronomical Image Processing Software (AIPS) program. Moreover, SERPent is a simple "out of the box" Python script, which is easy to set up and is free of compilers.

[ascl:1102.010] SEREN: A SPH code for star and planet formation simulations

SEREN is an astrophysical Smoothed Particle Hydrodynamics code designed to investigate star and planet formation problems using self-gravitating hydrodynamics simulations of molecular clouds, star-forming cores, and protostellar disks.

SEREN is written in Fortran 95/2003 with a modular philosophy for adding features into the code. Each feature can be easily activated or deactivated by way of setting options in the Makefile before compiling the code. This has the added benefit of allowing unwanted features to be removed at the compilation stage resulting in a smaller and faster executable program. SEREN is written with OpenMP directives to allow parallelization on shared-memory architecture.

[ascl:1404.005] SER: Subpixel Event Repositioning Algorithms

Subpixel Event Repositioning (SER) techniques significantly improve the already unprecedented spatial resolution of Chandra X-ray imaging with the Advanced CCD Imaging Spectrometer (ACIS). Chandra CCD SER techniques are based on the premise that the impact position of events can be refined, based on the distribution of charge among affected CCD pixels. Unlike ACIS SER models that are restricted to corner split (3- and 4-pixel) events and assume that such events take place at the split pixel corners, this IDL code uses two-pixel splits as well, and incorporates more realistic estimates of photon impact positions.

[ascl:1811.004] SEP: Source Extraction and Photometry

SEP (Source Extraction and Photometry) makes the core algorithms of Source Extractor (ascl:1010.064) available as a library of standalone functions and classes. These operate directly on in-memory arrays (no FITS files or configuration files). The code is derived from the Source Extractor code base (written in C) and aims to produce results compatible with Source Extractor whenever possible. SEP consists of a C library with no dependencies outside the standard library and a Python module that wraps the C library in a Pythonic API. The Python wrapper operates on NumPy arrays with NumPy as its only dependency. It is generated using Cython.

From Source Extractor, SEP includes background estimation, image segmentation (including on-the-fly filtering and source deblending), aperture photometry in circular and elliptical apertures, and source measurements such as Kron radius, "windowed" position fitting, and half-light radius. It also adds the following features that are not available in Source Extractor: optimized matched filter for variable noise in source extraction; circular annulus and elliptical annulus aperture photometry functions; local background subtraction in shape consistent with aperture in aperture photometry functions; exact pixel overlap mode in all aperture photometry functions; and masking of elliptical regions on images.

[ascl:1807.026] SENR: Simple, Efficient Numerical Relativity

SENR (Simple, Efficient Numerical Relativity) provides the algorithmic framework that combines the C codes generated by NRPy+ (ascl:1807.025) into a functioning numerical relativity code. It is part of the numerical relativity code package SENR/NRPy+. The package extends previous implementations of the BSSN reference-metric formulation to a much broader class of curvilinear coordinate systems, making it suitable for modeling physical configurations with approximate or exact symmetries, such as modeling black hole dynamics.

[ascl:1504.009] Self-lensing binary code with Markov chain

The self-lensing binary code with Markov chain code was used to analyze the self-lensing binary system KOI-3278. It includes the MCMC modeling and the key figures.

[ascl:1411.007] segueSelect: SDSS/SEGUE selection function modelling

The Python package segueSelect automatically models the SDSS/SEGUE selection fraction -- the fraction of stars with good spectra -- as a continuous function of apparent magnitude for each plate. The selection function can be determined for any desired sample cuts in signal-to-noise ratio, u-g, r-i, and E(B-V). The package requires Pyfits (ascl:1207.009) and, for coordinate transformations, galpy (ascl:1411.008). It can calculate the KS probability that the spectropscopic sample was drawn from the underlying photometric sample with the model selection function, plot the cumulative distribution function in r-band apparent magnitude of the spectroscopic sample (red) and the photometric sample+selection-function-model for this plate, and, if galpy is installed, can transform velocities into the Galactic coordinate frame. The code can also determine the selection function for SEGUE K stars.

[ascl:1607.020] SEEK: Signal Extraction and Emission Kartographer

SEEK (Signal Extraction and Emission Kartographer) processes time-ordered-data from single dish radio telescopes or from the simulation pipline HIDE (ascl:1607.019), removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and recovers the astronomical radio signal. With its companion code HIDE (ascl:1607.019), it provides end-to-end simulation and processing of radio survey data.

[ascl:1901.008] SEDobs: Observational spectral energy distribution simulation

SEDobs uses state-of-the-art theoretical galaxy SEDs (spectral energy distributions) to create simulated observations of distant galaxies. It used BC03 and M05 theoretical models and allows the user to configure the simulated observation that are needed. For a given simulated galaxy, the user is able to simulate multi-spectral and multi-photometric observations.

[ascl:1101.001] Second-order Tight-coupling Code

Prior to recombination photons, electrons, and atomic nuclei rapidly scattered and behaved, almost, like a single tightly-coupled photon-baryon plasma. In order to solve the cosmological perturbation equations during that time, Cosmic Microwave Background (CMB) codes use the so-called tight-coupling approximation in which the problematic terms (i.e. the source of the stiffness) are expanded in inverse powers of the Thomson Opacity. Most codes only keep the terms linear in the inverse Thomson Opacity. We have developed a second-order tight-coupling code to test the validity of the usual first-order tight-coupling code. It is based on the publicly available code CAMB.

[ascl:1201.003] SeBa: Stellar and binary evolution

The stellar and binary evolution package SeBa is fully integrated into the kira integrator, although it can also be used as a stand-alone module for non-dynamical applications. Due to the interaction between stellar evolution and stellar dynamics, it is difficult to solve for the evolution of both systems in a completely self-consistent way. The trajectories of stars are computed using a block timestep scheme, as described earlier. Stellar and binary evolution is updated at fixed intervals (every 1/64 of a crossing time, typically a few thousand years). Any feedback between the two systems may thus experience a delay of at most one timestep. Internal evolution time steps may differ for each star and binary, and depend on binary period, perturbations due to neighbors, and the evolutionary state of the star. Time steps in this treatment vary from several milliseconds up to (at most) a million years.

[ascl:1210.012] SearchCal: The JMMC Evolutive Search Calibrator Tool

SearchCal builds an evolutive catalog of stars suitable as calibrators within any given user-defined angular distance and magnitude around a scientific target. SearchCal can select suitable bright calibration stars (V ≤ 10; K ≤ 5.0) for obtaining the ultimate precision of current interferometric instruments like the VLTI and faint calibration stars up to K ~ 15 around the scientific target. Star catalogs available at the CDS are searched via web requests and provide the useful astrometric and photometric informations for selecting calibrators. The missing photometries are computed with an accuracy of about 0.1 mag. The stellar angular diameter is estimated with a precision of about 10% through newly determined surface-brightness versus color-index relations based on the I, J, H and K magnitudes. For each star the squared visibility is computed taking into account the central wavelength and the maximum baseline of the predicted observations.

[ascl:1601.003] SCOUSE: Semi-automated multi-COmponent Universal Spectral-line fitting Engine

The Semi-automated multi-COmponent Universal Spectral-line fitting Engine (SCOUSE) is a spectral line fitting algorithm that fits Gaussian files to spectral line emission. It identifies the spatial area over which to fit the data and generates a grid of spectral averaging areas (SAAs). The spatially averaged spectra are fitted according to user-provided tolerance levels, and the best fit is selected using the Akaike Information Criterion, which weights the chisq of a best-fitting solution according to the number of free-parameters. A more detailed inspection of the spectra can be performed to improve the fit through an iterative process, after which SCOUSE integrates the new solutions into the solution file.

[ascl:1609.006] SCIMES: Spectral Clustering for Interstellar Molecular Emission Segmentation

SCIMES identifies relevant molecular gas structures within dendrograms of emission using the spectral clustering paradigm. It is useful for decomposing objects in complex environments imaged at high resolution.

[ascl:1311.001] SciDB: Open Source DMAS for Scientific Research

SciDB is a DMAS (Data Management and Analytics Software System) optimized for data management of big data and for big analytics. SciDB is organized around multidimensional array storage, a generalization of relational tables, and is designed to be scalable up to petabytes and beyond. Complex analytics are simplified with SciDB because arrays and vectors are first-class objects with built-in optimized operations. Spatial operators and time-series analysis are easy to express. Interfaces to common scientific tools like R as well as programming languages like C++ and Python are provided.

[ascl:1505.008] SCEPtER: Stellar CharactEristics Pisa Estimation gRid

SCEPtER (Stellar CharactEristics Pisa Estimation gRid) estimates the stellar mass and radius given a set of observable quantities; the results are obtained by adopting a maximum likelihood technique over a grid of pre-computed stellar models. The code is quite flexible since different observables can be used, depending on their availability, as well as different grids of models.

[ascl:1803.003] scarlet: Source separation in multi-band images by Constrained Matrix Factorization

SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.

[ascl:1209.012] Scanamorphos: Maps from scan observations made with bolometer arrays

Scanamorphos is an IDL program to build maps from scan observations made with bolometer arrays. Scanamorphos can post-process scan observations performed with the Herschel photometer arrays. This post-processing mainly consists in subtracting the total low-frequency noise (both its thermal and non-thermal components), masking cosmic ray hit residuals, and projecting the data onto a map. Although it was developed for Herschel, it is also applicable with minimal adjustment to scan observations made with other bolometer arrays provided they entail sufficient redundancy; it was successfully applied to P-Artemis, an instrument operating on the APEX telescope. Scanamorphos does not assume any particular noise model and does not apply any Fourier-space filtering to the data. It is an empirical tool using only the redundancy built in the observations, taking advantage of the fact that each portion of the sky is sampled at multiple times by multiple bolometers. The user is allowed to optionally visualize and control results at each intermediate step, but the processing is fully automated.

[ascl:1010.063] SCAMP: Automatic Astrometric and Photometric Calibration

Astrometric and photometric calibrations have remained the most tiresome step in the reduction of large imaging surveys. SCAMP has been written to address this problem. The program efficiently computes accurate astrometric and photometric solutions for any arbitrary sequence of FITS images in a completely automatic way. SCAMP is released under the GNU General Public License.

[ascl:1904.015] SBGAT: Small Bodies Geophysical Analysis Tool

SBGAT (Small Body Geophysical Analysis Tool) generates simulated data originating from small bodies shape models, combined with advanced shape-modification properties. It uses polyhedral shape models from which can be computed mass properties such as volume, center of mass, and inertia, synthetic observations such as lightcurves and radar, and which can be used within dynamical models, such as spherical harmonics and polyhedron gravity modeling. SBGAT can generate spherical harmonics expansions from constant-density polyhedra (and export them to JSON) and evaluate the spherical harmonics expansions. It can also generate YORP coefficients, multi-threaded Polyhedron Gravity Model gravity and potential evaluations, and synthetic light-curve and radar observations for single/primary asteroids.

SBGAT has two distinct packages: a dynamic library SBGAT Core that contains the data structure and algorithm backbone of SBGAT, and SBGAT Gui, which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. SBGAT Core can be used without the SBGAT Gui wrapper.

[ascl:1601.012] SavGolFilterCov: Savitzky Golay filter for data with error covariance

A Savitzky–Golay filter is often applied to data to smooth the data without greatly distorting the signal; however, almost all data inherently comes with noise, and the noise properties can differ from point to point. This python script improves upon the traditional Savitzky-Golay filter by accounting for error covariance in the data. The inputs and arguments are modeled after scipy.signal.savgol_filter.

[ascl:1309.005] SATMC: SED Analysis Through Monte Carlo

SATMC is a general purpose, MCMC-based SED fitting code written for IDL and Python. Following Bayesian statistics and Monte Carlo Markov Chain algorithms, SATMC derives the best fit parameter values and returns the sampling of parameter space used to construct confidence intervals and parameter-parameter confidence contours. The fitting may cover any range of wavelengths. The code is designed to incorporate any models (and potential priors) of the user's choice. The user guide lists all the relevant details for including observations, models and usage under both IDL and Python.

[ascl:1707.002] SASRST: Semi-Analytic Solutions for 1-D Radiative Shock Tubes

SASRST, a small collection of Python scripts, attempts to reproduce the semi-analytical one-dimensional equilibrium and non-equilibrium radiative shock tube solutions of Lowrie & Rauenzahn (2007) and Lowrie & Edwards (2008), respectively. The included code calculates the solution for a given set of input parameters and also plots the results using Matplotlib. This software was written to provide validation for numerical radiative shock tube solutions produced by a radiation hydrodynamics code.

[ascl:1404.004] SAS: Science Analysis System for XMM-Newton observatory

The Science Analysis System (SAS) is an extensive suite of software tasks developed to process the data collected by the XMM-Newton Observatory. The SAS extracts standard (spectra, light curves) and/or customized science products, and allows reproductions of the reduction pipelines run to get the PPS products from the ODFs files. SAS includes a powerful and extensive suite of FITS file manipulation packages based on the Data Access Layer library.

[ascl:1904.020] SARAH: SUSY and non-SUSY model builder and analyzer

SARAH builds and analyzes SUSY and non-SUSY models. It calculates all vertices, mass matrices, tadpoles equations, one-loop corrections for tadpoles and self-energies, and two-loop RGEs for a given model. SARAH writes model files for a variety of other software packages for dark matter studies, includes many SUSY and non-SUSY models, and makes implementing new models efficient and straightforward. Written in Mathematica, SARAH can also use output from Vevacious (ascl:1904.019) to check for the global minimum for a given model and parameter point.

[ascl:1210.029] Sapporo: N-body simulation library for GPUs

Sapporo mimics the behavior of GRAPE hardware and uses the GPU to perform high-precision gravitational N-body simulations. It makes use of CUDA and therefore only works on NVIDIA GPUs. N-body codes currently running on GRAPE-6 can switch to Sapporo by a simple relinking of the library. Sapporo's precision is comparable to that of GRAPE-6, even though internally the GPU hardware is limited to single precision arithmetics. This limitation is effectively overcome by emulating double precision for calculating the distance between particles.

[ascl:0003.002] SAOImage DS9: A utility for displaying astronomical images in the X11 window environment

SAOImage DS9 is an astronomical imaging and data visualization application. DS9 supports FITS images and binary tables, multiple frame buffers, region manipulation, and many scale algorithms and colormaps. It provides for easy communication with external analysis tasks and is highly configurable and extensible via XPA and SAMP. DS9 is a stand-alone application. It requires no installation or support files. Versions of DS9 currently exist for Solaris, Linux, MacOSX, and Windows. All versions and platforms support a consistent set of GUI and functional capabilities. DS9 supports advanced features such as multiple frame buffers, mosaic images, tiling, blinking, geometric markers, colormap manipulation, scaling, arbitrary zoom, rotation, pan, and a variety of coordinate systems. DS9 also supports FTP and HTTP access. The GUI for DS9 is user configurable. GUI elements such as the coordinate display, panner, magnifier, horizontal and vertical graphs, button bar, and colorbar can be configured via menus or the command line. DS9 is a Tk/Tcl application which utilizes the SAOTk widget set. It also incorporates the X Public Access (XPA) mechanism to allow external processes to access and control its data, GUI functions, and algorithms.

[ascl:1605.015] SAND: Automated VLBI imaging and analyzing pipeline

The Search And Non-Destroy (SAND) is a VLBI data reduction pipeline composed of a set of Python programs based on the AIPS interface provided by ObitTalk. It is designed for the massive data reduction of multi-epoch VLBI monitoring research. It can automatically investigate calibrated visibility data, search all the radio emissions above a given noise floor and do the model fitting either on the CLEANed image or directly on the uv data. It then digests the model-fitting results, intelligently identifies the multi-epoch jet component correspondence, and recognizes the linear or non-linear proper motion patterns. The outputs including CLEANed image catalogue with polarization maps, animation cube, proper motion fitting and core light curves. For uncalibrated data, a user can easily add inline modules to do the calibration and self-calibration in a batch for a specific array.

[ascl:1504.011] samiDB: A Prototype Data Archive for Big Science Exploration

samiDB is an archive, database, and query engine to serve the spectra, spectral hypercubes, and high-level science products that make up the SAMI Galaxy Survey. Based on the versatile Hierarchical Data Format (HDF5), samiDB does not depend on relational database structures and hence lightens the setup and maintenance load imposed on science teams by metadata tables. The code, written in Python, covers the ingestion, querying, and exporting of data as well as the automatic setup of an HTML schema browser. samiDB serves as a maintenance-light data archive for Big Science and can be adopted and adapted by science teams that lack the means to hire professional archivists to set up the data back end for their projects.

[ascl:1407.006] SAMI: Sydney-AAO Multi-object Integral field spectrograph pipeline

The SAMI (Sydney-AAO Multi-object Integral field spectrograph) pipeline reduces data from the Sydney-AAO Multi-object Integral field spectrograph (SAMI) for the SAMI Galaxy Survey. The python code organizes SAMI data and, along with the AAO 2dfdr package, carries out all steps in the data reduction, from raw data to fully calibrated datacubes. The principal steps are: data management, use of 2dfdr to produce row-stacked spectra, flux calibration, correction for telluric absorption, removal of atmospheric dispersion, alignment of dithered exposures, and drizzling onto a regular output grid. Variance and covariance information is tracked throughout the pipeline. Some quality control routines are also included.

[ascl:1203.011] SALT2: Spectral Adaptive Lightcurve Template

SALT (Spectral Adaptive Lightcurve Template) is a package for Type Ia Supernovae light curve fitting. Its main purpose is to provide a distance estimator but it can also be used for photometric redshifts, and spectroscopic + photometric identification. This code is also known by the name snfit.

[ascl:1601.006] SAGE: Semi-Analytic Galaxy Evolution

SAGE (Semi-Analytic Galaxy Evolution) models galaxy formation in a cosmological context. SAGE has been rebuilt to be modular and customizable. The model runs on any dark matter cosmological N-body simulation whose trees are organized in a supported format and contain a minimum set of basic halo properties.

[ascl:1306.001] SAC: Sheffield Advanced Code

The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

[ascl:1111.003] Saada: A Generator of Astronomical Database

Saada transforms a set of heterogeneous FITS files or VOtables of various categories (images, tables, spectra, etc.) in a powerful database deployed on the Web. Databases are located on your host and stay independent of any external server. This job doesn’t require writing code. Saada can mix data of various categories in multiple collections. Data collections can be linked each to others making relevant browsing paths and allowing data-mining oriented queries. Saada supports 4 VO services (Spectra, images, sources and TAP) . Data collections can be published immediately after the deployment of the Web interface.

[ascl:1103.003] S2PLOT: Three-dimensional (3D) Plotting Library

We present a new, three-dimensional (3D) plotting library with advanced features, and support for standard and enhanced display devices. The library - S2PLOT - is written in C and can be used by C, C++ and FORTRAN programs on GNU/Linux and Apple/OSX systems. S2PLOT draws objects in a 3D (x,y,z) Cartesian space and the user interactively controls how this space is rendered at run time. With a PGPLOT inspired interface, S2PLOT provides astronomers with elegant techniques for displaying and exploring 3D data sets directly from their program code, and the potential to use stereoscopic and dome display devices. The S2PLOT architecture supports dynamic geometry and can be used to plot time-evolving data sets, such as might be produced by simulation codes. In this paper, we introduce S2PLOT to the astronomical community, describe its potential applications, and present some example uses of the library.

[ascl:1211.001] S2LET: Fast wavelet analysis on the sphere

S2LET provides high performance routines for fast wavelet analysis of signals on the sphere. It uses the SSHT code built on the MW sampling theorem to perform exact spherical harmonic transforms on the sphere. The resulting wavelet transform implemented in S2LET is theoretically exact, i.e. a band-limited signal can be recovered from its wavelet coefficients exactly and the wavelet coefficients capture all the information. S2LET also supports the HEALPix sampling scheme, in which case the transforms are not theoretically exact but achieve good numerical accuracy. The core routines of S2LET are written in C and have interfaces in Matlab, IDL and Java. Real signals can be written to and read from FITS files and plotted as Mollweide projections.

[ascl:1110.013] S2HAT: Scalable Spherical Harmonic Transform Library

Many problems in astronomy and astrophysics require a computation of the spherical harmonic transforms. This is in particular the case whenever data to be analyzed are distributed over the sphere or a set of corresponding mock data sets has to be generated. In many of those contexts, rapidly improving resolutions of both the data and simulations puts increasingly bigger emphasis on our ability to calculate the transforms quickly and reliably.

The scalable spherical harmonic transform library S2HAT consists of a set of flexible, massively parallel, and scalable routines for calculating diverse (scalar, spin-weighted, etc) spherical harmonic transforms for a class of isolatitude sky grids or pixelizations. The library routines implement the standard algorithm with the complexity of O(n^3/2), where n is a number of pixels/grid points on the sphere, however, owing to their efficient parallelization and advanced numerical implementation, they achieve very competitive performance and near perfect scalability. S2HAT is written in Fortran 90 with a C interface. This software is a derivative of the spherical harmonic transforms included in the HEALPix package and is based on both serial and MPI routines of its version 2.01, however, since version 2.5 this software is fully autonomous of HEALPix and can be compiled and run without the HEALPix library.

[ascl:1606.008] s2: Object oriented wrapper for functions on the sphere

The s2 package can represent any arbitrary function defined on the sphere. Both real space map and harmonic space spherical harmonic representations are supported. Basic sky representations have been extended to simulate full sky noise distributions and Gaussian cosmic microwave background realisations. Support for the representation and convolution of beams is also provided. The code requires HEALPix (ascl:1107.018) and CFITSIO (ascl:1010.001).

[ascl:9912.003] RVSAO 2.0: Digital Redshifts and Radial Velocities

RVSAO is a set of programs to obtain redshifts and radial velocities from digital spectra. RVSAO operates in the IRAF (Tody 1986, 1993) environment. The heart of the system is xcsao, which implements the cross-correlation method, and is a direct descendant of the system built by Tonry and Davis (1979). emsao uses intelligent heuristics to search for emission lines in spectra, then fits them to obtain a redshift. sumspec shifts and sums spectra to build templates for cross-correlation. linespec builds synthetic spectra given a list of spectral lines. bcvcorr corrects velocities for the motion of the earth. We discuss in detail the parameters necessary to run xcsao and emsao properly. We discuss the reliability and error associated with xcsao derived redshifts. We develop an internal error estimator, and we show how large, stable surveys can be used to develop more accurate error estimators. We develop a new methodology for building spectral templates for galaxy redshifts. We show how to obtain correlation velocities using emission line templates. Emission line correlations are substantially more efficient than the previous standard technique, automated emission line fitting. We compare the use of RVSAO with new methods, which use Singular Value Decomposition and $chi^2$ fitting techniques.

[ascl:1210.031] RVLIN: Fitting Keplerian curves to radial velocity data

The RVLIN package for IDL is a set of routines that quickly fits an arbitrary number of Keplerian curves to radial velocity data. It can handle data from multiple telescopes (i.e. it solves for the offset), constraints on P, e, and time of peri passage, and can incorporate transit timing data. The code handles fixed periods and circular orbits in combination and transit time constraints, including for multiple transiting planets.

[ascl:1505.020] rvfit: Radial velocity curves fitting for binary stars or exoplanets

rvfit, developed in IDL 7.0, fits non-precessing keplerian radial velocity (RV) curves for double-line and single-line binary stars or exoplanets. It fits a simple keplerian model to the observed RV and computes the seven parameters (six for a single-line system) from the model. Some parameters can be fixed beforehand if they are known, for instance, if photometric observations are available. The fit is done using an Adaptive Simulated Annealing algorithm optimized for this specific task. Simulated Annealing methods are powerful heuristic algorithms to minimize functions in multiparametric spaces.

[ascl:1406.007] RV: Radial Components of Observer's Velocity

The RV program produces a report listing the components, in a given direction, of the observer's velocity on a given date. This allows an observed radial velocity to be referred to an appropriate standard of rest -- typically either the Sun or an LSR.

As a secondary function, RV computes light time components to the Sun, thus allowing the times of phenomena observed from a terrestrial observatory to be referred to a heliocentric frame of reference. n.b. It will of course, in addition, be necessary to express the observations in the appropriate timescale as well as applying light time corrections. In particular, it is likely that an observed UTC will need to be converted to TDB as well as being corrected to the Sun.)

RV is distributed with the Starlink software collection (ascl:1110.012) and uses SLALIB (ascl:1403.025).

[ascl:1802.011] runDM: Running couplings of Dark Matter to the Standard Model

runDM calculates the running of the couplings of Dark Matter (DM) to the Standard Model (SM) in simplified models with vector mediators. By specifying the mass of the mediator and the couplings of the mediator to SM fields at high energy, the code can calculate the couplings at low energy, taking into account the mixing of all dimension-6 operators. runDM can also extract the operator coefficients relevant for direct detection, namely low energy couplings to up, down and strange quarks and to protons and neutrons.

[ascl:1706.002] rtpipe: Searching for Fast Radio Transients in Interferometric Data

rtpipe (real-time pipeline) analyzes radio interferometric data with an emphasis on searching for transient or variable astrophysical sources. The package combines single-dish concepts such as dedispersion and filters with interferometric concepts, including images and the uv-plane. In contrast to time-domain data recorded with large single-dish telescopes, visibilities from interferometers can precisely localize sources anywhere in the entire field of view. rtpipe opens interferometers to the study of fast transient sky, including sources like pulsars, stellar flares, rotating radio transients, and fast radio bursts. Key portions of the search pipeline, such as image generation and dedispersion, have been accelerated. That, in combination with its multi-threaded, multi-node design, makes rtpipe capable of searching millisecond timescale data in real time on small compute clusters.

[ascl:1607.015] RT1D: 1D code for Rayleigh-Taylor instability

The parallel one-dimensional moving-mesh hydrodynamics code RT1D reproduces the multidimensional dynamics from Rayleigh-Taylor instability in supernova remnants.

[ascl:1808.002] rsigma: Resonant disturbance

rsigma calculates the resonant disturbing function, R(sigma), for a massless particle in an arbitrary orbit perturbed by a planet in circular orbit. This function defines the strength of the resonance (its semi-amplitude) and the location of the stable equilibrium points (the minima). It depends on the variable sigma called critical angle and on the particle's orbital elements a, e, i and the argument of the perihelion. R(sigma) is numerically calculated and the code is valid for arbitrary eccentricities and inclinations, including retrograde orbits.

[ascl:1902.006] RPFITS: Routines for reading and writing RPFITS files

The RPFITS data file format records synthesis visibility data obtained from the Australia Telescope Compact Array (ATCA) at Narrabri, NSW. It is also used for single-dish spectral line data obtained from Parkes and Mopra, including Parkes multibeam data. RPFITS superficially resembles random group FITS, but differs in important respects, making it incompatible with standard FITS software such as FITSIO (ascl:1010.001) and FTOOLS (ascl:9912.002) and, in particular, it precludes the use of fv (ascl:1205.005). The RPFITS Fortran library contains routines for reading and writing RPFITS files. A header file, RPFITS.h, is provided to facilitate usage by C and C++ applications. Also included is rpfhdr, a utility for viewing RPFITS headers (it also works for standard FITS), and rpfex for extracting selected scans from an RPFITS file.

[ascl:1712.009] RODRIGUES: RATT Online Deconvolved Radio Image Generation Using Esoteric Software

RODRIGUES (RATT Online Deconvolved Radio Image Generation Using Esoteric Software) is a web-based radio telescope simulation and reduction tool. From a technical perspective it is a web based parameterized docker container scheduler with a result set viewer.

[ascl:1210.008] Rockstar: Phase-space halo finder

Rockstar (Robust Overdensity Calculation using K-Space Topologically Adaptive Refinement) identifies dark matter halos, substructure, and tidal features. The approach is based on adaptive hierarchical refinement of friends-of-friends groups in six phase-space dimensions and one time dimension, which allows for robust (grid-independent, shape-independent, and noise-resilient) tracking of substructure. Our method is massively parallel (up to 10^5 CPUs) and runs on the largest current simulations (>10^10 particles) with high efficiency (10 CPU hours and 60 gigabytes of memory required per billion particles analyzed). Rockstar offers significant improvement in substructure recovery as compared to several other halo finders.

[ascl:1201.002] Roche: Visualization and analysis tool for Roche-lobe geometry of evolving binaries

Roche is a visualization and analysis tool for drawing the Roche-lobe geometry of evolving binaries. Roche can be used as a standalone program reading data from the command line or from a file generated by SeBa (ascl:1201.003). Eventually Roche will be able to read data from any other binary evolution program. Roche requires Starlab (ascl:1010.076) version 4.1.1 or later and the pgplot (ascl:1103.002) libraries. Roche creates a series of images, based on the SeBa output file SeBa.data, displaying the evolutionary state of a binary.

[ascl:1502.023] ROBOSPECT: Width fitting program

ROBOSPECT, written in C, automatically measures and deblends line equivalent widths for absorption and emission spectra. ROBOSPECT should not be used for stars with spectra in which there is no discernible continuum over large wavelength regions, nor for the most carbon-enhanced stars for which spectral synthesis would be favored. Although ROBOSPECT was designed for metal-poor stars, it is capable of fitting absorption and emission features in a variety of astronomical sources.

[ascl:1808.011] Robbie: Radio transients and variables detection workflow

Robbie automates cataloging sources, finding variables, and identifying transients in the image domain. It works in a batch processing paradigm with a modular design so components can be swapped out or upgraded to adapt to different input data while retaining a consistent and coherent methodological approach. Robbie is based on commonly used and open software, including AegeanTools (ascl:1212.009) and STILS/TOPCAT (ascl:1101.010).

[ascl:1603.008] ROBAST: ROOT-based ray-tracing library for cosmic-ray telescopes

ROBAST (ROOT-based simulator for ray tracing) is a non-sequential ray-tracing simulation library developed for wide use in optical simulations of gamma-ray and cosmic-ray telescopes. The library is written in C++ and fully utilizes the geometry library of the ROOT analysis framework, and can build the complex optics geometries typically used in cosmic ray experiments and ground-based gamma-ray telescopes.

[ascl:1104.008] Rmodel: Determining Stellar Population Parameters

This program determines stellar population parameters (e.g. age, metallicity, IMF slope,...), using as input a pair of line-strength indices, through the interpolation in SSP model predictions. Both linear and bivariate fits are computed to perform the interpolation.

[ascl:1403.011] RMHB: Hierarchical Reverberation Mapping

RMHB is a hierarchical Bayesian code for reverberation mapping (RM) that combines results of a sparsely sampled broad line region (BLR) light curve and a large sample of active galactic nuclei (AGN) to infer properties of the sample of the AGN. The key idea of RM is to measure the time lag τ between variations in the continuum emission from the accretion disc and subsequent response of the broad line region (BLR). The measurement of τ is typically used to estimate the physical size of the BLR and is combined with other measurements to estimate the black hole mass MBH. A major difficulty with RM campaigns is the large amount of data needed to measure τ. RMHB allows a clear interpretation of a posterior distribution for hyperparameters describing the sample of AGN.

[ascl:1409.011] rmfit: Forward-folding spectral analysis software

Rmfit uses a forward-folding technique to obtain the best-fit parameters for a chosen model given user-selected source and background time intervals from data files containing observed count rates and a corresponding detector response matrix. rmfit displays lightcurves and spectra using a graphical interface that enables user-defined integrated or time-resolved spectral fits and binning in either time or energy. Originally developed for the analysis of BATSE Gamma-Ray Burst (GRB) spectroscopy, rmfit is a tool for the spectroscopy of transient sources; it accommodates the Fermi GBM and LAT data and Swift BAT.

[ascl:1806.024] RMextract: Ionospheric Faraday Rotation calculator

RMextract calculates Ionospheric Faraday Rotation for a given epoch, location and line of sight. This Python code extracts TEC, vTEC, Earthmagnetic field and Rotation Measures from GPS and WMM data for radio interferometry observations.

[ascl:1708.011] RM-CLEAN: RM spectra cleaner

RM-CLEAN reads in dirty Q and U cubes, generates rmtf based on the frequencies given in an ASCII file, and cleans the RM spectra following the algorithm given by Brentjens (2007). The output cubes contain the clean model components and the CLEANed RM spectra. The input cubes must be reordered with mode=312, and the output cubes will have the same ordering and thus must be reordered after being written to disk. RM-CLEAN runs as a MIRIAD (ascl:1106.007) task and a Python wrapper is included with the code.

[ascl:1811.009] RLOS: Time-resolved imaging of model astrophysical jets

RLOS (Relativistic Line Of Sight) uses hydrocode output data, such as that from PLUTO (ascl:1010.045), to create synthetic images depicting what a model relativistic astrophysical jet looks like to a stationary observer. The approximate time-delayed imaging algorithm used is implemented within existing line-of-sight code. The software has the potential to study a variety of dynamical astrophysical phenomena in collaboration with other imaging and simulation tools.

[ascl:1410.005] RICH: Numerical simulation of compressible hydrodynamics on a moving Voronoi mesh

RICH (Racah Institute Computational Hydrodynamics) is a 2D hydrodynamic code based on Godunov's method. The code, largely based on AREPO, acts on an unstructured moving mesh. It differs from AREPO in the interpolation and time advancement scheme as well as a novel parallelization scheme based on Voronoi tessellation. Though not universally true, in many cases a moving mesh gives better results than a static mesh: where matter moves one way and a sound wave is traveling in the other way (such that relative to the grid the wave is not moving), a static mesh gives better results than a moving mesh. RICH is designed in an object oriented, user friendly way that facilitates incorporation of new algorithms and physical processes.

[ascl:1611.009] RHOCUBE: 3D density distributions modeling code

RHOCUBE models 3D density distributions on a discrete Cartesian grid and their integrated 2D maps. It can be used for a range of applications, including modeling the electron number density in LBV shells and computing the emission measure. The RHOCUBE Python package provides several 3D density distributions, including a powerlaw shell, truncated Gaussian shell, constant-density torus, dual cones, and spiralling helical tubes, and can accept additional distributions. RHOCUBE provides convenient methods for shifts and rotations in 3D, and if necessary, an arbitrary number of density distributions can be combined into the same model cube and the integration ∫ dz performed through the joint density field.

[ascl:1502.001] RH 1.5D: Polarized multi-level radiative transfer with partial frequency distribution

RH 1.5D performs Zeeman multi-level non-local thermodynamical equilibrium calculations with partial frequency redistribution for an arbitrary amount of chemical species. Derived from the RH code and written in C, it calculates spectra from 3D, 2D or 1D atmospheric models on a column-by-column basis (or 1.5D). It includes optimization features to speed up or improve convergence, which are particularly useful in dynamic models of chromospheres. While one should be aware of its limitations, the calculation of spectra using the 1.5D or column-by-column is a good approximation in many cases, and generally allows for faster convergence and more flexible methods of improving convergence. RH 1.5D scales well to at least tens of thousands of CPU cores.

[ascl:1711.006] RGW: Goodman-Weare Affine-Invariant Sampling

RGW is a lightweight R-language implementation of the affine-invariant Markov Chain Monte Carlo sampling method of Goodman & Weare (2010). The implementation is based on the description of the python package emcee (ascl:1303.002).

[ascl:1710.002] rfpipe: Radio interferometric transient search pipeline

rfpipe supports Python-based analysis of radio interferometric data (especially from the Very Large Array) and searches for fast radio transients. This extends on the rtpipe library (ascl:1706.002) with new approaches to parallelization, acceleration, and more portable data products. rfpipe can run in standalone mode or be in a cluster environment.

[ascl:1505.028] RESOLVE: Bayesian algorithm for aperture synthesis imaging in radio astronomy

RESOLVE is a Bayesian inference algorithm for image reconstruction in radio interferometry. It is optimized for extended and diffuse sources. Features include parameter-free Bayesian reconstruction of radio continuum data with a focus on extended and weak diffuse sources, reconstruction with uncertainty propagation dependent on measurement noise, and estimation of the spatial correlation structure of the radio astronomical source. RESOLVE provides full support for measurement sets and includes a simulation tool (if uv-coverage is provided).

[ascl:1809.016] RequiSim: Variance weighted overlap calculator

RequiSim computes the Variance Weighted Overlap, which is a measure of the bias on the lensing signal from power spectrum modelling bias for any non-linear model. It assumes that the bias on the power spectrum is Gaussian with a covariance described by a user-provided knowledge matrix that describes the covariance in the bias on the power spectrum. The data from the Euclid wide-field survey are included.

[ascl:1612.022] REPS: REscaled Power Spectra for initial conditions with massive neutrinos

REPS (REscaled Power Spectra) provides accurate, one-percent level, numerical simulations of the initial conditions for massive neutrino cosmologies, rescaling the late-time linear power spectra to the simulation initial redshift.

[ascl:1904.008] repack: Repack and compress line-transition data

repack re-packs and compresses line-transition data for radiative-transfer calculations. It identifies the strong lines that dominate the spectrum from the large-majority of weaker lines, returning a binary line-by-line (LBL) file with the strong lines info (wavenumber, Elow, gf, and isotope ID), and an ASCII file with the combined contribution of the weaker lines compressed into a continuum extinction coefficient (in cm-1 amagat-1) as function of wavenumber and temperature.

[ascl:1505.021] relline: Relativistic line profiles calculation

relline calculates relativistic line profiles; it is compatible with the common X-ray data analysis software XSPEC (ascl:9910.005) and ISIS (ascl:1302.002). The two basic forms are an additive line model (RELLINE) and a convolution model to calculate relativistic smearing (RELCONV).

[ascl:1404.012] RegPT: Regularized cosmological power spectrum

RegPT computes the power spectrum in flat wCDM class models based on the RegPT treatment when provided with either of transfer function or matter power spectrum. It then gives the multiple-redshift outputs for power spectrum, and optionally provides correlation function data. The Fortran code has two major options for power spectrum calculations; -fast, which quickly computes the power spectrum at two-loop level (typically a few seconds) using the pre-computed data set of PT kernels for fiducial cosmological models, and -direct, in which the code first applies the fast method, and then follows the regularized expression for power spectrum to directly evaluate the multi-dimensional integrals. The output results are the power spectrum of direct calculation and difference of the results between fast and direct method. The code also gives the data set of PT diagrams necessary for power spectrum calculations from which the power spectrum can be constructed.

[ascl:1206.001] RegiStax: Alignment, stacking and processing of images

RegiStax is software for alignment/stacking/processing of images; it was released over 10 years ago and continues to be developed and improved. The current version is RegiStax 6, which supports the following formats: AVI, SER, RFL (RegiStax Framelist), BMP, JPG, TIF, and FIT. This version has a shorter and simpler processing sequence than its predecessor, and optimizing isn't necessary anymore as a new image alignment method optimizes directly. The interface of RegiStax 6 has been simplified to look more uniform in appearance and functionality, and RegiStax 6 now uses Multi-core processing, allowing the user to have up to have multiple cores(recommended to use maximally 4) working simultaneous during alignment/stacking.

[ascl:1401.004] Reflex: Graphical workflow engine for data reduction

Reflex provides an easy and flexible way to reduce VLT/VLTI science data using the ESO pipelines. It allows graphically specifying the sequence in which the data reduction steps are executed, including conditional stops, loops and conditional branches. It eases inspection of the intermediate and final data products and allows repetition of selected processing steps to optimize the data reduction. The data organization necessary to reduce the data is built into the system and is fully automatic; advanced users can plug their own modules and steps into the data reduction sequence. Reflex supports the development of data reduction workflows based on the ESO Common Pipeline Library. Reflex is based on the concept of a scientific workflow, whereby the data reduction cascade is rendered graphically and data seamlessly flow from one processing step to the next. It is distributed with a number of complete test datasets so users can immediately start experimenting and familiarize themselves with the system.

[ascl:1508.003] REDUCEME: Long-slit spectroscopic data reduction and analysis

The astronomical data reduction package REDUCEME reduces and analyzes long-slit spectroscopic data. The package uses the unformatted FORTRAN raw data format, so requires FITS files be transformed to REDUCEME format; the reverse operation (from REDUCEME to FITS format) is also available. The package is a set of programs written in FORTRAN 77 and includes shell scripts (using the C shell syntax) to perform routine tasks; it can be extended by the inclusion of external programs. REDUCEME uses PGPLOT (ascl:1103.002) for line plots and images, and a subset of subroutines, called BUTTON, enables the user to communicate interactively with the image display employing graphic buttons. One advantage of using REDUCEME is that for each image an associated error image can also be processed throughout the reduction process, allowing for a careful control of the error propagation.

[ascl:1507.017] REDSPEC: NIRSPEC data reduction

REDSPEC is an IDL based reduction package designed with NIRSPEC in mind though can be used to reduce data from other spectrographs as well. REDSPEC accomplishes spatial rectification by summing an A+B pair of a calibration star to produce an image with two spectra; the image is remapped on the basis of polynomial fits to the spectral traces and calculation of gaussian centroids to define their separation, producing straight spectral traces with respect to the detector rows. The raw images are remapped onto a coordinate system with uniform intervals in spatial extent along the slit and in wavelength along the dispersion axis.

[ascl:1106.026] RECFAST: Calculate the Recombination History of the Universe

RECFAST calculates the recombination of H, HeI, and HeII in the early Universe; this involves a line-by-line treatment of each atomic level. It differs in comparison to previous calculations in two major ways: firstly, the ionization fraction x_e is approximately 10% smaller for redshifts <~800, due to non-equilibrium processes in the excited states of H, and secondly, HeI recombination is much slower than previously thought, and is delayed until just before H recombines. RECFAST enables fast computation of the ionization history (and quantities such as the power spectrum of CMB anisotropies which depend on it) for arbitrary cosmologies.

[ascl:1110.016] REBOUND: Multi-purpose N-body code for collisional dynamics

REBOUND is a multi-purpose N-body code which is freely available under an open-source license. It was designed for collisional dynamics such as planetary rings but can also solve the classical N-body problem. It is highly modular and can be customized easily to work on a wide variety of different problems in astrophysics and beyond.

REBOUND comes with three symplectic integrators: leap-frog, the symplectic epicycle integrator (SEI) and a Wisdom-Holman mapping (WH). It supports open, periodic and shearing-sheet boundary conditions. REBOUND can use a Barnes-Hut tree to calculate both self-gravity and collisions. These modules are fully parallelized with MPI as well as OpenMP. The former makes use of a static domain decomposition and a distributed essential tree. Two new collision detection modules based on a plane-sweep algorithm are also implemented. The performance of the plane-sweep algorithm is superior to a tree code for simulations in which one dimension is much longer than the other two and in simulations which are quasi-two dimensional with less than one million particles.

[ascl:1107.009] REAS3: Modeling Radio Emission from Cosmic Ray Air Showers

The freely available Monte Carlo code REAS for modelling radio emission from cosmic ray air showers has evolved to include the full complexity of air shower physics. REAS3 improves the calculation of the emission contributions, which was not fully consistent in earlier versions of REAS, by incorporating the missing radio emission due to the variation of the number of charged particles during the air shower evolution using an "end-point formalism". With the inclusion of these emission contributions, the structure of the simulated radio pulses changes from unipolar to bipolar, and the azimuthal emission pattern becomes nearly symmetric. Remaining asymmetries can be explained by radio emission due to the variation of the net charge excess in air showers, which is automatically taken into account in the new implementation. REAS3 constitutes the first self-consistent time-domain implementation based on single particle emission taking the full complexity of air shower physics into account, and is freely available for all interested users. REAS3 has been superseded by CoREAS (ascl:1406.003).

[ascl:1506.007] REALMAF: Magnetic power spectra from Faraday rotation maps

REALMAF is a maximum-a-posteriori code to measure magnetic power spectra from Faraday rotation data. It uses a sophisticated model for the magnetic autocorrelation in real space, thus alleviating the need for simplifying assumptions in the processing. REALMAF treats the divergence relation of the magnetic field with a multiplicative factor in Fourier space, which allows modeling the magnetic autocorrelation as a spherically symmetric function.

[ascl:1408.017] RDGEN: Routines for data handling, display, and adjusting

RDGEN is a collection of routines for data handling, display, and adjusting, with a facility which helps to set up files for using with VPFIT (ascl:1408.015); it is included in the VPFIT distribution file. It is useful for setting region boundaries and initial guesses for VPFIT, for displaying the accumulated results, for examining by eye particular redshift systems and fits to them, testing that the error array is a true reflection of the rms scatter in the data, comparing spectra and generally examining and even modifying the data.

[ascl:1411.006] RC3 mosaicking pipeline: Creating mosaics for the RC3 Catalogue

The RC3 mosaicking pipeline creates color composite images and scientifically-calibrated FITS mosaics in all SDSS imaging bands for all the RC3 galaxies that lie within the survey’s footprint and on photographic plates taken by the Digitized Palomar Observatory Sky Survey (DPOSS) for the B, R, IR bands. The pipeline uses SExtractor (ascl:1010.064) for extraction and STIFF (ascl:1110.006) to generating color images. The mosaicking program uses a recursive algorithm for positional update first to correct the positional inaccuracy inherent in the RC3 catalog, then conducts the mosaicking procedure using the Astropy (ascl:1304.002) wrapper to IPAC's Montage (ascl:1010.036) software. The program is generalized into a pipeline that can be easily extended to future survey data or other source catalogs; an online interface is available at
http://lcdm.astro.illinois.edu/data/rc3/search.html.

[ascl:1105.009] Ray Tracing Codes: run_tau, run_raypath, and ray_kernel

Time-distance helioseismology aims to measure and interpret the travel times of waves propagating between two points located on the solar surface. The travel times are then inverted to infer sub-surface properties that are encoded in the measurements. The trajectory of the waves generally follows that of the infinite-frequency ray path, although they are sensitive to perturbations off of this path. Finite-frequency sensitivity kernels are thus needed to give more accurate inversion results.

Ray tracing codes calculate travel time kernels for a ray. There are three main codes which calculate the group time as a function of distance, the ray paths as well as the phase and group times along the path, and the ray kernels for the sound speed squared.

[ascl:0008.002] RATRAN: Radiative Transfer and Molecular Excitation in One and Two Dimensions

RATRAN is a numerical method and computer code to calculate the radiative transfer and excitation of molecular lines. The approach is based on the Monte Carlo method, and incorporates elements from Accelerated Lambda Iteration. It combines the flexibility of the former with the speed and accuracy of the latter. Convergence problems known to plague Monte Carlo methods at large optical depth (>100) are avoided by separating local contributions to the radiation field from the overall transfer problem. The random nature of the Monte Carlo method serves to verify the independence of the solution to the angular, spatial, and frequency sampling of the radiation field. This allows the method to be used in a wide variety of astrophysical problems without specific adaptations. Moreover, the code can be applied to all atoms or molecules for which collisional rate coefficients are available and any axially symmetric source model. Continuum emission and absorption by dust is explicitly taken into account but scattering is neglected. We expect this program to be an important tool in analyzing data from present and future infrared and (sub-)millimeter telescopes.

Would you like to view a random code?