ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 1201-1250 of 1981 (1948 ASCL, 33 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:1601.004] Odyssey: Ray tracing and radiative transfer in Kerr spacetime

Odyssey is a GPU-based General Relativistic Radiative Transfer (GRRT) code for computing images and/or spectra in Kerr metric describing the spacetime around a rotating black hole. Odyssey is implemented in CUDA C/C++. For flexibility, the namespace structure in C++ is used for different tasks; the two default tasks presented in the source code are the redshift of a Keplerian disk and the image of a Keplerian rotating shell at 340GHz. Odyssey_Edu, an educational software package for visualizing the ray trajectories in the Kerr spacetime that uses Odyssey, is also available.

[ascl:1806.018] OMEGA: One-zone Model for the Evolution of GAlaxies

OMEGA (One-zone Model for the Evolution of GAlaxies) calculates the global chemical evolution trends of galaxies. From an input star formation history, it uses SYGMA to create as a function of time multiple simple stellar populations with different masses, ages, and initial compositions. OMEGA offers several prescriptions for modeling the star formation efficiency and the evolution of galactic inflows and outflows. OMEGA is part of the NuGrid (ascl:1610.015) chemical evolution package.

[ascl:1904.024] OoT: Out-of-Transit Light Curve Generator

OoT (Out-of-Transit) calculates the light curves and radial velocity signals due to a planet orbiting a star. It explicitly models the effects of tides, orbital motion. relativistic beaming, and reflection of the stars light by the planet. The code can also be used to model secondary eclipses.

[ascl:1604.001] OpenMHD: Godunov-type code for ideal/resistive magnetohydrodynamics (MHD)

OpenMHD is a Godunov-type finite-volume code for ideal/resistive magnetohydrodynamics (MHD). It is written in Fortran 90 and is parallelized by using MPI-2 and OpenMP. The code was originally developed for studying magnetic reconnection problems and has been made publicly available in the hope that others may find it useful.

[ascl:1502.002] OpenOrb: Open-source asteroid orbit computation software

OpenOrb (OOrb) contains tools for rigorously estimating the uncertainties resulting from the inverse problem of computing orbital elements using scarce astrometry. It uses the least-squares method and also contains both Monte-Carlo (MC) and Markov-Chain MC versions of the statistical ranging method. Ranging obtains sampled, non-Gaussian orbital-element probability-density functions and is optimized for cases where the amount of astrometry is scarce or spans a relatively short time interval.

[ascl:1509.009] OPERA: Objective Prism Enhanced Reduction Algorithms

OPERA (Objective Prism Enhanced Reduction Algorithms) automatically analyzes astronomical images using the objective-prism (OP) technique to register thousands of low resolution spectra in large areas. It detects objects in an image, extracts one-dimensional spectra, and identifies the emission line feature. The main advantages of this method are: 1) to avoid subjectivity inherent to visual inspection used in past studies; and 2) the ability to obtain physical parameters without follow-up spectroscopy.

[ascl:1411.004] OPERA: Open-source Pipeline for Espadons Reduction and Analysis

OPERA (Open-source Pipeline for Espadons Reduction and Analysis) is an open-source collaborative software reduction pipeline for ESPaDOnS data. ESPaDOnS is a bench-mounted high-resolution echelle spectrograph and spectro-polarimeter designed to obtain a complete optical spectrum (from 370 to 1,050 nm) in a single exposure with a mode-dependent resolving power between 68,000 and 81,000. OPERA is fully automated, calibrates on two-dimensional images and reduces data to produce one-dimensional intensity and polarimetric spectra. Spectra are extracted using an optimal extraction algorithm. Though designed for CFHT ESPaDOnS data, the pipeline is extensible to other echelle spectrographs.

[submitted] Opik Collision Probability

The Opik method gives the mean probability of collision of a small body with a given planet. It is a statistical value valid for an orbit with given (a,e,i) and undefined argument of perihelion. In some cases, the planet can eject the small body from the solar system; in these cases, the program estimates the mean time for the ejection. The Opik method does not take into account other perturbers than the planet considered, so it only provides an idea of the timescales involved.

[ascl:1803.013] optBINS: Optimal Binning for histograms

optBINS (optimal binning) determines the optimal number of bins in a uniform bin-width histogram by deriving the posterior probability for the number of bins in a piecewise-constant density model after assigning a multinomial likelihood and a non-informative prior. The maximum of the posterior probability occurs at a point where the prior probability and the the joint likelihood are balanced. The interplay between these opposing factors effectively implements Occam's razor by selecting the most simple model that best describes the data.

[ascl:1310.001] ORAC-DR: Astronomy data reduction pipeline

ORAC-DR is a generic data reduction pipeline infrastructure; it includes specific data processing recipes for a number of instruments. It is used at the James Clerk Maxwell Telescope, United Kingdom Infrared Telescope, AAT, and LCOGT. This pipeline runs at the JCMT Science Archive hosted by CADC to generate near-publication quality data products; the code has been in use since 1998.

[ascl:1210.024] ORBADV: ORBital ADVection by interpolation

ORBADV adopts a ZEUS-like scheme to solve magnetohydrodynamic equations of motion in a shearing sheet. The magnetic field is discretized on a staggered mesh, and magnetic field variables represent fluxes through zone faces. The code uses obital advection to ensure fast and accurate integration in a large shearing box.

[ascl:1702.001] ORBE: Orbital integrator for educational purposes

ORBE performs numerical integration of an arbitrary planetary system composed by a central star and up to 100 planets and minor bodies. ORBE calculates the orbital evolution of a system of bodies by means of the computation of the time evolution of their orbital elements. It is easy to use and is suitable for educational use by undergraduate students in the classroom as a first approach to orbital integrators.

[ascl:1307.016] orbfit: Orbit fitting software

Orbfit determines positions and orbital elements, and associated uncertainties, of outer solar system planets. The orbit-fitting procedure is greatly streamlined compared with traditional methods because acceleration can be treated as a perturbation to the inertial motion of the body. Orbfit quickly and accurately calculates orbital elements and ephemerides and their associated uncertainties for targets ≳ 10 AU from the Sun and produces positional estimates and uncertainty ellipses even in the face of the substantial degeneracies of short-arc orbit fits; the sole a priori assumption is that the orbit should be bound or nearly so.

[ascl:1106.015] OrbFit: Software to Determine Orbits of Asteroids

OrbFit is a software system allowing one to compute the orbits of asteroids starting from the observations, to propagate these orbits, and to compute predictions on the future (and past) position on the celestial sphere. It is a tool to be used to find a well known asteroid, to recover a lost one, to attribute a small group of observations, to identify two orbits with each other, to study the future (and/or past) close approaches to Earth, thus to assess the risk of an impact, and more.

[ascl:1804.009] orbit-estimation: Fast orbital parameters estimator

orbit-estimation tests and evaluates the Stäckel approximation method for estimating orbit parameters in galactic potentials. It relies on the approximation of the Galactic potential as a Stäckel potential, in a prolate confocal coordinate system, under which the vertical and horizontal motions decouple. By solving the Hamilton Jacobi equations at the turning points of the horizontal and vertical motions, it is possible to determine the spatial boundary of the orbit, and hence calculate the desired orbit parameters.

[ascl:1409.007] ORBS: A reduction software for SITELLE and SpiOMM data

ORBS merges, corrects, transforms and calibrates interferometric data cubes and produces a spectral cube of the observed region for analysis. It is a fully automatic data reduction software for use with SITELLE (installed at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont Mégantic); these imaging Fourier transform spectrometers obtain a hyperspectral data cube which samples a 12 arc-minutes field of view into 4 millions of visible spectra. ORBS is highly parallelized; its core classes (ORB) have been designed to be used in a suite of softwares for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS).

[ascl:1304.012] ORIGAMI: Structure-finding routine in N-body simulation

ORIGAMI is a dynamical method of determining the morphology of particles in a cosmological simulation by checking for whether, and in how many dimensions, a particle has undergone shell-crossing. The code is written in C and makes use of the Delaunay tessellation calculation routines from the VOBOZ package (which relies on the Qhull package).

[ascl:1204.013] ORSA: Orbit Reconstruction, Simulation and Analysis

ORSA is an interactive tool for scientific grade Celestial Mechanics computations. Asteroids, comets, artificial satellites, solar and extra-solar planetary systems can be accurately reproduced, simulated, and analyzed. The software uses JPL ephemeris files for accurate planets positions and has a Qt-based graphical user interface. It offers an advanced 2D plotting tool and 3D OpenGL viewer and the standalone numerical library liborsa and can import asteroids and comets from all the known databases (MPC, JPL, Lowell, AstDyS, and NEODyS). In addition, it has an integrated download tool to update databases.

[ascl:1710.021] OSIRIS Toolbox: OH-Suppressing InfraRed Imaging Spectrograph pipeline

OSIRIS Toolbox reduces data for the Keck OSIRIS instrument, an integral field spectrograph that works with the Keck Adaptive Optics System. It offers real-time reduction of raw frames into cubes for display and basic analysis. In this real-time mode, it takes about one minute for a preliminary data cube to appear in the “quicklook” display package. The reduction system also includes a growing set of final reduction steps including correction of telluric absorption and mosaicing of multiple cubes.

[ascl:1805.014] OSS: OSSOS Survey Simulator

Comparing properties of discovered trans-Neptunian Objects (TNOs) with dynamical models is impossible due to the observational biases that exist in surveys. The OSSOS Survey Simulator takes an intrinsic orbital model (from, for example, the output of a dynamical Kuiper belt emplacement simulation) and applies the survey biases, so the biased simulated objects can be directly compared with real discoveries.

[ascl:1611.011] OXAF: Ionizing spectra of Seyfert galaxies for photoionization modeling

OXAF provides a simplified model of Seyfert Active Galactic Nucleus (AGN) continuum emission designed for photoionization modeling. It removes degeneracies in the effects of AGN parameters on model spectral shapes and reproduces the diversity of spectral shapes that arise in physically-based models. OXAF accepts three parameters which directly describe the shape of the output ionizing spectrum: the energy of the peak of the accretion disk emission Epeak, the photon power-law index of the non-thermal X-ray emission Γ, and the proportion of the total flux which is emitted in the non-thermal component pNT. OXAF accounts for opacity effects where the accretion disk is ionized because it inherits the ‘color correction’ of OPTXAGNF, the physical model upon which OXAF is based.

[ascl:1806.011] P2DFFT: Parallelized technique for measuring galactic spiral arm pitch angles

P2DFFT is a parallelized version of 2DFFT (ascl:1608.015). It isolates and measures the spiral arm pitch angle of galaxies. The code allows direct input of FITS images, offers the option to output inverse Fourier transform FITS images, and generates idealized logarithmic spiral test images of a specified size that have 1 to 6 arms with pitch angles of -75 degrees to 75 degrees​​. Further, it can output Fourier amplitude versus inner radius and pitch angle versus inner radius for each Fourier component (m = 0 to m = 6), and calculates the Fourier amplitude weighted mean pitch angle across m = 1 to m = 6 versus inner radius.

[ascl:1402.030] P2SAD: Particle Phase Space Average Density

P2SAD computes the Particle Phase Space Average Density (P2SAD) in galactic haloes. The model for the calculation is based on the stable clustering hypothesis in phase space, the spherical collapse model, and tidal disruption of substructures. The multiscale prediction for P2SAD computed by this IDL code can be used to estimate signals sensitive to the small scale structure of dark matter distributions (e.g. dark matter annihilation). The code computes P2SAD averaged over the whole virialized region of a Milky-Way-size halo at redshift zero.

[ascl:1205.002] p3d: General data-reduction tool for fiber-fed integral-field spectrographs

p3d is semi-automatic data-reduction tool designed to be used with fiber-fed integral-field spectrographs. p3d is a highly general and freely available tool based on IDL but can be used with full functionality without an IDL license. It is easily extended to include improved algorithms, new visualization tools, and support for additional instruments. It uses a novel algorithm for automatic finding and tracing of spectra on the detector, and includes two methods of optimal spectrum extraction in addition to standard aperture extraction. p3d also provides tools to combine several images, perform wavelength calibration and flat field data.

[ascl:1105.002] PACCE: Perl Algorithm to Compute Continuum and Equivalent Widths

PACCE (Perl Algorithm to Compute continuum and Equivalent Widths) computes continuum and equivalent widths. PACCE is able to determine mean continuum and continuum at line center values, which are helpful in stellar population studies, and is also able to compute the uncertainties in the equivalent widths using photon statistics.

[ascl:1110.011] Pacerman: Polarisation Angle CorrEcting Rotation Measure ANalysis

Pacerman, written in IDL, is a new method to calculate Faraday rotation measure maps from multi-frequency polarisation angle data. In order to solve the so called n-pi-ambiguity problem which arises from the observationally ambiguity of the polarisation angle which is only determined up to additions of n times pi, where n is an integer, we suggest using a global scheme. Instead of solving the n-pi-ambiguity for each data point independently, our algorithm, which we chose to call Pacerman solves the n-pi-ambiguity for a high signal-to-noise region "democratically" and uses this information to assist computations in adjacent low signal-to-noise areas.

[ascl:1708.014] PACSman: IDL Suite for Herschel/PACS spectrometer data

PACSman provides an alternative for several reduction and analysis steps performed in HIPE (ascl:1111.001) on PACS spectroscopic data; it is written in IDL. Among the operations possible with it are transient correction, line fitting, map projection, and map analysis, and unchopped scan, chop/nod, and the decommissioned wavelength switching observation modes are supported.

[ascl:1210.009] PAHFIT: Properties of PAH Emission

PAHFIT is an IDL tool for decomposing Spitzer IRS spectra of PAH emission sources, with a special emphasis on the careful recovery of ambiguous silicate absorption, and weak, blended dust emission features. PAHFIT is primarily designed for use with full 5-35 micron Spitzer low-resolution IRS spectra. PAHFIT is a flexible tool for fitting spectra, and you can add or disable features, compute combined flux bands, change fitting limits, etc., without changing the code.

PAHFIT uses a simple, physically-motivated model, consisting of starlight, thermal dust continuum in a small number of fixed temperature bins, resolved dust features and feature blends, prominent emission lines (which themselves can be blended with dust features), as well as simple fully-mixed or screen dust extinction, dominated by the silicate absorption bands at 9.7 and 18 microns. Most model components are held fixed or are tightly constrained. PAHFIT uses Drude profiles to recover the full strength of dust emission features and blends, including the significant power in the wings of the broad emission profiles. This means the resulting feature strengths are larger (by factors of 2-4) than are recovered by methods which estimate the underlying continuum using line segments or spline curves fit through fiducial wavelength anchors.

[ascl:1606.002] PAL: Positional Astronomy Library

The PAL library is a partial re-implementation of Pat Wallace's popular SLALIB library written in C using a Gnu GPL license and layered on top of the IAU's SOFA library (or the BSD-licensed ERFA) where appropriate. PAL attempts to stick to the SLA C API where possible.

[ascl:1406.002] PAMELA: Optimal extraction code for long-slit CCD spectroscopy

PAMELA is an implementation of the optimal extraction algorithm for long-slit CCD spectroscopy and is well suited for time-series spectroscopy. It properly implements the optimal extraction algorithm for curved spectra, including on-the-fly cosmic ray rejection as well as proper calculation and propagation of the errors. The software is distributed as part of the Starlink software collection (ascl:1110.012).

[ascl:1805.021] PampelMuse: Crowded-field 3D spectroscopy

PampelMuse analyzes integral-field spectroscopic observations of crowded stellar fields and provides several subroutines to perform the individual steps of the data analysis. All analysis steps assume that the IFS data has been properly reduced and that all the instrumental artifacts have been removed. PampelMuse is designed to correctly handle IFS data regardless of which instrument was used to observe the data. In addition to the actual data, the software also requires an estimate of the variances for the analysis; optionally, it can use a bad pixel mask. The analysis relies on the presence of a reference catalogue, containing coordinates and magnitudes of the stars in and around the observed field.

[ascl:1511.009] Pangloss: Reconstructing lensing mass

Pangloss reconstructs all the mass within a light cone through the Universe. Understanding complex mass distributions like this is important for accurate time delay lens cosmography, and also for accurate lens magnification estimation. It aspires to use all available data in an attempt to make the best of all mass maps.

[ascl:1103.008] Parallel HOP: A Scalable Halo Finder for Massive Cosmological Data Sets

Modern N-body cosmological simulations contain billions ($10^9$) of dark matter particles. These simulations require hundreds to thousands of gigabytes of memory, and employ hundreds to tens of thousands of processing cores on many compute nodes. In order to study the distribution of dark matter in a cosmological simulation, the dark matter halos must be identified using a halo finder, which establishes the halo membership of every particle in the simulation. The resources required for halo finding are similar to the requirements for the simulation itself. In particular, simulations have become too extensive to use commonly-employed halo finders, such that the computational requirements to identify halos must now be spread across multiple nodes and cores. Here we present a scalable-parallel halo finding method called Parallel HOP for large-scale cosmological simulation data. Based on the halo finder HOP, it utilizes MPI and domain decomposition to distribute the halo finding workload across multiple compute nodes, enabling analysis of much larger datasets than is possible with the strictly serial or previous parallel implementations of HOP. We provide a reference implementation of this method as a part of the toolkit yt, an analysis toolkit for Adaptive Mesh Refinement (AMR) data that includes complementary analysis modules. Additionally, we discuss a suite of benchmarks that demonstrate that this method scales well up to several hundred tasks and datasets in excess of $2000^3$ particles. The Parallel HOP method and our implementation can be readily applied to any kind of N-body simulation data and is therefore widely applicable. Parallel HOP is part of yt.

[ascl:1106.009] PARAMESH V4.1: Parallel Adaptive Mesh Refinement

PARAMESH is a package of Fortran 90 subroutines designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, it can operate as a domain decomposition tool for users who want to parallelize their serial codes, but who do not wish to use adaptivity.

The package builds a hierarchy of sub-grids to cover the computational domain, with spatial resolution varying to satisfy the demands of the application. These sub-grid blocks form the nodes of a tree data-structure (quad-tree in 2D or oct-tree in 3D). Each grid block has a logically cartesian mesh. The package supports 1, 2 and 3D models. PARAMESH is released under the NASA-wide Open-Source software license.

[ascl:1010.039] Parameter Estimation from Time-Series Data with Correlated Errors: A Wavelet-Based Method and its Application to Transit Light Curves

We consider the problem of fitting a parametric model to time-series data that are afflicted by correlated noise. The noise is represented by a sum of two stationary Gaussian processes: one that is uncorrelated in time, and another that has a power spectral density varying as $1/f^gamma$. We present an accurate and fast [O(N)] algorithm for parameter estimation based on computing the likelihood in a wavelet basis. The method is illustrated and tested using simulated time-series photometry of exoplanetary transits, with particular attention to estimating the midtransit time. We compare our method to two other methods that have been used in the literature, the time-averaging method and the residual-permutation method. For noise processes that obey our assumptions, the algorithm presented here gives more accurate results for midtransit times and truer estimates of their uncertainties.

[ascl:1103.014] ParaView: Data Analysis and Visualization Application

ParaView is an open-source, multi-platform data analysis and visualization application. ParaView users can quickly build visualizations to analyze their data using qualitative and quantitative techniques. The data exploration can be done interactively in 3D or programmatically using ParaView's batch processing capabilities.

ParaView was developed to analyze extremely large datasets using distributed memory computing resources. It can be run on supercomputers to analyze datasets of terascale as well as on laptops for smaller data.

[ascl:1601.010] PARAVT: Parallel Voronoi Tessellation

PARAVT offers massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition take into account consistent boundary computation between tasks, and support periodic conditions. In addition, the code compute neighbors lists, Voronoi density and Voronoi cell volumes for each particle, and can compute density on a regular grid.

[ascl:1502.005] PARSEC: PARametrized Simulation Engine for Cosmic rays

PARSEC (PARametrized Simulation Engine for Cosmic rays) is a simulation engine for fast generation of ultra-high energy cosmic ray data based on parameterizations of common assumptions of UHECR origin and propagation. Implemented are deflections in unstructured turbulent extragalactic fields, energy losses for protons due to photo-pion production and electron-pair production, as well as effects from the expansion of the universe. Additionally, a simple model to estimate propagation effects from iron nuclei is included. Deflections in the Galactic magnetic field are included using a matrix approach with precalculated lenses generated from backtracked cosmic rays. The PARSEC program is based on object oriented programming paradigms enabling users to extend the implemented models and is steerable with a graphical user interface.

[ascl:1208.020] ParselTongue: AIPS Python Interface

ParselTongue is a Python interface to classic AIPS, Obit and possibly other task-based data reduction packages. It serves as the software infrastructure for some of the ALBUS implementation. It allows you to run AIPS tasks, and access AIPS headers and extension tables from Python. There is also support for running Obit tasks and accessing data in FITS files. Full access to the visibilities in AIPS UV data is also available.

[ascl:1010.005] Particle module of Piernik MHD code

Piernik is a multi-fluid grid magnetohydrodynamic (MHD) code based on the Relaxing Total Variation Diminishing (RTVD) conservative scheme. The original code has been extended by addition of dust described within the particle approximation. The dust is now described as a system of interacting particles. The particles can interact with gas, which is described as a fluid. The comparison between the test problem results and the results coming from fluid simulations made with Piernik code shows the most important differences between fluid and particle approximations used to describe dynamical evolution of dust under astrophysical conditions.

[ascl:1010.073] partiview: Immersive 4D Interactive Visualization of Large-Scale Simulations

In dense clusters a bewildering variety of interactions between stars can be observed, ranging from simple encounters to collisions and other mass-transfer encounters. With faster and special-purpose computers like GRAPE, the amount of data per simulation is now exceeding 1TB. Visualization of such data has now become a complex 4D data-mining problem, combining space and time, and finding interesting events in these large datasets. We have recently starting using the virtual reality simulator, installed in the Hayden Planetarium in the American Museum for Natural History, to tackle some of these problem. partiview is a program that enables you to visualize and animate particle data. partiview runs on relatively simple desktops and laptops, but is mostly compatible with its big brother VirDir.

[ascl:1809.003] PASTA: Python Astronomical Stacking Tool Array

PASTA performs median stacking of astronomical sources. Written in Python, it can filter sources, provide stack statistics, generate Karma annotations, format source lists, and read information from stacked Flexible Image Transport System (FITS) images. PASTA was originally written to examine polarization stack properties and includes a Monte Carlo modeler for obtaining true polarized intensity from the observed polarization of a stack. PASTA is also useful as a generic stacking tool, even if polarization properties are not being examined.

[ascl:1102.002] PBL: Particle-Based Lensing for Gravitational Lensing Mass Reconstructions of Galaxy Clusters

We present Particle-Based Lensing (PBL), a new technique for gravitational lensing mass reconstructions of galaxy clusters. Traditionally, most methods have employed either a finite inversion or gridding to turn observational lensed galaxy ellipticities into an estimate of the surface mass density of a galaxy cluster. We approach the problem from a different perspective, motivated by the success of multi-scale analysis in smoothed particle hydrodynamics. In PBL, we treat each of the lensed galaxies as a particle and then reconstruct the potential by smoothing over a local kernel with variable smoothing scale. In this way, we can tune a reconstruction to produce constant signal-noise throughout, and maximally exploit regions of high information density.

PBL is designed to include all lensing observables, including multiple image positions and fluxes from strong lensing, as well as weak lensing signals including shear and flexion. In this paper, however, we describe a shear-only reconstruction, and apply the method to several test cases, including simulated lensing clusters, as well as the well-studied ``Bullet Cluster'' (1E0657-56). In the former cases, we show that PBL is better able to identify cusps and substructures than are grid-based reconstructions, and in the latter case, we show that PBL is able to identify substructure in the Bullet Cluster without even exploiting strong lensing measurements.

[ascl:1708.007] PBMC: Pre-conditioned Backward Monte Carlo code for radiative transport in planetary atmospheres

PBMC (Pre-Conditioned Backward Monte Carlo) solves the vector Radiative Transport Equation (vRTE) and can be applied to planetary atmospheres irradiated from above. The code builds the solution by simulating the photon trajectories from the detector towards the radiation source, i.e. in the reverse order of the actual photon displacements. In accounting for the polarization in the sampling of photon propagation directions and pre-conditioning the scattering matrix with information from the scattering matrices of prior (in the BMC integration order) photon collisions, PBMC avoids the unstable and biased solutions of classical BMC algorithms for conservative, optically-thick, strongly-polarizing media such as Rayleigh atmospheres.

[ascl:1207.012] PCA: Principal Component Analysis for spectra modeling

The mid-infrared spectra of ultraluminous infrared galaxies (ULIRGs) contain a variety of spectral features that can be used as diagnostics to characterize the spectra. However, such diagnostics are biased by our prior prejudices on the origin of the features. Moreover, by using only part of the spectrum they do not utilize the full information content of the spectra. Blind statistical techniques such as principal component analysis (PCA) consider the whole spectrum, find correlated features and separate them out into distinct components.

This code, written in IDL, classifies principal components of IRS spectra to define a new classification scheme using 5D Gaussian mixtures modelling. The five PCs and average spectra for the four classifications to classify objects are made available with the code.

[ascl:1705.004] PCAT: Probabilistic Cataloger

PCAT (Probabilistic Cataloger) samples from the posterior distribution of a metamodel, i.e., union of models with different dimensionality, to compare the models. This is achieved via transdimensional proposals such as births, deaths, splits and merges in addition to the within-model proposals. This method avoids noisy estimates of the Bayesian evidence that may not reliably distinguish models when sampling from the posterior probability distribution of each model.

The code has been applied in two different subfields of astronomy: high energy photometry, where transdimensional elements are gamma-ray point sources; and strong lensing, where light-deflecting dark matter subhalos take the role of transdimensional elements.

[ascl:1809.002] PCCDPACK: Polarimetry with CCD

PCCDPACK analyzes polarimetry data. The set of routines is written in CL-IRAF (including compiled Fortran codes) and analyzes dozens of point objects simultaneously on the same CCD image. A subpackage, specpol, is included to analyze spectropolarimetry data.

[ascl:1102.022] PDRT: Photo Dissociation Region Toolbox

Ultraviolet photons from O and B stars strongly influence the structure and emission spectra of the interstellar medium. The UV photons energetic enough to ionize hydrogen (hν > 13.6 eV) will create the H II region around the star, but lower energy UV photons escape. These far-UV photons (6 eV < hν < 13.6 eV) are still energetic enough to photodissociate molecules and to ionize low ionization-potential atoms such as carbon, silicon, and sulfur. They thus create a photodissociation region (PDR) just outside the H II region. In aggregate, these PDRs dominates the heating and cooling of the neutral interstellar medium.

As part of the Web Infrared Tool Shed (WITS) we have developed a web tool, called the PDR Toolbox, that allows users to determine the physical parameters of a PDR from a set of spectral line observations. Typical observations of both Galactic and extragalactic PDRs come from ground-based millimeter and submillimeter telescopes such as CARMA or the CSO, or space-based telescopes such as Spitzer, ISO, SOFIA, and Herschel. Given a set of observations of spectral line intensities, PDR Toolbox will compute best-fit FUV incident intensity and cloud density based on our published models of PDR emission.

[ascl:1605.008] PDT: Photometric DeTrending Algorithm Using Machine Learning

PDT removes systematic trends in light curves. It finds clusters of light curves that are highly correlated using machine learning, constructs one master trend per cluster and detrends an individual light curve using the constructed master trends by minimizing residuals while constraining coefficients to be positive.

[ascl:1304.001] PEC: Period Error Calculator

The PEC (Period Error Calculator) algorithm estimates the period error for eclipsing binaries observed by the Kepler Mission. The algorithm is based on propagation of error theory and assumes that observation of every light curve peak/minimum in a long time-series observation can be unambiguously identified. A simple C implementation of the PEC algorithm is available.

Would you like to view a random code?