Category Archives: codes

July 2014 additions to the ASCL

Twenty codes were added to the ASCL in July:

AstroML: Machine learning and data mining in astronomy
ASTRORAY: General relativistic polarized radiative transfer code
BayesFlare: Bayesian method for detecting stellar flares
Brut: Automatic bubble classifier
CLE: Coronal line synthesis

e-MERLIN data reduction pipeline
Exopop: Exoplanet population inference
EZ_Ages: Stellar population age calculator
Halogen: Multimass spherical structure models for N-body simulations
kungifu: Calibration and reduction of fiber-fed IFU astronomical spectroscopy

MATLAB package for astronomy and astrophysics
MCMAC: Monte Carlo Merger Analysis Code
Period04: Statistical analysis of large astronomical time series
PINGSoft2: Integral Field Spectroscopy Software
SAMI: Sydney-AAO Multi-object Integral field spectrograph pipeline

SPECDRE: Spectroscopy Data Reduction
The Starfish Diagram: Statistical visualization tool
TWODSPEC: Long-slit and optical fiber array spectra extensions for FIGARO
VIDE: The Void IDentification and Examination toolkit
VStar: Variable star data visualization and analysis tool

June 2014 additions to the ASCL

Twenty codes were added to the ASCL in June:

ASTROM: Basic astrometry program
ASURV: Astronomical SURVival Statistics
Autoastrom: Autoastrometry for Mosaics
CGS4DR: Automated reduction of data from CGS4
COCO: Conversion of Celestial Coordinates

CoREAS: CORSIKA-based Radio Emission from Air Showers simulator
FROG: Time-series analysis
GAUSSCLUMPS: Gaussian-shaped clumping from a spectral map
IRAS90: IRAS Data Processing
IRCAMDR: IRCAM3 Data Reduction Software

IUEDR: IUE Data Reduction package
JCMTDR: Applications for reducing JCMT continuum data in GSD format
MATCH: A program for matching star lists
PAMELA: Optimal extraction code for long-slit CCD spectroscopy
PERIOD: Time-series analysis package

POLMAP: Interactive data analysis package for linear spectropolarimetry
RV: Radial Components of Observer’s Velocity
STARMAN: Stellar photometry and image/table handling
TSP: Time-Series/Polarimetry Package
VADER: Viscous Accretion Disk Evolution Resource

May 2014 additions to the ASCL

Eighteen codes were added to the ASCL in May:

ATV: Image display tool
CURSA: Catalog and Table Manipulation Applications
DATACUBE: A datacube manipulation package
Defringeflat: Fringe pattern removal
DIPSO: Spectrum analysis code

ECHOMOP: Echelle data reduction package
ESP: Extended Surface Photometry
FLUXES: Position and flux density of planets
FORWARD: Forward modeling of coronal observables
HIIPHOT: Automated Photometry of H II Regions

LBLRTM: Line-By-Line Radiative Transfer Model
PHOTOM: Photometry of digitized images
PISA: Position Intensity and Shape Analysis
POLPACK: Imaging polarimetry reduction package
PROPER: Optical propagation routines

TelFit: Fitting the telluric absorption spectrum
The Hammer: An IDL Spectral Typing Suite
TRIPP: Time Resolved Imaging Photometry Package

March and April 2014 code additions

Twenty-six codes were added to the ASCL in March:

ASTERIX: X-ray Data Processing System
BAOlab: Image processing program
CCDPACK: CCD Data Reduction Package
CHIMERA: Core-collapse supernovae simulation code
computePk: Power spectrum computation

disc2vel: Tangential and radial velocity components derivation
GAIA: Graphical Astronomy and Image Analysis Tool
GPU-D: Generating cosmological microlensing magnification maps
GRay: Massive parallel ODE integrator
Inverse Beta: Inverse cumulative density function (CDF) of a Beta distribution

ISAP: ISO Spectral Analysis Package
JAM: Jeans Anisotropic MGE modeling method
KAPPA: Kernel Applications Package
KINEMETRY: Analysis of 2D maps of kinematic moments of LOSVD
Lightcone: Light-cone generating script

MGE_FIT_SECTORS: Multi-Gaussian Expansion fits to galaxy images
MLZ: Machine Learning for photo-Z
pyExtinction: Atmospheric extinction
RMHB: Hierarchical Reverberation Mapping
SLALIB: A Positional Astronomy Library

SOFA: Standards of Fundamental Astronomy
SURF: Submm User Reduction Facility
T(dust) as a function of sSFR
Unified EOS for neutron stars
Viewpoints: Fast interactive linked plotting of large multivariate data sets

YNOGKM: Time-like geodesics in the Kerr-Newmann Spacetime calculations

And seventeen codes were added to the ASCL in April:

AMBIG: Automated Ambiguity-Resolution Code
AST: World Coordinate Systems in Astronomy
CAP_LOESS_1D & CAP_LOESS_2D: Recover mean trends from noisy data
carma_pack: MCMC sampler for Bayesian inference
Comet: Multifunction VOEvent broker

LTS_LINEFIT & LTS_PLANEFIT: LTS fit of lines or planes
macula: Model of rotational modulations of a spotted star
RegPT: Regularized cosmological power spectrum
SAS: Science Analysis System for XMM-Newton observatory
SER: Subpixel Event Repositioning Algorithms

SpecPro: Astronomical spectra viewer and analyzer
Spextool: Spectral EXtraction tool
TORUS: Radiation transport and hydrodynamics code
TTVFast: Transit timing inversion
VictoriaReginaModels: Stellar evolutionary tracks

WFC3UV_GC: WFC3 UVIS geometric-distortion correction
ZDCF: Z-Transformed Discrete Correlation Function

Code citation news, info, and commentary

Mozilla Science Lab, GitHub and Figshare team up to fix the citation of code in academia
The Mozilla Science Lab, GitHub and Figshare – a repository where academics can upload, share and cite their research materials – is starting to tackle the problem. The trio have developed a system so researchers can easily sync their GitHub releases with a Figshare account. It creates a Digital Object Identifier (DOI) automatically, which can then be referenced and checked by other people.

Discussion of the above article on YCombinator
…it always make me cringe when privately held companies want to define an “open standard” for scientific citations that (surprise!) relies completely on their proprietary infrastructure. I still remember the case of Mendeley, which promised to build an open repository for research documents, and which is now a subsidiary of Elsevier, an organization that does not really embrace “open science”, to put it mildly.

Tool developed at CERN makes software citation easier
Researchers working at CERN have developed a tool that allows source code from the popular software development site GitHub to be preserved and cited through the CERN-hosted online repository Zenodo….
Now, people working on software in GitHub will be able to ensure that their code is not only preserved through Zenodo, but is also provided with a unique digital object identifier (DOI), just like an academic paper.

Webcite
WebCite is an on-demand archiving system for webreferences (cited webpages and websites, or other kinds of Internet-accessible digital objects), which can be used by authors, editors, and publishers of scholarly papers and books, to ensure that cited webmaterial will remain available to readers in the future.

DOIs unambiguously and persistently identify published, trustworthy, citable online scholarly literature. Right?
So DOIs unambiguously and persistently identify published, trustworthy, citable online scholarly literature. Right? Wrong.
The examples above are useful because they help elucidate some misconceptions about the DOI itself, the nature of the DOI registration agencies and, in particular issues being raised by new RAs and new DOI allocation models.

February 2014 code additions

Thirty-five codes were added to the ASCL in February:

Aladin Lite: Lightweight sky atlas for browsers
ANAigm: Analytic model for attenuation by the intergalactic medium
ARTIST: Adaptable Radiative Transfer Innovations for Submillimeter Telescopes
astroplotlib: Astronomical library of plots
athena: Tree code for second-order correlation functions

BAOlab: Baryon Acoustic Oscillations software
BF_dist: Busy Function fitting
CASSIS: Interactive spectrum analyzer
Commander 2: Bayesian CMB component separation and analysis
CPL: Common Pipeline Library

Darth Fader: Galaxy catalog cleaning method for redshift estimation
DexM: Semi-numerical simulations for very large scales
FAMA: Fast Automatic MOOG Analysis
GalSim: Modular galaxy image simulation toolkit
Glue: Linked data visualizations across multiple files

gyrfalcON: N-body code
HALOFIT: Nonlinear distribution of cosmological mass and galaxies
HydraLens: Gravitational lens model generator
KROME: Chemistry package for astrophysical simulations
libsharp: Library for spherical harmonic transforms

MGHalofit: Modified Gravity extension of Halofit
Munipack: General astronomical image processing software
P2SAD: Particle Phase Space Average Density
PyGFit: Python Galaxy Fitter
PyVO: Python access to the Virtual Observatory

PyWiFeS: Wide Field Spectrograph data reduction pipeline
QUICKCV: Cosmic variance calculator
QuickReduce: Data reduction pipeline for the WIYN One Degree Imager
SPLAT-VO: Spectral Analysis Tool for the Virtual Observatory
SPLAT: Spectral Analysis Tool

TARDIS: Temperature And Radiative Diffusion In Supernovae
UVMULTIFIT: Fitting astronomical radio interferometric data
Vissage: ALMA VO Desktop Viewer
wssa_utils: WSSA 12 micron dust map utilities
XNS: Axisymmetric equilibrium configuration of neutron stars

Codes gone bad and how to save them

The ASCL has 779 codes in it now, some of which date back to the 1990s. With the speed at which both the web and code authors (often grad students or post docs) move, links to some code sites are bound to go bad over time. We use a checker regularly to test links to ensure we’re not pointing to dead links; when we do find a broken link (defined as one we haven’t been able to reach for at least 2 weeks), we look for a new one and, if that doesn’t work, email the code author(s) to ask where the code has moved.

We can’t always find a good link, and code authors sometimes don’t reply to our emails. Currently, eight codes — 1% of our entries — have bad links. Of these, for half of them we either cannot find the code author or the code author has not replied to numerous emails.

What else can we do?

I assume that some code authors forget their codes. Having moved on perhaps to another institution and other work, they do not have time nor incentive to create a new web home for a code they wrote some years ago. That’s understandable, but then the code, a unique solution to a problem, an artifact of astrophysics research, a method used in research, is lost.

We’d like to save the codes (Save the Codes! I may have to put that on glow-in-the-dark pencils); here are a few ideas for authors who no longer want to maintain a site for their codes:

  1. Send an archive file of the code to the ASCL. We can house it, as we do for CHIWEI.
  2. Post the code in an online repostitory such as GitHub, SourceForge, Code.Google, or Bitbucket if you would like the code to be open source and are open to others continuing its development, or on a site such as Figshare or Zenodo to simply make it available.
  3. Create a Research Compendium for your paper, data, and code on Research Compendia, or a companion website for your research on RunMyCode and load the code and data for your research there.
  4. Ask your institutional library to house it; many institutions have repositories for storing the digital artifacts of academia and research.

I don’t know about option 4, but options 1-3 should take 15 minutes or less. Surely a code is worth that little bit of extra time to make it available to others even if you don’t want to be bothered with it anymore.

Please save your code; don’t let it go bad!

Where the codes are

There are currently 768 codes registered in the ASCL; the percentages of codes hosted on different popular sites are:

GitHub: 4.17%
SourceForge: 3.78%
Code.Google: 1.96%
Bitbucket: 0.52%

That means 11% of codes indexed by the ASCL are hosted on a public site conducive to social programming. That’s higher than the 7% from two years ago (by coincidence, almost exactly two years ago) and not unexpected, given the growth of GitHub. Fewer than 1% of ASCL codes were in GitHub two years ago (only 3 at that time — wow!); now there are 32 hosted on GitHub. For comparison, there were 14 codes on SourceForge two years ago, so while that number has doubled, the growth in use of GitHub is obviously much greater.

Though stored on sites conducive to collaboration, most of these codes are not big collaborations; the majority of codes in the ASCL in these repositories have 4 or fewer authors.

I expect the percentage of codes on such sites to grow as more people use these tools for versioning; I think those who use such tools may also be more open to sharing their codes and advertising them (via links in papers if nothing else), making them easier to find/register in the ASCL, too.

Astrophysics Code Sharing II: The Sequel at AAS 223

On Tuesday, January 7, the AAS Working Group on Astronomical Software (WGAS) and the ASCL sponsored a special session on code sharing as a follow up to the splinter meeting “Astrophysics Code Sharing?” held at AAS 221. We continue the dialogue for ways to improve the transparency and efficiency of research by sharing codes and to mitigate the negative aspects of releasing them.

Photograph of room session was held in, showing people in seats and standing in the back of the room

Even before the session began, it looked like there would be standing room only. Photo, Peter Teuben, used with permission

Before the session started, however, there were a few nerve-wracking moments;  weather- and Amtrak-related delays had one of the presenters arriving at AAS at 2:40 AM the day of the session rather than before lunch on Monday, and another getting to AAS after the session had started (!) but before his talk was to begin. So yes! There were minutes to spare!

The standing-room-only session was moderated by Peter Teuben of the University of Maryland and chairman of the ASCL Advisory Committee; Robert Hanisch, STScI, outgoing chair of the WGAS and also a member of the ASCL Advisory Committee, provided closing remarks. Those not in the room were not without news of what was being said in it, as there was much tweeting about the session (#aas223, #astroCodeShare).

Peter started the session by introducing the speakers (present or not) and explaining a bit how the session would work: code case studies would have 2-minute question periods for any clarifications or questions about the cases themselves, and other questions would be deferred until the open discussion period, which was approximately the latter half of the session.

Presentations
A very brief summary of some main points of the sessions, along with their titles, presenters, and links to slides where applicable, is given here.

    • Occupy Hard Drives: Making your work more valuable by giving it away, Benjamin Weiner (University of Arizona)
      Ben pointed out that time spent writing software represents an enormous sunk cost that is, unfortunately, not viewed as doing real work, though writing software is part of doing science. He stated that widely-used software has enabled at least as much science as a new instrument would. He encouraged people to document their code for their own sake, release it without worrying about bugs or other potential issues in the software, and to write software methods papers for journals.
      slides (PDF)
    • Maintaining A User Community For The Montage Image Mosaic Toolkit, Bruce Berriman (Caltech)
      In this case study of Montage, Bruce stated that releasing software comes with a cost, but that it is still worth doing. Montage was developed under contract, and was designed for ease of maintenance, modularity, and sustainability from the beginning. It is maintained primarily through volunteer effort, and in part through collaborations, e.g., with the LSST EPO team. He said the Caltech license under which Montage is licensed does not allow users to redistribute modified code, nor can Montage be included in other distributions such as Redhat. He suggests coders consider licensing carefully.
      slides (PDF)
    • Cloudy – simulating the non-equilibrium microphysics of gas and dust, and its observed spectrum, Gary Ferland (University of Kentucky)
      Gary discussed Cloudy, which, with over three decades of use, is the most mature of the three codes covered in this session. The code is autonomous and self-aware, providing warnings about what might have gone wrong when things do go wrong. Though the user community is broad and participants in the summer schools that are held on the code have formed collaborations, a Yahoo! discussion forum for Cloudy has not been as successful as they had hoped. Cloudy was released as open access, with the most permissive license possible; Gary cited NSF as making this necessary since the code was developed with public grant funds. Students who work on the code get industry-standard programming experience, which is intended to help students gain employment after graduation.
      slides (PDF)
    • NSF Policies on Software and Data Sharing and their Implementation, Daniel Katz (National Science Foundation)
      Dan covered the NSF policies that govern software funded by the agency. Though some NSF panels are much more rigorous than others, it is expected that PIs will publish all significant findings, included data and software; he stated quite firmly that data include software according to the Government. He also said that it is up to the community via peer review panels to enforce these policies, that many core research programs don’t enforce this very well, and that the community determines what is and is not acceptable. This may be changing, however, as with an Office of Science and Technology Policy memo on open data, OMB policies are pushing harder on open access.
      slides (PDF)
    • The Astropy Project’s Self-Herding Cats Development Model, Erik Tollerud (Yale University)
      The newest of the three code projects highlighted is Astropy. Erik described the grass-roots effort to self-organize the now ~60 code base contributors, and that this arose out of a common goal: to streamline astronomy tools written in Python, as having eight different packages to do the same thing means that 7/8s of the effort was wasted effort. He stated that technology now exists that provides good support for such an effort, including GitHub to manage the processes of many developers, Travis for testing code, and Sphinx for documentation, which is written as the code is written. He pointed out that agreement on the problem was the key in getting the effort to come together and that consensus, guidelines, and expectations make it work.
      slides (PDF)
    • Costs and benefits of developing out in the open, David W. Hogg (New York University)
      David started out by saying that everything his group does is open —  all papers, grant proposals, comments, and codes — and has been since 2005, and that this was a pragmatic, not an ethical decision. He stated that the negatives others give for not releasing code — getting scooped, embarrassment, time, e-mail and support requests, licensing — are overplayed, and that since the public is paying for this, we should return the products we develop to them. He doesn’t know of a single case of someone’s getting scooped because he/she shared code. Rather, the benefits that sharing openly provides, establishing priority, visibility and goodwill, re-use and citations, feedback and bug-catching, and having the moral high ground, outweigh the overplayed negatives.
      slides (PDF)

Discussion
After David’s presentation, Peter opened the floor for questions and discussion, and Kelle Cruz from Hunter College was ready! Kelle said that AAS should require code release and then asked whether anyone from the AAS journals was present. There was not.

Photo of slide with unneeded discussion questions on them

We didn’t really need to prompt discussion; there was plenty to talk about! Photo, Meredith Rawls, used with permission

Kelle then suggested to Daniel Katz that the NSF should take stronger role in enforcement. Dan said he will see what he can do to get astronomy reviewers training for what to look for, and that he already does this for his area. David Hogg said there aren’t any mechanisms for long-term stewardship of software and asked whether the NSF was looking at this. Dan said it is not at this time, and that the NSF generally avoids long-term commitment of funds.

Someone in the back of the room pointed out that protection of code can also lead to the protection of errors, told a sad anecdote to illustrate that point, and commented that code sharing fosters improvements in coding practice. In response to a question about whether it was worthwhile to share very specific code, David answered yes, just post it, that if it’s not useful to others, so what? But it just might be! And Benjamin Weiner suggested the code be put in GitHub.

Two questions came from someone else in the back of the room, one on whether export control restrictions (ITAR) would be changing; the second question relayed that PhD students write a code for their thesis but then protect it because, in their perception, the code makes them employable, and did the panel have anything to say about that? Erik Tollerud made the point that people are hired for the skills that went into creating the code, not for a particular code. David replied that he has seen this with data, that proprietary data does sometimes give someone leverage for employment. Dan answered the ITAR question by saying that changes in ITAR were probably not going to happen soon.

Another attendee asked about the cost of making code shareable, of what that cost is, and felt that the panelists had swept it under the rug. Ben replied that it’s a community problem, the community needs to reward it, and there needs to be a values change. In the meantime, put it out there anyway; clean it up if you can, but put it out. David agrees there are costs, but the benefits are more substantial than the costs. The cost is not very large and the upside is larger than the downside. Bruce thinks it is worth the effort to plan upfront; that will save time/money later on. This is harder if the code is not initially planned, but one should try to address this when possible.

Nuria Lorente, who was following the session from Australia through Twitter, tweeted that “NOT releasing code also comes at a price, which is often forgotten.”

Andrej Prsa from Villanova made a strong appeal to post code to arXiv; he stated that astro-ph should be open to other things beside preprints. Someone else pointed out that arXiv doesn’t necessarily agree. David said that he put the documentation for emcee, the MCMC hammer on arxiv and that gets cited. Erik made the point about additional contributors to a software development project such as Astropy don’t get credit if they are not on the author list on the paper uploaded to arXiv. Alberto Accomazzi from ADS mentioned that updating the author list on arXiv was a way to fix that and give others credit, even though that will not be reflected on ADS.

Someone commented on the need for some sort of code sharing infrastructure to help with sharing. David commented that he wants all flowers to bloom, but some flowers are more valuable than others. Erik said that better search engines over time will help, that Astropy is more findable because of better search engines and because more people now link to it. It was mentioned that with more code sharing, finding useful codes may become more difficult as the signal to noise ratio goes down.

Alberto Accomazzi brought up the uncertain provenance of code, code that does not have a license, and sometimes no author, attached to it, and stated that it is hard to deal with because it cannot be shared. This was echoed by David, who pointed out that a lack of a license for a code can prevent release. Bruce suggested a licensing workshop would be a good idea, and this idea got traction among attendees. The recent re-licensing of yt was brought up. Dan Katz looks specifically for licensing information when looking at proposal, and it’s clear to him that many people don’t know what they are doing on this and could use guidance. David suggested that people use BSD or MIT licenses if they know nothing about licensing.

Peter Teuben then brought the discussion to an end and turned the podium over to Robert Hanisch for closing remarks.

Session wrapup
Robert Hanisch reiterated that software sharing is fundamental to the dissemination and validation of research results, and though there are carrots and sticks for software sharing, the sticks are not very strong. He also pointed out that nothing within the funding agencies offers support for software development and that there is a disconnect between national policy and implementation. Journals at best only encourage code release, too; they do not require it. A sociological change is necessary; in the meantime, he hopes those attending will just put codes out there!  The benefits outweigh the costs.

He talked also about opportunity for change; as of Sunday January 5, the Working Group on Astronomical Software has Frossie Economou as its new chair, and that over the weekend, the Council of AAS had suggested that the WGAS be elevated from a Working Group to a Division within AAS. He had requested that the Council have the WGAS offer a prize specifically for software, and though the Council did not accept the idea upon presentation, Bob noted that a Division can award prizes independently. Having a Division focused on software will also provide more visibility for it, and on this hopeful note, the session ended.

… though the discussion continues…

My thoughts (just a few)
This is the fourth discussion session the ASCL has arranged; previous sessions include one at AAS 221 and two at the previous two ADASS meetings. (Links to materials or discussion from previous sessions are below.)

I was glad to hear several of the presenters say the concerns people have about releasing their codes are overplayed. I was particularly happy when David said that if people would only go ahead and release their imperfect software, other people would see that released codes are also imperfect and thus feel more emboldened to release their own imperfect work. Yes! Lose the fear, gain the codes! It really doesn’t need to be perfect; Nick Barnes, among others, have written eloquently, or amusingly, on this subject already. Astronomical software wants to be free; please release it, let it show!

It was hard for me to stay silent when the need for a code sharing infrastructure was mentioned, not because I disagree with the need — I believe the need is very great! — but because the ASCL is trying hard to help with that. I’ve looked at other similar efforts tried over the years, and either they have started, lived (usually briefly) and in one case, even flowered, and died, or they still exist but are mostly silent and their efforts in code sharing dormant. The ASCL has been around since 1999 and is indexed by ADS, and use of it has been increasing. It’s not perfect, but it does work and is actively growing.

I believe that science should be as transparent as possible, that code release (absent ITAR and other truly compelling reasons) even if only for examination, not reuse, is part of this transparency, and that ultimately, code release is better for code authors, especially if the astronomy community works together to make it better for them. Code sharing can make astronomy more efficient, too, which is especially important in the current financial climate.

Finally, I want to thank Peter for moderating the session, Bob for offering closing remarks, and the most excellent Ben, Bruce, Gary, Erik, Dan, and David for presenting at this session and also for not protesting even one time about the innumerable emails they received from me from May on. I also have to thank our wonderful volunteer whose name I did not get, alas, for her great work and for counting the 149 (!) attendees, the AAS for accepting the proposal in the first place, and the amazing people who sent this session literally around the world through their tweets. Thank you!

AAS 221 Astronomy Code Sharing? links
Announcement
Omar Laurino joins Astronomy Code Sharing panel
Brief blog post
Astronomy Computing Today post
Slides used at meeting: Google Doc  PDF

ADASS XXIII (2013) links
Announcement
Our eight questions
The eight questions that were discussed/links to discussion notes
Pre-print of proceedings paper

ADASS XXII (2012) links
Birds of a Feather session
Resources used/linked to for ADASS
Pre-print of proceedings paper

July and August 2013 additions to the ASCL

Twenty codes were added to the ASCL in July, and eighteen in August.

July:
AstroTaverna: Tool for Scientific Workflows in Astronomy
cosmoxi2d: Two-point galaxy correlation function calculation
CTI Correction Code
DustEM: Dust extinction and emission modelling
ETC++: Advanced Exposure-Time Calculations

FieldInf: Field Inflation exact integration routines
im2shape: Bayesian Galaxy Shape Estimation
ITERA: IDL Tool for Emission-line Ratio Analysis
K3Match: Point matching in 3D space
LENSVIEW: Resolved gravitational lens images modeling

MAH: Minimum Atmospheric Height
Monte Python: Monte Carlo code for CLASS in Python
NEST: Noble Element Simulation Technique
Obit: Radio Astronomy Data Handling
orbfit: Orbit fitting software

phoSim: Photon Simulator
PURIFY: Tools for radio-interferometric imaging
Shapelets: Image Modelling
SIMX: Event simulator
SOPT: Sparse OPTimisation

August:
APPSPACK: Asynchronous Parallel Pattern Search
BASIN: Beowulf Analysis Symbolic INterface
Ceph_code: Cepheid light-curves fitting
ChiantiPy: Python package for the CHIANTI atomic database
CReSyPS: Stellar population synthesis code

CRUSH: Comprehensive Reduction Utility for SHARC-2 (and more…)
GYRE: Stellar oscillation code
JHelioviewer: Visualization software for solar physics data
LensEnt2: Maximum-entropy weak lens reconstruction
LOSSCONE: Capture rates of stars by a supermassive black hole

MapCurvature: Map Projections
MoogStokes: Zeeman polarized radiative transfer
RADLite: Raytracer for infrared line spectra
SMILE: Orbital analysis and Schwarzschild modeling of triaxial stellar systems
SPEX: High-resolution cosmic X-ray spectra analysis

SYN++: Standalone SN spectrum synthesis
SYNAPPS: Forward-modeling of supernova spectroscopy data sets
THELI GUI: Optical, near- & mid-infrared imaging data reduction

Also in August, we added one very cool web resource, the NASA Exoplanet Archive.