ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

ASCL Code Record

[submitted] AstroPT

AstroPT is an autoregressive pretrained transformer developed with astronomical use-cases in mind. We have trained a selection of foundation models of increasing size from 1 million to 2.1 billion parameters on DESI legacy jpeg imagery, and find that AstroPT follows a similar saturating log-log scaling law to textual models. We also find that the models' performances on downstream tasks as measured by linear probing improves with model size up to the model parameter saturation point. We believe that collaborative community development paves the best route towards realising an open source `Large Observation Model' -- a model trained on data taken from the observational sciences at the scale seen in natural language processing. To this end, we release the source code, weights, and dataset for AstroPT under the MIT license, and invite potential collaborators to join us in collectively building and researching these models.

Code site:
https://github.com/Smith42/astroPT https://huggingface.co/collections/Smith42/astropt-67b48119dab5123a2e3d072e
Preferred citation method:

https://arxiv.org/abs/2405.14930


Views: 56