Software

ASKAP Science Data Processor software - ASKAPsoft Version 0.23.0

Commonwealth Scientific and Industrial Research Organisation
Guzman, Juan ; Whiting, Matthew ; Voronkov, Max ; Mitchell, Daniel ; Ord, Stephen ; Collins, Daniel ; Marquarding, Malte ; Lahur, Paulus ; Maher, Tony ; Van Diepen, Ger ; Bannister, Keith ; Wu, Xinyu ; Lenc, Emil ; Khoo, Jonathan ; Bastholm, Eric
Viewed: [[ro.stat.viewed]] Cited: [[ro.stat.cited]] Accessed: [[ro.stat.accessed]]
ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rfr_id=info%3Asid%2FANDS&rft_id=info:doi10.25919/5c5a4f42c5075&rft.title=ASKAP Science Data Processor software - ASKAPsoft Version 0.23.0&rft.identifier=https://doi.org/10.25919/5c5a4f42c5075&rft.publisher=Commonwealth Scientific and Industrial Research Organisation&rft.description=ASKAPsoft, the ASKAP Science Data Processor, provides data processing functionality, including:\n\n* Calibration\n* Spectral line imaging\n* Continuum imaging\n* Source detection and generation of source catalogs\n* Transient detection\n\nASKAPsoft is developed as a part of the CSIRO Australian Square Kilometre Array Pathfinder (ASKAP) Science Data Processor component. ASKAPsoft is a key component in the ASKAP system. It is the primary software for storing and processing raw data, and initiating the archiving of resulting science data products into the data archive (CASDA).\n\nThe processing pipelines within ASKAPsoft are largely written in C++ built on top of casacore and other third party libraries. The software is designed to be parallelised, where possible, for performance.\n\nASKAPsoft is designed to be built and executed in a standard Unix/Linux environment and core dependencies must be fulfilled by the platform. These include, but are not limited to, a C/C++/Fortran compiler, Make, Python 2.7, Java 7 and MPI. More specific dependencies are downloaded by the ASKAPsoft build system and are installed within the ASKAPsoft development tree. Specific to the Debian platform, after a standard installation of Debian Wheezy (7.x) the following packages will need to be installed with apt-get:\n\n* g++\n* gfortran\n* openjdk-7-jdk\n* python-dev\n* flex\n* bison\n* openmpi-bin\n* libopenmpi-dev\n* libfreetype6-dev\n* libpng12-dev\n\nMore information regarding the building, installation and running of the software can be found in the README file in the root of the file structure that forms this collection.\n\nSource code can be accessed via the links in Related Materials section.\n\n-----\nA major release, addressing a number of issues with the processing software and the pipeline scripts.\n\nPipelines:\n\n * When multiple raw MSs are provided for a given beam (split up by\n frequency range), the pipeline is capable of recognising this,\n merging (after any necessary splitting), and handling all required\n metadata appropriately. The functionality should be the same no\n matter the structure of the raw data.\n * The selfcal job allocation (for the sbatch call) has been altered\n to request a number of nodes, rather than cores +\n cores-per-node. This should provide more predictable allocations.\n * The weights cutoff parameter given to Selavy is now fully\n consistent with the linmos cutoff.\n * Fixed a bug that meant the raw data was overwritten when\n calibration was applied, even when KEEP_RAW_AV_MS=true.\n * The TELESCOP keyword is now added to the FITS headers.\n * A bug was fixed that was preventing the full-resolution MSs being\n included in the CASDA upload.\n * New parameters SPECTRAL_IMAGE_MAXUV and SPECTRAL_IMAGE_MINUV that\n allow control over the UV distances passed to the spectral imager.\n * Various improvements to the gatherStats job, so that it will still\n run after the killAll script has been called, and that looks for\n the pipeline-errors directory before trying to use it.\n * Making the cubeStats script more robust against failures of a\n single process (so that it doesn't hang but instead carries on as\n best it can).\n\n\nProcessing:\n\n * Imaging:\n - Fix a coordinate shift that was seen in spectral imaging, due to a\n different direction being provided by the advise functionality. \n\n * Calibration:\n - Efficiency improvements to ccalapply to help speed it up\n\n * Utilities:\n - Adjustment of the maximum cache size in mssplit to avoid\n out-of-memory issues\n - Trimming down of the pointing table in MSs produced by msconcat,\n so that very large tables do not result. \n\n * Selavy:\n - The restoring beam is now written into the component maps.\n - A significant change to the handling of the initial estimates for\n the Gaussian fits, making it more robust and avoiding downstream\n WCS errors that were hampering the analysis.\n - Minor catalogue fixes for component & HI catalogues\n - Segfaults in selfcal (3145)&rft.creator=Guzman, Juan &rft.creator=Whiting, Matthew &rft.creator=Voronkov, Max &rft.creator=Mitchell, Daniel &rft.creator=Ord, Stephen &rft.creator=Collins, Daniel &rft.creator=Marquarding, Malte &rft.creator=Lahur, Paulus &rft.creator=Maher, Tony &rft.creator=Van Diepen, Ger &rft.creator=Bannister, Keith &rft.creator=Wu, Xinyu &rft.creator=Lenc, Emil &rft.creator=Khoo, Jonathan &rft.creator=Bastholm, Eric &rft.date=2019&rft.edition=v1&rft_rights=GPLv3 Licence with CSIRO Disclaimer https://research.csiro.au/dap/licences/gplv3-licence-with-csiro-disclaimer/&rft_rights=Data is accessible online and may be reused in accordance with licence conditions&rft_rights=All Rights (including copyright) CSIRO 2018.&rft_subject=ASKAP&rft_subject=science data processor&rft_subject=pipeline&rft_subject=radio astronomy&rft_subject=software&rft_subject=data reduction&rft_subject=Astronomical sciences not elsewhere classified&rft_subject=Astronomical sciences&rft_subject=PHYSICAL SCIENCES&rft.type=Computer Program&rft.language=English Access the software

Licence & Rights:

Open Licence view details
Gpl

GPLv3 Licence with CSIRO Disclaimer
https://research.csiro.au/dap/licences/gplv3-licence-with-csiro-disclaimer/

Data is accessible online and may be reused in accordance with licence conditions

All Rights (including copyright) CSIRO 2018.

Access:

Open view details

Accessible for free

Contact Information



Brief description

ASKAPsoft, the ASKAP Science Data Processor, provides data processing functionality, including:

* Calibration
* Spectral line imaging
* Continuum imaging
* Source detection and generation of source catalogs
* Transient detection

ASKAPsoft is developed as a part of the CSIRO Australian Square Kilometre Array Pathfinder (ASKAP) Science Data Processor component. ASKAPsoft is a key component in the ASKAP system. It is the primary software for storing and processing raw data, and initiating the archiving of resulting science data products into the data archive (CASDA).

The processing pipelines within ASKAPsoft are largely written in C++ built on top of casacore and other third party libraries. The software is designed to be parallelised, where possible, for performance.

ASKAPsoft is designed to be built and executed in a standard Unix/Linux environment and core dependencies must be fulfilled by the platform. These include, but are not limited to, a C/C++/Fortran compiler, Make, Python 2.7, Java 7 and MPI. More specific dependencies are downloaded by the ASKAPsoft build system and are installed within the ASKAPsoft development tree. Specific to the Debian platform, after a standard installation of Debian Wheezy (7.x) the following packages will need to be installed with apt-get:

* g++
* gfortran
* openjdk-7-jdk
* python-dev
* flex
* bison
* openmpi-bin
* libopenmpi-dev
* libfreetype6-dev
* libpng12-dev

More information regarding the building, installation and running of the software can be found in the README file in the root of the file structure that forms this collection.

Source code can be accessed via the links in Related Materials section.

-----
A major release, addressing a number of issues with the processing software and the pipeline scripts.

Pipelines:

* When multiple raw MSs are provided for a given beam (split up by
frequency range), the pipeline is capable of recognising this,
merging (after any necessary splitting), and handling all required
metadata appropriately. The functionality should be the same no
matter the structure of the raw data.
* The selfcal job allocation (for the sbatch call) has been altered
to request a number of nodes, rather than cores +
cores-per-node. This should provide more predictable allocations.
* The weights cutoff parameter given to Selavy is now fully
consistent with the linmos cutoff.
* Fixed a bug that meant the raw data was overwritten when
calibration was applied, even when KEEP_RAW_AV_MS=true.
* The TELESCOP keyword is now added to the FITS headers.
* A bug was fixed that was preventing the full-resolution MSs being
included in the CASDA upload.
* New parameters SPECTRAL_IMAGE_MAXUV and SPECTRAL_IMAGE_MINUV that
allow control over the UV distances passed to the spectral imager.
* Various improvements to the gatherStats job, so that it will still
run after the killAll script has been called, and that looks for
the pipeline-errors directory before trying to use it.
* Making the cubeStats script more robust against failures of a
single process (so that it doesn't hang but instead carries on as
best it can).


Processing:

* Imaging:
- Fix a coordinate shift that was seen in spectral imaging, due to a
different direction being provided by the advise functionality.

* Calibration:
- Efficiency improvements to ccalapply to help speed it up

* Utilities:
- Adjustment of the maximum cache size in mssplit to avoid
out-of-memory issues
- Trimming down of the pointing table in MSs produced by msconcat,
so that very large tables do not result.

* Selavy:
- The restoring beam is now written into the component maps.
- A significant change to the handling of the initial estimates for
the Gaussian fits, making it more robust and avoiding downstream
WCS errors that were hampering the analysis.
- Minor catalogue fixes for component & HI catalogues
- Segfaults in selfcal (3145)

Available: 2019-02-06

Data time period: 2018-12-10 to ..

This dataset is part of a larger collection

Click to explore relationships graph
Subjects

User Contributed Tags    

Login to tag this record with meaningful keywords to make it easier to discover

Identifiers