Here we describe the various scripts we are using for the NGC1333 and CLASSy project. Most of them will be based on the Miriad package, but we might also be using some DS9, KARMA and NEMO. To simplify/organize our scripts, we have assembled a CVS module called MIS, in which all scripts (and default parameter data to run the scripts) are maintained. First we describe MIS, after that the individual scripts. Related information can be found in the EGN pipeline, which is directly derived from the MIS pipeline described here.
MIS (Miriad Interferometry Singledish) toolkit
This is our CVS based module to organize all our reduction scripts and store the parameters important for them as well. We separate these in a script independant way, so we allow anybody to use csh, sh, python, fortran, C, C++, Java, ...... There is also a manual available, although you can also generate a new one from your own MIS version.
If you want to grab a copy of MIS, for now use our CVS repository version (or use this tar file which you will need to cvs update):
cvs -d :pserver:$USER@cvs.astro.umd.edu:/home/cvsroot checkout mis
where $USER is your given username in CVS (or use anonymous if you only need to grab a copy for read-only use).
If you have never used CVS before, get your introduction from our CARMA pages here. If you need an account, you will need to email Peter your preferred username, and the hashed output of the password generator (details in that link) via the command
perl -e 'print crypt("my_password","b6"), "\n";'
and he better not get the response b60j72OwWnjlk from you!!!
Once you have obtained MIS, get it ready for installation:
cd mis ./configure source mis_start.csh
and you are in principle ready to use the MIS scripts. There are some additional powerful options to construct automated pipelines, but that is for another chapter.
Occasionally there will be updates. If you are not a developer, and just want to update MIS from what others have done, this should work in most cases:
is simply a shortcut to committing all your MIS changes back to CVS for others to pick up. Use that with caution, as it could commit more than you may have bargained for!
Not all scripts have been properly MIS-pipelined yet! Keywords followed by = means they can be input in this script, keywords preceded by = means they are set by this script
- getdata: project= rawdata= ary= track= =cvis
- map_inttime: source= =np
- uvcatSD: offname= fix=
- reduceSD.csh: example of single dish data reduction (using sinbad/sinpoly/varmaps) - MWP
- do_reduceSD: the same script, but now MIStified as well as cleaned up.
- do_mapSD: creates the SD cubes
- do_uvcat1: cut interferometer size down to just the 4 USB windows, and apply flags
- do_inspect1: inspect your data (you can edit to suit your needs)
- do_cal1: calibrate
- do_mos1: example mosaicing of just the UV data
- lincomb.csh: template script for combination of singledish and interferometric data using mossdi ("Stanimirovic method").
Example Session using MIS
A simple non-pipelined session could be as follows:
mis -q mkdir data10 cd data10 pipepar -c rawdata=/home/teuben/carma/work/n1333/rawdata getdata project=cx323.1E_89NGC133.17 getdata track=22 ary=E report map_inttime
A more advanced session making the inttime map would be to first construct a pipeline consisting of getdata and map_inttime for all projects, and then use the piperun command to execute this.
piperun -c n1333.lis 'pipeline 2 getdata map_inttime > Pipefile; pipepar -c project=%s; pipe all'
In each project directory you will then have a xyt.tab file for the next stage. For example to get a listing of how many pointings each track had, use
piperun n1333.lis pipepar -v project -v np
and here's how to then create the inttime maps,in various formats (miriad, fits):
When new data come in
When new data come in, and it has been placed in the rawdata directory, the following MIS command will get it into your local (and hopefully otherwise empty) directory:
pipepar -c getdata trial=1 ary=D report
where the ary=D is actually the default and not needed. Also, this will make a symlink, if you really must have a copy, add the link=0 keyword.
After this you can cut it down to its basic USB win4 (9,11,13,15) selection using
for interferometric, and
for single dish. Go ahead and inspect it the way you like, and create a mis.uvflag file with entries such as
edge=2 tsys=500. select=time(23:09:00.0,23:10:00.0),source(3c84) select=ant(7),time(00:00:00.0,00:25:00.0) select=ant(7) tsys=20.,100. select=time(23:00:00.0,23:20:00.0),ant(16) select=time(00:00:00.0,00:15:00.0),ant(16) select=ant(17),win(2,3)
you can formally flag this in pipeline fashion using
once you are happy with the flag file, save it:
and if this is a new project, you will have to add it to CVS for others to benefit:
cd $MIS/def cvs add cx323.1D_89NGC133.1 cvs add cx323.1D_89NGC133.1/mis.* cvs commit
anybody else now can retrieve these using the command
in your data project directory. Any changes to these persistent pipe files needs to be done using CVS, or if you only have those files to worry about (and not code) you can use our shortcut
Using CVS will give us a well known tool to make sure we're all agreeing on the parameters and the flag file(s), and whatever else we decide to save on a project-by-project (trial really) basis.
For final combined mapping we'll have to create another project, since only one mis.def file can exist per project.
Single Dish Data Reduction
When new data comes in, here is the standard list of things to do to reduce the Single Dish data (regardless of whether the interformeter people have already flagged things or not):
mis -u mkdir ExampleD1 cd ExampleD1 pipepar -c getdata trial=1 ary=D report do_uvcatSD do_uvflag vis=sddata do_reduceSD
You can replace the last line by
do_reduceSD device=/null sleep=0
if you don't want do_reduceSD to prompt and display the data. The maps for each antenna are stored in SD/, and the pictures of all the 23 antennas for each molecule are stored in /. (e.g HCN.allants.gif)
If you want to just blanket remove an antenna from the maps, and discard it (this would be on top of the standard flagging) add badants, e.g.
but if you do, make sure this is saved in $MIS for others to pick up
pipesave mis -i
Working from home/laptop
There's a potential pitfall if you want to work from home or your laptop, since you may not have mounted the rawdata directory (although you can actually do this using fusefs). But getdata now has an scp option, where you can full fake being at work, except of course symlinks don't work and you get a full copy of the data. Depending on your internet speed, this could be a slow processing getting the data. Here's an example:
mkdir D1 cd D1 pipepar -c getdata trial=1 ary=D email@example.com
this will ensure that your rawdata= is properly pointing to the motherland. If not, the mis -i would save a bad value to CVS and could mess up your collegues.
Another option is to use the wget_getdata script to create a local clone of all the compressed tar files. Re-running this script will then also keep your local repository in sync with the master version.
This script is supposed to run regularly/nightly, where all calibrated and processed visibilities and single dish cubes should be found for any subsequent mapping experiments. The script lives in $MIS/templates/n1333_nightly.csh and can be easily adapted for your needs. If you place a symlink from your $MIS/Nightly to the location where we keep this Nightly, others can reuse the processed data. This is currently /n/chara6/teuben/n1333/Nightly_latest. You can find a more complete description of the data products in Raw & Calibrated Data