- Someone with casaadm account credentials (Drew or Karlee) copies the release candidate into /home/casa/packages/pipeline.
- Copy (or remove) the old test symbolic link to a different symbolic link name; i.e. test_5.6.1.
- They then make a symbolic link to /home/casa/packages/pipeline/test; i.e. ln -sf /home/casa/packages/pipeline/<my new casa package> test.
- The pipeline is set to run on incoming data by setting the ciplRunState variable to RUN in the following file: /home/casa/capo/dsoc-test.properties
- The test pipeline gets run in parallel with the actual pipeline for a period ~ 1 week, and the results checked against those from the production pipeline.
Test pipeline (workspaces)
- As above, but note that VLASS SECI has its own version in the dsoc-xxxx.properties CAPO file: edu.nrao.workspaces.ProcessingSettings.CasaVersion.vlassSeci that you may or may not wish to change
- Workspaces has a separate CAPO property for the RUN/PAUSE/STOP of automated calibration: edu.nrao.workspaces.StandardCalibrationSettings.runState
{"serverDuration": 100, "requestCorrelationId": "bd9a18edcc886818"}
7 Comments
Mark Lacy
For 6.1.1 the pipeline fails to execute offline due to a python path issue, and testing needs to be started using runs from the command line. Furthermore, the method of invoking the hifv recipe has changed in 6.x (PIPE-813). The easiest way to run it from the CASA command line (as recommended by John) is probably:
Note that when run like this the weblog appears in a "procedure_hifv" sub-directory. The square brackets are very important - the asdm must be given as a list, even with only 1 element, otherwise bad things happen. Also, each run needs to be started in an empty directory (i.e. not one containing a previous pipeline run), otherwise the pipeline will attempt to append existing calibrations from the previous run.
Aaron Lawson
For the 6.1.1 pipeline, will it be alright to run these as vlapipe in /lustre/aoc/cluster/pipeline/dsoc-test/qa2/<JobID> or do we need to do this in some other area on lustre, such as our sciops areas? Also, do we know if there is some kind of special directory structure this method expects? (similar to how casa_piperestorescript.py expects to see a products/ rawdata/ and working/ directories)
Mark Lacy
I would run these elsewhere in lustre, just to separate them from the regular tests - your sciops area would be fine provided we can all see it to review the weblogs. This method has no special directory structure, you can run it on an SDM in the same directory if you want. I'm going to wait until my test run has finished though before setting you loose on these tests in case I find more issues...
Mark Lacy
Ok, so I have gotten things to run using the recipe reducer as listed in the comments above, so feel free to pick a dataset and have a go once you get the chance. I did an L+C Multiband from project SC1006 which seemed to work well. Talking to Drew we think one per band (and maybe one more Multiband) should be enough to validate the pipeline.
Drew Medlin
We have the standard Band test we could use. Data are in /lustre/aoc/sciops/dmedlin/pipeline/pl_data/bands/ I'd suggest we have four SRDP DAs, and we can each take two.
Mark Lacy
OK, sounds like a good plan.
Drew Medlin
It was discovered that John T. has already done the band tests, so we are processing some recently observed observations to see if they match the current production pipeline.