The CASA team envisions a program to reimplement CASA, informally known as ngCASA, which depends on a separate effort known as CASA Next Generation Infrastructure (CNGI). CNGI aims to "...condense and replace the CASA data processing software infrastructure and casacore code base only with a new and functionally equivalent package."
The CNGI effort previously completed Software and Framework trade studies. The study results identified technology to evaluate in an initial time-boxed prototyping effort; the prototyping is currently in progress.
This Conceptual Design Review (CoDR) will review the outline of the reimplementation program, analyze the CNGI technology down-select process and conclusions, and evaluate the prototyping status and results.
This first ngCASA review will be internal to the NRAO; future reviews will be external.
The full charge document can be found here.
Mark Whitehead, DMS Software Architect, Panel Chair All | P. Jagannathan, Algorithm and Imaging Specialist Design, Trade Study, Framework Study, SE Practices |
R. Rosen, ALMA ICT/NA Lead Transition Plan, Resource Allocation Plan, Schedule, SE Practices | A. Kepley, ALMA/CASA Subsystem Scientist Roadmap, Schedule |
R. Hiriart, ngVLA CSW IPT Lead Trade Study, Framework Study, Design | R. Farnsworth, Assistant Director, Project Management Department Roadmap, Transition Plan, Schedule, Resource Allocation Plan |
M. Pokorny, HPC Software Roadmap, design, Trade Study, Framework Study | J. Robnett, HPC/HTC Software Roadmap, Design, Transition Plan, Schedule, Resource Allocation Plan |
J. Marvil, VLA/CASA Subsystem Scientist Roadmap, Design, Transition Plan, Schedule, Resource Allocation Plan | B. Butler,VLA/VLBA Science Support Roadmap, Design, Transition Plan, Schedule, Resource Allocation Plan |
J. Kern, Director SRDP All | J. Masters, Pipeline Lead Roadmap, Design, Transition Plan, Schedule, Resource Allocation Plan |
1. Do the phases in the roadmap appear to be correct? Are the “gates” between the phases, including ready-to-proceed criteria, clear?
2. Does the planned allocation of resources between CASA and ngCASA appear to be sensible to allow the reimplementation to proceed at a reasonable pace while providing ALMA and the VLA with the essential support they need, including some improved capabilities? Relatedly, does the transition plan between CASA and ngCASA appear to be realistic, including the criteria by which CASA will transition to an “important bugs only” maintenance phase and then an end of life status.
3. While the schedule at this point is not detailed and is based on engineering estimates (i.e., guesses), does it nevertheless appear to be plausible?
4. Are the CNGI requirements and specifications realistic and sufficiently detailed for this stage of the process?
5. Was the process that resulted in the CNGI technology down-select suitable? Were there any important omissions in either the process or the initial ensemble of technologies that would have likely resulted in a different technology selection? If the committee responds that the result was incorrect, please provide a detailed explanation (i.e., not just a personal technical bias).
6. Are the planned prototypes: a) sufficiently realistic that they will be relevant for estimating work required for the overall CASA reimplementation b) cover sufficient data set sizes and hardware installations (single node single core to many node) that we are confident that realistic performance conclusions can be drawn, and c) sufficient to demonstrate that the functional definition of CNGI will support the level of complexity required by the scientific applications that will be built on it?
7. Are the prototype success metrics clear, especially as they relate to: a) code maintainability and extendability b) performance vs. current CASA, single node, single and multiple cores c) scalability to large numbers of nodes?
8. Are the software engineering aspects of the CNGI defined sufficiently to provide a good base for ongoing development, including version control, testing including testing requirements and automatic testing framework, and documentation?
9. Is there any other work that was not listed that should be carried out to be available in time for the next review?
Note: Documentation and Charge mappings can be found here.
Date 2020 | Activity | Responsible |
---|---|---|
May 20 | Document Delivery to the committee; every document should have at least three readers | Whitehead, Raba |
May 21 - Jun 03 | RID Entry and Discussion | Committee |
Jun 03 - 09 | RID Disposition | Project Team |
Jun 09 | Presentations delivered to committee | Whitehead, Raba |
Jun 10-11 | Review meeting; preliminary assessment at close of meeting | All |
Jun 24 | Chair provides Panel report to CASA lead, DMS AD | Whitehead |
Jul 1 | CASA Lead provides report response to Committee, DMS AD; publicly distributed thereafter | Whitehead, Raba |
Prior to the review meeting, committee members identify review item discrepancies (RID). The committee chair approves each RID and forwards to the project team for comment. Approximately one week prior to the review meeting, the review committee chair identifies RIDs which have not reached resolution and places them on the agenda for the review meeting. This methodology facilitates iteration between the review committee and the project team prior to the in-person meeting; during these interactions misunderstandings and non-controversial findings can be dealt with, allowing the review meeting to focus on discussion of critical issues or disagreements.
The standard RID workflow adopted from (ECSSS-M-ST-10-01C –Organization and Conduct of Reviews; 15 November 2008) is defined here. The review meeting should focus on presentations and discussions designed to bring closure to open discrepancies (RID). We anticipate that this meeting will be two days during which time should be provided for discussion between the committee and the project members. Each working session or day shall end with a restricted meeting of the committee during which each member shall debrief on the status of the problems identified.
For questions which cannot be answered prior to or during the meeting, ‘Action Items’ shall be defined including the due date and organization responsible for the performance of the action. Any Action Item shall be identified as critical or not. Action items and RIDs shall be reviewed prior to the end of the meeting.
The Review Committee Chair shall:
The Review Committee members shall, under the authority of the Review Committee Chair:
The RID process described above is implemented using the NRAO instance of the Atlassian Jira package (open-jira.nrao.edu). The package is used to track and mediate communication on the review items prior to the Review Meeting, as well as after-review actions recommended by the committee.
The Jira workflow for review items is defined here. Members of the review committee open discrepancies, supplying the description of the discrepancy, and suggested solution. Discrepancies can be judged as major or minor, as differentiated by the workflow in Figure 1 by the reporter. The TTAT Project Manager will review the RIDs for duplication and assign each RID to the appropriate party, transitioning the issue to the “In Progress” state.
Once the project has prepared a suitable response the ticket is transitioned to the “In Review” state and returned to the original reporter.
At this point one of four actions may be taken:
The completion of the review is defined as resolution of all major RIDs and critical action items as listed in Appendix A of the review report.