By Telcon: A. Cool, J. Mohr, G. Taylor
Members Absent: M. Arnaud, K. Koyama, F. Paerels
Others: A. Tennant, H. Tananbaum, B. Wilkes, R. Brissenden, R. Harnden, F. Seward, A. Prestwich
Manager's Report (Roger Brissenden):
Roger reported that the S/C is still performing superbly, operations
are routine, and there have been no significant hardware problems
since the last meeting. It has been noticed that certain S/C
components, e.g., the EPHIN, propulsion lines, SIM motors, have been
getting hotter. This issue is under study; currently it is at the
warning level and is not yet impacting operations. Consumable usage
(e.g., MUPS fuel, MUPS warm start counts) are under control. Observational
efficiency remains high, approaching 70%. Data processing has improved
somewhat to an average time between observation and data delivery of 5-6
days. MSFC is pursuing a sole source approach for the CXC contract after
August 2003. Roger will present details of the extended phase budget at
the next CUC meeting. Harvey mentioned that the plan is to keep the GO
funding at the same level during the extended phase.
(1) The committee strongly endorses retaining the same GO funding level (in real dollar terms) during the extended mission phase.
As always the committee greatly appreciates Roger's reports on status, operations, expendables usage, observing efficiency, observations, data systems, data delivery, and grant awards.
A discussion came up near the end of Roger's presentation about the change in selecting official (from Anne Kinney for cycle 4 to Harvey Tananbaum for cycle 5) and whether there would or should be any associated changes in policy. For example, previous policy frowned on cuts to exposure time for individual targets.
(2) In the absence of any specific proposed changes to the selection process, we encourage continuing the usual instructions to the peer review. Furthermore, the scientific community respects and accepts the peer review process; therefore, in general the committee would prefer that the selecting official not modify the results of the peer review.
DDT Report (Harvey Tananbaum):
This was Harvey's final report for cycle 3. He used 861 ks out of
1000 ks available. For cycle 4 the director has 1000 ks of
discretionary time, but in cycle 5 this will be reduced to 700
ks. Harvey was asked what fraction of director's discretionary time
requests are rejected. Although he does not track this, he estimates
about 50%.
Plans for Cycle 5 Peer Review (Fred Seward):
Fred reported on plans for the cycle 5 review. One important change
is that proposers must make a definite choice between requirements and
preferences for constraints. The scheme for disguising GTO proposals
that compete for an individual target with a GO proposal was reviewed.
(3) We endorse the plan for disguising GTO proposals that compete for
an individual target with a GO proposal and look forward to hearing
from Fred on how it works out. Identification of the GTO proposals
should be done as late in the review process as possible.
Technical and organizational aspects of the upcoming cycle 5 peer review were presented.
(4) Requiring preliminary grades from reviewers 5 days before the review is a good idea, but the CXC should realize that some reviewers will still wait until the last minute to turn in grades. A day or two should be sufficient to compile the preliminary grades before the panel meetings.
(5) The CXC should instruct chairs separately before the panels meet on useful approaches to running a peer review panel.
Fred asked for advice on whether the committee supported a change to evaluate Large (LP) and Very Large Project (VLP) proposals by two panels in the same topic area.
(6) The committee endorses the idea of evaluating Large (LP) and Very Large Project (VLP) proposals by two panels in the same topic area. The committee also makes the following recommendations regarding the review of VLPs: (a) the merging panel should be instructed that it can accept less than the set aside amount of time for VLPs and (b) the merging panel should be allowed to balance gray area proposals in all categories: VLP, LP, and regular proposals.
Fred also floated an idea to limit Chandra projects for successful proposers with the goal of increasing the number of different GOs participating in Chandra each round. This was purely for discussion.
(7) The committee emphatically rejects this suggestion.
Fred reviewed the fair share formula. He noted that the hold back reserve for proposals requesting more than the fair share amount was 4%.
(8) Overall the committee thought the fair share formula and hold back reserve was fine with the following exception for VLP projects.
(9) The CUC believes that VLP projects should be funded well. A VLP with a single 1000 ks observation would have a fair share amount, under the usual formula, of $150,000, which the committee does not believe is sufficient for these showcase projects. If instead one were to divide the total GO budget ($10M) by the ratio of observing time (0.05) the amount would be $500,000. An appropriate funding level exists between these two extremes. The CXC should develop a formula for funding VLP projects.
Julian Krolik took an action to send Belinda e-mail lists for
advertising the Chandra theory proposal opportunity for cycle
5. [Done]
Metrics for LP Assessment (Belinda Wilkes):
Belinda discussed metrics to assess the scientific productivity of
Chandra programs, by reviewing the HST experience. STScI uses two
basic metrics: refereed journal papers and citations to those
papers. Belinda noted several relevant features. There is an average
two year delay between observation and published paper. Citations are
delayed even longer, by more than three years. It took about 7 years
from launch before HST papers reached a steady state. Belinda
described current activities and plans to accumulate the data required
for Chandra, which is going to be a labor intensive task. Given the
long time delays it is going to take longer to make changes to the
program based on these scientific productivity metrics.
(10) This approach looks good to the CUC and the CXC should begin collecting data as soon as possible. Even relative figures of merit between different programs, scaled to the same time since program inception, would be useful.
WebChaSeR: New Release (Sherry Winkelman):
Sherry demo'ed the latest version of WebChaser.
(11) The CXC was very impressed with the capabilities and
functionality of WebChaser and the demo was great! Several members of
the CUC note that HST data takes a while (a couple of days) to
download from their website, because the data are re-processed on the
fly with the current calibrations. This is an enhancement that the
CXC might want to consider developing.
E&PO process for Cycle 5 (Kathy Lestition):
Kathy reviewed the cycle 4 E&PO peer review. Submissions were
doubled from cycle 3: twelve individual and nine institutional
proposals. One-third of the proposals were recommended for
funding for a total amount of $198K. Plans for the cycle 5 E&PO
peer review were outlined.
(12) The CUC has no substantial comments on the rules for E&PO proposals for the CfP next year and recommends that the team carry on as before. It would be helpful to publicize more information about the types of proposals accepted. For example the abstracts should be made available, if they are not already.
Kathy summarized staff hires in process. Positions for a SSU scientist, research assistant, administrator and education specialist all need to be filled (as of mid-December 2002).
Don Kniffen noted that 8 of the top 10 stories in OSS for the year (2002) were in space science: 4 from HST and 4 from Chandra.
Membership Issues
Various membership issues were discussed including a change to the
date when members rotate off (now to be the end of summer), the best
time for the winter meeting (mid-January is probably better than
mid-December), and the length of time that the chair should serve.
(13) The committee supports the idea that the chair serve for 2 years after 2 years of service on the committee.
(14) The CUC believes that the committee should continue to have a mix of about half its membership from large NASA/NSF centers and half "general" users from the University/academic community.
Report on Calibration Workshop (Hank Donnelly):
An extremely brief presentation was given by Hank. The CUC was told,
basically, to look at the proceedings of the workshop on the web.
(15) The Calibration Workshop is a great idea. The CUC fully supports
this effort, encourages the CXC cal group to run it annually, and CUC
members will try to attend on a regular basis.
However, the CUC was quite disappointed with this presentation, since we were actually looking for content and insight into the calibration process. Here are some questions that committee members thought should have been addressed. What major calibration issues were raised during the workshop? What issues, if any, were resolved? Was a roadmap for future progress developed? Were any recommendations made? What changes to the calibration plan are needed? How will the CXC cal group implement such changes?
By failing to summarize the content of the calibration workshop, the CUC is concerned that the calibration group has failed to extract useful and important information from the workshop. Was there a post-workshop review by the CXC cal group and, if so, what impact did the workshop have on the calibration program? Future presentations by the cal group must convince the CUC that program resources (i.e., funding and observing time) are being utilized effectively and efficiently. A bulleted list of generic information is not sufficient.
Calibration Status, Plans for Cycle 5 (Larry David):
Larry presented the AO3 calibration matrix. This lists detector,
targets, observation frequency, exposure time, observer, objective and
products. Several examples of objectives include: "monitor and
measure the QE," measure effects of CTI on gain and spectral
resolution, "measure the low energy QE uniformity," and so on. Note
that no numerical requirements or goals on the accuracy of measured
calibration quantities were given. This presentation engendered quite
a lot of discussion.
(16) The calibration presentations by the CXC cal group and subsequent discussions with the CUC have provided no rational basis for assessing whether the amount of calibration time is sufficient or excessive. The lack of quantitative requirement and goals for measured calibration quantities is unacceptable. The "seat of the pants" approach applied previously (as exemplified by the HRC-I observation of Vela for measuring the low energy QE uniformity, which now "doesn't look promising") cannot continue. The CUC wants to be provided with specific numerical values, including both requirements and goals, for calibration quantities and a narrative that briefly explains how one goes from a particular observation to the calibration quantity. In order to be quite clear and avoid misunderstanding, we request a brief text document (of order 20 pages long) that addresses all calibration issues, e.g., HRMA effective area; gain, QE, resolution, etc. for all ACIS chips; point spread function; astrometric precision, and so on. Provide the current calibration accuracy, requirements and goals on accuracy, and the plan to get there from specific calibration observations (e.g., someone analyzes some data set to get some result with errors that constrains some aspect of calibration to the required level of precision). Ideally the document should be oriented toward the viewpoint of users (i.e., what calibrations do I need to do an analysis). This document should be posted on the web and updated regularly.
Calibration Time in Cycle 5 (Harvey Tananbaum):
Harvey reported that the amount of calibration time in cycle 5 will be
increased to 1 Ms. In the absence of specific numerical goals and
requirements on calibration accuracy and a clear calibration plan, the
CUC is unable to assess whether this request is sufficient, too
much or too little.
Update on Low Energy QE (Paul Plucinsky):
Paul reviewed the current state of knowledge regarding the decrease in
ACIS low energy sensitivity, the application of effective area
correction software to celestial sources, and areas for future work.
Paul also mentioned the possibility of an ACIS bake-out scenario to
boil off the contamination.
(17) Based on the information at hand, the CUC is not qualified to make an informed recommendation on whether or not to proceed with a bake-out of ACIS. The CUC would like to be informed if this becomes a serious possibility and would be prepared to provide whatever input it can.
Paul asked for feedback from the CUC on prioritizing several issues: improving the gain of S3 below 800 eV, updating the edge energies and EXAFS in the absorption model, resolution matrix for S1, matrices for T=-110 C.
(18) The CUC believes that prioritization at this level should be part of calibration planning.
(19) In light of the low energy QE degradation on ACIS, the CXC should take a serious look at its SI trending and monitoring program. The CUC would like to see the current implementation of the monitoring and trending plan for SI characterization. This should be presented at the June 2003 CUC meeting.
CIAO Update
In general Chandra analysis software seems to be in pretty good shape
with no major issues to report.
(20) The CUC is pleased that CTI correction software is now implemented
as part of CIAO.
(21) For the next upgrade of ChaRT, the team should consider eliminating
the need for users to run MARX. This would make it much easier to use.
Martin Elvis passed out draft copies of the software survey, in response
to an earlier request from this committee.
(22) We thank the CXC software group for their attention to our
request. Individual members of the CUC will provide comments on the
survey directly to Martin Elvis.