Astronomer’s Proposal Tools (APT) Project Management Plan

Tony Krueger

June 6, 2000

 

1.      Project Background

Originally, Hubble proposals were submitted using the Remote Proposal Submission System (RPSS). This submission system was designed in the mid-1980s and represented the state of tech­nology and experience in handling "service mode" observations. It provided the bare minimum of user support, for example, checking for syntax or spelling errors and some illegal configurations. When received at STScI, these proposals were processed to determine feasibility and schedulabil­ity. It was at this stage that most problems were discovered, and manual intervention by operations staff was necessary. RPSS was used until 1994. RPSS was neither conducive to efficient user sup­port strategies nor was it observer/observatory staff user friendly. After three years of proposal preparation experience, an effort was initiated to improve upon the RPSS process, which led to the release of RPS2 for cycle 5 (1995) observing. RPS2 was designed to further two major goals:

        Improve the quality of proposals at submission time and thereby decrease the need for observers and operations staff to iterate. This was achieved by making some of the tele­scope operations constraints available to observers when they prepared their programs.

        Make routine the process of updating proposals after submission for scientific or opera­tional reasons. This was achieved by dividing proposals into "visits" which can be indepen­dently planned and scheduled.

RPS2 was implemented using client/server technology which processed a proposal in batch, and then graphically displayed the results of the processing to the observer. In developing RPS2, we opted in favor of the modest RPS2 architecture instead of a full interactive environment because of the shortcomings of some of STScI’s legacy systems and our tight delivery schedule. A fully interactive system was thus not cost effective in 1994. The vast majority of proposals are submitted today without feasibility or schedulability errors because observers are given sufficient information to remove these errors during proposal development. However, there are major areas with potential for improvement.

In the last three years, technological advances such as widespread use of the Internet, multi-plat­form visual development tools, and overall increases in the power of desktop hardware are allowing for significant improvements in user support tools that can be provided by an observatory. APT is envisioned as an integrated environment that will:

        Leverage off state of the art technologies;

        Provide modern user support tools;

        Achieve the goals stated in the Project Goals section below.

For the past two years their has been a collaborative effort (Scientist’s Expert Assistant - SEA) between Goddard Space Flight Center and STScI to prototype visual and expert system technolo­gies and how they help an observatory achieve the following goals in proposal preparation;

        Reduce the amount of observatory staff support needed for routine observations

        Self sufficiency of the observer (i.e., make it easier for observers to create accurate, feasible observations).

In August 1999, STScI put together a working group to develop a vision and initial development  plan for making APT a reality at STScI.  The result of this working group was the white paper “APT: HST Propoasl Preparation Environment for the Second Decade”1.  This document builds upon the ideas in the white paper and lays out the development and management plan for APT.

In January 2000, STScI started an effort called APT with the goal of taking and enhancing the prototype into a system that can operationally support the HST user community.  The SEA team will continue on with their research to investigate image simulation while STScI focuses on providing APT to the HST user community.

2.      Project Goals & Vision

Today, an HST user must interact with 3 separate software systems at STScI which all look and work differently.  For Phase 1, the PI uses Web based Exposure Time Calculator’s and Latex Forms.  For Phase 2, they use RPS2 to prepare their programs.  Finally for Archival Research, they use a tool called STARVIEW 2.  APT will incorporate these various tools into one HST user support environment.  APT is envisioned as an integrated tools environment that will achieve the following goals.

High Priority Goals

        Allow observers to develop and submit problem free and scheduable Phase 2 HST science programs.

        Provide observers with tools that are more intuitive, visual, and responsive.  The tool should be easy to use.

        Provide observers with tools that allow them to primarily spend time on scientific decisions and not on the mechanics of using the system.  The tool should facilitate the writing of HST programs.

        Provide a tool that allows the observer to optimize their HST science.

Medium Priority Goals

        Provide a tool that allows the observer to optimize general research.

        Provide documentation/help that is friendly, up to date, and easily accessible to users of varying levels of expertise.

        Provide a tool that provides timely feedback.

        Provide an extensible software framework which is responsive to change in both technology and observatory operations.

Low Priority Goals

        Replace RPS2 and the existing Web based Exposure Time Calculators.  This will occur naturally as APT is developed.

3.      Goddard STScI Collaboration Plan

The Goddard SEA and STScI  APT teams have developed a collaboration agreement which will allow Goddard to continue with research and innovation while STScI concentrates on fileding an operational system.  Figure 1. illustrates the organizational structure of both Goddard and STScI teams.

Figure 1.

Currently the Goddard team is funded until October 1, 2000.  The following agreement was reached between Goddard and STScI.

        All software enhancments will be made available to each team. 

        Goddard will act as an open source library for the software.  They will handle distributing the software to other institutions who request a copy of the software. STScI would be willing to take over this role when Goddard in no longer funded.

        Goddard and STScI will have their own copies of the software at their respective institutions.

        Goddard and STScI will work together to field the Visual Target Tuner tool for the June 1, 2000 STScI release.  All changes related to the Visual Target Tuner will be shared immediately via email.

        Goddard will continue its research by extending the software to simulate a photon going down the optical path thus producing a simulated image.

        STScI will enhance the software for operational HST use.

        Both Teams will periodically attend each others project development meetings to keep abreast of the software development efforts.

        Joint email lists will be set up to share information and techincal ideas.

4.      Technical Overview

APT consists of two major components; the APT toolset that provides users with tools that help them prepare their science programs, and the integrated environment that unifies all the tools and makes them interoperable. It is envisioned that some of the APT tools will be developed to work as a stand-alone tool and in the integrated APT environment. Figure 2 shows the list of envisioned tools and when they will be used by the HST user community.

 

Figure 2.

APT Generic Tools

This section describes tools that are used to support more than on specific set of HST users (ie., Phase 1, Phase 2, Archival Research). 

Top Level GUI & Architecture

The top level GUI integrates all the tools together and the architecture provides APT with interoperability and process control between the tools. It will be used for all phases of support.

Visual Target Tuner

This visual target tuner allows observers to visualize their targets and field of view.  It will be used during Phase 1, Phase 2, and for Archival Research.  The following major capabilities are planned enhancements to the existing tool.

        Support for Bright Object Checking

        Dithering & Patterns

        Parallel Observations

        Moving Target Support

Displaying Complete View of Program

This tool will provide the observer with a top level complete view of their program on a visit by visit basis.  It will be used primarily during Phase 2 processing.  It will show the relationship of visits, exposures, and targets.  It will provide output and diagnostics for the program in one display.

Submitting Programs to STScI

This tool will allow the observer to submit their completed Phase1 proposal and Phase2 programs to STScI.

Phase 1 Support Tools

This section describes the tools that will be used during STScI’s Phase 1 process.

Exposure Time Calculators

Exposure Time Calculators allow the observer to calculate exposure times and count rates.  The tool will primarily be used during the Phase 1 process.  We envision replacing the existing CGI web-based exposure time calculators as the APT exposure time calculators are developed.  This tool will also be used to support bright object checking during Phase 2 processing.

Phase 2 Support Tools

This section describes the tools that will be used during STScI’s Phase 2 process.

Orbit Planner

The orbit planner will allow observer’s to layout their exposures in an orbit.  This will include the orbit overheads and science exposures.  It will be used primarily during Phase 2 processing.

Visit Planner

This visit planner will allow observer’s to look at a visit’s schedulability for the upcoming cycle and to describe timing constraints between the visits.  It will be used primarily during Phase 2 processing.

Archival Research Support Tools

This section describes the tools available in APT to support archival research.

Archival Research Tool

This tool will allow the observer to look at archived HST observations and query the archive for observations.  It will use the Visual Target Tuner for displaying images and the HST archive tool (STARVIEW2) for querying the archive.  It will also support target duplication checking.

Other Potential Tools

The project envisions adding the following tools to APT after the initial development is completed (January 2003). Currently, the project does not heave the resources to support the development of these tools and therefore has delayed their development.

Phase I Submission Form - We would like to provide Phase I proposers with a web based electronic form to simplify the submission process. This is currently being studied to determine its benefits and costs.

Canned Observing Strategies - We would like to automate the process of apply­ing customizable observing strategies to observing programs (e.g. mosaicing).

Improved Software Updates - We would like to improve the way observers have access to the latest data on the state of the observatory. We need a strategy that will allow up-to-the-minute access to operational changes, but that also supports those who wish their environment to remain stable while they compare the results of scientific trade-offs. For example if a new version of the APT software was available, the user would be notified upon APT startup and asked if they would like to download the latest version.

Access to Execution Data - We would like our observer tools to be able to access data on the state of their program. This would be useful, for example, in making schedula­bility determinations based on exactly when observations have executed or will execute. Such a capability would reduce effort to implement proposals at STScI by decreasing the incidence of unschedulable observations due to execution information.

Grouping Observations for Global Update - We would like our tool to allow a proposer to group observations to perform a single update to all of them, such as a filter change or new target. If, for example, an observer finds out at a late stage that a planned target is infeasible, it should be easy to substitute another target without a great deal of search-and-change effort.

5.      Software Development Plan

Since APT is a collection of tools with an infrastructure allowing the tools to commu­nicate, small tool teams (one person to multi-person) will be formed to tackle the various tools and infrastructure issues. Each team will consist of software developer(s), a APT user group member, testing and system engineering support.  The APT user group member will be involved in the development process to ensure user feedback and requirement clarification as early in the process as possible. They are the main point of contact for any science or requirements issues related to the tool. System engineering will support requirements definition and attend design reviews. Whenever possible, a software developer will be assigned to one and only one tool team, allowing them to concentrate solely on that tool. We do envision assigned people to more than one team especially when cross-team coordination is needed.  Each tool team will be responsible for developing the tool from product conception to and including the maintenance of the tool.

All the teams development efforts will be coordinated through weekly project meetings. These project meetings are used to relay development status, handle cross-team coordination, and for technical reviews/presentations. The APT user group lead attends these project meetings so that user issues can be addressed and coordinated.

The original APT working group report “APT: HST Propoasl Preparation Environment for the Second Decade”1 called for innovation and fielding teams. This model had a lot of appeal from its conceptual point of view and its goals, but it was hard to envision how to actually implement this, particularly with a smaller development team than envisioned by the working group.

6.      APT User Group

The APT user group consists of STScI astronomers, software developers, and testers.  This group will work with the development team to provide science vision and science requirements.  They will help in evaluating the usability of the tool and provide testing support. The user group team members are listed in table 1.

Table 1.

The user group will help APT project with global issues.  They will help define the scope of the project, prioritize the work, define an RPS2/APT transition plan, and provide use cases for the tool.

Each tool team has a user group member on it.  They are responsible for;

 Act as liaison between the user group and the developers.  Their involvment will vary somewhat depending on the tool

 Report on the status of the task at the APT user group meeting.

 Provide the needed science input ot make decisions about the capabilities of the software.  This will likely involve the development of a list of high-level capabilities and science use cases.

 Testing the usability and capabilities of software prototypes and suggest changes

 Suggest instrument scientists and data assistants who can carry out additional testing.

 Review documentation of the software from a science perspective.  The may provide general documentation that gives users a scientific overview of the tool.  They are not required to produce detailed documentation of each capability

 Provide science input needed to priortize work.

The user group efforts will be coordinated through weekly user meetings, which include members of the software team and the APT project manager.

The user group will conduct requirements reviews to STScI every 6 months.  This will allow APT to get broader input from the STScI science community.

In addition to the STScI APT user group, we will ask the Space Telescope User Committee (STUC) to evaluate APT at approximately 6 month intervals (ie., coinciding with our external releases) to provide guidance and feedback on APT.  We also plan to contact a subset of non-STScI PIs that used APT for Phase 1 or Phase 2 preparation to get additional feedback and suggestions.  We will demo APT at AAS when major new releases are becoming available.

The APT user group will provide the following testing support.

Functionality Testing- This form of testing is carried out by the user group members who are working on a paticular tool.  In addition, instrument group members will likely be involved in testing instrument-specific issues.  The goal is to verify that the specified capabilities of the tool are met, including user interface issues.  This testing occurs during the independent test phase (see testing section below) of  the development process and is on-going. 

Completeness and Accuracy Testing- The goal of this testing is to ensure that the results of the tool are scientifically accurate and complete. Some assistance for the instrument groups will be needed.  This form of testing is carried out by the user group members who are working on a paticular tool.  In addition, instrument group members will likely be involved in testing instrument. This testing occurs during the independent and acceptance testing phases (see testing section below) of the development process.

 

7.      Software Development Process

The software development process usually consists of the following stages, concept development, requirements definition, design, implementation, testing, and maintenance support. Whether you are building a prototype to test concepts or building an operational system, the soft­ware goes through these different stages. For example a prototype, needs a concept, requirements, design, and implementation for the evaluation. The amount of effort put into each stage may be minimal and not go through any formal review. Delivering a system for operational use, will go through each of these stages, but much more formally. Some tools in APT may need to spend a fair amount of time in prototyping and trying to innovate, and some will be developed from the beginning in a more formal regimented manner. The APT project will determine on a tool by tool basis what needs to be done for the software development process. It is recognized that all operationally released products will go through a regimented development process to ensure the most bug-free and accurate product.

Guidelines and Standards

Source Code Configuration

The primary programming language for the APT will be JAVA. The project will use the Concurent Versions System (CVS) for source code configuration. The repository for the source code will be stored outside the operational firewall.

Design Principles and Standards

The system should favor interactive approaches over batch processes. Performance and respon­siveness will be considered in all design decisions.

The system should reuse existing components of other systems where appropriate (e.g. SEA com­ponents, Starview Components, etc.)

The system design will be object-oriented.

The system design should implement the Model, View, Controller (MVC) architecture.

The design will use a Unified Modelling Language (UML) tool called Rational Rose.

The system will be extensible (i.e., easy to add new features/tools).

The system should be platform independent wherever possible. Java is the leading candidate because it provides a number of advantages such as;

 interactive graphics and image processing support;

 support for accessing databases and catalog servers;

 user interface support;

 portability; and

 object oriented development.

The system will use a common data format. The markup language XML is the leading candidate since it is a world wide standard for data representation. The advantages to XML are:

 Support of automatic checking of documents for structure validity;

 availability of a rich array of tools to process and display XML documents; and

 availability of existing Java libraries that already support XML

Coding Standards and Development Environment

 The project developed a set of Java coding standards which are a blend of the GSFC and Sun Microsystems coding standards.

 The project developers will be encouraged to use an Integrated Development Environment (IDE) called CodeGuide to help in their software development.

Development Process

Requirements Reviews

All requirements will go through a requirements review before proceeding to design/implementation. All requirement documentation will be made available to the all reviewers, before the review. The requirements will be presented at the requirements review, which is open to all interested parties.

Design Reviews

All designs will go through a design review before proceeding to implementation. All design/requirements documentation will be made available to the all interested parties, including the primary reviewers, before the design review. The design will be presented at the design review, which is open to all interested parties.

Source Code Reviews

All source code will go through a code review before proceeding to developer testing. The code should compile and load without errors before it is reviewed. Reviewers should be familiar with the programming language being reviewed. All code should be made available prior to the review.

8.      Testing Plan

Testing is performed in two distinct phases as shown by the figure 3.below. 

                 Devoloper Independent/Acceptance Release                                                      

Figure 3.

Developer Testing

Each developer is responsible for unit and integration testing their software changes.  Since we are using CVS as our configuration management tool, this is easy to do.  Each developer has there own copy of the software, where they make their changes.  They can update the software, test it, and get feedback from users before they configure the change into the code repository.  The developer’s check in their changes when they have fully tested them. Once the software is configured, all changes are available to all the other developer’s. Additionally testing occurs by all the developer’s as they work with the newly checked in software. 

Independent Testing

A snapshot of the system is checked out of the code repository and put in a test environment for independent testing by APT testers and the APT user group.  We are planning to internally release a new version of the software every 6-8 weeks for independent testing.  This will help reduce the burden of testing large complicated releases, and allow the testing to be concurrent with development.

Acceptance Testing

Prior to an external community release, the latest version of the software is baselined.  This occurs about 1 month prior to release.  During this period, software changes will not occur except critical problems that would not allow the system to be useable. Known problems will be documented and installation testing will be performed on non-STScI platforms.  There will be a release of a new verison of APT every six months to our external user community.  Acceptance testing will be performed, once every six months to coincide with the external releases.

9.      Documentation Plan

The section describes the APT project’s documentation plan.

Standards

We don’t envision a need to have a set of documentation standards for our project documentation.  In keeping with the vision of quickly putting tools into user’s hands and due to the project’s size (4-6 FTEs), we have chosen to not define a set of documentation standards.  We believe this would just be an unnecessary overhead for the project. 

Team members are expected to produce documentation that is web-displayable. The project will use the following guidelines for its documenation;

        All documentation should be prepared in FrameMaker or MS Word

        All presentations should be prepared in PowerPoint

        All object-oriented design documentation should be prepared with Rational Rose.

        All documenation be made available in HTML, PDF, and/or ASCII text.

Location and Configuration Management

Documentation shall be stored outside the operational firewall so that it is accessible to everyone on the team. All project documenation should be put on the project web page and stored in   /ra/p3/http/apst/apt.  We don’t envision a need for a configuration control mechanism such as CVS for our project documentation.  Documenation will be backed up during system backups in this area.

Software Documents

The project will produce the following documents

        The project requirements will be documented and placed on-line.  We don’t expect that this will be one document for the entire system, but a collection of documents specific to the various development areas.

        The project designs will be documented using a UML tool and placed on-line.

        Meeting minutes for the various development efforts

        Presentations and status reports

User Documentation

All APT related user documentation will be available on-line as an integrated part of the APT tool.  This documentation will be configured with the APT software so that it is kept consisted with the software.  STScI documentation, such as instrument handbooks, will continue to be made available via the STScI web-pages. 

10. Non-Labor Resource Plan

Hardware

Table 2. shows future APT hardware needs. 

 

Hardware Product

Fiscal Year

Comments

1 Dell Desktop PC

2001

For Tom Donaldson who joins the project in Oct, 2000

2-3 Dell Desktop PCs

2001

If APT overguide monies are approved, we will need computers for the software developers.

Computers to Support STScI Lisp servers

2001

If APT makes a decision to not deliver executable Lisp programs with the APT downloaded package, we will need to run the Lisp programs at STScI for all PC users.  We will need additional computers to support the STScI servers.  A technical decision will be made in early FY01 on how to proceed.

Table 2.

COTS

The following software products are being used or may be needed by the APT project.  Table 3. shows both the known costs and potential costs.  Some of the potential costs, particularily the Lisp PC version, will depend upon techincal descisions to be made later in the project. 

 

COTS Product

Use

Cost to ESS/APT

Fiscal Year

Comments

Framemaker

Word processing

0

N/A

Currently available and being used

MS Word

Word processing

0

N/A

Currently available and being used

PowerPoint

Presentations & Talks

0

N/A

Currently available and being used

CodeGuide

Integrated Development Tool for Java Development

60 per seat

2000

Currently available and being used. Have 10 copies.  

Rational Rose

Object-oriented design tool

6000 per seat

2000

Currently have one copy behind the firewall, may need another copy to support more than one user at a time. 

CVS

Configuration Management Tool

FreeWare

N/A

Currently available and being used

Install AnyWhere

Java application installation Tool for the Web

FreeWare

Full version cost not known.

2000

Currently using free version, may need to buy full version to spport version checking on installations

Help Desk Software

Tool to track user problems and comments

0

N/A

Currently available and in use.

OPR tool

Web based tool used by developers to track software problem reports

0

N/A

Currently available and in use.

Lisp PC version

Lisp software and development environment for PC’s

12,500 to 25,000 initial cost.

~4000 per year for maintenance.

2000 or 2001

Currently have a Solaris  version.  Will need this if we want to support running Trans or Spike on a PC box as part of the APT delivered application.  Initial cost dependent upon what we purchase for our development.

Table 3.

Travel & Training

Currently we don’t envision any special travel or training monies needed to support the APT project.  We believe that any travel to conferences or training can be handled as part of the existing ESS travel and training budget.

11.  Security Plan

There are no security issues related to the software development process and therefore APT doesn’t need to be developed behind the STScI firewall.  Some of the user data is proprietary, however due to the low risk of interest in our data, there is also no need to encript any of the data being transferred across the internet.  All servers running at STScI to support the project, need to be hosted outside the operational firewall.

12. Management Plan

This section describes the APT management organization and plan.

Project Management

The APT project is being developed in the Engineering & Software Services Division (ESS) with Science input from Institute Science Division, Hubble Division, and Science Policies Division.  Tony Krueger is the APT project manager who reports to Rick White, ESS Department Lead.  Rick White reports to Stefi Baum, ESS Division Head.  Stefi Baum reports to Mike Hauser, Deputy Director.  Mike Hauser reports to Steve Beckwidth, STScI Director.

Figure 4.

User Risk Management

STScI currently has tools that support Phase 1, Phase 2, and Archival Research.  These tools are available and operational.  We can continue to provide these tools to the HST community if problems arise with the delivery of APT.

Technical Risk Management

There are three major areas of techincal risk which need to be planned for;

Expanding the Current Architecture

The current prototype is not integrated with any of the STScI legacy systems needed to support Phase 2 processing.  The current architecture will need to be enhanced to support a client/server methodology.  As part of this work we will need to extend the architecture to allow non-Java modules to easily be plugged into the current architecture with a well-defined interface. We plan to prototype some approaches to this problem, during the first year of the project.  This will allow us to determine an approach and adjust resources if problems arise early in the project.

System Responsiveness

The current prototype is already slow on Sun computers due to the Java engine on that platform. This is a concern especially when the slower STScI legacy systems have not been integrated into APT yet.  We are going to initially concentrate our efforts in ensuring that the integration of the STScI legacy software will not slow the system down beyond use. This would be a catastrophic system problem. We are also expecting that Sun will release faster versions of their Java system and that we will not have to apply resources to that problem.  If Java performance on the Sun isn’t improved by the vendor by the Phase 11 Cycle 2 release, we will survey our user community to see what hardware our users have available and make a decision on whether to speed up APT on the Sun platform or take a different approach.  Additionally, we will support the PC computer platform where Java performance is not a problem.  This will give our user community an alternative platform to the Sun computers for APT processing.

A user profile of our Archive (MAST), shows that a large amount of the MAST community has PC computers available.  See Figure 5. below.

Figure 5.

APT Usability

This project is highly interactive using visualization to convey information to the user.  Providing an easy to use tool will be a challenge.  We designed our development strategy with this in mind.  We plan to release the software internally every 6-8 weeks to allow for internal STScI user feedback as soon as possible.  We adopted a phased release approach to  the external HST community.  Our six month release schedule will allow us to get feedback from our HST user community allowing us to make usability changes on a 6 month timescale.

Project Overguide Monies

We are planning on asking for Goddard overguide monies in FY01 and FY02 to help the development team in these high risk areas.  We would like to remove the techincal risk as early as possible in the project.

Project Reporting

The APT project manager will provide the following reports to STScI management;

        Written Monthly Status Reports to the Engineering & Software Services Division

        Quarterly status updates to the Hubble Division via the SUSD department reviews.

        Presentation to the STScI Director’s Office and the Hubble Division at the Program Management Reviews.

All technical and management documentation should made available on the APT project web page.

13.  Project Schedule & Labor Resource Plan

The project will  have six major releases approximately six months apart, each coinciding with the HST Phase 1 and Phase 2 deadlines.  Figure 6 shows a high level delivery schedule and the Figure 7 shows when work will start and how long it will last.

Figure 6.

 

 

Figure 7.

 

 

 

 

 

 
Figure 8 shows the FTE resources needed for the project.  The chart coincides with the APT delivery schedule.  So how do you read this chart?  There are 2.5 FTEs working on the Visual Target Tuner tool from Jan 00 to Jun 00.  From June 00 to Jan 01, the FTE level will drop to 1.5 FTEs and so on.

 

Figure 8.

14. Reference

1.       The “APT:HST Proposal Preparation Environment for the Second Decade” document is available at http://ra.stsci.edu/apst/apt/documents/aptdocumentation.html

2.       The APT project development page is http://ra.stsci.edu/apst/apt