Phase 11: Project Review
Some of the questions we will be answering in this phase are:
-
What do I need to do in preparation for the following implementations?
-
What has been the impact of the pilot implementation?
-
How do I decide what to do next?
-
How can I measure the success of the Control-D implementation?
Inputs
Before you start this phase you should have:
-
Reviewed the "old" system (Phase 1).
-
Implemented generic processing (optional – Phase 7).
-
Implemented online viewing (Phase 9).
-
Implemented Control-D/WebAccess Server (optional – Phase 10).
Outputs
-
At the end of this phase you will have:
-
Selected the next application to implement.
-
Reviewed the automated system.
Requirements for Future Implementations
-
Not all actions performed during the pilot implementation have to be repeated for subsequent implementations. Many were "one time only" tasks, for example, the administration tasks are in place, the structure of the recipient tree has been decided, and so on.
-
The list below highlights outputs that must be achieved in each phase for subsequent applications. In other words, these are the tasks that need to be performed for each new application.
Phase 1 Outputs: Decide Implementation Strategy
-
Perform a distribution system review.
-
Decide your project objectives.
-
Decide your implementation strategy.
-
Select a pilot application.
-
Select the super users.
-
Assign resources to the project.
We will discuss (later in this phase) how to decide which applications to implement next and how to decide your implementation strategy.
It may be that the super users selected for the pilot can be used as a user support team for future implementations. If this is possible, we recommend that you use these users as they already have the experience of managing the implementation of Control-D in the end user environment. If not, you will have to identify new super users for each new application.
Phase 2 Outputs: Define Recipient Tree
-
Set standards for recipient names.
-
Insert basic recipient information.
Having defined the recipient tree structure and set standards for recipient naming during the pilot implementation, the requirements for future applications will be to identify the recipients of the application and to insert the basic recipient information in the tree.
Phase 3 Outputs: Design Decollating Missions
-
Decide your report decollating mission scheduling method.
-
Define decollating missions for the pilot application.
-
Insert synonyms into the recipient tree.
-
Test decollation missions.
The process of defining report decollating mission definitions should be easier for future applications after the experience gained from the pilot application. We suggest that when you are defining report decollating mission definitions for future applications, you merge this phase with Phase 6 – identify jobs using CDAM Direct Write, and define the report decollating mission definitions accordingly.
Phase 4 Outputs: Design Print Bundles
-
Define printing missions for the recipients of the pilot application.
-
Set printing mission naming standards.
-
Design the basic format of the printed bundles.
-
Initially test your printing mission definitions.
It may be that no further printing missions need to be defined. You may want to set up test missions to check that the user output is printed as required. The production implementation of printing reports from new applications may only involve updating the INCLUDE/EXCLUDE parameters of existing printing missions. You should also consider the timing of print bundle production as new applications are implemented and the print volume increases.
Phase 5 Outputs: Implement System Administration Tasks
-
Implement the Control-D "housekeeping" utilities.
-
Set the SEARCH default using Control-D wish WD0933.
-
Define, test and implement backup procedures.
-
Define, test and implement restore procedures.
As a result of the pilot implementation, all required administration procedures should already be in place and no further actions should be required for future implementations.
Phase 6 Outputs: Implement CDAM Direct Write
-
Create JCL procedures that use the CDAM Direct Write facility.
-
Update the relevant report decollating mission definitions for CDAM Direct Write.
We recommended previously that you do this phase in conjunction with Phase 3, for example, when defining the report decollating mission definitions. Again, we recommend that you identify the top 10% of large volume jobs to give you the biggest "payback."
Phase 7 Outputs: Handle MSGCLASS Output
-
Define and test the generic decollating missions.
-
Define appropriate authorizations in the recipient tree for online access to MSGCLASS output.
You may already be processing all MSGCLASS outputs as a result of the pilot implementation. If not, you should set the required generic missions up, or modify the existing missions. Authorizations for MSGCLASS viewing will probably be similar for all applications (that is, operations personnel access), but you may need to define access for development teams, and so on.
Phase 8 Outputs: Production Implementation
-
Define accepted banner pages and bundle formats.
-
Implement the pilot application into production.
-
Perform a Report Pruning Survey.
-
Implement the results of the Report Pruning Survey.
Depending on your objectives you can determine if you should perform a Report Pruning Survey after the implementation of each application or whether to implement all applications and then perform a global Report Pruning Survey. Postponing the survey at this point will basically mean that you are postponing the implementation of online viewing services until all applications are defined and are being printed and bundled by Control-D.
Phase 9 Outputs: Online Viewing Implementation
-
Perform a user survey about online viewing.
-
Set up the online viewing access environments.
-
Train appropriate users for online viewing.
-
Modify the report decollating mission parameters for online reports.
-
Establish report online viewing durations (CTDDELRP).
-
Define the required view authorizations in the recipient tree.
-
Implement the pilot application for online viewing.
Whichever strategy you follow for the Report Pruning Survey (that is, global or per application), we recommend that the implementation of online viewing services be performed at an application level. This will ensure that attention can be concentrated to a specific application's needs and that quality and objectivity are not lost trying to implement online viewing services in a "Big Bang" approach. We recommend a gradual introduction of users to the online viewing services to ease the implementation for you and for them.
Phase 10 Outputs: Control-D/WebAccess Server Implementation
-
Set up the Control-D (and z/OS) environment to support Control-D/WebAccess Server users.
-
Perform a user survey of potential Control-D/WebAccess Server users.
-
Implement Control-D/WebAccess Server for appropriate users.
For potential Control-D/WebAccess Server users, we suggest that you follow the recommendations made for Phase 9.
For future implementations, you should consider combining the survey in Phase 9 with the survey performed here. Many of the support and implementation procedures can be controlled by the Control-D/WebAccess Server administrator, if one has been assigned.
Phase 11 Outputs: Project Review
-
Review the automated system.
-
Select the next application to implement.
After implementing each application, we suggest that you analyze the implementation and make any necessary enhancements for following applications. Each time you implement, you will have to determine what to do next. We will discuss this in the following section and identify how to select and assign priorities for implementation.
Which Application Should I Select Next?
Having gained the experience of the first implementation, the following applications will be implemented easier and faster. We have identified what needs to be done for subsequent applications in the previous section. Now you have to decide what your strategy will be and which applications to implement next.
Decide Strategy
After the initial implementation, you should consider your implementation strategy for the subsequent applications. This will depend on your project objectives and available resources. You may want to implement all applications for bundling and printing, or you may want to process each application for online viewing.
It is also important that you reassess how many resources are required to achieve your objectives based on the experience gained with the pilot implementation. If possible, you should continue with the staff who already have experience from the pilot implementation.
Select Next Application
Deciding which applications to handle next will depend on several issues. There may be specific problem applications that the implementation of Control-D will solve. There may be high priority systems. There may be pressure from end users eager for improved services.
Generally, we recommend that you select applications for implementation based on the benefits that you will gain by automating the application. You should prioritize the systems that will deliver most benefits. If possible, and depending on your resources, you may be able to work on multiple applications concurrently.
You should now select the next application to implement.
Compare Systems
Depending on the size of the pilot application, it may be difficult at this stage to get a true picture of the automated system and what benefits it has produced. We suggest that you perform a total system comparison when Control-D is fully implemented, using the following guidelines.
Calculate Project Success
In Phase 1, we asked you to measure your current system. We will use the measurements gathered during Phase 1 to compare to the automated Control-D system. We asked you to measure key resources and services used and provided by the output management system. We will use the following list to discuss each key area of the output management process and to calculate what has been achieved:
-
Measuring volumes of printed output.
-
Evaluating spool utilization.
-
Removing redundant data from circulation.
-
Quantifying the backup resource.
-
Analyze new output management methods.
-
Analyze information availability.
-
Analyze online viewing usage.
-
Review the end user methodology.
Compare Systems
Depending on the size of the pilot application, it may be difficult at this stage to get a true picture of the automated system and what benefits it has produced. We suggest that you perform a total system comparison when Control-D is fully implemented, using the following guidelines.
Measure the Volume of Printed Output
In Phase 1 we asked, "How much output do you currently produce?"
If the results from the Report Pruning, online viewing and Control-D/WebAccess Server surveys were positive, many reports should have been extracted from the print workload. It is usual that the progression to online viewing increases as time passes. As more applications are implemented under Control-D and users continue to request online access to information, the print workload should continue to fall. Control-D supplies a number of pre-defined reports that tell you how much output is being printed for each recipient. You can find sample reports in the IOA SAMPLE library.
Evaluate Spool Utilization
In Phase 1 we asked, "How big is the spool and how is it used?"
From the initial study, you should have identified the trends and utilization of spool volumes. With the introduction of CDAM Direct Write, the utilization of the spool should have dropped and the trends should have stabilized. After full implementation, you should reassess the requirements of the spool volumes and reclaim any space that was previously unavailable.
Remove Redundant Data from Circulation
In Phase 1 we asked, "How much unwanted data is sent to users?"
From the results of the Report Pruning survey in Phase 8, you should have identified how much unwanted data was being distributed. In some cases, the results of this survey can be alarming because of the discovery of large volumes of redundant output circulating in the distribution system. You should document what percentage of reports was identified as redundant.
Quantify the Backup Resource
In Phase 1 we asked, "How many resources are used for report backups?"
You calculated how many resources were allocated for the archiving of report data. After implementing the backup missions of Control-D, you should compare the resources used. If you have set up the backup missions according to the recommendations specified in Phase 5, the new archiving strategy should be more economical, because there is no duplication of data being archived and reports are archived in a compressed format.
Analyze the New Output Management Methods
In Phase 1 we asked, "What is the current level of report reruns?"
With the implementation of online viewing in Phase 9, the end users now have control over their reports and have the ability to restore archived reports if required. This facility should dramatically reduce the number of report rerun requests received from the users. The objective should be that there are zero rerun requests from the end user environment once they have the capability to manage their reports on their own.
Analyze Information Availability
In Phase 1 we asked, "What are the average report delivery times?"
If you implemented online viewing services for mainframe or PC users, your report delivery services should have vastly improved. Information is now available for access to the users as soon as it is created. For PC users, reports can be automatically downloaded overnight to the PC environment so that they are available for the start of the working day.
If reports are still being printed, the time saved on splitting and bundling of output should enable faster report delivery, as well as delivering a higher quality product. For sites that formally delivered printed output to users some distance from the computer center, the availability of online information will be one of the highlights of the implementation.
Analyze Online Viewing Usage
In Phase 1 we asked, "How many report recipients have online access?"
During Phase 1, we asked you to assess the scope of your users' access to terminals and PCs. In Phases 9 and 10 we discovered the users' preferences and requirements for online viewing. You should calculate what percentages of report recipients have adopted online viewing services.
Analyze Online Viewing Usage
In Phase 1 we asked, "How many report recipients have online access?"
During Phase 1, we asked you to assess the scope of your users' access to terminals and PCs. In Phases 9 and 10 we discovered the users' preferences and requirements for online viewing. You should calculate what percentages of report recipients have adopted online viewing services.
Review End User Methodology
In Phase 1 we asked, "What happens to the report once the user receives it?"
Now that the implementation is complete, you should have a better understanding of how reports are used within your company. It may be many years since anyone investigated the report distribution system. The objective of the implementation was not only to automate and improve the output management system but also to identify redundant data.
The enhanced system should enable the users to improve their handling of information to meet their business requirements.
You should now perform a project review.
Review
During this phase, you examined what is required for future implementations, decided a strategy to process the other applications and reviewed the success of the implementation by comparing it with the previous system.
Before you continue, you should have:
-
Selected the next application to implement.
-
Reviewed the automated system.