|   | 
Chapter 7. Specify the System Development Details 
  
  
 
 | 
 
  | 
 
  You should specify to the supplier (whether this 
 supplier be an internal Company-IR group or a third party) that they must 
 have:  
 
 ·        
 a good methodology in order to 
 develop a system  
 
 ·        
 a formal quality management 
 system for the development, supply and maintenance of the system  
 You should also make the supplier aware of the fact 
 that the Company may audit. 
 These are the standards that should be in place in 
 order to develop a system that can be validated.  | 
  
 
  
 
 | 
 
  | 
 
  The supplier’s system must be developed using a good 
 methodology that uses a life cycle approach.  
 Note: A general description of a system life cycle 
 methodology can be found in the IR Policy Manual and other references (See,
 References in the Reference Material part of this document).   | 
  
 
 
 
 
 | 
 
  | 
 
  The supplier’s computer system must be developed using 
 a formal quality management system.  
 Adherence to a quality management system should provide 
 sufficient documentary evidence for subsequent acceptance by the validation 
 team.   | 
  
 
  
 
 | 
 
  | 
 
  The quality management system should include procedures 
 associated with:  
 
 ·        
 documentation control 
  
 
 ·        
 project management  
 
 ·        
 quality planning  
 
 ·        
 life cycle definition 
  
 
 ·        
 testing  
 
 ·        
 configuration management 
  
 
 ·        
 programming/technical 
 development standards   | 
  
 
  
  
 
 | 
 
  | 
 
  A good methodology and quality plan will ensure that a 
 user requirements specification is developed.  
 This topic looks at the 
 user requirements specification.   | 
  
 
  
 
 | 
 
  | 
 
  The user requirements specification:  
 
 ·        
 describes the functions that a 
 system or system component must or should be capable of performing 
  
 
 ·        
 is generally developed by the 
 user in the initial stages of a system development or system selection 
 process  
 
 ·        
 is written in general terms and 
 specifies what needs to be done, not how it will be done  
 
 ·        
 is independent of the specific 
 application program (technically non specific) that will be written or 
 purchased  | 
  
 
 
 
 
 | 
 
  | 
 
  
 The following techniques 
 may be used to capture relevant user requirements:  
 
 ·        
 workshops (such as critical 
 requirements analysis workshops)  
 
 ·        
 interviews  
 
 ·        
 presentations  
 
 ·        
 data modeling  
 
 ·        
 data flow diagrams   | 
  
 
  
 
 | 
 
  | 
 
  The user requirements specification will be used as the 
 basis for the development of the system acceptance test scripts / 
 performance qualification test scripts (See, the topic 
 Performance Qualification in Chapter 8 - Perform Qualification 
 Activities).  
 The user requirements specification will be reviewed 
 during the specification qualification (See, the topic 
 Specification Qualification in Chapter 8 - Perform Qualification 
 Activities).   | 
  
 
 
 
 
 
 
 | 
 
  | 
 
  A good methodology and quality plan will ensure that a 
 functional specification is developed.  
 This topic looks at the functional specification.   | 
  
 
 
 
 
 
 | 
 
  | 
 
  The functional specification, or system 
 specification:  
 
 ·        
 describes in a high-level 
 manner, the hardware, software and peripherals that make up the computer 
 system as a whole (Note: In system development terms, this specification 
 will form the basis of system testing.)  
 
 ·        
 describes how the specific 
 system to be purchased or developed will meet the user and functional 
 requirements  
 
 ·        
 describes the specific user 
 requirements that will not be met by the system  
 
 ·        
 should include reference to 
 the data model to be used  
 
 ·        
 should define the 
 functionality that does not relate directly to the user interface (e.g. 
 system interfaces)  
 
 ·        
 should define the 
 non-functional requirements such as performance and availability. 
   | 
  
  
 
  
  
  
 
 
 | 
 
  | 
 
  The functional specification may be produced:  
 
 ·        
 when a new application is 
 being developed  
 
 ·        
 when the users need to be 
 exposed to the system before finalizing their requirements  | 
  
  
 
  
 
 
 | 
 
  | 
 
  The functional specification may be produced from a 
 prototyping exercise in order to model the required user interface.  
 The use of prototypes should be carefully controlled 
 (e.g. by time boxing and controlling the number of iterations) and 
 maintained within the scope of the User Requirements Specification.  
 The agreed prototype forms part of the functional 
 specification and can be used as the basis for a first pass conference 
 room pilot.   | 
  
  
 
  
 
 
 | 
 
  | 
 
  Functional specifications can comprise mechanical, 
 electrical, and software function specifications for systems embedded in 
 manufacturing equipment.   | 
  
  
 
  
 
 
 | 
 
  | 
 
  The functional specification will be used as the 
 basis for the development of systems acceptance test scripts / operational 
 qualification test scripts.  
 The functional specification is reviewed as part of 
 Design Qualification (See, the topic Design Qualification 
 in Chapter 8 - Perform Qualification Activities).   | 
  
  
 
  
  
 
 | 
 
  | 
 
  A good methodology and quality plan will ensure that a 
 design specification is developed.  
 This topic looks at the design specification.   | 
  
 
  
 
 | 
 
  | 
 
  The design specification is a complete definition of 
 the equipment or system in sufficient detail to enable it to be built.  
 This specification will 
 form the basis of module/integration testing.   | 
  
 
  
 
 | 
 
  | 
 
  The design specification is reviewed in the:  
 
 ·        
 Design Qualification 
  
 
 ·        
 Installation Qualification - 
 The design specification is used to check that the correct equipment or 
 system is supplied to the required standards and that it is installed 
 correctly.  
 (See, Chapter 8 - Perform Qualification 
 Activities).   | 
  
 
  
  
 
 | 
 
  | 
 
  A good methodology and quality plan will ensure that 
 several types of documentation are developed.  
 This topic looks at the following types of 
 documentation:  
 
 ·        
 end-user documentation 
  
 
 ·        
 administration documentation
  
 
 ·        
 system support documentation  | 
  
 
  
 
 | 
 
  | 
 
  End-user documentation comprehensively describes the 
 functional operation of the system.  
 This documentation should include:  
 
 ·        
 some means of problem solving 
 for the user such as an index, trouble-shooting guide and description of 
 error messages  
 
 ·        
 comprehensive drawings of the 
 system, if applicable  
 End-user documentation is generally produced by the 
 supplier or developer and should be updated each time the system changes.   | 
  
 
  
 
 | 
 
  | 
 
  Administrator documentation is written for the 
 administrator (the user who will maintain and administer the system).  
 This documentation:  
 
 ·        
 describes how to perform the 
 administrator functions, such as: 
 
 ·        
 system configuration 
 
 ·        
 adding users to the system 
 
 ·        
 setting up levels of security
  
 
 ·        
 setting up and maintaining 
 master control records  
 
 ·        
 may be a special section of the 
 end-user documentation or it may be provided as a separate document 
  
 Administration documentation is provided by the 
 supplier.   | 
  
 
  
 
 | 
 
  | 
 
  System support documentation describes the system 
 administration activities that are specific to the software.  
 These administration activities include:  
 
 ·        
 configuration of the 
 environment 
 
 ·        
 installation 
 
 ·        
 maintenance documentation
  
 
 ·        
 the running of batch jobs 
 System support documentation is provided by the 
 supplier or developer for the system administrator.   | 
  
 
  
  
 
 | 
 
  | 
 
  A good methodology and quality plan will ensure that 
 several types of testing are undertaken throughout the development life 
 cycle.  
 This topic looks at the following types of testing:  
 
 ·        
 module testing  
 
 ·        
 integration testing  
 
 ·        
 system acceptance testing
  
 
 ·        
 stress testing  | 
  
 
  
 
 | 
 
  | 
 
  Module testing - sometimes known as unit testing - is 
 testing at the level of a single functional routine or software module.  
 At a simple level, and independent of the system as a 
 whole, unit testing verifies that the routine provides correct output for a 
 given set of inputs.  
 Module testing is carried out to verify that the system 
 performs as defined in a Design Specification (See, Chapter 8 - 
 Perform Qualification Activities).   | 
  
 
  
 
 | 
 
  | 
 
  Integration testing:  
 
 ·        
 verifies that the system 
 functions correctly as a whole  
 
 ·        
 proves that all software 
 modules correctly interface with each other to form the software system as 
 defined in the design specification and functional specification  
 Integration testing is performed on the fully built 
 system, as it is to be used by the end-users. Data from other external 
 systems may, however, be provided by "dummy" interfaces.  
 Example: A manufacturing resource planning system might 
 be tested with data provided from a flat file that simulates the interface 
 to the inventory system, without requiring the inventory system to be 
 involved in the testing.  
 Similarly a process control system can be tested by 
 "dummying" inputs and outputs from field instructions in the plant.   | 
  
 
 
 
 
 | 
 
  | 
 
  System acceptance testing is the testing of the 
 system’s interfaces to other systems in the computing environment.  
 It should cover both the testing of user requirements 
 and system functionality. This not only ascertains that the system accepts 
 data correctly from other systems, but also that it accurately passes data 
 to downstream systems and correctly processes data within the system itself.
  
 System acceptance testing is usually done separately 
 from the integration testing in order to minimize the downtime and expertise 
 requirements for the other systems.  
 The testing may be performed:  
 
 ·        
 at the suppliers (and then 
 repeated at the user site)  
 
 ·        
 solely at the user site 
   | 
  
 
  
 
 | 
 
  | 
 
  
 Stress testing involves 
 cataloguing the fact that the system fails in expected ways that are not 
 catastrophic, yet are easily recognized as errors.  
 There are two categories of stress testing:  
 
 ·        
 entering data that is outside 
 the range of acceptable data from the system and ensuring that the data is 
 flagged as an error  
 
 ·        
 testing the system with a high 
 volume of transactions. The objective is to determine the maximum 
 operational capacity at which the system can be run without danger of loss 
 or corruption of data. Based on this type of testing, the system developer 
 will rate the system’s maximum operational capacity 
 These two types of stress testing should be performed 
 by the developer of the system as part of module testing and integration 
 testing rather than as a separate activity.  
 Similar testing of the system related to the user’s 
 planned operation and environment should be included as part of the 
 Performance Qualification (See, the topic Performance 
 Qualification in Chapter 8 - Perform Qualification Activities).   | 
  
 
  
 
 | 
 
  | 
 
  For a standalone computer system, the system acceptance 
 testing broadly equates to OQ and part of PQ. Some aspects of performance 
 qualification may need to be performed by the user after system acceptance 
 testing (especially for configurable software packages).  
 For an embedded system, system acceptance testing is 
 only part of OQ/PQ since other machine performance checks of components 
 which do not form a direct part of the system will need to be performed.   | 
  
 
  
 
 | 
 
  | 
 
  It is very important that direct traceability is 
 established between the specification and the test performed i.e. a cross 
 reference from the test script back to the section in the appropriate 
 specification where the function is defined.  
 This traceability ensures that all parts of the 
 software are tested, and clearly establishes the acceptance criteria for a 
 given test.   | 
  
 
  
Computer System 
Retirement  
  
  
 
 | 
 
  | 
 
  The stages in the retirement process depend on the 
 definition of raw data for each system.  
 For regulatory systems, data must be archived either 
 electronically or as some form of hardcopy (paper or fiche). Continued 
 on-line access to data may be achieved by data migration to the replacement 
 system, although this should be treated as a development project in its own 
 right (design and testing of migration/translation tools, quality control of 
 a proportion of transferred data, etc.).  
 A pre-retirement review of validation is necessary to 
 ensure that the documentation and testing package is complete BEFORE the 
 system is dismantled. The physical decommissioning process should be 
 planned, ensuring overlap with the replacement system already operating 
 successfully in its production environment.   | 
  
 
  
  
 | 
  |