Monday, April 27, 2020

Smith Consulting Software Essay Example

Smith Consulting Software Essay System Analysis of Smith Consulting System Documentation BSA/385 Contents Abstract3 System Analysis of Smith Consulting System Documentation4 Introduction4 Customer Engagement Approach4 Software Development Processes and Procedures4 Quality Assurance Processes and Procedures6 Testing Procedures7 Developer Testing8 Reliability9 Accuracy9 Developer Performance Testing10 Developer Fault Testing10 User Acceptance Reliability11 User Acceptance Accuracy11 User Acceptance Fault Tolerance11 Test System Infrastructure15 Hardware/Software Capabilities15 Formal Program Specifications Format15 Conclusion16 Attachments18 Abstract The LTA (Learning Team A) group has been asked to define, develop, and propose standards for a software testing environment at Smith Consulting. The LTA group will review several aspects of the system documentation currently being used by Smith Consulting and provide proposed solutions for each of the areas defined in this document. System Analysis of Smith Consulting System Documentation Introduction Smith Consulting (Smith) has tasked LTA (Learning Team A) with developing standardized project approach and testing procedures so that prospective clients are confident that Smith is performing their contractual obligations efficiently. These procedures will be generalized so that they can be applied to any project that Smith takes on and have sufficient documentation so that the procedures are correctly applied to each project. This ensures that Smith has repeatable processes in place and can put more resources towards completing the project rather than developing the procedures to complete the project. We will write a custom essay sample on Smith Consulting Software specifically for you for only $16.38 $13.9/page Order now We will write a custom essay sample on Smith Consulting Software specifically for you FOR ONLY $16.38 $13.9/page Hire Writer We will write a custom essay sample on Smith Consulting Software specifically for you FOR ONLY $16.38 $13.9/page Hire Writer Customer Engagement Approach Smith is dedicated to the long-term success of the project. Smith does not approach the projects as a system to be designed and left with the client to manage. Smith offers flexible management terms from support to full-time maintenance of any application Smith designs, ensuring that our dedication to service extends through the life of the product. Smith also strives to help every end-user, helping them to fully understand and embrace the new technology. Smith understands the challenges end-users face when moving to new technology, and we want every employee to be comfortable using the system. Software Development Processes and Procedures Smith realizes that there may not be one clear-cut solution for all software development projects and seeks to use a process that takes into account as many variables as possible when developing new software. This means that Smith will first need to determine the needs for the company. The first step in this process is to identify the stakeholders and develop a project timeline and budget. These factors will help drive the project toward the estimated completion date. The stakeholders will include members of Smith so that the project is kept manageable and realistic as far as time of completion and resources needed. The next step is creating an analysis team to work with the various stakeholders to understand what the company needs the new system to do. As the analysis is completed a more formalized design will be presented to the same stakeholders in the form of a data flow diagram to ensure that all the needs are being met. At this time the stakeholders will have the ability to present changes as part of the change ontrol process described as part of the Smith quality assurance processes. Smith will also implement additional design methods as required by the specific needs of a project. These methods include the use of new models, tools, and techniques in order to fully understand the system requirements. It may be necessary to bring in third-party vendors to provide and use the tools that these methods require. These vendors will be subject to a procurement process as indicated in the contract for the project and are subject to a determination of need by the stakeholders. The next step is for a finalized data flow diagram to be presented to a design team who will then analyze and determine the best approach for implementing the design. This will include determination of a tool, and the solicitation of vendors to provide the tool as needed. The stakeholders will again be consulted before any final determinations are made. Once a tool or vendor is determined the design team will work on implementing the design. The design will be implemented as part of the project plan timeline and Smith will provide developer testing in addition to end-user acceptance testing. This ensures that the final product matches the goals of the system as laid out in the project plan. Once acceptance testing is completed the system will be implemented and the system will enter its training phase as indicated in the project plan. After full implementation, the system will enter the maintenance phase. Depending on the term of the contract, Smith will be involved in the maintenance phase of the system through its live cycle. All contracts include phone support for as-designed elements of the system. Quality Assurance Processes and Procedures Smith is dedicated to providing quality of the highest level in all its services to its clients. This quality is ensured through the use of tools put in place at the beginning of the project. The most important tool in use is the project manager. All of Smith’s project managers are qualified, experienced managers who take a hands-on approach to ensuring that a project stays on the timeline and within the budget. This includes weekly updates to shareholders as well as daily stand up meetings to ensure that everyone is on track and there are no concerns to delay the project. Any concerns are immediately made known and the manager will do his or her best to provide more resources to address the problem with minimal effects to the project plan. To ensure that the project is kept within scope, a strict change control process will be used to determine what parts of the project will be able to be changed and when. This can be different for every project; Smith approaches each project plan in a similar fashion, including setting priority, and secondary goals. The stakeholders have the ability to define these goals during the analysis phase as well as a say in the change control process. Primary goals are strictly followed and changes are made to these goals only when the outcome of the project is in jeopardy. Secondary goals are more flexible and their change control process is not as severe; however the process is strictly adhered to. This adherence to the change control process ensures that the project is a success and not dependent on the success or failure of individual parts. Testing Procedures For each level of testing that takes place, Smith will develop charts detailing the testing step to be performed, an example of the chart is shown in example 1. This chart includes sections to identify what is being tested (Actor), what is being done (Action) and a description of the testing task. In addition, a secondary chart will be provided that details each step to be performed to complete the test case in question. Actor| Action| Description| Customer/Employee/Etc. | Click Button 1| Button 1 starts the applications and takes the user to the login, etc. | Customer/Employee/Etc. | Click Button 2| Button 2 logs the current user into the system| Continue†¦| | | Step| Step Expected Result| 1. Launch Application/Etc. Application window should open and prompt user for credentials, etc. | 2. Login/Etc. | User should be logged into application and appropriate menu items should be shown based on user’s security level, etc. | Continue†¦| | Figure 1 – Sample Software Capability Testing Diagrams An additional process flow that will be followed by Smith engineers is to chart the process flow for all testing for easy review by clients and engineering staff. The format of the process flow is shown in Figure 2. Figure 2 – Sample Testing Data Flow Diagram Developer Testing Smith requires its developers to perform testing on each piece of a project that they are responsible for working on. Each of these tests ensures that the software is ready for the next step in the process, whether that is integration with other pieces or implementation. The developers are required to ensure that the testing covers reliability, accuracy, fault tolerance and performance as required by the architecture design documentation for the project. Reliability Reliability testing is important to ensure that the system is capable of handling input and output in all situations it is likely to encounter during use. This includes programming for proper error handling should a user input unexpected data, as well as for handling exceptions on the data output. Developers use the architecture document to control the input and output and use exception handling to notify the user that something has gone wrong without passing the bad data along to the next process. In order to test reliability developers will purposefully input bad data and the system is required to handle it correctly. The developers are required to check not only that the errors are thrown properly, but that they are informative and the bad data are not output. Accuracy Accuracy testing is important because the input must be received, stored, transformed and output correctly. Any failure in any of these steps will affect the accuracy of the system. Developers must develop algorithms to transform the data properly so that the output is as expected. Since system algorithms can be achieved in multiple ways that achieve the same results, it is important that the accuracy is tested as often as possible. Developers are required to develop and input use cases to ensure the data are being output as required. If the output is not what is expected the developers can roubleshoot and track down the problem and run the use cases again. The system architecture document will detail the results needs, but developers often must use creativity to develop a solution to match the requirements. Accuracy testing at the developer stage is a key step for accuracy in the implemented system. Developer Performance Testing Performance testing must be performed during the development phase to ensure that the system does not have any resource or application issues prior to deliver to the customer and conforms to the stated needs of the customer. At Smith performance testing is performed during unit testing and during integration testing. With unit performance testing the developer is able to identify issues easier than during integration testing. During the integration testing phase any module interaction that was not able to be identified during unit testing will be tested. Performance testing includes the development of test cases that test each module within the client application, if resources are required such as network connectivity or file I/O then these systems will also be tested. The expected results of performance testing will be defined during the system analysis phase and will be approved by the client as acceptable criteria. Developer Fault Testing Developer fault testing is a technique used by Smith to inject errors into the software/hardware project to test the capability of the system to handle the errors in expected ways. This method of testing aids developers in the identification of the inability of the application to respond to system conditions that are expected during the normal use of the product. By using fault testing the developers can include processes that respond to these error conditions in a meaningful way. User Acceptance Reliability Like developer reliability testing, user acceptance testing for reliability is important to ensure that the system will reliably handle inputs and outputs. In UAT, test cases will be provided to selected users from the customer’s side. These test cases will allow users to test inputting information into the system so the software can demonstrate that it is capable of handling various types of input by executing on it properly. The UAT will allow the customer to provide feedback to Smith on the test cases and anything that did not perform as expected so that Smith can make any modifications to the system. In the event that Smith has to make any changes, UAT will begin again after the system has been adjusted. User Acceptance Accuracy User acceptance testing for accuracy will provide users with tests to ensure that the system is capable of handling inputs, loading, processing, storage, and outputs accurately. These tests should be consistent with the original testing that took place in the development of the algorithms to ensure that the expected results are produced by the system to the degree of accuracy that is required by the application. All calculations that the system needs to be able to perform should be tested in the UAT to demonstrate to the customer that the system is capable of handling the calculations correctly. This is the chance for the customer to fully test the system and provide any feedback to Smith. The user tests will be developed to demonstrate the full capabilities of the system. User Acceptance Fault Tolerance Once the software has been designed and implemented, the end-user’s who will utilize the software will be able to test the software’s performance (User Acceptance Testing, 2010). Allowing the end-user’s to test the software will allow IT personnel to make any changes and fix bugs that may cause future errors. All though much software may not be productive due to errors, Smith depends on test results and test reports to ensure errors are fixed and errors in the system are debugged. Below are a few tools that are used for user acceptance testing and fault tolerance: * Test Results and Error Reports User Acceptance Sign-off (Figure 3) * Production Systems * Final Installation Procedures and Instructions * Final Documentation and Training Materials * Project Plan * Methodology Compliance Form (User Acceptance Testing, 2010) (Figure 4). Smith will use these tools to ensure the User-Interface is accurate and ready for deployment. Figures 3 and 4 are sample templates fo r some of the tools used. Figure 3 – User Acceptance Sign-Off Sheet Figure 4 Methodology Compliance Form Test System Infrastructure Hardware/Software Capabilities Smith uses QuickTest Professional 10. (QuickTest) software to perform testing on all of its software projects. The software is an automated testing environment that uses testing scripts to regression test all parts of a new software product. These testing scripts are created by a developer and ensure that as new items are added, the previously tested items maintain their same functionality. Since the testing scripts are setup in a GUI environment that uses capture technology to generate them directly from the software environment that is being tested, the time required to create them is minimal and the expected results are easy to predict (HP, 2007). This ease of use helps control the testing timeline so that the projects stay on track. In addition, QuickTest provides customizable reports about errors that can include screenshots and other information to make it easier for developer to recreate the errors so they can be resolved (HP, 2007). Smith maintains separate testing hardware to handle its testing needs. This environment utilizes quad-core Intel processors running Windows Vista and equipped with maximum memory capabilities so that the testing can run as quickly as possible. The separate environment ensures that the testing can run simultaneous to the development whenever possible so that the timeline set up for any project can be strictly adhered to. Formal Program Specifications Format The format of the specifications that will be used by the software engineers at Smith will is detailed in the attachment named: System Requirements Specification. This specification includes a complete description of the requirements of the system to be built. Detailed instructions are included to aid developers in the definition of the requirements and what will be one to satisfy that requirement. Conclusion Smith Consulting takes pride in providing professional development of systems for our clients. Having the processes and requirements in place as outlined in this document such as our quality assurance processes and our developer and user acceptance testing ensures that the systems that Smith Consulting builds remain in scope with the project plan and within budget. Our processes also help to ensure that the client and key stakeholders are engaged throughout the development lifecycle. References Everett, G. D. amp; McLeod, Jr. , R. (2007). Software Testing. Retrieved February 27, 2010, from University of Phoenix eCampus, Entire eBook. BSA385 – Introduction to Software Engineering. Frenzel, C. W. , amp; Frenel, J. C. (2004). Management of Information Technology, 4E. Retrieved February 14, 2010, from University of Phoenix eCampus, Entire eBook. BSA385 – Introduction to Software Engineering. Hewlett-Packard Development Company, L. P. (2007). HP QuickTest Professional software Data sheet. Retrieved March 5, 2010, from https://h10078. www1. p. com/cda/hpdc/navigation. do? action=downloadPDFamp;caid=3885amp;cp=54_4000_100amp;zn=btoamp;filename=4AA1-2116ENW. pdf Hewlett-Packard Development Company, L. P. (2010). HP QuickTest Professional software System Requirements. Retrieved March 5, 2010 from https://h10078. www1. hp. com/cda/hpms/display/main/hpms_content. jsp? zn=btoamp;cp=1-11-127-24^9674_4000_100__ Attachments Purpose: The System Requirements Specification (SRS ) is a complete description of the requirements of the system to be built. It is derived from Customer Requirements. It covers all the business functions, inputs, outputs, and system interfaces of the proposed project, and answers these questions: * What is the system or software supposed to do (from the customer’s point of view)? * What users, system hardware, other hardware, and other software does the system interact with? * What are the performance requirements, such as speed, recovery, and capacity? * What are any constraints on design? Scope: The System Requirements Specification must be completed for any systems development project. Instructions: Identify instructions for using the template. 1. Prior to releasing remove this template cover page. This is part of the template not part of the finished document. 2. Angle brackets (lt; gt;) indicate information to be input for specific project. Remove angle brackets (lt; gt;) when information is entered. 3. Template sections which do not apply to the system can be labeled as â€Å"Do not Apply† or removed from the document as long as the base requirement of information listed above has been recorded. 4. Template instructions are italicized and should be removed from the document. 5. Open the header/footer and update the appropriate information to the header. No information needs to be updated in the footer – this will occur automatically each time the file is closed. * lt;Project Namegt; System Requirements Specification Rev lt;1. 0, 1. x, 2. 0,gt; Revision # of document. Use 0. 1 thru 0. 9 for pre-approval drafts. Use 1. 0 thru 9. 9 for approved copies. lt;Dategt; Date of revision Prepared by: lt;Authorgt; * Approvers lt;Include a place for, and acquire approval by all critical project stakeholders, as required by the Software Development Guidelines. More approvals may be included as deemed appropriate. gt; The following â€Å"Approvers† are responsible for reviewing this System Requirements Specification and agree with the project’s requirements. The approvers understand and will support the responsibilities described herein for their organization. Note: Approver signatures are captured electronically in the Electronic Qualification Document Management System (EQDMS). lt;Namegt;| | Project Lead| | lt;Namegt;| | | | lt;Namegt;| | lt;other reviewergt;| | lt;Namegt;| | lt;Key Stakeholder #1gt;lt;lt;Titlegt;gt;| | lt;Namegt;| | lt;Key Stakeholder #2gt;lt;lt;Titlegt;gt;| | | | | Document History Date Revised| Version No. | Author| Reason for changes| | | | | | | | | | | | | | | | | | | | | Introduction7 Purpose7 Scope7 Definitions, Acronyms, and Abbreviations7 References7 Overview7 Overall Description7 System Perspective7 System Requirements7 System Interfaces8 User Interfaces8 Hardware Interfaces8 Software Interfaces8 Communications Interfaces8 Memory Constraints8 Operations8 Site Adaptation Requirements8 System Functions9 User Characteristics9 Constraints9 Assumptions and Dependencies9 Apportioning of Requirements9 Functional Requirements9 Performance Requirements9 Logical Database Requirements10 Design Constraints10 Standard Compliance10 Software System Attributes10 Supporting Information10 Introduction Purpose The SRS identifies all of the system requirements. The system requirements are derived from customer requirements as well as perceived customer needs and specific local and regulatory requirements. The SRS identifies all the system requirements sufficient for the developers to develop a system which meets customer expectations. In addition, the SRS provides sufficient detail for complete system validation. The audience is the entire project team and customer/sponsor representatives. Scope lt;Identify the software product(s) to be produced by name, explain what the software product will and will not do, include relevant benefits, objectives, and goals of the softwaregt; Definitions, Acronyms, and Abbreviations lt;Define all terms, acronyms, and abbreviationsgt; References lt;Provide a complete list of all documents referenced elsewhere in this document, identify each document by title, date, and publishing organization, specify the sources from which the reference to an appendix or another documentgt; Overview lt;Describe what the SRS contains and explain how the SRS is organizedgt; Overall Description System Perspective lt;Describe whether the system is totally self-contained or has interactions with other systems within or outside of its environment, a block diagram can be added here to show interconnections with other systems and requirements related to overall systemsgt; System Requ irements lt;The purpose of this section is to describe all of the software requirements to a level of detail sufficient to enable designers to design a system to satisfy those requirements and QA testers to test that the system satisfies those requirements. As a minimum, every requirement should include a description of every input/stimulus into the system and every output/response from the system, and all functions performed by the system in response to an input or in support of an output. gt; lt;The following sub-sections 3. 1 – 3. 15 identify different requirements categories. It is unlikely that every project will have requirements in each category. It is not necessary to identify requirements in each category, they are provided only as guidance to insure that each type of requirement is considered. All requirements should be listed in section 3 and each requirement should be uniquely numbered. gt; System Interfaces lt;List each system interface and identify the functionality of the software to accomplish the system requirement and the interface description to match the systemgt; * User Interfaces lt;Specify the logical interface between the software product and its users, including configuration characteristics such as required screen formats, page or window layouts, content of reports or menus, or availability of programmable function keys necessary to accomplish the software requirements. Specify the aspects of optimizing the interface with the person(s) who must use the system, an example would be constructing a list of do’s and don’ts on how the system will appear to the usergt;. * Hardware Interfaces lt;Specify the logical characteristics of each interface between the software product and the hardware components of the system. This includes configuration characteristics, supported devices/how they will be supported (full-screen vs. line-by-line support for a terminal for example), and protocolsgt;. * Software Interfaces lt;Describe the use of other required software products and interfaces with other application systems. Describe the purpose of the interfacing software, and the definition of interface in terms of message content and format. Reference the documents(s) defining the interface(s). Include name, mnemonic, specification number, version number, and source for each required software productgt;. * Communications Interfaces lt;Specify the vario us interfaces to communications such as local network protocols, etc. gt; Memory Constraints lt;Specify any applicable characteristics and limits on primary and secondary memory. gt; Operations lt;List the following if not already listed in the User Interface section above: the various modes of operations in the user organization (user initiated operation), periods of interactive operations and periods of unattended operations, data processing support functions, and backup and recovery operations. gt; Site Adaptation Requirements lt;Define the requirements for any data or initialization sequences that are specific to a given site, mission, or operational mode (grid values, safety limits, etc. , and the site or mission-related features that should be modified to adapt the software to a particular installation. System Functions lt;Provide a summary of the major functions that the software will perform. The functions should be organized in a way that makes the list of functions understandable to the customer or to anyone else reading the document for the first time. Graphics can be used to show the different functions and their relationships, and the logical relationships among variables. gt; User Characteristics lt;Describe the most general characteristics of the intended users of the product including educational level, experience, and technical expertise. gt; Constraints lt;Describe any items that will limit the developer’s options such as regulatory policies, hardware limitations, interfaces to other applications, parallel operation, audit functions, control functions, higher-order language requirements, signal handshake protocol, reliability requirements, criticality of the application, and safety and security considerations. gt; Assumptions and Dependencies lt;List any changes t o the software that can affect the requirements listed in this document and could result in the update of this software requirements specifications. Example is the availability of an operating system on a specific hardware environment that is designated for the software product. gt; Apportioning of Requirements lt;Identify requirements that may be delayed until future versions of the system. gt; Functional Requirements lt;Describe the fundamental actions that must take place in the software in accepting and processing the inputs and in processing and generating the outputs. Examples are validity checks on the inputs, exact sequence of operations, responses to abnormal situations, effect of parameters, and relationship of outputs to inputs, including input/output sequences and formulas for input to output conversion. Partition the functional requirements into sub functions as necessary. gt; Performance Requirements lt;List requirements in measurable terms related to the following: * -Static numerical requirements: such as the number of terminals to be supported, the number of simultaneous users to be supported, and amount and type of information to be handled, and * -Dynamic numerical requirements: such as the number of transactions and tasks and the amount of data to be processed within certain time periods for both normal and peak workload conditions. gt; Logical Database Requirements lt;Specify the logical requirements for any information that is to be placed into a database such as types of information used by various functions, frequency of use, accessing capabilities, data entities and their relationships, integrity constraints, and data retention requirements. gt; Design Constraints lt;Specify design constraints that can be imposed by other standards, hardware limitations, etc. gt; Standar d Compliance lt;Specify requirements derived from existing standards or regulationsgt; Software System Attributes lt;Describe other software attributes that can serve as requirements such as factors required to establish reliability, availability, security, Smith Consulting Software Essay Example Smith Consulting Software Essay System Analysis of Smith Consulting System Documentation BSA/385 Contents Abstract3 System Analysis of Smith Consulting System Documentation4 Introduction4 Customer Engagement Approach4 Software Development Processes and Procedures4 Quality Assurance Processes and Procedures6 Testing Procedures7 Developer Testing8 Reliability9 Accuracy9 Developer Performance Testing10 Developer Fault Testing10 User Acceptance Reliability11 User Acceptance Accuracy11 User Acceptance Fault Tolerance11 Test System Infrastructure15 Hardware/Software Capabilities15 Formal Program Specifications Format15 Conclusion16 Attachments18 Abstract The LTA (Learning Team A) group has been asked to define, develop, and propose standards for a software testing environment at Smith Consulting. The LTA group will review several aspects of the system documentation currently being used by Smith Consulting and provide proposed solutions for each of the areas defined in this document. System Analysis of Smith Consulting System Documentation Introduction Smith Consulting (Smith) has tasked LTA (Learning Team A) with developing standardized project approach and testing procedures so that prospective clients are confident that Smith is performing their contractual obligations efficiently. These procedures will be generalized so that they can be applied to any project that Smith takes on and have sufficient documentation so that the procedures are correctly applied to each project. This ensures that Smith has repeatable processes in place and can put more resources towards completing the project rather than developing the procedures to complete the project. We will write a custom essay sample on Smith Consulting Software specifically for you for only $16.38 $13.9/page Order now We will write a custom essay sample on Smith Consulting Software specifically for you FOR ONLY $16.38 $13.9/page Hire Writer We will write a custom essay sample on Smith Consulting Software specifically for you FOR ONLY $16.38 $13.9/page Hire Writer Customer Engagement Approach Smith is dedicated to the long-term success of the project. Smith does not approach the projects as a system to be designed and left with the client to manage. Smith offers flexible management terms from support to full-time maintenance of any application Smith designs, ensuring that our dedication to service extends through the life of the product. Smith also strives to help every end-user, helping them to fully understand and embrace the new technology. Smith understands the challenges end-users face when moving to new technology, and we want every employee to be comfortable using the system. Software Development Processes and Procedures Smith realizes that there may not be one clear-cut solution for all software development projects and seeks to use a process that takes into account as many variables as possible when developing new software. This means that Smith will first need to determine the needs for the company. The first step in this process is to identify the stakeholders and develop a project timeline and budget. These factors will help drive the project toward the estimated completion date. The stakeholders will include members of Smith so that the project is kept manageable and realistic as far as time of completion and resources needed. The next step is creating an analysis team to work with the various stakeholders to understand what the company needs the new system to do. As the analysis is completed a more formalized design will be presented to the same stakeholders in the form of a data flow diagram to ensure that all the needs are being met. At this time the stakeholders will have the ability to present changes as part of the change ontrol process described as part of the Smith quality assurance processes. Smith will also implement additional design methods as required by the specific needs of a project. These methods include the use of new models, tools, and techniques in order to fully understand the system requirements. It may be necessary to bring in third-party vendors to provide and use the tools that these methods require. These vendors will be subject to a procurement process as indicated in the contract for the project and are subject to a determination of need by the stakeholders. The next step is for a finalized data flow diagram to be presented to a design team who will then analyze and determine the best approach for implementing the design. This will include determination of a tool, and the solicitation of vendors to provide the tool as needed. The stakeholders will again be consulted before any final determinations are made. Once a tool or vendor is determined the design team will work on implementing the design. The design will be implemented as part of the project plan timeline and Smith will provide developer testing in addition to end-user acceptance testing. This ensures that the final product matches the goals of the system as laid out in the project plan. Once acceptance testing is completed the system will be implemented and the system will enter its training phase as indicated in the project plan. After full implementation, the system will enter the maintenance phase. Depending on the term of the contract, Smith will be involved in the maintenance phase of the system through its live cycle. All contracts include phone support for as-designed elements of the system. Quality Assurance Processes and Procedures Smith is dedicated to providing quality of the highest level in all its services to its clients. This quality is ensured through the use of tools put in place at the beginning of the project. The most important tool in use is the project manager. All of Smith’s project managers are qualified, experienced managers who take a hands-on approach to ensuring that a project stays on the timeline and within the budget. This includes weekly updates to shareholders as well as daily stand up meetings to ensure that everyone is on track and there are no concerns to delay the project. Any concerns are immediately made known and the manager will do his or her best to provide more resources to address the problem with minimal effects to the project plan. To ensure that the project is kept within scope, a strict change control process will be used to determine what parts of the project will be able to be changed and when. This can be different for every project; Smith approaches each project plan in a similar fashion, including setting priority, and secondary goals. The stakeholders have the ability to define these goals during the analysis phase as well as a say in the change control process. Primary goals are strictly followed and changes are made to these goals only when the outcome of the project is in jeopardy. Secondary goals are more flexible and their change control process is not as severe; however the process is strictly adhered to. This adherence to the change control process ensures that the project is a success and not dependent on the success or failure of individual parts. Testing Procedures For each level of testing that takes place, Smith will develop charts detailing the testing step to be performed, an example of the chart is shown in example 1. This chart includes sections to identify what is being tested (Actor), what is being done (Action) and a description of the testing task. In addition, a secondary chart will be provided that details each step to be performed to complete the test case in question. Actor| Action| Description| Customer/Employee/Etc. | Click Button 1| Button 1 starts the applications and takes the user to the login, etc. | Customer/Employee/Etc. | Click Button 2| Button 2 logs the current user into the system| Continue†¦| | | Step| Step Expected Result| 1. Launch Application/Etc. Application window should open and prompt user for credentials, etc. | 2. Login/Etc. | User should be logged into application and appropriate menu items should be shown based on user’s security level, etc. | Continue†¦| | Figure 1 – Sample Software Capability Testing Diagrams An additional process flow that will be followed by Smith engineers is to chart the process flow for all testing for easy review by clients and engineering staff. The format of the process flow is shown in Figure 2. Figure 2 – Sample Testing Data Flow Diagram Developer Testing Smith requires its developers to perform testing on each piece of a project that they are responsible for working on. Each of these tests ensures that the software is ready for the next step in the process, whether that is integration with other pieces or implementation. The developers are required to ensure that the testing covers reliability, accuracy, fault tolerance and performance as required by the architecture design documentation for the project. Reliability Reliability testing is important to ensure that the system is capable of handling input and output in all situations it is likely to encounter during use. This includes programming for proper error handling should a user input unexpected data, as well as for handling exceptions on the data output. Developers use the architecture document to control the input and output and use exception handling to notify the user that something has gone wrong without passing the bad data along to the next process. In order to test reliability developers will purposefully input bad data and the system is required to handle it correctly. The developers are required to check not only that the errors are thrown properly, but that they are informative and the bad data are not output. Accuracy Accuracy testing is important because the input must be received, stored, transformed and output correctly. Any failure in any of these steps will affect the accuracy of the system. Developers must develop algorithms to transform the data properly so that the output is as expected. Since system algorithms can be achieved in multiple ways that achieve the same results, it is important that the accuracy is tested as often as possible. Developers are required to develop and input use cases to ensure the data are being output as required. If the output is not what is expected the developers can roubleshoot and track down the problem and run the use cases again. The system architecture document will detail the results needs, but developers often must use creativity to develop a solution to match the requirements. Accuracy testing at the developer stage is a key step for accuracy in the implemented system. Developer Performance Testing Performance testing must be performed during the development phase to ensure that the system does not have any resource or application issues prior to deliver to the customer and conforms to the stated needs of the customer. At Smith performance testing is performed during unit testing and during integration testing. With unit performance testing the developer is able to identify issues easier than during integration testing. During the integration testing phase any module interaction that was not able to be identified during unit testing will be tested. Performance testing includes the development of test cases that test each module within the client application, if resources are required such as network connectivity or file I/O then these systems will also be tested. The expected results of performance testing will be defined during the system analysis phase and will be approved by the client as acceptable criteria. Developer Fault Testing Developer fault testing is a technique used by Smith to inject errors into the software/hardware project to test the capability of the system to handle the errors in expected ways. This method of testing aids developers in the identification of the inability of the application to respond to system conditions that are expected during the normal use of the product. By using fault testing the developers can include processes that respond to these error conditions in a meaningful way. User Acceptance Reliability Like developer reliability testing, user acceptance testing for reliability is important to ensure that the system will reliably handle inputs and outputs. In UAT, test cases will be provided to selected users from the customer’s side. These test cases will allow users to test inputting information into the system so the software can demonstrate that it is capable of handling various types of input by executing on it properly. The UAT will allow the customer to provide feedback to Smith on the test cases and anything that did not perform as expected so that Smith can make any modifications to the system. In the event that Smith has to make any changes, UAT will begin again after the system has been adjusted. User Acceptance Accuracy User acceptance testing for accuracy will provide users with tests to ensure that the system is capable of handling inputs, loading, processing, storage, and outputs accurately. These tests should be consistent with the original testing that took place in the development of the algorithms to ensure that the expected results are produced by the system to the degree of accuracy that is required by the application. All calculations that the system needs to be able to perform should be tested in the UAT to demonstrate to the customer that the system is capable of handling the calculations correctly. This is the chance for the customer to fully test the system and provide any feedback to Smith. The user tests will be developed to demonstrate the full capabilities of the system. User Acceptance Fault Tolerance Once the software has been designed and implemented, the end-user’s who will utilize the software will be able to test the software’s performance (User Acceptance Testing, 2010). Allowing the end-user’s to test the software will allow IT personnel to make any changes and fix bugs that may cause future errors. All though much software may not be productive due to errors, Smith depends on test results and test reports to ensure errors are fixed and errors in the system are debugged. Below are a few tools that are used for user acceptance testing and fault tolerance: * Test Results and Error Reports User Acceptance Sign-off (Figure 3) * Production Systems * Final Installation Procedures and Instructions * Final Documentation and Training Materials * Project Plan * Methodology Compliance Form (User Acceptance Testing, 2010) (Figure 4). Smith will use these tools to ensure the User-Interface is accurate and ready for deployment. Figures 3 and 4 are sample templates fo r some of the tools used. Figure 3 – User Acceptance Sign-Off Sheet Figure 4 Methodology Compliance Form Test System Infrastructure Hardware/Software Capabilities Smith uses QuickTest Professional 10. (QuickTest) software to perform testing on all of its software projects. The software is an automated testing environment that uses testing scripts to regression test all parts of a new software product. These testing scripts are created by a developer and ensure that as new items are added, the previously tested items maintain their same functionality. Since the testing scripts are setup in a GUI environment that uses capture technology to generate them directly from the software environment that is being tested, the time required to create them is minimal and the expected results are easy to predict (HP, 2007). This ease of use helps control the testing timeline so that the projects stay on track. In addition, QuickTest provides customizable reports about errors that can include screenshots and other information to make it easier for developer to recreate the errors so they can be resolved (HP, 2007). Smith maintains separate testing hardware to handle its testing needs. This environment utilizes quad-core Intel processors running Windows Vista and equipped with maximum memory capabilities so that the testing can run as quickly as possible. The separate environment ensures that the testing can run simultaneous to the development whenever possible so that the timeline set up for any project can be strictly adhered to. Formal Program Specifications Format The format of the specifications that will be used by the software engineers at Smith will is detailed in the attachment named: System Requirements Specification. This specification includes a complete description of the requirements of the system to be built. Detailed instructions are included to aid developers in the definition of the requirements and what will be one to satisfy that requirement. Conclusion Smith Consulting takes pride in providing professional development of systems for our clients. Having the processes and requirements in place as outlined in this document such as our quality assurance processes and our developer and user acceptance testing ensures that the systems that Smith Consulting builds remain in scope with the project plan and within budget. Our processes also help to ensure that the client and key stakeholders are engaged throughout the development lifecycle. References Everett, G. D. amp; McLeod, Jr. , R. (2007). Software Testing. Retrieved February 27, 2010, from University of Phoenix eCampus, Entire eBook. BSA385 – Introduction to Software Engineering. Frenzel, C. W. , amp; Frenel, J. C. (2004). Management of Information Technology, 4E. Retrieved February 14, 2010, from University of Phoenix eCampus, Entire eBook. BSA385 – Introduction to Software Engineering. Hewlett-Packard Development Company, L. P. (2007). HP QuickTest Professional software Data sheet. Retrieved March 5, 2010, from https://h10078. www1. p. com/cda/hpdc/navigation. do? action=downloadPDFamp;caid=3885amp;cp=54_4000_100amp;zn=btoamp;filename=4AA1-2116ENW. pdf Hewlett-Packard Development Company, L. P. (2010). HP QuickTest Professional software System Requirements. Retrieved March 5, 2010 from https://h10078. www1. hp. com/cda/hpms/display/main/hpms_content. jsp? zn=btoamp;cp=1-11-127-24^9674_4000_100__ Attachments Purpose: The System Requirements Specification (SRS ) is a complete description of the requirements of the system to be built. It is derived from Customer Requirements. It covers all the business functions, inputs, outputs, and system interfaces of the proposed project, and answers these questions: * What is the system or software supposed to do (from the customer’s point of view)? * What users, system hardware, other hardware, and other software does the system interact with? * What are the performance requirements, such as speed, recovery, and capacity? * What are any constraints on design? Scope: The System Requirements Specification must be completed for any systems development project. Instructions: Identify instructions for using the template. 1. Prior to releasing remove this template cover page. This is part of the template not part of the finished document. 2. Angle brackets (lt; gt;) indicate information to be input for specific project. Remove angle brackets (lt; gt;) when information is entered. 3. Template sections which do not apply to the system can be labeled as â€Å"Do not Apply† or removed from the document as long as the base requirement of information listed above has been recorded. 4. Template instructions are italicized and should be removed from the document. 5. Open the header/footer and update the appropriate information to the header. No information needs to be updated in the footer – this will occur automatically each time the file is closed. * lt;Project Namegt; System Requirements Specification Rev lt;1. 0, 1. x, 2. 0,gt; Revision # of document. Use 0. 1 thru 0. 9 for pre-approval drafts. Use 1. 0 thru 9. 9 for approved copies. lt;Dategt; Date of revision Prepared by: lt;Authorgt; * Approvers lt;Include a place for, and acquire approval by all critical project stakeholders, as required by the Software Development Guidelines. More approvals may be included as deemed appropriate. gt; The following â€Å"Approvers† are responsible for reviewing this System Requirements Specification and agree with the project’s requirements. The approvers understand and will support the responsibilities described herein for their organization. Note: Approver signatures are captured electronically in the Electronic Qualification Document Management System (EQDMS). lt;Namegt;| | Project Lead| | lt;Namegt;| | | | lt;Namegt;| | lt;other reviewergt;| | lt;Namegt;| | lt;Key Stakeholder #1gt;lt;lt;Titlegt;gt;| | lt;Namegt;| | lt;Key Stakeholder #2gt;lt;lt;Titlegt;gt;| | | | | Document History Date Revised| Version No. | Author| Reason for changes| | | | | | | | | | | | | | | | | | | | | Introduction7 Purpose7 Scope7 Definitions, Acronyms, and Abbreviations7 References7 Overview7 Overall Description7 System Perspective7 System Requirements7 System Interfaces8 User Interfaces8 Hardware Interfaces8 Software Interfaces8 Communications Interfaces8 Memory Constraints8 Operations8 Site Adaptation Requirements8 System Functions9 User Characteristics9 Constraints9 Assumptions and Dependencies9 Apportioning of Requirements9 Functional Requirements9 Performance Requirements9 Logical Database Requirements10 Design Constraints10 Standard Compliance10 Software System Attributes10 Supporting Information10 Introduction Purpose The SRS identifies all of the system requirements. The system requirements are derived from customer requirements as well as perceived customer needs and specific local and regulatory requirements. The SRS identifies all the system requirements sufficient for the developers to develop a system which meets customer expectations. In addition, the SRS provides sufficient detail for complete system validation. The audience is the entire project team and customer/sponsor representatives. Scope lt;Identify the software product(s) to be produced by name, explain what the software product will and will not do, include relevant benefits, objectives, and goals of the softwaregt; Definitions, Acronyms, and Abbreviations lt;Define all terms, acronyms, and abbreviationsgt; References lt;Provide a complete list of all documents referenced elsewhere in this document, identify each document by title, date, and publishing organization, specify the sources from which the reference to an appendix or another documentgt; Overview lt;Describe what the SRS contains and explain how the SRS is organizedgt; Overall Description System Perspective lt;Describe whether the system is totally self-contained or has interactions with other systems within or outside of its environment, a block diagram can be added here to show interconnections with other systems and requirements related to overall systemsgt; System Requ irements lt;The purpose of this section is to describe all of the software requirements to a level of detail sufficient to enable designers to design a system to satisfy those requirements and QA testers to test that the system satisfies those requirements. As a minimum, every requirement should include a description of every input/stimulus into the system and every output/response from the system, and all functions performed by the system in response to an input or in support of an output. gt; lt;The following sub-sections 3. 1 – 3. 15 identify different requirements categories. It is unlikely that every project will have requirements in each category. It is not necessary to identify requirements in each category, they are provided only as guidance to insure that each type of requirement is considered. All requirements should be listed in section 3 and each requirement should be uniquely numbered. gt; System Interfaces lt;List each system interface and identify the functionality of the software to accomplish the system requirement and the interface description to match the systemgt; * User Interfaces lt;Specify the logical interface between the software product and its users, including configuration characteristics such as required screen formats, page or window layouts, content of reports or menus, or availability of programmable function keys necessary to accomplish the software requirements. Specify the aspects of optimizing the interface with the person(s) who must use the system, an example would be constructing a list of do’s and don’ts on how the system will appear to the usergt;. * Hardware Interfaces lt;Specify the logical characteristics of each interface between the software product and the hardware components of the system. This includes configuration characteristics, supported devices/how they will be supported (full-screen vs. line-by-line support for a terminal for example), and protocolsgt;. * Software Interfaces lt;Describe the use of other required software products and interfaces with other application systems. Describe the purpose of the interfacing software, and the definition of interface in terms of message content and format. Reference the documents(s) defining the interface(s). Include name, mnemonic, specification number, version number, and source for each required software productgt;. * Communications Interfaces lt;Specify the vario us interfaces to communications such as local network protocols, etc. gt; Memory Constraints lt;Specify any applicable characteristics and limits on primary and secondary memory. gt; Operations lt;List the following if not already listed in the User Interface section above: the various modes of operations in the user organization (user initiated operation), periods of interactive operations and periods of unattended operations, data processing support functions, and backup and recovery operations. gt; Site Adaptation Requirements lt;Define the requirements for any data or initialization sequences that are specific to a given site, mission, or operational mode (grid values, safety limits, etc. , and the site or mission-related features that should be modified to adapt the software to a particular installation. System Functions lt;Provide a summary of the major functions that the software will perform. The functions should be organized in a way that makes the list of functions understandable to the customer or to anyone else reading the document for the first time. Graphics can be used to show the different functions and their relationships, and the logical relationships among variables. gt; User Characteristics lt;Describe the most general characteristics of the intended users of the product including educational level, experience, and technical expertise. gt; Constraints lt;Describe any items that will limit the developer’s options such as regulatory policies, hardware limitations, interfaces to other applications, parallel operation, audit functions, control functions, higher-order language requirements, signal handshake protocol, reliability requirements, criticality of the application, and safety and security considerations. gt; Assumptions and Dependencies lt;List any changes t o the software that can affect the requirements listed in this document and could result in the update of this software requirements specifications. Example is the availability of an operating system on a specific hardware environment that is designated for the software product. gt; Apportioning of Requirements lt;Identify requirements that may be delayed until future versions of the system. gt; Functional Requirements lt;Describe the fundamental actions that must take place in the software in accepting and processing the inputs and in processing and generating the outputs. Examples are validity checks on the inputs, exact sequence of operations, responses to abnormal situations, effect of parameters, and relationship of outputs to inputs, including input/output sequences and formulas for input to output conversion. Partition the functional requirements into sub functions as necessary. gt; Performance Requirements lt;List requirements in measurable terms related to the following: * -Static numerical requirements: such as the number of terminals to be supported, the number of simultaneous users to be supported, and amount and type of information to be handled, and * -Dynamic numerical requirements: such as the number of transactions and tasks and the amount of data to be processed within certain time periods for both normal and peak workload conditions. gt; Logical Database Requirements lt;Specify the logical requirements for any information that is to be placed into a database such as types of information used by various functions, frequency of use, accessing capabilities, data entities and their relationships, integrity constraints, and data retention requirements. gt; Design Constraints lt;Specify design constraints that can be imposed by other standards, hardware limitations, etc. gt; Standar d Compliance lt;Specify requirements derived from existing standards or regulationsgt; Software System Attributes lt;Describe other software attributes that can serve as requirements such as factors required to establish reliability, availability, security,

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.