A-level Computing/AQA/Paper 1/Systematic approach to problem solving

Analysis
Analysis of the system to identify the requirements and define the problem being solved.

For example, the construction of a website could cover:


 * Data - its origin, uses, volumes and characteristics
 * Procedures - what is done, where, when and how, and how errors and exceptions are handled
 * Future - development plans and expected growth rates
 * Problems with any existing system

In the case of a different type of problem such as a simulation or game, the requirements will still need to cover a similar set of considerations.

Design
When designing the system, some or all of the following should be taken into account:
 * Processing: Documenting and creating the algorithms and appropriate modular structure for the solution.
 * Data structures: how data will be held and how it will be accessed - for example in a dynamic structure such as a queue or tree, or in a file or database
 * Output: content, format, sequence, frequency, medium etc.
 * Input: volume, frequency, documents used, input methods;
 * User interface: screens and dialogues, menus, special-purpose requirements
 * Security: how the data is to be kept secure from accidental corruption or deliberate tampering or hacking
 * Hardware: selection of an appropriate configuration

Implementation
Once the design has been agreed, the programs can be coded. A clear focus needs to be maintained on the ultimate goal of the project, without users or programmers being sidetracked into creating extra features which might be useful, or possible future requirements. Programmers will need to be flexible in accepting user feedback and making changes to their programs as problems or design flaws are detected. In even a moderately complex system it is hard to envision how everything will work together, so iterative changes at every stage are a normal part of a prototyping/agile approach.

Testing
Testing is carried out at each stage of the development process. Once all the programs have been tested with normal, boundary and erroneous data, unit testing, module testing and system testing will also be carried out. The system then needs to be tested by the user to ensure that it meets the specification. This is known as acceptance testing. It involves testing with data supplied by the end user rather than data designed especially for testing purposes. it has the following objectives: Testing is an iterative process, with each stage in the test process being repeated with modifications have to be made owing to errors coming to light at a subsequent stage.
 * to confirm that the system delivered meets the original customer specifications
 * to find out whether any major changes in operating procedures will be needed
 * to test the system in the environment in which it will run, with realistic volumes of data

Evaluation
The evaluation may include a post-implementation review, which is a critical examination of the system three to six months after it has been put into operation. This waiting period allows users and technical staff to learn how to use the system, get used to new ways of working and understand the new procedures required. It allows management a chance to evaluate the usefulness of the reports and on-line queries that they can make, and go through several 'month-end' periods when various routine reports will be produced. Shortcoming of the system, if there are any, will be becoming apparent at all levels of the organisation, and users will want a chance to air their views and discuss improvements. The solution should be evaluated on the basis of effectiveness, usability and maintainability. The post-implementation review will focus on the following:
 * a comparison of the system;s actual performance wit the anticipated performance objectives
 * an assessment of each aspect of the system against preset criteria
 * errors which were made during system development
 * unexpected benefits and problems