Task 2

Identifying and Fixing Defects

Systematically test existing code against requirements using comprehensive test data categories, identify defects through evidence-based analysis, and implement quality fixes.

AO2 (Application) - 41%
Assessment Objective
3-4 hours
Estimated Time
Challenging
Difficulty Level

Task Requirements

Core Task Description

Assess given code against stated requirements. Design and run tests with clearly identified test data: valid, valid extreme, invalid, invalid extreme, erroneous and record expected versus actual results.

Correct errors and adjust the code so it meets the brief while following sound programming conventions.

Learning Objectives

Analyze existing code against stated requirements systematically
Design comprehensive test cases with appropriate test data categories
Execute tests methodically and document results accurately
Identify defects through logical analysis and testing evidence
Apply appropriate debugging techniques to locate root causes
Implement fixes following sound programming conventions

Time Management

Code Analysis45 min
Test Design60 min
Test Execution90 min
Defect Analysis30 min
Code Correction75 min

Total Estimated5 hours

Prerequisites

Code reading and analysis skills
Testing methodologies
Debugging techniques
Programming conventions

Test Data Categories (Required)

1

Valid Data

Normal input that should be accepted and processed correctly

Example Test Cases:

  • Typical user inputs
  • Standard data formats
  • Expected value ranges

Purpose:

Verify normal functionality works as intended

2

Valid Extreme

Valid input at the boundaries of acceptable ranges

Example Test Cases:

  • Maximum/minimum values
  • Empty strings (if valid)
  • Boundary conditions

Purpose:

Test edge cases within valid parameters

3

Invalid Data

Input that violates business rules but is structurally correct

Example Test Cases:

  • Negative ages
  • Future birth dates
  • Duplicate IDs

Purpose:

Verify appropriate error handling for rule violations

4

Invalid Extreme

Input that exceeds system boundaries or formats

Example Test Cases:

  • Overly long strings
  • Numbers beyond data type limits
  • Special characters

Purpose:

Test system robustness against extreme invalid input

5

Erroneous Data

Input that is fundamentally malformed or corrupted

Example Test Cases:

  • Non-numeric text in number fields
  • Malformed dates
  • SQL injection attempts

Purpose:

Verify security and stability against malformed input

Required Deliverables

Annotated Test Records

Comprehensive test documentation showing all test cases, data, expected results, actual results, and analysis

Format: Spreadsheet or structured document with clear annotations

60%

Corrected Code

Modified source code with all identified defects fixed and improvements implemented

Format: Source code files with clear comments explaining changes made

40%

Success Criteria & Assessment

Test Design & Execution

Comprehensive test cases covering all 5 data categories
Clear identification of expected vs actual results
Systematic testing approach with documented methodology
Accurate recording of test outcomes and observations

Defect Identification

Correct identification of all defects present in the code
Clear documentation of symptoms and root causes
Evidence-based analysis linking tests to defects found
Prioritization of defects by severity and impact

Code Correction

All identified defects properly fixed
Solutions maintain or improve code quality
Adherence to programming conventions and best practices
Verification that fixes resolve issues without creating new ones

Debugging Strategies

Systematic Analysis

Work through code line by line, understanding logic flow

Best used: For complex logic errors or when multiple issues are suspected

Test-Driven Investigation

Use failing test cases to isolate problem areas

Best used: When tests clearly identify specific failure points

Trace Execution

Follow data flow and variable states through execution

Best used: For data handling or calculation errors

Boundary Analysis

Focus on edge cases and input validation logic

Best used: When extreme value tests are failing

Requirements Comparison

Compare code behavior against original specifications

Best used: To identify logic that doesn't match intended functionality

Preparation Steps

1
Review the provided code thoroughly before starting testing
2
Understand the stated requirements and success criteria
3
Plan your testing approach and data categories systematically
4
Prepare test data examples for each of the 5 categories
5
Set up proper documentation templates for recording results
6
Review debugging techniques and tools available to you
7
Practice reading and analyzing similar code examples

Common Mistakes to Avoid

Incomplete test coverage missing one or more data categories
Poor test data selection that doesn't effectively reveal defects
Inadequate documentation of expected vs actual results
Fixing symptoms rather than addressing root causes
Introducing new bugs while fixing existing ones
Poor annotation and explanation of testing process
Failure to verify that fixes actually resolve the issues
Not following consistent coding conventions in corrections

Recommended Testing Process

Phase 1: Analysis & Planning

  1. 1Read and understand the code requirements thoroughly
  2. 2Analyze the provided code structure and logic flow
  3. 3Design test cases for all 5 data categories
  4. 4Prepare expected results for each test case

Phase 2: Execution & Analysis

  1. 5Execute tests systematically and record actual results
  2. 6Compare expected vs actual results for each test
  3. 7Identify and document all defects found
  4. 8Analyze root causes and plan corrections
Next: Task 3 →← Previous: Task 1