a. What You Need To Know
Naming conventions
A naming convention is a consistent set of rules used to name variables, functions, methods, classes, objects and other solution elements.
Good naming makes code easier to read, test and maintain.
Camel case
Camel case joins words together with no spaces, and each word after the first begins with a capital letter.
Examples:
studentNamecalculateTotalcurrentScore
Snake case
Snake case uses underscores to separate words.
Examples:
student_namecalculate_totalcurrent_score
Hungarian notation
Hungarian notation adds a prefix to show the data type or structure before the main name.
Examples:
iNumStudentsstrFirstNamearrScores
Key Point
The best names are short, meaningful and consistent. A good name should help someone understand purpose quickly.
Internal documentation
Internal documentation means the comments and notes written inside source code.
Its purpose is to help a programmer understand how the code works, why certain decisions were made, and how the code can be maintained later.
Useful internal documentation can include:
- a header comment at the top of the file
- comments explaining functions, methods, inputs and outputs
- comments clarifying a complex section of logic
- notes about testing, upgrades or changes
What good comments do
Good comments:
- add meaning that the code itself does not already make obvious
- help future maintenance
- reduce time spent re-reading complex logic
What poor comments do
Poor comments simply restate the code in words.
Common Error
Comments should be meaningful and non-trivial. Writing x = x + 1 // add 1 to x adds almost no value.
Testing
Testing checks whether software behaves as expected.
Testing is not just about trying the program once. It is a systematic process that involves:
- choosing test data
- predicting expected results
- running the program
- recording actual results
- fixing errors when results do not match
Test cases
A test case is a planned set of steps used to check whether part of the software works correctly.
Test data
Test data is the data chosen for those tests.
Expected and actual results
Expected results are what should happen if the algorithm is correct. Actual results are what the program really produces during testing.
Boundary values
Boundary testing uses values at, below and above important limits. It is especially useful when range checking is part of the solution.
Examples:
- one below the minimum
- exactly on the minimum
- exactly on the maximum
- one above the maximum
Boundary Testing
If a condition checks a boundary, do not only test the middle of the range. Test just outside and just inside the boundary as well.
Debugging
Debugging is the process of finding and fixing errors in software.
Breakpoints
Breakpoints pause a program at a selected line so the programmer can inspect variable values and program state.
Debugging statements
Debugging statements are extra lines, such as print() statements, added temporarily to show values or track the flow of execution.
Trace tables
Trace tables simulate the flow of execution step by step. They are especially useful for finding logic errors in algorithms.
Types of errors
Syntax errors
Syntax errors break the rules of the programming language. These usually stop the program from compiling or running.
Runtime errors
Runtime errors happen while the program is running. They often cause crashes or unexpected error messages.
Examples include:
- divide-by-zero
- opening a file that does not exist
- invalid input not being handled properly
Logic errors
Logic errors happen when the code runs, but the output is wrong because the algorithm or condition is incorrect.
Distinguishing the three
Syntax errors stop code from working at all. Runtime errors happen during execution. Logic errors let the program run, but produce incorrect behaviour.
Why this section matters
A working program is not automatically a good program. It also needs to be readable, testable and maintainable.
Strong students do not just write code and hope. They:
- name things clearly
- document important logic
- choose test data on purpose
- debug systematically
What this means for your folio
In a folio task, you should be able to:
- apply a consistent naming convention
- include useful internal documentation
- produce test data and expected results
- explain how you found and fixed errors
That process evidence is often just as important as the final output.