This blog post intends to describe some points to give good manual testing tips so that companies can release quality products. The idea is also to combine these manual tests with their documentation and their respective automation.
What is manual testing?
Since the tests are not optional, the manual test should provide:
- A user-friendly usability
- The product should try not to have errors.
- The product must be flexible to user errors.
- The product must have self-help.
- The product must have validations that the client requires.
- The product must work as required by the customer.
- The product must be prepared to be expandable and shut down by modules.
- The product must be able to migrate over time.
- The product must have its documentation of use.
How to perform a good manual test
Know the product logic
Learn how the product works by reading the input documentation.
Hold meetings to cover doubts.
Know product form structure
Know the structure of the product form
Know about the forms that will be taken to develop the project through the technical input documentation. Hold meetings to answer questions.
Know the database schema
Get to know the database and tables that will enter the project’s development through the technical input documentation. Hold meetings to cover doubts.
The logs of the development flow can be followed
You should see the logs of the development flow. Knowing the logs will help us tell the developer where it could be causing an error or not. This will help speed up the analysis time for the developer in search of the error. It will also allow us to know if the flow verification test is correct.
Note: During the manual testing process, you can find improvements or possible improvements for the product. These improvements or possible improvements should be documented for future analysis and commissioning.
The test environment should be as real as possible to what developers deploy in production. This will lead us to have a more successful test. With this, we will avoid constant rollbacks after delivering the product. It will also prevent bugs and even possible penalties.
Having a good understanding of business logic and development flow, forms, database schema, we can create good test cases.
Note: Test cases must be created before the end of development.
Test cases that go the right and wrong way
Different types of validations must be made to the form as indicated below.
That the system works correctly as stated in the documentation accessing correct data in the form validating the proper flow through the log and the impact on the corresponding tables
Data type validation (integers, strings, booleans, etc.)
Character limitation validation (min & max)
Validate spelling errors in the form
Validate uniformity with the other application forms
Test with all empty fields
Perform load tests (concurrent, sequential)
Perform stress tests
Document the manual test
To carry out the manual test documentation, we will require our own tool for it; for example, we can use Jira with the Zephyr plugin.
Taking into account as the ideal that in the input documentation it indicates where and how the project will be developed technically and having defined the test cases for the manual tests, we will proceed to document the use cases, for this we must indicate a nomenclature to their names, for example:.
CU: use case
NameTest: test name
NumCaseUse: number of use cases
Completed would be CU-AP-NameTest-NumCaseUse.
For a real example: CU-Television-ChannelHigh-0001
We will take as an example the use case “CU-Television-ChannelHigh-0001”
The structure of a use case must present at least the following.
- Who reports it
- Who it is for
- Development Version
- Acceptance requirements
- Test steps and results, in this item, you can add test screens.
While executing the manual tests, we can find errors of different types. When somebody finds an error in the application, a ticket must be generated to resolve this error. The ticket must contain minimally:
- A precise detail of the error
- How to reproduce the error
- Log or screens with evidence of the error
Note: The corresponding debug with the technical leader is announced, the criticality of the error is validated. Depending on the criticality of the error, the resolution priority is given. In case of having high priority, a developer must be assigned for the resolution of the bug. The hours for the correction of the error and delivery date must be estimated.