Quality in software development and delivery chain

What does the term Quality mean? In my opinion, it’s the EVERYTHING in a process. From the moment when the SOW document starts till the point when you begin the Quality treatment.

That’s why I share below some of the ideas that I find important (many of them from the day-to-day situations and experience in the software development industry).

What tools or techniques can be applied to ensure quality in the software development process and the delivery chain?

SOW clearly specified

This point is described in itself. The higher quality of content in the document, the clearest scope project will have. This point will surely help in case of a change request need(Internal or External).

Detailed engineering document (DID)

The DID is a key piece in the process, like the SOW´s technical extension. With the DID document detailed approved by the client, the technical scope is well delimited for the project stakeholders and execution.
Beyond that, if the DID presents quality, it is an excellent element of a knowledge base for later phases, for example, Integration and Support.

Peer review revision process

This part of the process is oriented to the fact that the product generated by a resource (could be a piece of software or, could be a module, an interface definition, etc.) should be evaluated by a pair. During this process, both peers should generate a discussion or change of ideas based on constructive criticism. This exercise is performed before the delivery to the subsequent stage in each part of the chain.

Try to Minimize the complex (KISS)

Most of the time, complex situations could be solved simply. This makes the analysis to last slightly more time but it is gained at the stage of coding, integration, and later support. Why try to kill a mosquito with a cannonball if with a clap is enough?

Operationally validation by pair-validated unit tests

Each piece of software can be validated unitarily before is delivered to the next stage of integration. For this purpose, the technique of establishing unit tests is proposed and then executed by a pair. In this test executions, not only the results should be evaluated, but the module should be clearly operable. This last definition “should be clearly operable” means simplicity in the operation of beginning, status, and stop, as well as in the process´s log messages.

Documentation generation

Beyond the DID, each module must contain in its deliverable stage, a piece of documentation. In it, you should detail items such as functionality, interface version, OS dependencies, architecture manual and operating manual.

Stop using the phrase “here it works fine”, and analyze the problem

Most of the times (not to say all) customer environments differ in a large percentage of development conditions, this is because in the environment of the Company the deliverables are presented to more than one customer.
On the other hand, in customer’s environments, only the part that corresponds to the customer is presented as a new environment.
The thought: “here works”, is often heard when a problem is raised to the corresponding deliverable generation area.
As a result of this, what happens is that what deploys presents dependencies external to the client’s own environment that causes the deploy to fail or not work as expected.

BaseLine of integration / Product BaseLine

All products are made of smaller pieces of software that evolves and has its internal or external interface versions.
The way to carry an order in an evolutionary way is through the registration of Baselines. This is practically a “snapshot” of the versions of interfaces, modules and external dependencies that ensure that a set of components work correctly from end to end. This helps not only in the development process, but also in traceability over time.

Clear operativity

The key to achieving an excellent post-coding stage and for maintenance to be effective is that the operation of each module must be absolutely clear and understandable by technical personnel.
Logs, regardless of the medium in which they are stored, must have clear chronology and understandable content.

Mid-term/long-term focus (knowing the systems’ transactionality) to properly invest effort

We are the generators of our products, and the key is to know them thoroughly. From its architecture, scalability, functionality, roadmap, improvements, and performance.
In this way, we can see in a simple way if the product we own may or may not meet the expectations of an implementation.
Most of the time I heard “the product does not support more than xxxx transactions per second …” “you have to invest in what support xxxx transactions”, to which I usually respond with the analysis of asking the question of how much is the level of transactions per second required by the functional scenario. Every time the exercise was done, the answer was that no effort was needed at this particular point.

As a reflection, the quality is only thought based on the processes or deliverables? DO NOT. Quality must be part of us, of our being. If that is in our DNA, everything else will naturally follow and with a good level of quality.

You may also like

5G and V2X

working at Intraway

From Zero to Trainee

manual testing

Good Manual Testing Tips