Software Quality – Back to basics

Quality Assurance & Quality Control are integral processes of the production/life cycle of any product in most of the industries. While quality governance is an age-old process in the manufacturing industry, it is relatively an emerging topic in the software industry, which itself is only thirty or forty years young. However, of late there has been a surge of too many well-defined quality processes and standards injected into the software development life cycle that half of the production time is eaten up by these measures. In this article, let me try to do a root-cause analysis on poor software quality and see if these processes alone can really drastically improve the product quality. Or is it just that we are missing some basics here?

QA, QC, QE, QM, Q-what?

Quality Assurance (QA) is the overall process innovation and execution part of quality management (QM) where as quality control (QC) is the execution of actual tasks to conform to the quality standards. QA processes starts as early as the phase of rolling in the requirements and it spreads through the design, execution, delivery and maintenance of the software product. QC or the actual process of testing is a thread of the entire QA process. Quality is not just about functional correctness that verifies each and every input => process => output thread as per the requirements but it also deals with the conformance of performance, scalability, compatibility and usability of the product.

SDLC – The past perfect

In the distant past, software projects were mostly done in research oriented labs. The programmers had plenty of time available with them to design and code given requirements into software pieces. Moreover, most of the software products then targeted one single platform (hardware/operating system). The passionate developers used to thoroughly unit test their code before delivering the same for productive usage. To make programmers’ lives easier most of the operating systems and programming languages had a pretty good life expectancy. Coding then was more concentrated around functional correctness than painting excellent graphical user interfaces. The obvious result was quality software products that were more or less bug free, fast on execution and functionally most correct.

Challenges hit

However, the 80s witnessed the proliferation of the cheaper personal computers and also a number of operating systems, system tools and applications. This period also saw the sudden surge of a number of programming environments, frameworks and graphical user interface oriented programming methodologies. The popularity of the world-wide-web also contributed to alternate programming possibilities and scripting environments. With a large number of possibilities available to code the same ‘Hello World’ program, developers were forced to learn new methodologies and programming mediums pretty quickly and most often on-the-fly during a software project execution. The proficiency levels started coming down and so was the quality of code written and unit tested.

Another thing that added to the chaos was the pace at which business establishments computerized their office or factory setup. Sudden implementation pressure and ever-shortening timelines to deliver also resulted in poor unit testing time. Either there wasn’t sufficient time to test or the unit testing culture was partly ignored!

During the same era, system and office products companies like Microsoft started shipping new and new versions of desktop operating systems almost every year. This also meant that the code written for one O/S had to be run or adapted for multiple platforms. Platform specific coding and testing paved way to another level of quality challenge and multitude compatibility and upgrade tests.

Since the hardware was getting faster and cheaper it wasn’t always possible to do proper benchmark tests. It is also a fact that not many cared about fine-tuning the software to make it run at the best possible performance.

Differentiation and introduction of several roles in software testing took half the burden away from the developers – but the wrong way. Many developers thought that unit and module testing are probably not their job but that of the dedicated test crew!

Introduction of many processes – both software project management and quality related – deviated the focus from coding and testing to conformance of these standards, many of which had nothing to do with actual software quality produced.

So, how can we deliver better quality?

With too many distractions around flashy technologies, user interfaces and over engineered products the testers seem to have forgotten one basic thing. At the
end, testing is all about passing the use cases or conformance. The very first thing that has to be passed by a software system is whether the given valid inputs are resulting in expected outputs as mentioned in the specifications. This use-case based testing has to be the main area of interest to every tester.

Regression errors is one area where again use-cases has to be stressed on. Every single time the codeline changes the testers or automated mechanisms has to perform the use case testing. It should not be just a click around test Integrated automated tests are a must along with every build cycle of the software. These days plenty of automated test tools are available in the market. One can also write custom automated test tools via easy scripting options. But since automated testing also revolves around the scripts/code written for the test program, the quality of the automation tools/code itself should be very high.

The success of the manual testing is only as good as the number of test catalogues/cases available. It should be made a habit that developers and
testers discuss, scrutinize and update the test catalogues/cases periodically. This is something that rarely happens in subsequent releases of products.

Performance and scalability validations should be done with proper IT infrastructure set up and/or with the simulation of the actual usage of the
system. It is a fact that many of the software produced are not tested for performance and scalability. Many a time, the first version of the software is the guinea pig using which probably a more realistic version is rolled out later, that is usable.

Finally, developers should not forget that unit testing is part and parcel of the development cycle. You have to be your own tester in the first place! In the planning phase, most of the inexperienced developers and their bosses make the mistake of not allocating the right effort against the modules to be coded. Most of them just think about the amount of time required to code and not unit/module test their code. This planning mistake can finally result in poor quality and blame game (a commonly found phenomenon in any developer-tester interaction).

Six Sigma and what not?

Sometimes I get a feeling that the information technology people have the tendency to borrow incompatible ideas from other industries to the software world and patch upon. According to me Six Sigma and certain other quality processes are not something that fits into software directly. Well, at the end any theory can be made use of but have we got enough examples where Six Sigma (originally from electronics manufacturing standards) has been judiciously and successfully used in any software company? For me, it has been successful only in creating multiple redundant roles in a project team. Come on, this is not martial arts! We don’t need green belts and black belts here in the software teams – nor do we want historic data on defects per million opportunities.

Finally, ‘Quality Engineering’ itself is an oxymoron – at least in the software context. For me, quality is about the conformance or verification of something that is already engineered (Nobody can produce quality as such!). And quality processes itself should not be made too complicated via unwanted processes and methods. The need of the hour is realistic project plans, committed developers (who are also good unit testers), test cases, integrated test automation tools and passionate manual testers.