Software Quality – Back to basics

Quality Assurance & Quality Control are integral processes of the production/life cycle of any product in most of the industries. While quality governance is an age-old process in the manufacturing industry, it is relatively an emerging topic in the software industry, which itself is only thirty or forty years young. However, of late there has been a surge of too many well-defined quality processes and standards injected into the software development life cycle that half of the production time is eaten up by these measures. In this article, let me try to do a root-cause analysis on poor software quality and see if these processes alone can really drastically improve the product quality. Or is it just that we are missing some basics here?

QA, QC, QE, QM, Q-what?

Quality Assurance (QA) is the overall process innovation and execution part of quality management (QM) where as quality control (QC) is the execution of actual tasks to conform to the quality standards. QA processes starts as early as the phase of rolling in the requirements and it spreads through the design, execution, delivery and maintenance of the software product. QC or the actual process of testing is a thread of the entire QA process. Quality is not just about functional correctness that verifies each and every input => process => output thread as per the requirements but it also deals with the conformance of performance, scalability, compatibility and usability of the product.

SDLC – The past perfect

In the distant past, software projects were mostly done in research oriented labs. The programmers had plenty of time available with them to design and code given requirements into software pieces. Moreover, most of the software products then targeted one single platform (hardware/operating system). The passionate developers used to thoroughly unit test their code before delivering the same for productive usage. To make programmers’ lives easier most of the operating systems and programming languages had a pretty good life expectancy. Coding then was more concentrated around functional correctness than painting excellent graphical user interfaces. The obvious result was quality software products that were more or less bug free, fast on execution and functionally most correct.

Challenges hit

However, the 80s witnessed the proliferation of the cheaper personal computers and also a number of operating systems, system tools and applications. This period also saw the sudden surge of a number of programming environments, frameworks and graphical user interface oriented programming methodologies. The popularity of the world-wide-web also contributed to alternate programming possibilities and scripting environments. With a large number of possibilities available to code the same ‘Hello World’ program, developers were forced to learn new methodologies and programming mediums pretty quickly and most often on-the-fly during a software project execution. The proficiency levels started coming down and so was the quality of code written and unit tested.

Another thing that added to the chaos was the pace at which business establishments computerized their office or factory setup. Sudden implementation pressure and ever-shortening timelines to deliver also resulted in poor unit testing time. Either there wasn’t sufficient time to test or the unit testing culture was partly ignored!

During the same era, system and office products companies like Microsoft started shipping new and new versions of desktop operating systems almost every year. This also meant that the code written for one O/S had to be run or adapted for multiple platforms. Platform specific coding and testing paved way to another level of quality challenge and multitude compatibility and upgrade tests.

Since the hardware was getting faster and cheaper it wasn’t always possible to do proper benchmark tests. It is also a fact that not many cared about fine-tuning the software to make it run at the best possible performance.

Differentiation and introduction of several roles in software testing took half the burden away from the developers – but the wrong way. Many developers thought that unit and module testing are probably not their job but that of the dedicated test crew!

Introduction of many processes – both software project management and quality related – deviated the focus from coding and testing to conformance of these standards, many of which had nothing to do with actual software quality produced.

So, how can we deliver better quality?

With too many distractions around flashy technologies, user interfaces and over engineered products the testers seem to have forgotten one basic thing. At the
end, testing is all about passing the use cases or conformance. The very first thing that has to be passed by a software system is whether the given valid inputs are resulting in expected outputs as mentioned in the specifications. This use-case based testing has to be the main area of interest to every tester.

Regression errors is one area where again use-cases has to be stressed on. Every single time the codeline changes the testers or automated mechanisms has to perform the use case testing. It should not be just a click around test Integrated automated tests are a must along with every build cycle of the software. These days plenty of automated test tools are available in the market. One can also write custom automated test tools via easy scripting options. But since automated testing also revolves around the scripts/code written for the test program, the quality of the automation tools/code itself should be very high.

The success of the manual testing is only as good as the number of test catalogues/cases available. It should be made a habit that developers and
testers discuss, scrutinize and update the test catalogues/cases periodically. This is something that rarely happens in subsequent releases of products.

Performance and scalability validations should be done with proper IT infrastructure set up and/or with the simulation of the actual usage of the
system. It is a fact that many of the software produced are not tested for performance and scalability. Many a time, the first version of the software is the guinea pig using which probably a more realistic version is rolled out later, that is usable.

Finally, developers should not forget that unit testing is part and parcel of the development cycle. You have to be your own tester in the first place! In the planning phase, most of the inexperienced developers and their bosses make the mistake of not allocating the right effort against the modules to be coded. Most of them just think about the amount of time required to code and not unit/module test their code. This planning mistake can finally result in poor quality and blame game (a commonly found phenomenon in any developer-tester interaction).

Six Sigma and what not?

Sometimes I get a feeling that the information technology people have the tendency to borrow incompatible ideas from other industries to the software world and patch upon. According to me Six Sigma and certain other quality processes are not something that fits into software directly. Well, at the end any theory can be made use of but have we got enough examples where Six Sigma (originally from electronics manufacturing standards) has been judiciously and successfully used in any software company? For me, it has been successful only in creating multiple redundant roles in a project team. Come on, this is not martial arts! We don’t need green belts and black belts here in the software teams – nor do we want historic data on defects per million opportunities.

Finally, ‘Quality Engineering’ itself is an oxymoron – at least in the software context. For me, quality is about the conformance or verification of something that is already engineered (Nobody can produce quality as such!). And quality processes itself should not be made too complicated via unwanted processes and methods. The need of the hour is realistic project plans, committed developers (who are also good unit testers), test cases, integrated test automation tools and passionate manual testers.

Enterprise SOA for Agile Enterprises

To improve is to change; to be perfect is to change often – Winston Churchill

Sir Churchill seems to be completely right in a general sense, and it is very much relevant in this modern era of information technology. In a competitive, complex and dynamic business environment, organizations are fast realizing that the ability to transform their IT for their future business needs will determine their success. An agile enterprise or an organization that continuously reinvents itself, would be successful only if they have a flexible IT infrastructure that depicts their current organizational structure and business domain. The need for this kind of flexibility resulted in what is called an enterprise architecture that is nothing but a bundle of loosely coupled functions and processes from the organizations’ point of view.

An enterprise architecture gets even better if it is a Service Oriented Architecture (SOA). An SOA is a collection of services that can interact with each other to carry out the business processes. Each service in this case, would be self-sufficient, in terms of data and state maintenance, and would form a node in a distributed computing environment. Each business process, thus becomes a series of request-response cycles involving one or more services. Please note that we are not necessarily talking about web services here.

SOA really puts some of the beautiful technology concepts into practice – Reusability, Encapsulation and common protocol for usage and data exchange to name a few. Each independent service as such could be implemented in any language or technology platform. It also provides tremendous possibility for organizations to integrate their existing solutions with other external systems by keeping similar protocols for publishing and accessing the services. This is achieved via the standardization of protocols by leading independent research & standards organizations like World Wide Web Consortium (W3C). We may recall that first generation service oriented architectures like DCOM or CORBA didn’t help everyone mainly because access protocols across heterogenic systems were not standardized.

Enterprise SOA seems to be the new buzzword now, that promises a scaleable, adaptable and open service oriented architecture for developing and/or adapting enterprise solutions. Leading business software vendors like SAP is investing heavily to port their traditional ERP and domain specific enterprise applications and suites towards service orientation. While newer applications can be fully service compliant and will work as per the current vision, it is yet to be seen how organizations can adapt their existing applications to the SOA platforms that these vendors offer. One-time adaptation may not be a feasible option for many huge organizations. So they can buy the new enterprise SOA applications for fresh requirements and integrate them to their existing infrastructure. Over a period of time, probably they can adapt the systems towards enterprise SOA, and until it is fully achieved they have to be satisfied with a side-by-side model in their IT infrastructure.

A clean enterprise SOA solution is not there in the market yet. However the next two or three years could witness a surge of service oriented enterprise offerings for both medium and large scale organizations. The success of these solutions at the end may depend on enterprise-wide protocol standardization (how long the current standards last?) and lower cost to get started (for IT optimization). After all, these technological shifts should help the enterprises to concentrate more on their core business rather than spending time and resources on IT infrastructure alone.

Windows Vista is here!

Microsoft has just announced their new fancy operating system namely Windows Vista. Vista in English language means ‘panorama’ or landscape view. Well, the naming could not have been better as the capabilities of the new O/S ends very much around how your application windows looks and are viewed. Beyond that is it really worth? I guess it’s not.

I used Microsoft Windows for application development since Windows 3.X (Windows/386) era. Since then I have seen how the Windows O/S evolved and I can say that Microsoft is pretty good in marketing whatever Windows flavours they came up with. After Windows 98 Second Edition which had a very good user interface for common man and reasonable stability, I thought NT 4.0 was a pretty cool Operating System for the developers. It may not have been a gamer O/S but
it had everything in it for application developers as well as corporate server requirements. If you categorize PC users broadly into three categories – Application Developers, Information workers and Gamers – I would think that the 98SE belonged to the information workers and NT belonged to application developers. Now that leaves us with one category of RIG lovers and probably Vista is for them?

Though I used Windows 2000 professional edition extensively for development as well as normal PC usage of browsing, chatting, editing etc for almost five years I somehow liked Windows XP better – Well, probably except for its stupid Start menu that requires your mouse to be moved vertically and horizontally several times before I could achieve something. I thought Windows XP was almost complete as a UI-rich Operating System until Microsoft announced the arrival of the Vista! Now, let us see if it really a major technology and usability revamp over XP.

If you believe in love at first site, well then Vista is for you. For the first time in Microsoft’s O/S history you get to see all your application windows arranged in an ‘almost 3D’ manner. This is quite an eye-catching feature along with the translucent menu bars. Microsoft calls it an aero desktop – again quite convincing pet name! But beware, the basic home edition doesn’t support flip 3D windows. This is a setback for normal home users who are probably the main target audience for this kind of features.

Beyond the catchy looks the other value add that Vista has is in terms of the number of tools that it is bundled with – Windows Defender and firewall, Windows DVD maker, Instant search, Windows movie maker, a bunch of 3D games to name a few. But will somebody buy an operating system for these fancy tools? God, err… Billy, alone knows. Wait, the worst is still not over – The so called Business Edition is something in which I had great hope on as I thought it is a
professional developer environment. But Microsoft feels that a few administration features like scheduled complete PC backup/restore, remote desktop, drive encryption etc would automatically make your Home edition a Business Edition of the O/S. Should we blame them, probably not. I think they have got their market and anything that is dumped on the users will be gracefully accepted.

The impact that Vista rollout has on corporates would be in terms of additional investment on aero-supported graphics cards. Also, if home edition PCs are equipped with these expensive cards, the prices for the final assembly are surely going to shoot up which will adversely affect the PC market for sometime. One of the main features WDDM (Windows Display Driver Model) that claims to be the PC saviour in case of a display driver failures definitely requires you spending a few additional bucks while upgrading – All this on top of the $300 that you pay for the Vista Business Edition.

All in all, other than for a little bit of enhanced usability for common users and some pyrotechnics that comes along, I would not buy Microsoft Vista. But then, it’s the Microsoft’s market and sooner or later it will get you there. I am counting down days before I am dispossessed off my good old XP 🙁