Small businesses are the lifeblood of the U.S. information technology (IT) sector, with small IT providers accounting for approximately 80% of the IT jobs in the country. No longer shielded from the global marketplace, small IT providers must contend with competition from overseas companies. Further, this competition is not limited to contract programming. Global players are now involved in higher IT functions and consulting opportunities, including system design, systems integration, and contract management. Couple this with a marketplace that continues to evolve rapidly as technology changes and advances, an ever-expanding menu of IT offerings, options and services, and increased customer demand for superior quality products and services in shorter time frames, and it becomes clear that matters are not getting any easier for the small IT provider. With greater competition and more demanding customers, small U.S. IT providers must now prove they can deliver high-quality products and services in a timely and predictable manner. Therefore, it has been suggested that small IT providers should adopt proven software/system engineering and management practices if they wish to survive, and flourish, in this global marketplace.4
However, small companies are often wary of the next great "academic" theory or esoteric model that makes its way into their door. They feel the use of these models by "big" businesses may not be readily applicable to a small firm context. The Capability Maturity Model (CMM) is no exception to such criticisms. Though the CMM is clearly one of the models that has not only made its way into practice but also into widespread use,2,6 the majority of evidence regarding the success of the CMM comes mainly from larger organizations. Smaller organizations have been less inclined to adopt the CMM model for several reasons. They feel the defined processes are too complex, not applicable, or too time consuming and become barriers to productivity; the CMM framework promotes bureaucracy which in turn stifles their creative, dynamic, and innovative culture; and the costs of dedicated software engineering personnel, training, and documentation make such efforts prohibitively expensive.1,7 So should small firms entertain the use of a formal process development approach as suggested by the CMM framework? In other words, does formal process capability building lead to improved competitive performance in small firms? This article attempts to answer these questions by comparing IT provider firms of varying formal process capabilities and sizes by using five different firm performance measures.
The development of process maturity models in the area of systems development began in the 1980s during the software engineering total quality management (TQM) movement. The interest in CMM, and its variant models, now has reached a global audience with IT firms worldwide looking for proven methods for managing the consistent delivery of high-quality software, systems, and related IT services within established budgets and timeframes.2
The original intent of the Capability Maturity Model for Software (SW-CMM), developed under the direction of the Software Engineering Institute (SEI), was to provide the U.S. Department of Defense a means of including the contractor's software development capabilities within the contract award decision-making process. However, the SW-CMM evolved into a framework for process improvement that in turn attracted the attention of several commercial organizations, such as Hewlett-Packard, Hughes Aircraft, Motorola, PRC, Raytheon, and Schlumberger.6 Based in part on the reported success of these firms, the popularity of the SW-CMM grew. As the IT development role expanded and systems became increasingly complex, the SEI responded by introducing the Capability Maturity Model Integration (CMMi) in 2000. The CMMi combined previous system and software capability models, including the SW-CMM, to provide organizations with a framework for enterprisewide IT process improvement.
The CMMi model describes a set of key process areas (KPAs) that, when consistently adopted by an organization, increase an organization's ability to meet its cost, quality, schedule, and performance objectives. Each KPA is assigned to one of four process categories (engineering, process management, project management, and support) as well as to one of five maturity levels8 (see Figure 1). An organization moves up this CMM evolutionary maturity structure by institutionalizing processes required within a certain maturity level. Level one represents the least mature stage (meaning project outcomes are less predictable and tend to involve more rework, defects, and schedule slippage); level five represents the highest level of process maturity, tending toward predictable outcomes and little or no rework/scrap.
There have been several case studies, as well as empirical studies, promoting the benefits of CMM-based systems/software process improvement (SPI) efforts.5,6,9,10,11 These studies are grounded in total quality management (TQM) theories and show a link between increased process capability maturity and improvements in cycle time, cost control, productivity, quality, and customer satisfaction. With growing research support for a relationship between higher levels of maturity and improved organizational performance, it is not difficult to see why thousands of organizations worldwide are spending billions of dollars on CMM process improvement efforts.6 While the conventional belief by small firms is that CMM is only for larger organizations, more recent research indicates this assumption may be false. There are a few case studies of small organizations1,4,7 that have applied the principles of the CMM and thereby seen improvements in cost, development time, and quality. This study seeks to empirically test if higher process capability, based on the CMM, improves performance in small firms.
Data collection for this research was completed in two phases over a nine-month period. The first phase was targeted at large IT providers, while the second phase was targeted at small to medium sized IT providers. Data collection efforts involved the cooperation of two professional IT conferences, one IT industry consortium, and one regional IT professional organization. These efforts generated a total of 84 responses (62 usable). Multiple responses from the same organization/location were averaged to provide a single measure for that organization, resulting in a final sample size of 60. Adhering to confidentiality requirements and other organizational restrictions, a response rate of 19% was estimated. Respondents consist of owners/partners (8%), CEOs/presidents (8%), directors/VPs (18%), managers/supervisors (21%), engineers/analysts/developers (23%), and consultants/strategists (18%). They average 5.6 years of experience in their current organizations.
More important, the sample represents a diverse cross section of the IT marketplace, with a large portion of our sample coming from well-known IT service providers. Some 45% of the sample consists of small firms with 100 or less employees (of which 17 companies have 50 or less employees) while the remaining 55% of the sample consists of organizations with over 100 employees (of which 23 companies have 1,000 or more employees). Twenty-five companies have $100 million or less in sales, with approximately the same number having 50 or less clients. Twenty companies do $1 billion or more in sales, with approximately the same number having 1,000 or more clients.
Firm performance was measured using five items: number of projects exceeding budgeted cost; number of projects delivered on time; productivity; product reliability based on internal performance tests; and number of customer complaints. These items correspond to five common process improvement goals: cost control; on-time delivery; productivity; conformance quality; and, customer satisfaction, respectively.2,3,5,911 The five firm performance items were measured using rates of increases over the previous three years as opposed to absolute measures. This approach allows for more appropriate comparison among firms of various sizes as well as other differentiating characteristics. All items were analyzed as improvements, reverse scoring raw data where appropriate.
Items to measure a firm's capability maturity were based on the process definitions from the CMMi.8,12 Respondents were asked the extent to which each process is consistently performed as defined for each product/service contracted. For example, respondents were asked the extent to which their organization consistently performed a project monitoring and control process to "provide an understanding of the project's progress so that appropriate corrective actions can be taken when the project's performance deviates significantly from the plan."
A single score for each process category was created by averaging the individual processes responses within a category. A firm's overall process maturity capability was then computed by summing the process category scores. Since few large organizations have successfully obtained the CMM's level five status (with small organizations rarely moving beyond level three), processes included in the study are restricted to maturity level three and below.
The sample was segmented into four groups based on firm size and level of process capability: small sizedhigh process capability firm (Sm-Hi), large sized-high process capability firm (Lg-Hi), small sized-low process capability firm (Sm-Lo), and large sized-high process capability firm (Lg-Lo) (see Figure 2). Firms with 100 or fewer employees were classified as small firms, while those with more than 100 employees were classified as large. Firms with an overall process maturity capability score equal to or greater than the mean capability score were classified as high capability firms, while those with a score below the mean score for the entire sample were classified as low capability firms.
An ANOVA with post hoc analysis, which performs multiple pair-wise comparisons, was used to determine if the mean improvement in performance of the Sm-Hi group was significantly different from the other groups. Table 1 shows the difference in the mean performance improvement of the Sm-Hi with respect to the other three groups for each of the five firm performance measures. For example, the mean improvement in customer satisfaction of Sm-Hi group is significantly higher ( = 1.80 at p < 0.01) than that of the Sm-Lo group. As another example, the Sm-Hi group's mean improvement in cost control is lower than that of the Lg-Lo group; however, this difference ( = - 0.05) is not significant.
As can be seen from Table 1, the performance improvement of Sm-Hi group is consistently higher than the performance improvements of the Lg-Hi group, Sm-Lo group, or the Lg-Lo group with respect to every measure in which there is a statistically significant difference. In addition, the performance improvement of the Sm-Hi group was significantly higher than: the Sm-Lo group with respect to customer satisfaction; the Lg-Hi group with respect to productivity, conformance quality, and customer satisfaction; and the Lg-Lo group with respect to conformance quality.
A second analysis was done conducted the improved performance of the Sm-Lo group to the performance of each of the larger firm groups. Table 2 shows the difference in the mean performance improvement of each small firm group (Sm-Lo and Sm-Hi) with respect to the two larger firm groups (Lg-Hi and Lg-Lo). Of the 10 performance comparisons, five per large firm group, only one of Sm-Hi group's differences was marginally negative (cost control with respect to the Lg-Lo group at 0.05) as compared to three much larger negative differences (0.35, 0.46 and 0.87) for the Sm-Lo group. Of the Sm-Hi group's nine positive differences, five were large differences (over 1.0) with four of the five being significant. As a comparison, the Sm-Lo group had only one large positive difference (productivity with respect to the Lg-Hi group at 1.13): this difference was also the only significant difference for the Sm-Lo group.
Before we discuss the results, it is important to note the sample size in this study is relatively small and results from this study should therefore be interpreted with caution. Despite this limitation, the results do offer some interesting insights and seem to suggest small firms with higher process capability enjoy considerably greater improvements in key firm-level performance goals.
When compared with the Sm-Lo group, the Sm-Hi group was noticeably better in on-time delivery, productivity, conformance quality, and customer satisfaction; however, only customer satisfaction was statistically significant. In addition, the Sm-Hi group compared more favorably to the two large firm groups. The lack of significant differences when comparing the two small IT firm groups head-to-head may revolve around the arguments and counterarguments promoting and criticizing the use of formal processes in smaller IT firms. Potential increases in cost control and on-time delivery through the use of formal processes may be counterbalanced by overhead cost and time associated with performing these processes in a small IT firm environment. Similarly, the use of formal processes may both positively and negatively impact productivity in small IT firms, resulting in a lower net gain. Further, increases in quality and productivity, which typically accompany the embedding of best practices into formal processes, may already exist at some level in smaller IT firms through the informal sharing of these practices. However, the results clearly suggest that having formal processes in a small IT firm environment can directly result in increased customer satisfaction. Formal processes provide clients with predictable and consistent methods for interacting with the IT service provider, which may in turn substantially reduce frustrations that arise from ad hoc ways of dealing with customer issues. Further, formal processes grant the customers a view into the small IT provider's service development and provisioning activities, thus facilitating greater interaction between the provider and client, and ultimately generating greater customer satisfaction. While a majority of differences associated with the direct comparison between the two small firm groups are not significant, perhaps due to the small sample size, it seems, based on the results of the second analysis (see Table 2), that smaller-higher capability firms are doing a better job than smaller-lower capability firms at making improvement gains in on-time delivery, productivity, quality, and customer satisfaction compared to larger IT providers.
The research results also indicate that smaller firms with high process capabilities seem to have benefited more, at least in the recent past, from their high process capabilities than larger firms with high process capabilities. It is also important to note that the results show these differences are not merely a result of the difference in firm size between the two groups. Compared to the Lg-Hi group, the Sm-Hi group had greater improvements in cost control and on-time delivery and showed significantly more improvement in productivity, conformance quality, and customer satisfaction. A possible reason for these differences may be that small firms with high process capabilities are better equipped to adjust to change. While the use of the CMM framework allows for better control and coordination of activities between various process groups, changes associated with activities between divergent groups take time. For example, a functional group in a large organization may already be heavily into activities associated with a process before a change is formally communicated to them. On the other hand, smaller organizationsthrough the use of less formal methods of coordination and communicationmay find it easier to adjust current activities or processes to meet the changing needs of the customer.
The final comparison of the Sm-Hi group with the Lg-Lo group revealed that the Sm-Hi group's improved performance was higher in on-time delivery, productivity, conformance quality, and customer satisfaction, with conformance quality being significantly higher. In an attempt to make the software/systems development process more adaptive to changing customer demands, several large organizations are moving away from process-centric approaches, like the CMM, to more people-centric approaches common to agile software development. While agile methods do allow large organizations to become more responsive and flexible, they also make the outcome less predictable. It may be that small-high process capability firms are better able to balance this flexibility-control equation than larger firms.
While the results of this research show the potential benefits of the use of the CMM in small firms, questions still remain about how to best implement such formal process capability efforts in these organizations. In order to gain greater insight into which individual processes were most critical to be implemented, we asked firms in the Sm-Hi group which processes they believed were the most important to their improvement efforts. The resulting top five processes are shown in Table 3. It is interesting to note that the most important processes selected by these firms tend to involve the customer to a greater extent. However, this is not surprising considering that small firms deal with projects requiring close and complex interaction with their customers3 and that these processes allow the customer more insight into the provider's processes, increasing efficiency through preplanning and development.7 These firms also have a very heavy process quality focus, as can be seem by their emphases on configuration management and product and process quality assurance.
Small firms would be well advised to tailor the CMM model to their own environment and culture and to stress a process focus built on flexibility, efficiency, and quality.
The results discussed here clearly suggest small firms can benefit from developing a high level of formal process capability. However, in this endeavor they face two difficult and related challenges: overcoming negative perceptions about CMM, viewed as an overly bureaucratic approach; and overcoming the resistance to using formal processes. Formality and bureaucracy are often synonymous, and run counter to the culture of entrepreneurship that typically pervades smaller firms.
To reap the benefits of a formal process capability approach while being simultaneously cognizant of their entrepreneurial and less formal culture, small firms would be well advised to tailor the CMM model to their own environment and culture and to stress a process focus built on flexibility, efficiency, and quality.1,4,7 Further areas of research may be to identify best practices used by small firms adopting the CMM framework. Of particular interest would be the support processes, often an area of angst for these firms. Also of interest is the relative trade-off between the cost and benefits of implementing formal software/system engineering and management practices (both in large and small firms). Finally, with small firms effectively adopting and adapting proven large firm practices, what lessons can large firms now learn from their small firm counterparts?
8. Kishore, R., Swinarski, M., Jackson, E. and Rao, H.R. A quality-distinction (QD) model of IT capabilities: Conceptualization and two-stage empirical validation using CMMi processes. IEEE Trans. on Eng. Manage (forthcoming).
©2012 ACM 0001-0782/12/0700 $10.00
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from firstname.lastname@example.org or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2012 ACM, Inc.
No entries found