Menu

The Economics of Web Service Development

July 7, 2004

Marcia Gulesian

The financial consequences of your selection of software, hardware, and technology are often dwarfed by your decisions about the other component of every computer-based solution -- that is, people. In order to have a fairly concrete discussion of the roles people occupy in software development, I'll cite specific technologies such as web services and PKI; a specific industry -- healthcare; and a specific government regulation, HIPPA.

Management needs to be able to plan and control the activities of the organization. At the same time, people run organizations -- be they management, project managers, or developers -- who can perform with higher or lower levels of efficiency, creativity, and job satisfaction, and each of whom is required to make choices.

This article will discuss the economics and management of creating and later aggregating a new web service, from the perspective of people with different roles who contribute to the final outcome.

I'll also cover other related topics: how you might go about choosing which web services app(s) to develop, when resources are not available to develop all that have been requested, and the cost of complying with very recent federal regulations.

I've chosen the healthcare industry in general and hospitals in particular to provide a laboratory for studying the economics and management of software development (and deployment). Whatever your industry, you'll find some or all of the problems you face in the following sections. Hospitals often have many departments that are not managed centrally, with each department providing a unique solution, and few of them connected together.

Today their solutions are implemented by a mix of modern (web-based technology) and legacy (mainframes or client/server) systems. In addition, the hospital's internal data infrastructure must interact, subject to federal regulations, with those (often very different) of external enterprises such as insurance companies or governmental and legal entities.

Enter the patient: he or she will often go to several hospital departments for his or her treatment and have to deal with these same insurance companies or governmental and legal entities, independently of the hospital's own exchanges with them. This is a complicated system, even under business-as-usual circumstances.

In the next few sections of this article, I'll give an overview of the hierarchy (from developer to senior management) involved in the creation of a new web-services application. Following that, I'll discuss some of the main issues confronting this hierarchy, both its individual members and the entirety.

Figure 1 shows the players, some of their responsibilities, and some of the hard-to-measure parameters that characterize them. While nothing in my discussion is uniquely relevant to web-services development, this relatively new technology has appeared at the same time as the arrival of a remarkable array of other new technologies and, also, changing economic conditions. As a result, web services is a prime candidate for the issues that I'll address below.

Figure 1

What's the Problem?

Web services is a new paradigm that organizations should use to develop new applications only after rethinking their business model. This is where the business analyst comes in. Many software projects are considered failures because they do not actually solve a valid business problem or have a recognizable return on investment (ROI).

Why spend money building a system to automate a business process if the total cost of ownership is greater than it would have been had no system been built at all? Otherwise, software engineers may end up focusing their efforts on solving technical problems that ultimately do not address the original business problem. You need to be cautious about applying new web-services technology to your old problems.

For example, even though you have the ability to put wrappers around existing applications and reuse existing functionality, you should do so only when doing so saves time now, without creating additional costs later. Many development tools simply wrap existing application objects that were never meant to be exposed as public APIs. Doing so can reveal internal dependencies and make it very hard to change these interfaces over time.

And, in your rush to exploit the ease of interoperability afforded by web services, there's an increased danger of exposing yourself to possible legal liability, especially when expert systems or fuzzy logic are involved.

An early expert system developed at Stanford University School of Medicine to diagnose and recommend treatment for certain blood infections was never actually used in practice even though, in tests, the software outperformed many clinicians. This decision was made for ethical and legal reasons related to the use of computers in medicine (if it gives the wrong diagnosis, whom do you sue?)

Management

The purpose of web services (or any other technology) is to improve the quality of service offered by an institution and, at the same time, when possible, to reduce costs. In some cases, the quality of the service can be related directly to a dollar benefit; in other cases it cannot. For example, what dollar value can be placed on the reduction in waiting time at a hospital emergency room? Yet the decision regarding the resources to expend in providing the improved service should be made primarily on the basis of some measure of performance.

In making such decisions, management often must choose which project to undertake from a list of them competing for the same limited investment dollars. Here's where "opportunity costs" are a factor: the winning project chosen by management must generate a rate of return at least equal to the economic return foregone from deferred investments in the losing project(s), plus the cost of implementing the winning project.

But, "opportunity cost" is hard to measure. At the same time, "opportunity cost" is important because it affects an organization's performance far out into the future, which also makes it hard to measure. Finally, cost estimates based on accounting data -- which record actual current or past costs and are thus historical -- must therefore be considered as only first approximations to the relevant costs in managerial economics.

Decision tree analysis is a decision-making tool that should be used, either formally or informally, for comparing the opportunity costs of different proposed projects. This tool provides a systematic framework for analyzing a sequence of interrelated decisions that may be made over time. It clarifies choices and risks and the related benefits of long-term investment alternatives. In Figure 2, square nodes indicate the decision points and round nodes indicate the chance event forks.

Since all decision making is about the future, which nobody can predict with certainty, guessing probabilities (risk) is an important part of the decision-making process. But be mindful that your chief financial officer will likely see the uncertainty surrounding the decision you make in a much different light than you. We aren't really bad at estimating risks. What we are really bad at is enumerating all the assumptions that lie behind our estimates.

Figure 2

After creating a decision tree, managers (assisted by a financial officer when necessary) may use the data it contains to calculate Net Present Value (NPV) -- the difference between the cost of an investment and the return on an investment measured in today's dollars -- or other like determinants of ROI. The article Web Services Return on Investment introduces this subject. Note, however, that building a decision tree and calculating NPV before management decides whether to undertake a proposed project -- without accounting for speed of rollout, system adoption rate, and the like -- can lead to an unexpected outcome.

And, the drawn-out process of "proving" value before implementation is the wrong approach when considering small projects. That's because a low-cost trial can more easily and accurately forecast the outcome. After all is said and done, the best choices about which project to undertake are sometimes made by experienced managers with successful track records, based as much on instinct and knowledge of their organization's human assets as any formal, quantitative analysis of the many variables involved.

Project Management

Project management differs from more traditional management activities, mainly because of its limited framework, which gives rise to a host of rather unique problems. Projects go through a series of stages. During this life cycle, a variety of skill requirements are involved. And projects bring together personnel with diverse behaviors and knowledge and skills, some of whom remain with the project for less than its full life.

The project manager needs to be effective in planning, scheduling, and controlling and identifying risks. An interesting account of how to manage risk on software projects (and provide minimum-cost downside protection) can be found in Waltzing with Bears.

I lean in the direction of having no unnecessary measurements (a.k.a. controls) in project management. The key word is "unnecessary." Project managers employ tools (such as GANT or PERT) to obtain:

  1. A graphical display of project activities.
  2. An estimate of how long the project will take.
  3. An indication of which activities are the most critical to timely project completion.
  4. An indication of how long any activity can be delayed without lengthening the project.

The idea is to show interrelationships among activities. But many projects are subject to "change orders" as the work proceeds, whereby the final product has features different from those originally planned. Obviously, changes in scope can also affect quality, schedule, and cost. So quantitative tools for project management can be used without factoring in human behavior to the extent that it needs to be.

The Mythical Man-Month and It Sounded Good When We Started offer useful insights for working with people on projects. And, finally, you may have heard an experienced project managers quote the maxim, "A carelessly planned project takes three times longer to complete than expected; a carefully planned one takes only twice as long."

Development

The developer (or the person who makes technology decisions for him or her) must decide on the language, framework, platform, and tools to be used in the development of a web-services (or any other kind of) application. These can be tricky choices. Sometimes they're made for personal, not business reasons: made in support of an individual's career-advancement goals, out of intellectual curiosity about the latest technology to catch the developer's eye, or even based on nothing but marketing hype.

Equally far-reaching are decisions made in the selection of programming methodology: old-fashioned, fly-by-the-seat-of-your-pants (preferred by many), or the sometimes oversold RAD and XP. Taken by themselves, neither of these choices assures the best outcome. However, when considered with other factors -- such as the match of the individual developer's skill sets to the needs of the project, her views on and needs for more or less management, and individual motivation -- logical choices can be made.

Remember, there's a cost to every decision. With RAD, for example, does the software you deliver sooner mean that your customer must accept a product that is less usable, less fully featured, or less efficient? That is, will lower development costs also mean a lower level of customer satisfaction?

Job Design

Good job design is not easy. As jobs continue to evaporate in the post-dotcom era, low morale among IT workers is increasingly threatening productivity, according to a recent study by research and consulting firm Meta Group of Stamford, Conn.. They reported that 71 percent of IT managers surveyed indicated that IT employee burnout is currently a serious issue in their organizations.

So, it's more important now than ever to design jobs -- at every level of your organization -- with not only a logical approach that doesn't consider satisfaction of wants and needs and a behavioral approach that does. A job description that calls for specialization such as only Java or only C# allows the programmer to concentrate her efforts and thereby become proficient in some aspect of the task.

For some, however, this narrow scope can be dull, downright boring, and the source of dissatisfaction. On the other hand, some workers prefer a job with limited requirements and responsibilities and are not capable of handling jobs with larger scope. So, matching the developer's personality and skill set to his or her job can go a long way toward maximizing productivity and quality.

Other Issues

A number of other issues affect each level of the hierarchy. One, interrupted or successively different job assignments, arguably affects the developer more than other members of the hierarchy. Interruptions in concentration create the need to relearn. They occur when jobs are split or when an expedited job interrupts an existing one.

While the amount of forgetting may be difficult to estimate, the restarting rate for your return to an interrupted task can be material. In contrast, when successive jobs are not interrupted, but simply different, there's no forgetting. In the latter case, it's all about learning. In either case there's a startup cost.

Another issue is management style. It has a direct effect on every individual in the organizational chart and affects the bottom line. I'll concentrate on the opposite poles of a hands-off management style (a.k.a. university) and a micro-managed (a.k.a. kindergarten) management style.

When the former is paired with skilled workers who like their jobs, creativity is often the result. The other extreme -- the micro-management of under-skilled or unhappy workers -- is frequently a good idea. But, the micro-management of skilled workers who otherwise like their jobs can sometimes prove to be counterproductive. You may have your own examples of the latter. And, many professionals experience excessive attempts to control their behavior, which can cause resentment.

Contrast this with Peter Drucker's idea, that "a manager's job is to make people's strengths effective and their weaknesses irrelevant." In some environments, managers even encourage employees to risk making mistakes and create an atmosphere that encourages them to be open when errors occur. After all, you have to make mistakes before you (and everybody else) can learn from them.

Cost of Federal Regulations

Federal regulations, particularly those concerning confidentiality, affect the bottom line in every industry. The biggest driver of security technology in the healthcare space is the Health Insurance Portability and Accountability Act of 1996 (HIPPA), whose final rules took effect on April 21, 2003. It includes a set of provisions to move a number of administrative healthcare functions online.

The U.S. government has mandated that healthcare entities, which deal with a large amount of confidential information, must gradually increase their HIPPA-compliant volume of transactions. The emergence of standards such as web services (and HL7, an ANSI-accredited standard widely used to interface the independent systems in health care institutions concerned with clinical information) can help these organizations achieve HIPPA compliance and re-architect their internal applications infrastructure with less disruption than traditional approaches.

While there's a cost benefit when you implement HIPPA-compliant, web-services apps correctly, there can be a steep price to pay when your software fails to meet HIPPA's rules on privacy and the like. Noncompliance can cost your organization thousands of dollars a year in penalties and a violation of the Act's privacy provision for personal gain or with malicious intent can result in a fine of up to $250,000, imprisonment for up to10 years, or both.

And, of course, there's also the risk of civil suits. This privacy rule is expected to cost the health-care industry $17.6 billion over 10 years, according to the preamble of the regulation.

HIPPA is for the most part technology neutral and digital signatures are not presently required. However, without the use of PKI, confidentiality, message integrity, access control, non-repudiation, and end-entity authentication is difficult to achieve.

Key factors to use when estimating the cost of putting a PKI infrastructure in place are number of users, number of applications, and whether the Certifying Authority (CA) service is being outsourced or managed fully in-house. If the cost of certificates or related infrastructure services exceed 10 percent of the overall cost of an application, then a positive ROI becomes increasingly difficult to achieve. Naturally, to determine the real cost of PKI, you have to amortize all software, hardware, consulting, and other hidden costs of deployment.

Biometrics have become interesting for the healthcare industry because they're inexpensive, mobile, and [relatively] secure. But, biometrics by themselves don't solve HIPPA compliance issues. It takes a digital certificate to achieve non-repudiation with a digitally signed transaction. However, by combining a digital certificate with a biometric device, you can achieve good security practices and HIPPA compliance for many healthcare organizations' tasks.

Total Cost

The notion that "time is money" has been understood from ancient times. So, while the consumer of a cost-saving software application may want to receive it as soon as possible (minimizing "delay costs"), the development team may not want to speed up the project (minimizing "crashing costs"). Management needs to strike the appropriate balance -- on a case-by-case basis -- between these competing interests.

In addition to those mentioned so far, other costs frequently need to be considered (or disregarded):

  • Sunk Costs -- Costs of past investments that cannot be recovered. They have no relevance to new investments.
  • Overhead Costs -- The best way to determine whether overhead costs should influence the project-selection decision is by estimating the incremental overhead cost, or the difference in these costs with and without a particular (proposed) investment.
  • The Cost of Money -- A very important factor when considering the long-term costs and benefits, especially through periods of high cost of capital.

A discussion of the post-completion audit is necessary in any treatment of capital budgeting. The post-audit involves (1) a comparison of actual results to those predicted in the investment proposal, and (2) an explanation of observed differences. The post-audit has several purposes, including improved forecasts and improved operations.

But the predicted results are based on probabilities and the actual results sometimes fail to meet expectations for reasons beyond the control of the people directly responsible for the outcome, as well as for reasons that no one could realistically be expected to anticipate. So, if the post-audit process is not used carefully, management may be reluctant to suggest potentially profitable but risky projects.

Economic benefit from employing any emerging software technology is possible when the processes of development and deployment are managed well.

Conclusion

By focusing too much on technology and not enough on whether you're actually solving a business problem (as opposed to simply automating a flawed one), what makes people tick (their individual wants, needs, and skills), and the hidden costs of compliance with governmental regulations, you can wind up squandering precious resources. In the process, you can dampen the enthusiasm of senior management for a new technology such as web services, which, if properly introduced, can improve your organization's ability to do business.

The most important goal for any software development group should be to achieve customer satisfaction. This will be done by producing end-deliverables that satisfy, and hopefully even exceed the customer's expectations. But, a separate consideration and the real driver for software development should be total cost of ownership. Your organization should produce results that are economically justified, not ones that simply enable bragging rights to having adopted the latest technology.