Software Acquisition Review

The following sample Information Technology essay is 2508 words long, in APA format, and written at the undergraduate level. It has been downloaded 449 times and is available for you to use, free of charge.

For organizations large and small, software provides immense benefits. Yet these benefits do not come without risks; when introducing software into a system, particularly when that software was created by groups outside the organization, it poses serious information technology security threats. In addition to the inherent initial dangers, of course, there are also ongoing issues and concerns with maintaining the integrity of the new software. When the organization in question is part of the government, as opposed to a smaller entity with less power and responsibility, the potential consequences are amplified. Furthermore, government organizations are more likely to be the target of attacks than other groups, if only for their visibility and perceived possession of great power. The military is possibly the most obvious place to aim software infiltration techniques of all, and thus it is instructive to pick a particular branch and a certain acquisition for examination. Recently, the United States Navy has begun implementing a specific set of modules for use in close-to-shore or shallow-water endeavors, and this system in total is referred to as the Littoral Combat Ship—a phrase which, perhaps confusingly, is also the name of the primary class of ships used in the system. Though various structural problems with software components have hindered the integration of the Littoral Class Ship into the overall structure and framework the Navy employs, to the software risk management assessor, the more relevant components of the acquisition of the various module pieces are the potential threats they may pose. However, by developing a plan of action and milestones, the risks of hidden malware, fraudulent administrator access, and unintentional system security weaknesses can be mitigated.

Three Primary Risks

The primary risks involved in bringing in various software components to govern the overall Littoral Combat Ship network of strategies consist of both intentional infiltration attempts and unintentional weaknesses. Of necessity, the modular nature of the approach means that there is not merely one major software acquisition in question, but rather multiple smaller acquisitions. O’Rourke (2005) described the Littoral Combat Ship—in the sense of a vessel, rather than the overarching governing system—as follows: “The Littoral Combat Ship (LCS) is to be a small, fast Navy surface combatant with modular weapon systems. The LCS program was announced by the Navy in November 2001 as part of a proposed family of next-generation Navy surface combatants” (p. 1). The fact that each individual ship contains multiple modular weapons systems suggests that this makes for a unique and fascinating challenge for the field of software acquisition risk assessment. One of the unintentional weaknesses and vulnerabilities to outside attempts to breach system integrity is that in order to interface between the various modular components, there will be instances in which data is passed between different software systems over relatively less secure channels. Protecting against this eventuality might well require that the Navy also seek out software that is specifically designed to integrate diverse components and strengthen these weak “joints” between multiple different structural elements. However, the possibility of deliberate system threats is perhaps more alarming.

Two of the more obvious potential obstacles to system security, particularly when many different modules are involved, are the possibility of the software containing components of malware and the potential that the system could allow access to those who are not authorized. The former threat is fairly straightforward; though the likelihood of this is enhanced when outsourcing the creation of new software to those outside the organization, even those with ill intent within the group could intentionally put harmful code into the software. As for the authorization of administrative access, the issue goes beyond simple password protection measures or anti-malware software. If one were to choose to do so, there could be disguised means of integrating “backdoor” access to the system. These risks are very pertinent.

Justification of Risks Selected

The risks selected are not arbitrary or randomly chosen. Though in some respects it may seem that they are the first threats that easily come to mind, it is precisely because such concerns are so common and on the tip of the tongue compared to other issues that they must bear the primary brunt of the attention given to securing the system. In this manner, the approach to securing the system can be formalized in terms of what the majority of outside attackers might immediately think of as the best potential routes for gaining access to government software systems. Indeed, concerning the need for a strong framework, Boehm (1991) articulated the then-emerging field of software risk management as, “. . . an attempt to formalize the risk-oriented correlates of success into a readily applicable set of principles and practices,” and put forth the objectives as, “. . . to identify, address, and eliminate risk items before they become either threats to successful software operation or major sources of software rework” (p. 32). This is the very definition of the area of software acquisition risk management, and thus it cannot be taken seriously enough. Yet determining the most likely approaches to destabilizing the security of the software from an acquisitions point of view is not enough; the organization in question must also be aware of other vulnerabilities.

Though of course, one would not expect deliberately shoddy work to be done under the circumstances of a major government organization such as the United States Navy commissioning software to be written, mistakes are always possible. Potential oversights on the part of those coding the software are of concern in short because simple human error is such a regrettably common occurrence. From the outside, it is, of course, more difficult to threaten a system from this angle than from known and deliberately induced weaknesses to attack. However, mistakes are, on the whole, a much more ordinary and commonplace variety of vulnerability than a well-thought-out and deliberate malicious strike. Thus it is helpful to have outside eyes examine the code for holes and structural concerns. Even the possibility of a system crashing at a crucial moment is a cause for profound alarm when the software involved might be governing the deployment of weapons in key naval encounters. Though in some senses this worry is not necessarily within the purview of software acquisitions risk management, in a larger, more holistic view of the topic, malice can sometimes arise from what was initially a benign mistake on the part of a programmer or group of programmers. For this reason as well as for others, it is vital to implement a plan of action and milestones in order to protect the security of the Littoral Combat Ship software components.

Plan of Action and Milestones

In any assessment of the risks involved in software acquisitions, a plan of action and milestones is an instrumental tool in organizing the thoughts, concerns, and proposed strategies to counter the potential dangers that the evaluator devises. For the Littoral Combat Ship systems and the associated software modules, the plan of action must involve an initial phase in which time, budgetary resources, and other quantities must be allocated among the various modules. This can be done in a number of ways. One method would be to distribute resources according to the quantity of code involved, the idea being that the simple size of the software program would dictate how long it would take to review and maintain the security of the given module. Another approach, which is more helpful in this particular circumstance where there is a very real threat of infiltration, would be to divide up the relevant quantities according to which systems are either more vital or that have the greatest potential for harm. These two qualities are not identical; weapons systems, for example, can cause great harm but are not essential to keep a ship afloat and on an appropriate navigational course. Particularly given the shallow-water circumstances under which a littoral ship operates by definition, navigational systems must be given special emphasis. An important milestone will have been passed when the appropriate division of resources has been achieved.

From the point at which resources are parried up going forward, the approach taken must be one of regularly revisiting any decisions made. As threats are identified and eliminated or otherwise shored up, the system must be reevaluated again and again to ensure its ongoing security. At times, there can even be an “arms race” between those who are attempting to secure the software and those who are trying to infiltrate it, but this is a necessary and ongoing part of the challenge to software security management. In this sense, each threat disposed of can be considered a milestone, as can each regular evaluation successfully completed. Not to be neglected as an opportunity to regard the very real accomplishments of personnel as milestones is the creation of documents and records concerning the ongoing treatment of the system. Indeed, overall, the greater the number of milestones proposed, both large and small, the larger the ability to keep the personnel involved as motivated as possible will be. This, among other aspects, is part of what rationalizes the approach given here.

Justification of Proposed Actions

For the actions suggested here, justification comes in the form of reference to the existing literature on the topic. The model proposed is essentially an approach of spiral development. This technique has been around for quite some time now; as early as Boehm (1988), spiral development was described as “. . . an evolving risk-driven approach that provides a framework for guiding the software process” (p. 61). Though spiral models do require more constant evaluation and replacement of individual components than other approaches, this is a framework that has been long established as sufficient for the task at hand. Indeed, there is research specifically advocating for the use of a spiral model in the particular case of the Littoral Combat Ship system’s array of related software. As Forbes, Volkert, Gentile, and Michaud (2009) have put it: “Through this continual rolling in of evolving components the system continues to offer more advanced capability. This creates an elaborate tradeoff scenario in which dissimilar attributes must be examined, weighted, and analyzed for best value and applicability to user needs” (p. 1). Though this information does highlight the labor-intensive components of the spiral model, it also draws attention to the fact that this strategy provides for maximum capacity to revisit and revise various elements of the system to best meet user needs at any given moment. Yet in order to continue providing the best value for the system, one important required component, indubitably, is measurement.

Measurement of Proposed Actions

No plan of action is complete without a means of measuring its progress. There are differences between the various strategies that might be employed, though. For example, Putnam and Myers (2013) provided a quite recent discussion of the distinction between assessments and metrics, showing a strong preference for the latter as being less subject to evaluator bias. Indeed, the actions suggested here must be measured in terms that are as objective as possible, so that consistency is maintained from year to year. However, Putnam and Myers also rightly pointed out that there are certain dangers associated with strictly periodical approaches to measurement, as there can be a tendency to exhibit excitement, enthusiasm, and motivation only around assessment or metric-taking dates, leaving team performance otherwise lackluster. This may not be a problem that can be easily overcome with a pat solution. One approach would be to schedule evaluations on a more frequent basis, such that there is little opportunity for the work to grow stale. Yet this carries its own risks, too, for the more often assessments occur, the less likely personnel are to take any one such instance seriously. Though the evaluations might then be weighted appropriately, such that the end-of-year examination is more important than the quarterly one, for example, it is unclear at this point whether that would prove a satisfactory solution to the problems Putnam and Myers have pointed out. Overall, though, these concerns must be put aside in favor of making some decision, regardless of whether it will, in the end, prove to be the most correct of the possible approaches.

Given the available information, it seems the best way to schedule evaluations is on a quarterly basis with the option to assess more frequently if immediate problems or concerns warrant the change. In many organizations, the question of how the power and authority would be distributed in order to handle this might be in question, but as the entity currently under study is the Navy, the existing chain of command ought to establish clear routes of communication and responsibility. On the finer scale, an evaluation must be in terms of potential threats eliminated, perhaps assessed as a proportion of total potential threats for the time period in question. Part of measurement, too, is records-keeping, and there must be responsible parties for this aspect as well. One of the concerns, of course, is the security and safety of the records themselves. As the documents created in the process of threat identification and elimination are likely to consist of spreadsheets and word processing files, however, the integrity of these will already naturally be covered under the Navy’s systems for backup and security. Because there will be nothing particularly novel about the communication and documentation techniques employed, there is no need to here elaborate upon separate strategies for safeguarding the data. In the end, the fact that the proposed software is being introduced into an organization with many processes already in place, though there are downsides as well, is more of a blessing than a curse to software acquisition risk management.

Conclusion

Malware, administrator fraud, and simple mistakes all degrade the certainty of the security of the various modular components of the Littoral Combat Ship systems. However, with a spiral development approach of regularly revisiting these various aspects on a scheduled basis, the threats can be mitigated or even eliminated outright. The security of the United States Navy’s shallow-water combat structures and the various vessels involved are too important to neglect. Thus the role of those who evaluate software acquisitions risks is as vital here as it is elsewhere. Taken altogether, it is obvious that the variety of ways in which confidence in a software system can be enhanced is well worth the effort invested.

References

Boehm, B. W. (1988). A spiral model of software development and enhancement. Computer, 21(5), 61-72.

Boehm, B. W. (1991). Software risk management: Principles and practices. Software, IEEE, 8(1), 32-41.

Forbes, E., Volkert, R., Gentile, P., & Michaud, K. (2009). Implementation of a methodology supporting a comprehensive system-of-systems maturity analysis for use by the Littoral Combat Ship mission module program (No. NPS-AM-09-019). San Diego, CA: Space and Naval Warfare Systems Center.

O’Rourke, R. (2005). Navy Littoral Combat Ship (LCS): Background and issues for Congress. Washington, DC: Library of Congress Congressional Research Service.

Putnam, L., & Myers, W. (2013). Five core metrics: The intelligence behind successful software management. Boston, MA: Addison-Wesley.