Blog Archives

Responsible Online Gaming

In this video from TEDxUniversityofNevada, James Kosta of 3G studios shares how the next generation of young adults will be very attracted to online gaming. Kosta identifies the “3R’s” of regulated gaming as risk, randomness, and reward, then explains how online platforms will excel at providing this experience to users. Because of this, Kosta makes a call for a fourth “r” – responsibility.

Kosta provides extensive research and insight in this 15 minute talk. Take the time to watch it and then share your thoughts in the comment section below.

Most IT Project Failures Are NOT Due To Technical Problems

One of the points I stress through the entire sixteen weeks of my MBA IT classes is that most IT project failures – and by some estimates over 60% of all significant IT/IS installations are “challenged”– is that the majority of the problems encountered are people problems, not technical problems. Despite the increasing technical sophistication of MBA students in general, this fact is still surprising to many. In the vast majority of cases of full or partial information system failure, the hardware and software work exactly as specified, and the failure can be traced back to one or more common management misunderstandings, which lead to failures in change management for the systems project.

Arguably the most common misunderstanding is the underestimation of the complexity – in organizational terms – of non-trivial information systems implementations. All non-trivial IS implementations change business processes. This is another point I stress and on reflection it is easy to see that it can be no other way. Even the most sophisticated software is fixed in terms of the information it requires to function. Gathering this information at the level of detail and in the sequence required is a process, sometimes a very fundamental one, and often a process that is new to the organization if not actually counter to existing processes. When management underestimates the amount of change required by the system implementation they do not adequately prepare operations personnel.  The result is resistance to the new system and in some cases passive aggression toward the system that is tantamount to sabotage.

Some systems never even survive to full installation due to underestimation of the complexity and corresponding effort of implementation. A very common problem results from expecting operations personnel to handle data/information transfer to the new system while continuing to perform the full range of duties of their current job. In a surprising number of instances management does not even offer overtime or other extra compensation.  What is unsurprising is the inevitable resentment and confusion among operations personnel that frequently causes installations to drag far behind schedule, negatively effects normal business functions and can result in significant employee turnover.  Another factor that frequently accompanies underestimation of the IS project is inadequate training in the new system. Occasionally the need for both training and backfill (backfill is the hiring of temporary employees to assist in the installation of a new system) is recognized by management but is not pursued due to cost considerations.  This has been shown in many unequivocal studies to be false economy. The cost of training, backfill and other implementation support, such as consulting, should be acknowledged by management and factored into the overall cost/benefit calculation for the system.

The mindset that results in underestimation of IS complexity is sometimes termed ‘an appliance mentality,’ that is, the system is regarded as a refrigerator or washing machine might be. You plug it in and it works, right? Alas, no, however studies have shown that this attitude increases as the management hierarchy is ascended.  C-level executives, without whose support no project can be expected to succeed, exhibit surprising ignorance of technology, even today.

Another entire set of problems, even more intractable than system complexity underestimation problems, arise as a result of non-technical management giving or surrendering control of information systems projects entirely to technical personnel.  If control is relinquished to solely technical personnel early in the project, the problem is frequently a technically correct system that solves the wrong problem. One technique for correcting this set of issues is dual managers, one from IS/IT and another from the business side of the house. While common in Europe, this management style has yet to gain widespread acceptance in the US.

Invariably in my classes many hands go up when I ask who has seen a ‘challenged IT installation’ at their workplace. The resulting discussion is a valued part of the learning process for most MBA’s. Anyone care to comment on experiences you’ve had with regard to ‘challenged’ systems? (Names can be changed to protect the innocent 😉

William L. Kuechler, Jr., Ph.D.

William L. Kuechler, Jr., Ph.D.William Kuechler is professor of information systems and chair of the information systems discipline at the University of Nevada, Reno. He holds a bachelor’s degree in electrical engineering from Drexel University, and a Ph.D. in computer information systems from Georgia State University.


Best Practices For Software Application Packages

The phrase ‘best practices’ as applied to software applications is relatively new. In the twenty years I was associated with the software industry, first as a programmer, then as an analyst and eventually as consultant and product manager, I never used or heard the phrase. There were good reasons for this, but before I can explain them, I need to establish some background.

If you thought about the title of this post, you might well ask yourself, “What do you mean – practices? What does that have to do with software? You mean – written the best way? Or – the most bells and whistles?

What the phrase ‘best practices’ alludes to in relation to a software package is something that I stress in my MBA classes because it surprises many people – the fact that ‘practices’ – business processes the dictate what your company’s employees will do and when they’ll do it – are ‘baked into’ any non-trivial piece of software. When you buy a software application – let’s say your company has decided it could benefit from a Customer Relationship Management (CRM) package –  then you are locked into the processes, the practices that are built into the software. If you use the package, then you WILL feed it the information it needs, in the format it requires it and in the sequence it wants it. This is so whether or not you work that way now, whether or not you currently have the required information available and whether or not the process fits your company’s culture.

Well, you say, software can be modified, right? Right – for a price. The first price you pay when you modify COTS (common off the shelf) software is that you defeat the reason you bought the package in the first place – it was written, debugged and ready to use. The second price is Yankee dollars, lots and lot of them. It is widely known that the cost of consulting and modification for most ERP (enterprise resource planning) installations is greater than the multi-million dollar cost of the software itself.

Thus, it is critical to the success of a software package installation to thoroughly investigate the practices built into the application before you buy; and this brings us back to the notion of ‘best practices.’ Almost no software is developed, even by very large software companies such as Oracle or SAP, without a client. The client supplies the specifications for the application – the screens it should show, the information it needs, the reports it should produce and so on.  Of necessity, the software mirrors the processes needed to satisfy the requirements. Because the early clients of a newly developed software application are typically large, successful organizations, the marketing arms of large software companies began, about fifteen years ago, to promote these ‘baked in’ processes as ‘the best practices of the best corporations.’ Applications developers are thus able to develop the software once and sell it multiple times – a very profitable undertaking.  To understand the marketing coup that this was, it is necessary to understand the many circumstances under which ‘best practices’ are actually a detriment.

What is wrong with doing my inventory the way Mercedes Benz does it, you may ask. Or, why shouldn’t I want to handle my accounts receivable the way Nestle does? First, issues can arise due to disparities of size between your organization and Mercedes. Organizational size breeds process complexity, complexity that leads to high cost and long learning curves for software to support it and likely to a multitude of reports and functions that smaller companies will never use. Even more importantly, standard processes make it impossible to achieve sustainable strategic advantage from your software enabled process. A highly advanced, non-standard logistics process is what gives Wal-Mart a strategic advantage over competitors and the ability to grow market share and sustain growth.  A very non-standard computer supported order fulfillment process is exactly what has enabled Amazon to become a major player in multiple retail marketplaces.  As you may have already surmised, the software that supports Amazon and Wal-Mart, at least in the critical areas mentioned, is (and has to be) as non-standard as the processes themselves.

The key to determining whether the ‘best practices’ of a software application package are really the best for you lies in understanding what aspects of your business model are core competencies, activities that distinguish your business from others in the same marketplace.  A close second in importance is performing a process audit during the purchase cycle of any large software package in order to determine just how different the embedded ‘best practice’ is from your current processes.  The need to make large adjustments to existing processes in order to accommodate a new software application is one of the most widely acknowledged sources of installation failure.

William L. Kuechler, Jr., Ph.D.

William L. Kuechler, Jr., Ph.D.William Kuechler is professor of information systems and chair of the information systems discipline at the University of Nevada, Reno. He holds a bachelor’s degree in electrical engineering from Drexel University, and a Ph.D. in computer information systems from Georgia State University.

His academic career follows a successful industry career in information systems development and consulting. His work experience brings insight to his teaching of both IS management and technical material and brings a wealth of practical background to his research. Kuechler’s two primary research themes are the cognitive bases of IS use, development and education, and design science research in IS. He is on the editorial advisory board for the Journal of Information Systems Education and is an associate editor for the Journal of Information