Information Needs For The AIS – 2024

In 1967, Russell Ackoff presented a classical analysis of misinformation in management ((COPPIED AT THE BOTTOM)) Now, you need to fast-forward to the present. After reading the case, craft your own version of misinformation in management by developing five (5) key incorrect assumptions that management makes about its accounting information systems.

For this assignment, research the Internet for information related to improper assumptions concerning accounting information systems.

Page paper

Write a five to seven (5-7) page paper in which you:

1.       Based on your research, assess how corporate leaders may make improper assumptions related to accounting information systems and the related information. Indicate the most negative potential impacts on business operations related to these assumptions. Provide support for your rationale.

suggest

2.       Suggest three to four (3-4) ways in which organizational performance may be improved when information is properly managed within a business system. Provide support for your rationale.

3.       Evaluate the level of system security (i.e., high, medium, low) needed to ensure information integrity within automated business systems. Provide support for your evaluation.

4.       Use at least three (3) quality resources in this assignment. Note: Wikipedia and similar Websites do not qualify as quality resources.

Your assignment must follow these formatting requirements:

·         Be typed, double spaced, using Times New Roman font (size 12), with one-inch margins on all sides; citations and references must follow APA or school-specific format. Check with your professor for any additional instructions.

>&nbsp

 

>&nbsp

 

1 REQUIRED: Read the five assumptions, contentions, and Ackoff’s explanation. For each of the five, decide if you agree or disagree with Ackoff’s contentions. Defend your stand by preparing a report to explain your beliefs. Be prepared to defend your beliefs in class. ASSUMPTION 1: MANAGEMENT NEEDS MORE INFORMATION Assumption 1.

Most management information systems (MISs) are designed based on the assumption that the critical deficiency under which most managers operate is the lack of relevant information. Contention 1. I do not deny that most managers lack a good deal of information that they should have, but I do deny that this is the most important informational deficiency from which they suffer.

It seems to me that they suffer more from an overabundance of irrelevant information. This is not a play on words. The consequences of changing the emphasis of an MIS from supplying relevant information to eliminating irrelevant information is considerable. If one is preoccupied with supplying relevant information, attention is almost exclusively given to the generation, storage, and retrieval of information; hence, emphasis is placed on constructing data banks, coding, indexing, updating files, using access languages, and so on.

The ideal that has emerged from this orientation is an infinite pool of data into which managers can reach to pull out any information they want. If, however, one sees the manager’s information problem primarily, but not exclusively, as one that arises out of an overabundance of irrelevant information, most of which was not asked for, then the two most important functions of an information system become filtration (or evaluation) and condensation.

The literature on the MIS seldom refers to these functions, let alone considers how to carry them out. My experience indicates that most managers receive much more data (if not information) than they can possibly absorb even if they spend all of their time trying to do so. Hence they already suffer from an information overload.

They must spend a great deal of time separating the relevant documents. For example, I have found that I receive an average of 43 hours of unsolicited reading material each week. The solicited material is usually half again this amount. I have seen a daily stock status report that consists of approximately 600 pages of computer printout.

The report is circulated daily across managers’ desks. I’ve also seen requests for major capital expenditures that come in book size, several of which are distributed to managers each week. It is not uncommon for many managers to receive an average of one journal a day or more. One could go on and on.

Unless the information overload to which managers are subjected is reduced, any additional information made available by an MIS cannot be expected to be used effectively. Even relevant documents have too much redundancy. Most documents can be considerably condensed without loss of content. My point here is best made, perhaps, by describing This case is adapted from a classic article entitled “Management Misinformation Systems.

” It was written by Russell L. Ackoff and appeared in Management Sciences. In the article, Ackoff identified five common assumptions about information systems and then explained why he disagreed with them. Case 1-2 Ackoff’s Management Misinformation Systems 2 CASE 1-2 briefly an experiment that a few of my colleagues and I conducted on the operations research (OR) literature several years ago.

By using a panel of well-known experts, we identified four OR articles that all members of the panel considered to be “above average” and four articles that were considered to be “below average.” The authors of the eight articles were asked to prepare “objective” examinations (duration 30 minutes) plus answers for graduate students who were to be assigned the articles for reading.

(The authors were not informed about the experiment.) Then several experienced writers were asked to reduce each article to two-thirds and one-third of its original length only by eliminating words. They also prepared a brief abstract of each article. Those who did the condensing did not see the examinations to be given to the students.

A group of graduate students who had not previously read the articles was then selected. Each one was given four articles randomly selected, each of which was in one of its four versions: 100 percent, 67 percent, 33 percent, or abstract. Each version of each article was read by two students. All were given the same examinations.

The average scores on the examinations were compared. For the above-average articles there was no significant difference between average test scores for the 100 percent, 67 percent, and 33 percent versions, but there was a significant decrease in average test scores for those who had read only the abstract.

For the below-average articles there was no difference in average test scores among those who had read the 100 percent, 67 percent, and 33 percent versions, but there was a significant increase in average test scores of those who had read only the abstract. The sample used was obviously too small for general conclusions, but the results strongly indicate the extent to which even good writing can be condensed without loss of information.

I refrain from drawing the obvious conclusions about bad writing. It seems clear that condensation as well as filtration, performed mechanically or otherwise, should be an essential part of an MIS, and that such a system should be capable of handling much, if not all, of the unsolicited as well as solicited information that a manager receives.

ASSUMPTION 2: MANAGERS NEED THE INFORMATION THEY WANT Assumption 2. Most MIS designers “determine” what information is needed by asking managers what information they would like to have. This is based on the assumption that managers know what information they need and want. Contention 2. For a manager to know what information he needs, he must be aware of each type of decision he should (as well as does) make and he must have an adequate model of each.

These conditions are seldom satisfied. Most managers have some conception of at least some of the types of decisions they must make. Their conceptions, however, are likely to be deficient in a very critical way, a way that follows from an important principle of scientific economy: The less we understand a phenomenon, the more variables we require to explain it.

Hence managers who do not understand the phenomena they control play it “safe” and, with respect to information, want “everything.” The MIS designer, who has even less understanding of the relevant phenomena than the manager, tries to provide even more than everything. She thereby increases what is already an overload of irrelevant information.

For example, market researchers in a major oil company once asked their marketing managers what variables they thought were relevant in estimating the sales volume of future service stations. Almost 70 variables were identified. The market researchers then added about half again this many variables and performed a large multiple linear regression analysis of sales of existing stations against these variables and found about 35 to be statistically significant.

A forecasting equation was based on this analysis. An OR team subsequently constructed a model based on only one of these variables, traffic flow, CASE 1-2 3 which predicted sales better than the 35-variable regression equation. The team went on to explain sales at service stations in terms of the customers’ perception of the amount of time lost by stopping for service.

The relevance of all but a few of the variables used by the market researchers could be explained by their effect on such a perception. The moral is simple: One cannot specify what information is required for decision making until an explanatory model of the decision process and the system involved has been constructed and tested.

Information systems are subsystems of control systems. They cannot be designed adequately without taking control into account. Furthermore, whatever else regression analyses can yield, they cannot yield understanding and explanation of phenomena. They describe and, at best, predict. ASSUMPTION 3: GIVING MANAGERS THE INFORMATION THEY NEED IMPROVES THEIR DECISION MAKING Assumption 3.

It is frequently assumed that if managers are provided with the information they need, they will then have no problem in using it effectively. Contention 3. Operations research (an academic subject area dealing with the application of mathematical models and techniques to business decisions) stands to the contrary.

Give most managers an initial tableau of a typical “real” mathematical programming, sequencing, or network problem and see how close they come to an optimal solution. If their experience and judgment have any value, they may not do badly, but they will seldom do very well. In most management problems there are too many possibilities to expect experience, judgment, or intuition to provide good guesses, even with perfect information.

Furthermore, when several probabilities are involved in a problem, the unguided mind of even a manager has difficulty in aggregating them in a valid way. We all know many simple problems in probability in which untutored intuition usually does very badly (e.g., What are the correct odds that 2 of 25 people selected at random will have their birthdays on the same day of the year?).

For example, very few of the results obtained by queuing theory, when arrivals and service are probabilistic, are obvious to managers; nor are the results of risk analysis where the managers’ own subjective estimates of probabilities are used. The moral: It is necessary to determine how well managers can use needed information.

When, because of the complexity of the decision process, they cannot use it well, they should be provided with either decision rules or performance feedback so that they can identify and learn from their mistakes. ASSUMPTION 4: MORE COMMUNICATION MEANS BETTER PERFORMANCE Assumption 4. The characteristic of most MISs is that they provide managers with better current information about what other managers and their departments are doing.

Underlying this provision is the belief that better interdepartmental communication enables managers to coordinate their decisions more effectively and hence improves the organization’s overall performance. Contention 4. Not only is this not necessarily so, but it seldom is so. One would hardly expect two competing companies to become more cooperative because the information each acquires about the other is improved.

For example, consider the following very much simplified version of a situation I once ran into. The simplification of the case does not affect any of its essential characteristics. A department store has two “line” operations: buying and selling. Each function is performed by a separate department.

The Purchasing Department primarily controls one variable: how much of each item is bought. The Merchandising Department controls the price at which it is sold. Typically, the measure of performance applied to the Purchasing Department was the turnover rate of inventory. The measure applied to the Merchandising Department was gross sales; this department sought to maximize the number of items sold times their price.

Now by examining a single item, let us consider what happens in this system. The merchandising manager, using his knowledge of competition and consumption, set a price that he judged would maximize gross sales. In doing so, he utilized price-demand curves for each type of item. For each price the curves show the expected sales and values on an upper and lower confidence band as well (see Figure 1).

When instructing the Purchasing Department about how many items to make available, the merchandising manager quite naturally used the value on the upper confidence curve. This minimized the chances of his running short, which, if it occurred, would hurt his performance. It also maximized the chances of being overstocked, but this was not his concern, only the purchasing manager’s.

Say, therefore, that the merchandising manager initially selected price P1 and requested that amount Q1 be made available by the Purchasing Department. In this company the purchasing manager also had access to the price-demand curves. She knew that the merchandising manager always ordered optimistically.

Therefore, using the same curve, she read over from Q1 to the upper limit and down to the expected value, from which she obtained Q2, the quantity she actually intended to make available. She did not intend to pay for the merchandising manager’s optimism. If merchandising ran out of stock, it was not her worry.

Now the merchandising manager was informed about what the purchasing manager had done, so he adjusted his price to P2. The purchasing manager in turn was told that the merchandising manager had made this readjustment, so she planned to make only Q3 available. If this process (made possible only by perfect communication between departments) had been allowed to continue, nothing would have been bought and nothing would have been sold.

This outcome was avoided by prohibiting communication between the two departments and forcing each to guess what the other was doing. I have obviously caricatured the situation in order to make the point clear: When organizational units have inappropriate measures of performance that put them in conflict with each other, as is often the case, communication between them may hurt organizational performance, not help it.

Organizational structure and performance measurement must be taken into account before opening the floodgates and permitting the free flow of information between parts of the organization. ASSUMPTION 5: MANAGERS NEED ONLY TO UNDERSTAND HOW TO USE AN INFORMATION SYSTEM Assumption 5. A manager does not have to understand how an information system works, only how to use it.

Contention 5. Managers must understand their MIS or they are handicapped and cannot properly operate and control their company. Most MIS designers seek to make their systems as innocuous and unobtrusive as possible to managers, lest they become frightened. The designers try to provide managers with very easy 4 CASE 1-2 Optimistic Expected Pessimistic P1 P2 P3 Q1 Q2 Q3Demand Price Figure 1 access to the system and assure them that they need to know nothing more about it.

The designers usually succeed in keeping managers ignorant in this regard. This leaves managers unable to evaluate the MIS as a whole. It often makes them afraid to even try to do so, lest they display their ignorance publicly. In failing to evaluate their MIS, managers delegate much of the control of the organization to the system’s designers and operators—who may have many virtues, but managerial competence is seldom among them.

Let me cite a case in point. A chairman of the board of a midsize company asked for help on the following problem. One of his larger (decentralized) divisions had installed a computerized production inventory control and manufacturing manager information system about a year earlier. It had acquired about $2 million worth of equipment to do so.

The board chairman had just received a request from the division for permission to replace the original equipment with newly announced equipment that would cost several times the original amount. An extensive “justification” for so doing was provided with the request. The chairman wanted to know whether the request was justified.

He admitted to complete incompetence in this connection. A meeting was arranged at the division, at which I was subjected to an extended and detailed briefing. The system was large but relatively simple. At the heart of it was a reorder point for each item and a maximum allowable stock level. Reorder quantities took lead time as well as the allowable maximum into account.

The computer kept track of stock, ordered items when required, and generated numerous reports on both the state of the system it controlled and its own “actions.” When the briefing was over, I was asked if I had any questions. I did. First I asked if, when the system had been installed, there had been many parts whose stock level exceeded the maximum amount possible under the new system.

I was told there were many. I asked for a list of about 30 and for some graph paper. Both were provided. With the help of the system designer and volumes of old daily reports I began to plot the stock level of the first listed item over time. When this item reached the maximum “allowable” stock level, it had been reordered.

The system designer was surprised and said that by sheer “luck” I had found one of the few errors made by the system. Continued plotting showed that because of repeated premature reordering the item had never gone much below the maximum stock level. Clearly, the program was confusing the maximum allowable stock level and the reorder point.

This turned out to be the case in more than half of the items on the list. Next I asked if they had many paired parts, ones that were only used with each other, for example, matched nuts and bolts. They had many. A list was produced and we began checking the previous day’s withdrawals. For more than half of the pairs the differences in the numbers recorded as withdrawn were very large.

No explanation was provided. Before the day was out it was possible to show by some quick and dirty calculations that the new computerized system was costing the company almost $150,000 per month more than the hand system that it had replaced, most of this in excess inventories. The recommendation was that the system be redesigned as quickly as possible and that the new equipment not be authorized for the time being.

The questions asked of the system had been obvious and simple ones. Managers should have been able to ask them, but—and this is the point—they felt themselves incompetent to do so. They would not have allowed a hand-operated system to get so far out of their control. No MIS should ever be installed unless the managers for whom it is intended are trained to evaluate and hence control it rather than be controlled by it.

Source: Reprinted by permission of Russell L. Ackoff, “Management Misinformation Systems,” Management Sciences 14, no. 4 (December 1967). The Institute of Management Sciences, 290 Westminster Street, Providence, R.I. 02903.

Need assignment writing services that are 100% risk-free. Our writers are capable of providing the best assignment help to students in globally at best rates.