INF5190 Concepts and Principles in Knowledge Management


Table of Contents

1 Introduction 3

2 Theory 4

2.1 Two types of knowledge 4

2.2 Knowledge Management in an Actor-Network Theory perspective 5

2.3 Knowledge Management in a Total Quality Management perspective 6

2.3.1 Knowledge generation 7

2.3.2 Knowledge codification 8

2.3.3 Knowledge transfer 8

2.4 Organizational theory 10

2.5 Cultural aspects 11

3 Method 12

3.1 Interviews and observations 12

3.2 Document analysis 12

4 Case description and analysis 13

4.1 Organizational analysis 13

4.2 The NTAX software lifecycle model 14

4.2.1 Knowledge generation 14

4.2.2 Knowledge codification 15

4.2.3 Mechanisms of knowledge transfer 16

4.3 Main ANT-concepts in a KM perspective applied within the organisation 16

4.3.1 Heterogeneity and negotiations 16

4.3.2 Inscriptions 16

4.3.3 Translation 16

5 Discussion 18

5.1 Two ways of understanding 18

5.2 Knowledge about the current system 18

5.3 Knowledge beyond the current system 19

6 Conclusion 21

References 22

1  Introduction

The report is written as part of the assessment in INF5190 – “Concepts and Principles in Knowledge Management”. The project group has chosen to look at the software lifecycle for one of the major processes at the Norwegian Tax Administration (NTAX), namely the self-declaration process for private citizens. The report’s primary focus is to investigate to which extent the concepts of Knowledge Management (KM) can give insights on the software lifecycle model, and how such insights may be used for process improvement.

In our understanding of KM, based on the INF5190 lectures and the course literature, we tend to see KM as partly an information technology induced management philosophy and partly a management philosophy drawing upon insights from total quality management (TQM). While process improvement seems to be one of the major drivers for installing Knowledge Management, we have, however, found little indication of there being a clear bridge between the specifics of quality management and knowledge management.

Using an organization, such as NTAX, as a case for investigating how the practical aspects of KM and TQM relate thus seems to give context to important research questions from the perspective of information management, quality management and knowledge management.

In the next chapter we will try to provide a perspective on KM within in the context of quality management and information infrastructure theory. The purpose of this exercise is to differentiate between two distinct views on knowledge that seems to be one of the core ideas of KM, namely tacit knowledge and explicit knowledge, although will argue for Kuhn’s way of differentiating between the two rather than the way of Nonaka.

This sets a framework for empirical research, and in the third chapter we give a short description of the methods we have applied when researching NTAX. Results from the case analysis are then presented in chapter four. The results include an analysis of how the knowledge processes of knowledge generation, knowledge codification and knowledge transfer are implemented in the software lifecycle. We try to differentiate between two perspectives on knowledge by applying TQM and ANT separately.

The fifth chapter contains a discussion of the case analysis, where our emphasis has been to use aspects of KM as a way of synthesizing parts of the case analysis that came either from a control perspective or a design perspective. The type of problems one tries to solve puts an constraint on what will be a natural and efficient epistemology. However, as KM aims at handling all aspects of knowledge, both social knowledge and technical knowledge, we try to use this framework for giving a more holistic view on process improvement within the software lifecycles at NTAX.

We chose to conclude the report by commenting on whether we felt the approach gave an added perspective on process improvement, and also give some recommendations for further process improvement at NTAX.

2  Theory

Our intent with this chapter is to describe knowledge management within a wider philosophical context that includes a perspective on technical and social knowledge representing two distinct types of knowledge, suited for solving two different types of problems. We continue by elaborating on what we see to be the characteristics of knowledge management within each of these two domains, and then conclude the theory chapter with one section on organizational aspects and one section on cultural aspects.

2.1  Two types of knowledge

Kuhn (1962) argues that scientific knowledge evolves in two directions. The normal way of accumulating knowledge is by solving puzzles according to rules defined by the current paradigm of doing science within a given discipline. Every now and then, however, this paradigm may be challenged, and new ways of understanding the world may evolve.

The way Kuhn describes the process of doing scientific research is strikingly similar to how Argyris and Schön (1978) suggests how to create organizational learning.

Figure 1 – Double loop learning (Argyris & Schön, 1978)

An organization may learn through the methods of quality management, i.e. identifying errors and opportunities for improvement, and work out ways to improve the system based on such insights, as would correspond to “single loop learning”. However, from time to time it may strike the organization that the whole system should have been designed in a completely different manner, so by challenging the current assumptions and beliefs, the variables governing the single loop learning may be adjusted. This external perspective on the system is referred to as “double loop learning”.

Jashapara (2004: 135) suggests total quality management (TQM) and business process engineering (BPR) as a possible way for understanding or implementing double loop learning. In the appendix B of ISO 9004:2000, it is suggested that double loop learning should be a natural way of operating any ISO 9001:2000 certifiable quality management system, where the inner loop learning is handled by methods for creating continuous improvements (“kaizen”; Imai, 1986) while the outer loop is handled by “breakthrough management” (Juran, 1964).

What seems to us to be not all that clearly stated in the quality management literature, and knowledge management literature, is the way the inner loop and outer loop of double loop learning seem to correspond to two different ways of understanding the world. In our understanding, the difference seems to correspond to the “two cultures” of natural science and social science (Snow, 1964), meaning that the two cultures deal with two different concepts of knowledge, corresponding to whether the purpose of the research is to predict and control nature or whether it is to “understand” a culture from an anthropological point of view, i.e. to understand the language of the tribe in terms of observing what they tend to do.

Kuhn explains his gradual understanding of scientific discovery to start with the understanding of hermeneutics (Lee, 1991). A similar approach may be used for explaining Deming’s image of the organization as a learning system. In fact, Scherkenbach (1986: 35) uses a model similar to figure one to illustrate the Deming philosophy of organizational learning by illustrating process improvement within the current system as the inner loop feedback mechanism between supplier and producer, following the usual methods of statistical process control, while the outer loop feedback mechanism corresponds to consumer research.

In this presentation we consequently try to distinguish between knowledge processes related to the social science of the outer loop, using Actor-Network Theory (Monteiro, 2000; Latour, 1987) as a possible framework, while applying the traditional methods of statistical quality control (Deming, 1992) for analyzing the knowledge processes within the inner loop.

2.2  Knowledge Management in an Actor-Network Theory perspective

Within the science and technology field it is of importance to be able to formulate and understand information, innovation and knowledge management processes, and hence ANT was created; in order to better understand not only what kind of knowledge that exists and gets distributed in an organization, but also how, and by that exploring how a knowledge network is created (Monteiro, 2000). ANT is, although it is called a theory, more of a material/knowledge-semiotic method that gives us a powerful linguistic tool to describe KM processes and relations.

Rather than looking at knowledge as something that is contained by someone or something, it is possible or even feasible to recognize knowledge as a network, or network of contextualized data and information, and that this network may consist of both knowledgeable and may be not so knowledgeable humans, non-humans like i.e. information processing and storing kinds of software, a web of experience, an urge to tell and a yearning to know. In all, a suitable network for the creation and transformation of knowledge from tacit to explicit and back, in addition to tacit to tacit and explicit to explicit. In an environment of learning in multiple directions and contexts, we form networks of knowledge, where we teach and learn, show and tell, exchanging a mix of tacit and explicit knowledge in a both structured and unstructured manner, with all the influencing factors that are comprised of earlier experience, education, familiarity with the tools needed to understand and manage a task, relations to others and so on. We might say that the tools for facilitating KM to a certain extent resembles Information Infrastructures as it is described by Hanseth and Braa (2000) and Monteiro (2000) as open networks, linked to others networks indefinitely, and as such, not necessarily easy to control.

It may help, though, to look at infrastructure as Dahlbom does, as a “regulating skeleton, providing framework and guidelines for the activity”, and even if the immaterial nature of knowledge makes it seemingly incompatible in relation to a physical infrastructure, the more important aspects of an infrastructure is in fact immaterial; agreements, standards and metrics. An important element in Information Infrastructures is the idea of gateways, which is a method for combining two, or more, initially incompatible actors (standards). In an ANT perspective this could be regarded as translation, and in KM as a tool for knowledge distribution. According to Dahlbom, the stable infrastructures of the information society would be its educational institutions, research organisations, legal systems, habits and so on, as a shared resource. (Dahlbom, 2000:217-220), and by sharing, knowledge is created and distributed.

2.3  Knowledge Management in a Total Quality Management perspective

Although the best definition of Total Quality Management (TQM), in a European context, may be the evaluation criteria for the annual quality awards (EFQM, 2006), when discussion the knowledge management principles underlying the ideas of quality management, we chose to focus on the ideas put forward by Shewhart and Deming.

A good starting point for discussing KM within the context of TQM is perhaps by looking at the ISO 9000 model for designing quality management systems (figure 2). Although the figure appears to consists of a single loop, the system model can be thought of to consist of two loops or two types of logic as illustrated by the box indicating “measurements, analysis and improvement” having input from processes and products on one hand (single loop feedback) and input from customers on the other hand (double loop feedback).

In fact, if we identify the customer box on the left hand side of the diagram with the customer box on the right hand side of the diagram, the structure would topologically be possible to be described as a torus, i.e. a topology of double loop learning that is identical to the Argyris model in figure one or the Kuhn model of how science evolves.

Figure 2 – The NS-ISO 9000:2000 quality management framework

As we have already presented Statistical Process Control (SPC) as a tool for knowledge management as a previous INF5190 presentation (Øgland, Bakke, Murad & Bjørnseth, 2006), in this presentation we will just summarize some of the main insights from the presentation.

2.3.1  Knowledge generation

According to Davenport and Prusak (1998: 52), knowledge generation refers to: “…, the specific activities and initiatives firms undertake to increase their stock of corporate knowledge”. They characterize the importance of knowledge generation in this way: “…since it is axiomatic that a firm’s greatest asset is its knowledge, then the firm that fails to generate new knowledge will probably cease to exist.”(ibid.: 67).

Acquisition of knowledge, as used by Davenport and Prusak (ibid), refers to the dimension of knowledge generation where the knowledge is acquired by the organizations as well as that developed within it. In chapter six of ISO 9001:2000 there are requirements related to people having sufficient competence for performing tasks, and that the competence shall be documented.

The main process for generating knowledge with the ISO 9001:2000 model is by recording problems and areas for improvement. Although there are requirements all over the model aiding to define a consistent model that will make learning and improvement possible, the main issues are related to chapter eight in the ISO model where there focus is on applying (knowledge) metrics.

2.3.2  Knowledge codification

Knowledge codification involves codified material such as texts and computer systems that organize and contain knowledge of an organization. The purpose of knowledge codification is: “…to put organizational knowledge into a form that makes it accessible to those who need it” (Davenport and Prusak 1998: 68).

Making a quality system in compliance with the ISO 9000 requirements and recommendations would be a typical example of a system for codifying, storing and managing knowledge for the purpose of improving organizational performance. As the focus of the system is on quality (conformance to standards), the process of codifications runs mostly through the processes of quality control.

In order to perform quality control of processes and products, the standard way of doing this would be by identifying metrics for all critical aspects of the organization and use statistical process control (SPC) for monitoring whether the processes are stable and under control (Deming, 1992).

In other words, codification starts by identifying what may go wrong or what we would like to improve, and then select metrics that are useful from this perspective. In chapter four of ISO 9001:2000 there are also requirements dealing with how to create a map to illustrate how the processes connect.