Archive for the ‘organizational learning’ Category

Enchantment, Organizations, and Mediating Instruments: Potential for a New Consensus?

August 3, 2011

I just came across something that could be helpful in regaining some forward momentum and expanding the frame of reference for the research on caring in nursing with Jane Sumner (Sumner & Fisher, 2008). We have yet to really work in the failure of Habermas’ hermeneutic objectivism (Kim, 2002; Thompson, 1984) and we haven’t connected what we’ve done with (a) Ricoeur’s (1984, 1985, 1990, 1995) sense of narrative as describing the past en route to prescribing the future (prefiguring, configuring, and refiguring the creation of meaning in discourse) and with (b) Wright’s (1999) sense of learning from past data to efficiently and effectively anticipate new data within a stable inferential frame of reference.

Now I’ve found a recent publication that resonates well with this goal, and includes examples from nursing to boot. Boje and Baskin (2010; see especially pp. 12-17 in the manuscript available at cite only secondary literature but do a good job of articulating where the field is at conceptually and in tracing the sources of that articulation.  So they make no mention of Ricoeur on narrative (1984, 1985, 1990) and on play and the heuristic fiction (1981, pp. 185-187), and they make no mention of Gadamer on play as the most important clue to methodological authenticity (1989, pp. 101-134). It follows that they then also do not make any use of the considerable volume of other available and relevant work on the metaphysics of allure, captivation, enthrallment, rapture, beauty, or eros.

This is all very important because these issues are highly salient markers of the distinction between a modern, Cartesian, and mechanical worldview destructive of enchantment and play, and the amodern, nonCartesian, and organic worldview in tune with enchantment and play. As I have stressed repeatedly in these posts, the way we frame problems is now the primary problem, in opposition to those who think identifying and applying resources, techniques, or will power is the problem. It is essential that we learn to frame problems in a way that begins from requirements of subject-object interdependence instead of from assumptions of subject-object independence. Previous posts here explore in greater detail how we are all captivated by the desire for meaning. Any time we choose negotiation or patient waiting over violence, we express faith in the ultimate value of trusting our words. So though Boje and Baskin do not document this larger context, they still effectively show exactly where and how work in the nonCartesian paradigm of enchantment connects up with what’s going on in organizational change management theory.

The paper’s focus on narrative as facilitating enchantment and disenchantment speaks to our fundamental absorption into the play of language. Enchantment is described on page 2 as involving positive connection with existence, of being enthralled with the wonder of being endowed with natural and cultural gifts.  Though not described as such, this hermeneutics of restoration, as Ricoeur (1967) calls it, focuses on the way symbols give rise to thought in an unasked-for assertion of meaningfulness. The structure we see emerge of its own accord across multiple different data sets from tests, surveys, and assessments is an important example of this gift through which previously identified meanings re-assert themselves anew (see my published philosophical work, such as Fisher, 2004). The contrast with disenchantment of course arises as a function of the dead and one-sided modern Cartesian effort aimed at controlling the environment, which effectively eliminates wonder and meaning via a hermeneutics of suspicion.

In accord with the work done to date with Sumner on caring in nursing, the Boje and Baskin paper describes people’s variable willingness to accept disenchantment or demand enchantment (p. 13) in terms that look quite like preconventional and postconventional Kohlbergian stages. A nurse’s need to shift from one dominant narrative form to another is described as very difficult because of the way she had used the one to which she was accustomed to construct her identity as a nurse (p. 15). Bi-directionality between nurses and patients is implied in another example of a narrative shift in a hospital (p. 16). Both identity and bi-directionality are central issues in the research with Sumner.

The paper also touches on the conceptual domain of instrumental realism, as this is developed in the works of Ihde, Latour, Heelan and others (on p. 6; again, without citing them), and emphasizes a nonCartesian subject-object unity and belongingness, which is described at length in Ricoeur’s work. At the bottom of page 7 and top of 8, storytelling is theorized in terms of retrospection, presentness, and a bet on future meaning, which precisely echoes Ricoeur’s (1984, 1985, 1990) sense of narrative refiguration, configuration, and prefiguration. A connection with measurement comes here, in that what we want is to:

“reach beyond the data in hand to what these data might imply about future data, still unmet, but urgent to foresee. The first problem is how to predict values for these future data, which, by the meaning of inference, are necessarily missing. This meaning of missing must include not only the future data to be inferred but also all possible past data that were lost or never collected” (Wright, 1999, p. 76).

Properly understood and implemented (see previous posts in this blog), measurement based in models of individual behavior provides a way to systematically create an atmosphere of emergent enchantment. Having developmentally sound narratives rooted in individual measures on multiple dimensions over time gives us a shared written history that we can all find ourselves in, and that we can then use to project a vision of a shared future that has reasonable expectations for what’s possible.

This mediation of past and future by means of technical instruments is being described in a way (Miller & O’Leary, 2007) that to me (Fisher & Stenner, 2011) denotes a vital distinction not just between the social and natural sciences, but between economically moribund and inflationary industries such as education, health care, and social services, on the one hand, and economically vibrant and deflationary industries such as microprocessors, on the other.

It is here, and I say this out loud for the first time here, even to myself, that I begin to see the light at the end of the tunnel, to see a way that I might find a sense of closure and resolution in the project I took up over 30 years ago. My puzzle has been one of understanding in theory and practice how it is that measurement and mathematical thinking are nothing but refinements of the logic used in everyday conversation. It only occurs to me now that, if we can focus the conversations that we are in ways that balance meaningfulness and precision, that situate each of us as individuals relative to the larger wholes of who we have been and who we might be, that encompasses both the welcoming Socratic midwife and the annoying Socratic gadfly as different facets of the same framework, and that enable us to properly coordinate and align technical projects involving investments in intangible capital, well, then, we’ll be in a position to more productively engage with the challenges of the day.

There won’t be any panacea but there will be a new consensus and a new infrastructure that, however new they may seem, will enact yet again, in a positive way, the truth of the saying, “the more things change, the more they stay the same.” As I’ve repeatedly argued, the changes we need to implement are nothing but extensions of age-old principles into areas in which they have not yet been applied. We should take some satisfaction from this, as what else could possibly work? The originality of the application does not change the fact that it is rooted in appropriating, via a refiguration, to be sure, a model created for other purposes that works in relation to new purposes.

Another way of putting the question is in terms of that “permanent arbitration between technical universalism and the personality constituted on the ethico-political plane” characteristic of the need to enter into the global technical society while still retaining our roots in our cultural past (Ricoeur, 1974, p. 291). What is needed is the capacity to mediate each individual’s retelling of the grand narrative so that each of us sees ourselves in everyone else, and everyone else in ourselves. Though I am sure the meaning of this is less than completely transparent right now, putting it in writing is enormously satisfying, and I will continue to work on telling the tale as it needs to be told.


Boje, D., & Baskin, K. (2010). Our organizations were never disenchanted: Enchantment by design narratives vs. enchantment by emergence. Journal of Organizational Change Management, 24(4), 411-426.

Fisher, W. P., Jr. (2004, October). Meaning and method in the social sciences. Human Studies: A Journal for Philosophy and the Social Sciences, 27(4), 429-54.

Fisher, W. P., Jr., & Stenner, A. J. (2011, August 31 to September 2). A technology roadmap for intangible assets metrology. International Measurement Confederation (IMEKO). Jena, Germany.

Gadamer, H.-G. (1989). Truth and method (J. Weinsheimer & D. G. Marshall, Trans.) (Second revised edition). New York: Crossroad.

Kim, K.-M. (2002, May). On the failure of Habermas’s hermeneutic objectivism. Cultural Studies <–> Critical Methodologies, 2(2), 270-98.

Miller, P., & O’Leary, T. (2007, October/November). Mediating instruments and making markets: Capital budgeting, science and the economy. Accounting, Organizations, and Society, 32(7-8), 701-34.

Ricoeur, P. (1967). Conclusion: The symbol gives rise to thought. In R. N. Anshen (Ed.), The symbolism of evil (pp. 347-57). Boston, Massachusetts: Beacon Press.

Ricoeur, P. (1974). Political and social essays (D. Stewart & J. Bien, Eds.). Athens, Ohio: Ohio University Press.

Ricoeur, P. (1981). Hermeneutics and the human sciences: Essays on language, action and interpretation (J. B. Thompson, Ed.) (J. B. Thompson, Trans.). Cambridge, England: Cambridge University Press.

Ricoeur, P. (1984, 1985, 1990). Time and Narrative, Vols. 1-3 (K. McLaughlin (Blamey) & D. Pellauer, Trans.). Chicago, Illinois: University of Chicago Press.

Ricoeur, P. (1995). Reply to Peter Kemp. In L. E. Hahn (Ed.), The philosophy of Paul Ricoeur (pp. 395-398). Chicago, Illinois: Open Court.

Sumner, J., & Fisher, W. P., Jr. (2008). The moral construct of caring in nursing as communicative action: The theory and practice of a caring science. Advances in Nursing Science, 31(4), E19-E36.

Thompson, J. B. (1981). Critical hermeneutics: A study in the thought of Paul Ricoeur and Jurgen Habermas. New York: Cambridge University Press.

Wright, B. D. (1999). Fundamental measurement for psychology. In S. E. Embretson & S. L. Hershberger (Eds.), The new rules of measurement: What every educator and psychologist should know (pp. 65-104 []). Hillsdale, New Jersey: Lawrence Erlbaum Associates.

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at
Permissions beyond the scope of this license may be available at

A Technology Road Map for Efficient Intangible Assets Markets

February 24, 2011

Scientific technologies, instruments and conceptual images have been found to play vitally important roles in economic success because of the way they enable accurate predictions of future industry and market states (Miller & O’Leary, 2007). The technology road map for the microprocessor industry, based in Moore’s Law, has successfully guided market expectations and coordinated research investment decisions for over 40 years. When the earlier electromechanical, relay, vacuum tube, and transistor computing technology paradigms are included, the same trajectory has dominated the computer industry for over 100 years (Kurzweil, 2005, pp. 66-67).

We need a similar technology road map to guide the creation and development of intangible asset markets for human, social, and natural (HSN) capital. This will involve intensive research on what the primary constructs are, determining what is measurable and what is not, creating consensus standards for uniform metrics and the metrology networks through which those standards will function. Alignments with these developments will require comprehensively integrated economic models, accounting frameworks, and investment platforms, in addition to specific applications deploying the capital formations.

What I’m proposing is, in a sense, just an extension in a new direction of the metrology challenges and issues summarized in Table ITWG15 on page 48 in the 2010 update to the International Technology Roadmap for Semiconductors ( Distributed electronic communication facilitated by computers and the Internet is well on the way to creating a globally uniform instantaneous information network. But much of what needs to be communicated through this network remains expressed in locally defined languages that lack common points of reference. Meaningful connectivity demands a shared language.

To those who say we already have the technology necessary and sufficient to the measurement and management of human, social, and natural capital, I say think again. The difference between what we have and what we need is the same as the difference between (a) an economy whose capital resources are not represented in transferable representations like titles and deeds, and that are denominated in a flood of money circulating in different currencies, and, (b) an economy whose capital resources are represented in transferable documents and are traded using a single currency with a restricted money supply. The measurement of intangible assets is today akin to the former economy, with little actual living capital and hundreds of incommensurable instruments and scoring systems, when what we need is the latter. (See previous entries in this blog for more on the difference between dead and living capital.)

Given the model of a road map detailing the significant features of the living capital terrain, industry-specific variations will inform the development of explicit market expectations, the alignment of HSN capital budgeting decisions, and the coordination of research investments. The concept of a technology road map for HSN capital is based in and expands on an integration of hierarchical complexity (Commons & Richards, 2002; Dawson, 2004), complex adaptive functionality (Taylor, 2003), Peirce’s semiotic developmental map of creative thought (Wright, 1999), and historical stages in the development of measuring systems (Stenner & Horabin, 1992; Stenner, Burdick, Sanford, & Burdick, 2006).

Technology road maps replace organizational amnesia with organizational learning by providing the structure of a memory that not only stores information, knowledge, understanding, and wisdom, but makes it available for use in new situations. Othman and Hashim (2004) describe organizational amnesia (OA) relative to organizational learning (OL) in a way that opens the door to a rich application of Miller and O’Leary’s (2007) detailed account of how technology road maps contribute to the creation of new markets and industries. Technology road maps function as the higher organizational principles needed for transforming individual and social expertise into economically useful products and services. Organizational learning and adaptability further need to be framed at the inter-organizational level where their various dimensions or facets are aligned not only within individual organizations but between them within the industry as a whole.

The mediation of the individual and organizational levels, and of the organizational and inter-organizational levels, is facilitated by measurement. In the microprocessor industry, Moore’s Law enabled the creation of technology road maps charting the structure, processes, and outcomes that had to be aligned at the individual, organizational, and inter-organizational levels to coordinate the entire microprocessor industry’s economic success. Such road maps need to be created for each major form of human, social, and natural capital, with the associated alignments and coordinations put in play at all levels of every firm, industry, and government.

It is a basic fact of contemporary life that the technologies we employ every day are so complex that hardly anyone understands how they do what they do. Technological miracles are commonplace events, from transportation to entertainment, from health care to manufacturing. And we usually suffer little in the way of adverse consequences from not knowing how an automatic transmission, a thermometer, or digital video reproduction works. It is enough to know how to use the tool.

This passive acceptance of technical details beyond our ken extends into areas in which standards, methods, and products are much less well defined. Managers, executives, researchers, teachers, clinicians, and others who need measurement but who are unaware of its technicalities are then put in the position of being passive consumers accepting the lowest common denominator in the quality of the services and products obtained.

And that’s not all. Just as the mass market of measurement consumers is typically passive and uninformed, in complementary fashion the supply side is fragmented and contentious. There is little agreement among measurement experts as to which quantitative methods set the standard as the state of the art. Virtually any method can be justified in terms of some body of research and practice, so the confused consumer accepts whatever is easily available or is most likely to support a preconceived agenda.

It may be possible, however, to separate the measurement wheat from the chaff. For instance, measurement consumers may value a way of distinguishing among methods that is based in a simple criterion of meaningful utility. What if all measurement consumers’ own interests in, and reasons for, measuring something in particular, such as literacy or community, were emphasized and embodied in a common framework? What if a path of small steps from currently popular methods of less value to more scientific ones of more value could be mapped? Such a continuum of methods could range from those doing the least to advance the users’ business interests to those doing the most to advance those interests.

The aesthetics, simplicity, meaningfulness, rigor, and practical consequences of strong theoretical requirements for instrument calibration provide such criteria for choices as to models and methods (Andrich, 2002, 2004; Busemeyer and Wang, 2000; Myung, 2000; Pitt, Kim, Myung, 2003; Wright, 1997, 1999). These criteria could be used to develop and guide explicit considerations of data quality, construct theory, instrument calibration, quantitative comparisons, measurement standard metrics, etc. along a continuum from the most passive and least objective to the most actively involved and most objective.

The passive approach to measurement typically starts from and prioritizes content validity. The questions asked on tests, surveys, and assessments are considered relevant primarily on the basis of the words they use and the concepts they appear to address. Evidence that the questions actually cohere together and measure the same thing is not needed. If there is any awareness of the existence of axiomatically prescribed measurement requirements, these are not considered to be essential. That is, if failures of invariance are observed, they usually provoke a turn to less stringent data treatments instead of a push to remove or prevent them. Little or no measurement or construct theory is implemented, meaning that all results remain dependent on local samples of items and people. Passively approaching measurement in this way is then encumbered by the need for repeated data gathering and analysis, and by the local dependency of the results. Researchers working in this mode are akin to the woodcutters who say they are too busy cutting trees to sharpen their saws.

An alternative, active approach to measurement starts from and prioritizes construct validity and the satisfaction of the axiomatic measurement requirements. Failures of invariance provoke further questioning, and there is significant practical use of measurement and construct theory. Results are then independent of local samples, sometimes to the point that researchers and practical applications are not encumbered with usual test- or survey-based data gathering and analysis.

As is often the case, this black and white portrayal tells far from the whole story. There are multiple shades of grey in the contrast between passive and active approaches to measurement. The actual range of implementations is much more diverse that the simple binary contrast would suggest (see the previous post in this blog for a description of a hierarchy of increasingly complex stages in measurement). Spelling out the variation that exists could be helpful for making deliberate, conscious choices and decisions in measurement practice.

It is inevitable that we would start from the materials we have at hand, and that we would then move through a hierarchy of increasing efficiency and predictive control as understanding of any given variable grows. Previous considerations of the problem have offered different categorizations for the transformations characterizing development on this continuum. Stenner and Horabin (1992) distinguish between 1) impressionistic and qualitative, nominal gradations found in the earliest conceptualizations of temperature, 2) local, data-based quantitative measures of temperature, and 3) generalized, universally uniform, theory-based quantitative measures of temperature.

The latter is prized for the way that thermodynamic theory enables the calibration of individual thermometers with no need for testing each one in empirical studies of its performance. Theory makes it possible to know in advance what the results of such tests would be with enough precision to greatly reduce the burden and expenses of instrument calibration.

Reflecting on the history of psychosocial measurement in this context, it then becomes apparent that these three stages can then be further broken down. The previous post in this blog lists the distinguishing features for each of six stages in the evolution of measurement systems, building on the five stages described by Stenner, Burdick, Sanford, and Burdick (2006).

And so what analogue of Moore’s Law might be projected? What kind of timetable can be projected for the unfolding of what might be called Stenner’s Law? Guidance for reasonable expectations is found in Kurzweil’s (2005) charting of historical and projected future exponential increases in the volume of information and computer processing speed. The accelerating growth in knowledge taking place in the world today speaks directly to a systematic integration of criteria for what shall count as meaningful new learning. Maps of the roads we’re traveling will provide some needed guidance and make the trip more enjoyable, efficient, and productive. Perhaps somewhere not far down the road we’ll be able to project doubling rates for growth in the volume of fungible literacy capital globally, or the halving rates in the cost of health capital stocks. We manage what we measure, so when we begin measuring well what we want to manage well, we’ll all be better off.


Andrich, D. (2002). Understanding resistance to the data-model relationship in Rasch’s paradigm: A reflection for the next generation. Journal of Applied Measurement, 3(3), 325-59.

Andrich, D. (2004, January). Controversy and the Rasch model: A characteristic of incompatible paradigms? Medical Care, 42(1), I-7–I-16.

Busemeyer, J. R., & Wang, Y.-M. (2000, March). Model comparisons and model selections based on generalization criterion methodology. Journal of Mathematical Psychology, 44(1), 171-189 [].

Commons, M. L., & Richards, F. A. (2002, Jul). Organizing components into combinations: How stage transition works. Journal of Adult Development, 9(3), 159-177.

Dawson, T. L. (2004, April). Assessing intellectual development: Three approaches, one sequence. Journal of Adult Development, 11(2), 71-85.

Kurzweil, R. (2005). The singularity is near: When humans transcend biology. New York: Viking Penguin.

Miller, P., & O’Leary, T. (2007, October/November). Mediating instruments and making markets: Capital budgeting, science and the economy. Accounting, Organizations, and Society, 32(7-8), 701-34.

Myung, I. J. (2000). Importance of complexity in model selection. Journal of Mathematical Psychology, 44(1), 190-204.

Othman, R., & Hashim, N. A. (2004). Typologizing organizational amnesia. The Learning Organization, 11(3), 273-84.

Pitt, M. A., Kim, W., & Myung, I. J. (2003). Flexibility versus generalizability in model selection. Psychonomic Bulletin & Review, 10, 29-44.

Stenner, A. J., Burdick, H., Sanford, E. E., & Burdick, D. S. (2006). How accurate are Lexile text measures? Journal of Applied Measurement, 7(3), 307-22.

Stenner, A. J., & Horabin, I. (1992). Three stages of construct definition. Rasch Measurement Transactions, 6(3), 229 [].

Taylor, M. C. (2003). The moment of complexity: Emerging network culture. Chicago: University of Chicago Press.

Wright, B. D. (1997, Winter). A history of social science measurement. Educational Measurement: Issues and Practice, 16(4), 33-45, 52 [].

Wright, B. D. (1999). Fundamental measurement for psychology. In S. E. Embretson & S. L. Hershberger (Eds.), The new rules of measurement: What every educator and psychologist should know (pp. 65-104 []). Hillsdale, New Jersey: Lawrence Erlbaum Associates.

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at
Permissions beyond the scope of this license may be available at