Archive for the ‘Actor-Network Theory (ANT)’ Category

Revisiting the “Glocal” integration of universals and historical context

April 11, 2014

Integrated considerations of the universal and the local, the pure ideal parameters and the messy concrete observations, seem ever more ubiquitous in my reading lately. For instance, Ricoeur (1992, p. 289) takes up the problem of human rights imperfectly realized as a product of Western Europe’s cultural history that has nonetheless been adopted by nearly every country in the world. Ricoeur raises the notion of “universals in context or of potential or inchoate universals” that embody the paradox in which

“on the one hand, one must maintain the universal claim attached to a few values where the universal and the historical intersect, and on the other hand, one must submit this claim to discussion, not on a formal level, but on the level of the convictions incorporated in concrete forms of life.”

I could hardly come up with a better description of Rasch measurement theory and practice myself. Any given Rasch model data analysis provides many times more individual-level qualitative statistics on the concrete, substantive observations than on the global quantitative measures. The whole point of graphical displays of measurement information in kidmaps (Chien, Wang, Wang, & Lin, 2009; Masters, 1994), Wright maps (Wilson, 2011), construct maps and self-scoring forms (Best, 2008; Linacre, 1997), etc. is precisely to integrate concrete events as they happened with the abstract ideal of a shared measurement dimension.

It is such a shame that there are so few people thinking about these issues aware of the practical value of the state of the art in measurement, and who include all of the various implications of multifaceted, multilevel, and multi-uni-dimensional modeling, fit assessment, equating, construct mapping, standard setting, etc. in their critiques.

The problem falls squarely in the domain of recent work on the coproduction of social, scientific, and economic orders (such as Hutchins 2010, 2012; Nersessian, 2012). Systems of standards, from languages to metric units to dollars, prethink the world for us and simplify a lot of complex work. But then we’re stuck at the level of conceptual, social, economic, and scientific complexity implied by those standards, unless we can create new forms of social organization integrating more domains. Those who don’t know anything about the available tools can’t get any analytic traction, those who know about the tools but don’t connect with the practitioners can’t get any applied traction (see Wilson’s Psychometric Society Presidential Address on this; Wilson, 2013), analysts and practitioners who form alliances but fail to include accountants or administrators may lack financial or organizational traction, etc. etc.

There’s a real need to focus on the formation of alliances across domains of practice, building out the implications of Callon’s (1995, p. 58) observation that “”translation networks weave a socionature.” In other words, standards are translated into the languages of different levels and kinds of practice to the extent that people become so thoroughly habituated to them that they succumb to the illusion that the objects of interest are inherently natural in self-evident ways. (My 2014 IOMW talk took this up, though there wasn’t a lot of time for details.)

Those who are studying these networks have come to important insights that set the stage for better measurement and metrology for human, social, and natural capital. For instance, in a study of universalities in medicine, Berg and Timmermans (2000, pp. 55, 56) note:

“In order for a statistical logistics to enhance precise decision making, it has to incorporate imprecision; in order to be universal, it has to carefully select its locales. The parasite cannot be killed off slowly by gradually increasing the scope of the Order. Rather, an Order can thrive only when it nourishes its parasite—so that it can be nourished by it.”

“Paradoxically, then, the increased stability and reach of this network was not due to more (precise) instructions: the protocol’s logistics could thrive only by parasitically drawing upon its own disorder.”

Though Berg and Timmermans show no awareness at all of probabilistic and additive conjoint measurement theory and practice, their description of how a statistical logistics has to work to enhance precise decision making is right on target. This phenomenon of noise-induced order is a kind of social stochastic resonance (Fisher, 1992, 2011b) that provides another direction in which explanations of Rasch measurement’s potential role in establishing new metrological standards (Fisher, 2009, 2011a) have to be taken.

Berg, M., & Timmermans, S. (2000). Order and their others: On the constitution of universalities in medical work. Configurations, 8(1), 31-61.

Best, W. R. (2008). A construct map that Ben Wright would relish. Rasch Measurement Transactions, 22(3), 1169-70 [http://www.rasch.org/rmt/rmt223a.htm].

Callon, M. (1995). Four models for the dynamics of science. In S. Jasanoff, G. E. Markle, J. C. Petersen & T. Pinch (Eds.), Handbook of science and technology studies (pp. 29-63). Thousand Oaks, California: Sage Publications.

Chien, T.-W., Wang, W.-C., Wang, H.-Y., & Lin, H.-J. (2009). Online assessment of patients’ views on hospital performances using Rasch model’s KIDMAP diagram. BMC Health Services Research, 9, 135 [10.1186/1472-6963-9-135 or http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2727503/%5D.

Fisher, W. P., Jr. (1992, Spring). Stochastic resonance and Rasch measurement. Rasch Measurement Transactions, 5(4), 186-187 [http://www.rasch.org/rmt/rmt54k.htm].

Fisher, W. P., Jr. (2009, November). Invariance and traceability for measures of human, social, and natural capital: Theory and application. Measurement, 42(9), 1278-1287.

Fisher, W. P., Jr. (2011a). Bringing human, social, and natural capital to life: Practical consequences and opportunities. In N. Brown, B. Duckor, K. Draney & M. Wilson (Eds.), Advances in Rasch Measurement, Vol. 2 (pp. 1-27). Maple Grove, MN: JAM Press.

Fisher, W. P., Jr. (2011b). Stochastic and historical resonances of the unit in physics and psychometrics. Measurement: Interdisciplinary Research & Perspectives, 9, 46-50.

Hutchins, E. (2010). Cognitive ecology. Topics in Cognitive Science, 2, 705-715.

Hutchins, E. (2012). Concepts in practice as sources of order. Mind, Culture, and Activity, 19, 314-323.

Linacre, J. M. (1997). Instantaneous measurement and diagnosis. Physical Medicine and Rehabilitation State of the Art Reviews, 11(2), 315-324 [http://www.rasch.org/memo60.htm].

Masters, G. N. (1994). KIDMAP – a history. Rasch Measurement Transactions, 8(2), 366 [http://www.rasch.org/rmt/rmt82k.htm].

Nersessian, N. J. (2012). Engineering concepts: The interplay between concept formation and modeling practices in bioengineering sciences. Mind, Culture, and Activity, 19, 222-239.

Wilson, M. R. (2011). Some notes on the term: “Wright Map.” Rasch Measurement Transactions, 25(3), 1331 [http://www.rasch.org/rmt/rmt253.pdf].

Wilson, M. (2013, April). Seeking a balance between the statistical and scientific elements in psychometrics. Psychometrika, 78(2), 211-236.

A New Agenda for Measurement Theory and Practice in Education and Health Care

April 15, 2011

Two key issues on my agenda offer different answers to the question “Why do you do things the way you do in measurement theory and practice?”

First, we can take up the “Because of…” answer to this question. We need to articulate an historical account of measurement that does three things:

  1. that builds on Rasch’s use of Maxwell’s method of analogy by employing it and expanding on it in new applications;
  2. that unifies the vocabulary and concepts of measurement across the sciences into a single framework so far as possible by situating probabilistic models of invariant individual-level within-variable phenomena in the context of measurement’s GIGO principle and data-to-model fit, as distinct from the interactions of group-level between-variable phenomena in the context of statistics’ model-to-data fit; and
  3. that stresses the social, collective cognition facilitated by networks of individuals whose point-of-use measurement-informed decisions and behaviors are coordinated and harmonized virtually, at a distance, with no need for communication or negotiation.

We need multiple publications in leading journals on these issues, as well as one or more books that people can cite as a way of making this real and true history of measurement, properly speaking, credible and accepted in the mainstream. This web site http://ssrn.com/abstract=1698919 is a draft article of my own in this vein that I offer for critique; other material is available on request. Anyone who works on this paper with me and makes a substantial contribution to its publication will be added as co-author.

Second, we can take up the “In order that…” answer to the question “Why do you do things the way you do?” From this point of view, we need to broaden the scope of the measurement research agenda beyond data analysis, estimation, models, and fit assessment in three ways:

  1. by emphasizing predictive construct theories that exhibit the fullest possible understanding of what is measured and so enable the routine reproduction of desired proportionate effects efficiently, with no need to analyze data to obtain an estimate;
  2. by defining the standard units to which all calibrated instruments measuring given constructs are traceable; and
  3. by disseminating to front line users on mass scales instruments measuring in publicly available standard units and giving immediate feedback at the point of use.

These two sets of issues define a series of talking points that together constitute a new narrative for measurement in education, psychology, health care, and many other fields. We and others may see our way to organizing new professional societies, new journals, new university-based programs of study, etc. around these principles.

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.

Consequences of Standardized Technical Effects for Scientific Advancement

January 24, 2011

Note. This is modified from:

Fisher, W. P., Jr. (2004, Wednesday, January 21). Consequences of standardized technical effects for scientific advancement. In  A. Leplège (Chair), Session 2.5A. Rasch Models: History and Philosophy. Second International Conference on Measurement in Health, Education, Psychology, and Marketing: Developments with Rasch Models, The International Laboratory for Measurement in the Social Sciences, School of Education, Murdoch University, Perth, Western Australia.

—————————

Over the last several decades, historians of science have repeatedly produced evidence contradicting the widespread assumption that technology is a product of experimentation and/or theory (Kuhn 1961; Latour 1987; Rabkin 1992; Schaffer 1992; Hankins & Silverman 1999; Baird 2002). Theory and experiment typically advance only within the constraints set by a key technology that is widely available to end users in applied and/or research contexts. Thus, “it is not just a clever historical aphorism, but a general truth, that ‘thermodynamics owes much more to the steam engine than ever the steam engine owed to thermodynamics’” (Price 1986, p. 240).

The prior existence of the relevant technology comes to bear on theory and experiment again in the common, but mistaken, assumption that measures are made and experimentally compared in order to discover scientific laws. History and the logic of measurement show that measures are rarely made until the relevant law is effectively embodied in an instrument (Kuhn 1961; Michell 1999). This points to the difficulty experienced in metrologically fusing (Schaffer 1992, p. 27; Lapré & van Wassenhove 2002) instrumentalists’ often inarticulate, but materially effective, knowledge (know-how) with theoreticians’ often immaterial, but well articulated, knowledge (know-why) (Galison 1999; Baird 2002).

Because technology often dictates what, if any, phenomena can be consistently produced, it constrains experimentation and theorizing by focusing attention selectively on reproducible, potentially interpretable effects, even when those effects are not well understood (Ackermann 1985; Daston & Galison 1992; Ihde 1998; Hankins & Silverman 1999; Maasen & Weingart 2001). Criteria for theory choice in this context stem from competing explanatory frameworks’ experimental capacities to facilitate instrument improvements, prediction of experimental results, and gains in the efficiency with which a phenomenon is produced.

In this context, the relatively recent introduction of measurement models requiring additive, invariant parameterizations (Rasch 1960) provokes speculation as to the effect on the human sciences that might be wrought by the widespread availability of consistently reproducible effects expressed in common quantitative languages. Paraphrasing Price’s comment on steam engines and thermodynamics, might it one day be said that as yet unforeseeable advances in reading theory will owe far more to the Lexile analyzer (Burdick & Stenner 1996) than ever the Lexile analyzer owed reading theory?

Kuhn (1961) speculated that the second scientific revolution of the mid-nineteenth century followed in large part from the full mathematization of physics, i.e., the emergence of metrology as a professional discipline focused on providing universally accessible uniform units of measurement (Roche 1998). Might a similar revolution and new advances in the human sciences follow from the introduction of rigorously mathematical uniform measures?

Measurement technologies capable of supporting the calibration of additive units that remain invariant over instruments and samples (Rasch 1960) have been introduced relatively recently in the human sciences. The invariances produced appear 1) very similar to those produced in the natural sciences (Fisher 1997) and 2) based in the same mathematical metaphysics as that informing the natural sciences (Fisher 2003). Might then it be possible that the human sciences are on the cusp of a revolution analogous to that of nineteenth century physics? Other factors involved in answering this question, such as the professional status of the field, the enculturation of students, and the scale of the relevant enterprises, define the structure of circumstances that might be capable of supporting the kind of theoretical consensus and research productivity that came to characterize, for instance, work in electrical resistance through the early 1880s (Schaffer 1992).

Much could be learned from Rasch’s use of Maxwell’s method of analogy (Nersessian, 2002; Turner, 1955), not just in the modeling of scientific laws but from the social and economic factors that made the regularities of natural phenomena function as scientific capital (Latour, 1987). Quantification must be understood in the fully mathematical sense of commanding a comprehensive grasp of the real root of mathematical thinking. Far from being simply a means of producing numbers, to be useful, quantification has to result in qualitatively transparent figure-meaning relations at any point of use for any one of every different kind of user. Connections between numbers and unit amounts of the variable must remain constant across samples, instruments, time, space, and measurers. Quantification that does not support invariant linear comparisons expressed in a uniform metric available universally to all end users at the point of need is inadequate and incomplete. Such standardization is widely respected in the natural sciences but is virtually unknown in the human sciences, largely due to untested hypotheses and unexamined prejudices concerning the viability of universal uniform measures for the variables measured via tests, surveys, and performance assessments.

Quantity is an effective medium for science to the extent that it comprises an instance of the kind of common language necessary for distributed, collective thinking; for widespread agreement on what makes research results compelling; and for the formation of social capital’s group-level effects. It may be that the primary relevant difference between the case of 19th century physics and today’s human sciences concerns the awareness, widespread among scientists in the 1800s and virtually nonexistent in today’s human sciences, that universal uniform metrics for the variables of interest are both feasible and of great human, scientific, and economic value.

In the creative dynamics of scientific instrument making, as in the making of art, the combination of inspiration and perspiration can sometimes result in cultural gifts of the first order. It nonetheless often happens that some of these superlative gifts, no matter how well executed, are unable to negotiate the conflict between commodity and gift economics characteristic of the marketplace (Baird, 1997; Hagstrom, 1965; Hyde, 1979), and so remain unknown, lost to the audiences they deserve, and unable to render their potential effects historically. Value is not an intrinsic characteristic of the gift; rather, value is ascribed as a function of interests. If interests are not cultivated via the clear definition of positive opportunities for self-advancement, common languages, socio-economic relations, and recruitment, gifts of even the greatest potential value may die with their creators. On the other hand, who has not seen mediocrity disproportionately rewarded merely as a result of intensive marketing?

A central problem is then how to strike a balance between individual or group interests and the public good. Society and individuals are interdependent in that children are enculturated into the specific forms of linguistic and behavioral competence that are valued in communities at the same time that those communities are created, maintained, and reproduced through communicative actions (Habermas, 1995, pp. 199-200). The identities of individuals and societies then co-evolve, as each defines itself through the other via the medium of language. Language is understood broadly in this context to include all perceptual reading of the environment, bodily gestures, social action, etc., as well as the use of spoken or written symbols and signs (Harman, 2005; Heelan, 1983; Ihde, 1998; Nicholson, 1984; Ricoeur, 1981).

Technologies extend language by providing media for the inscription of new kinds of signs (Heelan, 1983a, 1998; Ihde, 1991, 1998; Ihde & Selinger, 2003). Thus, mobility desires and practices are inscribed and projected into the world using the automobile; shelter and life style, via housing and clothing; and communications, via alphabets, scripts, phonemes, pens and paper, telephones, and computers. Similarly, technologies in the form of test, survey, and assessment instruments provide the devices on which we inscribe desires for social mobility, career advancement, health maintenance and improvement, etc.

References

Ackermann, J. R. (1985). Data, instruments, and theory: A dialectical approach to understanding science. Princeton, New Jersey: Princeton University Press.

Baird, D. (1997, Spring-Summer). Scientific instrument making, epistemology, and the conflict between gift and commodity economics. Techné: Journal of the Society for Philosophy and Technology, 2(3-4), 25-46. Retrieved 08/28/2009, from http://scholar.lib.vt.edu/ejournals/SPT/v2n3n4/baird.html.

Baird, D. (2002, Winter). Thing knowledge – function and truth. Techné: Journal of the Society for Philosophy and Technology, 6(2). Retrieved 19/08/2003, from http://scholar.lib.vt.edu/ejournals/SPT/v6n2/baird.html.

Burdick, H., & Stenner, A. J. (1996). Theoretical prediction of test items. Rasch Measurement Transactions, 10(1), 475 [http://www.rasch.org/rmt/rmt101b.htm].

Daston, L., & Galison, P. (1992, Fall). The image of objectivity. Representations, 40, 81-128.

Galison, P. (1999). Trading zone: Coordinating action and belief. In M. Biagioli (Ed.), The science studies reader (pp. 137-160). New York, New York: Routledge.

Habermas, J. (1995). Moral consciousness and communicative action. Cambridge, Massachusetts: MIT Press.

Hagstrom, W. O. (1965). Gift-giving as an organizing principle in science. The Scientific Community. New York: Basic Books, pp. 12-22. (Rpt. in B. Barnes, (Ed.). (1972). Sociology of science: Selected readings (pp. 105-20). Baltimore, Maryland: Penguin Books.

Hankins, T. L., & Silverman, R. J. (1999). Instruments and the imagination. Princeton, New Jersey: Princeton University Press.

Harman, G. (2005). Guerrilla metaphysics: Phenomenology and the carpentry of things. Chicago: Open Court.

Hyde, L. (1979). The gift: Imagination and the erotic life of property. New York: Vintage Books.

Ihde, D. (1998). Expanding hermeneutics: Visualism in science. Northwestern University Studies in Phenomenology and Existential Philosophy). Evanston, Illinois: Northwestern University Press.

Kuhn, T. S. (1961). The function of measurement in modern physical science. Isis, 52(168), 161-193. (Rpt. in The essential tension: Selected studies in scientific tradition and change (pp. 178-224). Chicago, Illinois: University of Chicago Press (Original work published 1977).

Lapré, M. A., & Van Wassenhove, L. N. (2002, October). Learning across lines: The secret to more efficient factories. Harvard Business Review, 80(10), 107-11.

Latour, B. (1987). Science in action: How to follow scientists and engineers through society. New York, New York: Cambridge University Press.

Maasen, S., & Weingart, P. (2001). Metaphors and the dynamics of knowledge. (Vol. 26. Routledge Studies in Social and Political Thought). London: Routledge.

Michell, J. (1999). Measurement in psychology: A critical history of a methodological concept. Cambridge: Cambridge University Press.

Nersessian, N. J. (2002). Maxwell and “the Method of Physical Analogy”: Model-based reasoning, generic abstraction, and conceptual change. In D. Malament (Ed.), Essays in the history and philosophy of science and mathematics (pp. 129-166). Lasalle, Illinois: Open Court.

Price, D. J. d. S. (1986). Of sealing wax and string. In Little Science, Big Science–and Beyond (pp. 237-253). New York, New York: Columbia University Press. p. 240:

Rabkin, Y. M. (1992). Rediscovering the instrument: Research, industry, and education. In R. Bud & S. E. Cozzens (Eds.), Invisible connections: Instruments, institutions, and science (pp. 57-82). Bellingham, Washington: SPIE Optical Engineering Press.

Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests (Reprint, with Foreword and Afterword by B. D. Wright, Chicago: University of Chicago Press, 1980). Copenhagen, Denmark: Danmarks Paedogogiske Institut.

Roche, J. (1998). The mathematics of measurement: A critical history. London: The Athlone Press.

Schaffer, S. (1992). Late Victorian metrology and its instrumentation: A manufactory of Ohms. In R. Bud & S. E. Cozzens (Eds.), Invisible connections: Instruments, institutions, and science (pp. 23-56). Bellingham, WA: SPIE Optical Engineering Press.

Turner, J. (1955, November). Maxwell on the method of physical analogy. British Journal for the Philosophy of Science, 6, 226-238.

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.

Questions about measurement: If it is so important, why…?

January 28, 2010

If measurement is so important, why is measurement quality so uniformly low?

If we manage what we measure, why is measurement leadership virtually nonexistent?

If we can’t tell if things are getting better, staying the same, or getting worse without good metrics, why is measurement so rarely context-sensitive, focused, integrated, and interactive, as Dean Spitzer recommends it should be?

If quantification is valued for its rigor and convenience, why is no one demanding meaningful mappings of substantive, additive amounts of things measured on number lines?

If everyone is drowning in unmanageable floods of data why isn’t measurement used to reduce data volumes dramatically—and not only with no loss of information but with the addition of otherwise unavailable forms of information?

If learning and improvement are the order of the day, why isn’t anyone interested in the organizational and individual learning trajectories that are defined by hierarchies of calibrated items?

If resilient lean thinking is the way to go, why aren’t more measures constructed to retain their meaning and values across changes in item content?

If flexibility is a core value, why aren’t we adapting instruments to people and organizations, instead of vice versa?

If fair, just, and meaningful measurement is often lacking in judge-assigned performance assessments, why isn’t anyone estimating the consistency, and the leniency or harshness, of ratings—and removing those effects from the measures made?

If efficiency is valued, why does no one at all seem to care about adjusting measurement precision to the needs of the task at hand, so that time and resources are not wasted in gathering too much or too little data?

If it’s common knowledge that we can do more together than we can as individuals, why isn’t anyone providing the high quality and uniform information needed for the networked collective thinking that is able to keep pace with the demand for innovation?

Since the metric system and uniform product standards are widely recognized as essential to science and commerce, why are longstanding capacities for common metrics for human, social, and natural capital not being used?

If efficient markets are such great things, why isn’t anyone at all concerned about lubricating the flow of human, social, and natural capital by investing in the highest quality measurement obtainable?

If everyone loves a good profit, why aren’t we setting up human, social, and natural capital metric systems to inform competitive pricing of intangible assets, products, and services?

If companies are supposed to be organic entities that mature in a manner akin to human development over the lifespan, why is so little being done to conceive, gestate, midwife, and nurture living capital?

In short, if measurement is really as essential to management as it is so often said to be, why doesn’t anyone seek out the state of the art technology, methods, and experts before going to the trouble of developing and implementing metrics?

I suspect the answers to these questions are all the same. These disconnects between word and deed happen because so few people are aware of the technical advances made in measurement theory and practice over the last several decades.

For the deep background, see previous entries in this blog, various web sites (www.rasch.org, www.rummlab.com, www.winsteps.com, http://bearcenter.berkeley.edu/, etc.), and an extensive body of published work (Rasch, 1960; Wright, 1977, 1997a, 1997b, 1999a, 1999b; Andrich, 1988, 2004, 2005; Bond & Fox, 2007; Fisher, 2009, 2010; Smith & Smith, 2004; Wilson, 2005; Wright & Stone, 1999, 2004).

There is a wealth of published applied research in education, psychology, and health care (Bezruczko, 2005; Fisher & Wright, 1994; Masters, 2007; Masters & Keeves, 1999). To find more search Rasch and the substantive area of interest.

For applications in business contexts, there is a more limited number of published resources (ATP, 2001; Drehmer, Belohlav, & Coye, 2000; Drehmer & Deklava, 2001; Ludlow & Lunz, 1998; Lunz & Linacre, 1998; Mohamed, et al., 2008; Salzberger, 2000; Salzberger & Sinkovics, 2006; Zakaria, et al., 2008). I have, however, just become aware of the November, 2009, publication of what could be a landmark business measurement text (Salzberger, 2009). Hopefully, this book will be just one of many to come, and the questions I’ve raised will no longer need to be asked.

References

Andrich, D. (1988). Rasch models for measurement. (Vols. series no. 07-068). Sage University Paper Series on Quantitative Applications in the Social Sciences). Beverly Hills, California: Sage Publications.

Andrich, D. (2004, January). Controversy and the Rasch model: A characteristic of incompatible paradigms? Medical Care, 42(1), I-7–I-16.

Andrich, D. (2005). Georg Rasch: Mathematician and statistician. In K. Kempf-Leonard (Ed.), Encyclopedia of Social Measurement (Vol. 3, pp. 299-306). Amsterdam: Academic Press, Inc.

Association of Test Publishers. (2001, Fall). Benjamin D. Wright, Ph.D. honored with the Career Achievement Award in Computer-Based Testing. Test Publisher, 8(2). Retrieved 20 May 2009, from http://www.testpublishers.org/newsletter7.htm#Wright.

Bezruczko, N. (Ed.). (2005). Rasch measurement in health sciences. Maple Grove, MN: JAM Press.

Bond, T., & Fox, C. (2007). Applying the Rasch model: Fundamental measurement in the human sciences, 2d edition. Mahwah, New Jersey: Lawrence Erlbaum Associates.

Dawson, T. L., & Gabrielian, S. (2003, June). Developing conceptions of authority and contract across the life-span: Two perspectives. Developmental Review, 23(2), 162-218.

Drehmer, D. E., Belohlav, J. A., & Coye, R. W. (2000, Dec). A exploration of employee participation using a scaling approach. Group & Organization Management, 25(4), 397-418.

Drehmer, D. E., & Deklava, S. M. (2001, April). A note on the evolution of software engineering practices. Journal of Systems and Software, 57(1), 1-7.

Fisher, W. P., Jr. (2009, November). Invariance and traceability for measures of human, social, and natural capital: Theory and application. Measurement (Elsevier), 42(9), 1278-1287.

Fisher, W. P., Jr. (2010). Bringing human, social, and natural capital to life: Practical consequences and opportunities. Journal of Applied Measurement, 11, in press [Pre-press version available at http://www.livingcapitalmetrics.com/images/BringingHSN_FisherARMII.pdf].

Ludlow, L. H., & Lunz, M. E. (1998). The Job Responsibilities Scale: Invariance in a longitudinal prospective study. Journal of Outcome Measurement, 2(4), 326-37.

Lunz, M. E., & Linacre, J. M. (1998). Measurement designs using multifacet Rasch modeling. In G. A. Marcoulides (Ed.), Modern methods for business research. Methodology for business and management (pp. 47-77). Mahwah, New Jersey: Lawrence Erlbaum Associates, Inc.

Masters, G. N. (2007). Special issue: Programme for International Student Assessment (PISA). Journal of Applied Measurement, 8(3), 235-335.

Masters, G. N., & Keeves, J. P. (Eds.). (1999). Advances in measurement in educational research and assessment. New York: Pergamon.

Mohamed, A., Aziz, A., Zakaria, S., & Masodi, M. S. (2008). Appraisal of course learning outcomes using Rasch measurement: A case study in information technology education. In L. Kazovsky, P. Borne, N. Mastorakis, A. Kuri-Morales & I. Sakellaris (Eds.), Proceedings of the 7th WSEAS International Conference on Software Engineering, Parallel and Distributed Systems (Electrical And Computer Engineering Series) (pp. 222-238). Cambridge, UK: WSEAS.

Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests (Reprint, with Foreword and Afterword by B. D. Wright, Chicago: University of Chicago Press, 1980). Copenhagen, Denmark: Danmarks Paedogogiske Institut.

Salzberger, T. (2000). An extended Rasch analysis of the CETSCALE – implications for scale development and data construction., Department of Marketing, University of Economics and Business Administration, Vienna (WU-Wien) (http://www2.wu-wien.ac.at/marketing/user/salzberger/research/wp_dataconstruction.pdf).

Salzberger, T. (2009). Measurement in marketing research: An alternative framework. Northampton, MA: Edward Elgar.

Salzberger, T., & Sinkovics, R. R. (2006). Reconsidering the problem of data equivalence in international marketing research: Contrasting approaches based on CFA and the Rasch model for measurement. International Marketing Review, 23(4), 390-417.

Smith, E. V., Jr., & Smith, R. M. (2004). Introduction to Rasch measurement. Maple Grove, MN: JAM Press.35.

Spitzer, D. (2007). Transforming performance measurement: Rethinking the way we measure and drive organizational success. New York: AMACOM.

Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, New Jersey: Lawrence Erlbaum Associates.

Wright, B. D. (1977). Solving measurement problems with the Rasch model. Journal of Educational Measurement, 14(2), 97-116 [http://www.rasch.org/memo42.htm].

Wright, B. D. (1997a, June). Fundamental measurement for outcome evaluation. Physical Medicine & Rehabilitation State of the Art Reviews, 11(2), 261-88.

Wright, B. D. (1997b, Winter). A history of social science measurement. Educational Measurement: Issues and Practice, 16(4), 33-45, 52 [http://www.rasch.org/memo62.htm].

Wright, B. D. (1999a). Fundamental measurement for psychology. In S. E. Embretson & S. L. Hershberger (Eds.), The new rules of measurement: What every educator and psychologist should know (pp. 65-104 [http://www.rasch.org/memo64.htm]). Hillsdale, New Jersey: Lawrence Erlbaum Associates.

Wright, B. D. (1999b). Rasch measurement models. In G. N. Masters & J. P. Keeves (Eds.), Advances in measurement in educational research and assessment (pp. 85-97). New York: Pergamon.

Wright, B. D., & Stone, M. H. (1999). Measurement essentials. Wilmington, DE: Wide Range, Inc. [http://www.rasch.org/memos.htm#measess].

Wright, B. D., & Stone, M. H. (2004). Making measures. Chicago: Phaneron Press.

Zakaria, S., Aziz, A. A., Mohamed, A., Arshad, N. H., Ghulman, H. A., & Masodi, M. S. (2008, November 11-13). Assessment of information managers’ competency using Rasch measurement. iccit: Third International Conference on Convergence and Hybrid Information Technology, 1, 190-196 [http://www.computer.org/portal/web/csdl/doi/10.1109/ICCIT.2008.387].

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.

Review of Spitzer’s Transforming Performance Measurement

January 25, 2010

Everyone interested in practical measurement applications needs to read Dean R. Spitzer’s 2007 book, Transforming performance measurement: Rethinking the way we measure and drive organizational success (New York, AMACOM). Spitzer describes how measurement, properly understood and implemented, can transform organizational performance by empowering and motivating individuals. Measurement understood in this way moves beyond quick fixes and fads to sustainable processes based on a measurement infrastructure that coordinates decisions and actions uniformly throughout the organization.

Measurement leadership, Spitzer says, is essential. He advocates, and many organizations have instituted, the C-suite position of Chief Measurement Officer (Chapter 9). This person is responsible for instituting and managing the four keys to transformational performance measurement (Chapters 5-8):

  • Context sets the tone by presenting the purpose of measurement as either negative (to inspect, control, report, manipulate) or positive (to give feedback, learn, improve).
  • Focus concentrates attention on what’s important, aligning measures with the mission, strategy, and with what needs to be managed, relative to the opportunities, capacities, and skills at hand.
  • Integration addresses the flow of measured information throughout the organization so that the covariations of different measures can be observed relative to the overall value created.
  • Interactivity speaks to the inherently social nature of the purposes of measurement, so that it embodies an alignment with the business model, strategy, and operational imperatives.

Spitzer takes a developmental approach to measurement improvement, providing a Measurement Maturity Assessment in Chapter 12, and also speaking to the issues of the “living company” raised by Arie de Geus’ classic book of that title. Plainly, the transformative potential of performance measurement is dependent on the maturational complexity of the context in which it is implemented.

Spitzer clearly outlines the ways in which each of the four keys and measurement leadership play into or hinder transformation and maturation. He also provides practical action plans and detailed guidelines, stresses the essential need for an experimental attitude toward evaluating change, speaks directly to the difficulty of measuring intangible assets like partnership, trust, skills, etc., and shows appreciation for the value of qualitative data.

Transforming Performance Measurement is not an academic treatise, though all sources are documented, with the endnotes and bibliography running to 25 pages. It was written for executives, managers, and entrepreneurs who need practical advice expressed in direct, simple terms. Further, the book does not include any awareness of the technical capacities of measurement as these have been realized in numerous commercial applications in high stakes and licensure/certification testing over the last 50 years (Andrich, 2005; Bezruczko, 2005; Bond & Fox, 2007; Masters, 2007; Wilson, 2005). This can hardly be counted as a major criticism, since no books of this kind have yet to date been able to incorporate the often highly technical and mathematical presentations of advanced psychometrics.

That said, the sophistication of Spitzer’s conceptual framework and recommendations make them remarkably ready to incorporate insights from measurement theory, testing practice, developmental psychology, and the history of science. Doing so will propel the strategies recommended in this book into widespread adoption and will be a catalyst for the emerging re-invention of capitalism. In this coming cultural revolution, intangible forms of capital will be brought to life in common currencies for the exchange of value that perform the same function performed by kilowatts, bushels, barrels, and hours for tangible forms of capital (Fisher, 2009, 2010).

Pretty big claim, you say? Yes, it is. Here’s how it’s going to work.

  • First, measurement leadership within organizations that implements policies and procedures that are context-sensitive, focused, integrated, and interactive (i.e., that have Spitzer’s keys in hand) will benefit from instruments calibrated to facilitate:
    • meaningful mapping of substantive, additive amounts of things measured on number lines;
    • data volume reductions on the order of 80-95% and more, with no loss of information;
    • organizational and individual learning trajectories defined by hierarchies of calibrated items;
    • measures that retain their meaning and values across changes in item content;
    • adapting instruments to people and organizations, instead of vice versa;
    • estimating the consistency, and the leniency or harshness, of ratings assigned by judges evaluating performance quality, with the ability to remove those effects from the performance measures made;
    • adjusting measurement precision to the needs of the task at hand, so that time and resources are not wasted in gathering too much or too little data; and
    • providing the high quality and uniform information needed for networked collective thinking able to keep pace with the demand for innovation.
  • Second, measurement leadership sensitive to the four keys across organizations, both within and across industries, will find value in:
    • establishing industry-wide metrological standards defining common metrics for the expression of the primary human, social, and natural capital constructs of interest;
    • lubricating the flow of human, social, and natural capital in efficient markets broadly defined so as to inform competitive pricing of intangible assets, products, and services; and
    • new opportunities for determining returns on investments in human, community, and environmental resource management.
  • Third, living companies need to be able to mature in a manner akin to human development over the lifespan. Theories of hierarchical complexity and developmental stage transitions that inform the rigorous measurement of cognitive and moral transformations (Dawson & Gabrielian, 2003) will increasingly find highly practical applications in organizational contexts.

Leadership of the kind described by Spitzer is needed not just to make measurement contextualized, focused, integrated, and interactive—and so productive at new levels of effectiveness—but to apply systematically the technical, financial, and social resources needed to realize the rich potentials he describes for the transformation of organizations and empowerment of individuals. Spitzer’s program surpasses the usual focus on centralized statistical analyses and reports to demand the organization-wide dissemination of calibrated instruments that measure in common metrics. The flexibility, convenience, and scientific rigor of instruments calibrated to measure in units that really add up fit the bill exactly. Here’s to putting tools that work in the hands of those who know what to do with them!

References

Andrich, D. (2005). Georg Rasch: Mathematician and statistician. In K. Kempf-Leonard (Ed.), Encyclopedia of Social Measurement (Vol. 3, pp. 299-306). Amsterdam: Academic Press, Inc.

Bezruczko, N. (Ed.). (2005). Rasch measurement in health sciences. Maple Grove, MN: JAM Press.

Bond, T., & Fox, C. (2007). Applying the Rasch model: Fundamental measurement in the human sciences, 2d edition. Mahwah, New Jersey: Lawrence Erlbaum Associates.

Dawson, T. L., & Gabrielian, S. (2003, June). Developing conceptions of authority and contract across the life-span: Two perspectives. Developmental Review, 23(2), 162-218.

Fisher, W. P., Jr. (2009, November). Invariance and traceability for measures of human, social, and natural capital: Theory and application. Measurement (Elsevier), 42(9), 1278-1287.

Fisher, W. P., Jr. (2010). Bringing human, social, and natural capital to life: Practical consequences and opportunities. Journal of Applied Measurement, 11, in press [Pre-press version available at http://www.livingcapitalmetrics.com/images/BringingHSN_FisherARMII.pdf%5D.

Masters, G. N. (2007). Special issue: Programme for International Student Assessment (PISA). Journal of Applied Measurement, 8(3), 235-335.

Spitzer, D. (2007). Transforming performance measurement: Rethinking the way we measure and drive organizational success. New York: AMACOM.

Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, New Jersey: Lawrence Erlbaum Associates.

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.

Protocols for Living Capital

December 23, 2009

David Brooks’ December 22, 2009 column, “The Protocol Society,” hits some really great notes. There are several things worth commenting on. The first point concerns the protection of intellectual property and the encouragement of a free flow of ideas within the overarching operating system of laws, regulations, and property rights. What Brooks is getting at here is the concept of living capital.

A diverse group of writers (Hayek, De Soto, Latour, many others) contrast what they variously term socialist, centralized, and prescientific efforts to control capital’s concrete forms, on the one hand, with capitalist, decentralized, and scientific methods that focus on liberating the flow of capital defined abstractly in terms of the rule of law and transferable representations (titles, deeds, calibrated instruments, etc.). These two senses of capital also apply in the context of intangibles like human, social, and natural capital (Fisher, 2002, 2005, 2009a, 2010).

Second, the movement in economics away from mathematical modeling echoes the broadening appreciation for qualitative methods across the social sciences that has been underway since the 1960s. The issue is one of learning how to integrate substantive concerns for meaningfulness and understanding in the ways we think about economics. The idealized rational consumer typically assumed in traditional mathematical models demands the imposition of a logic not actually often observed in practice.

But just because people may not behave in accord with one sense of rationality does not mean there is not a systematic logic employed in the ways they make decisions that are meaningful to them. Further, though few are yet much aware of this, mathematical models are not inherently irreconcilable with qualitative methods (Fisher, 2003a, 2003b; Heelan, 1998; Kisiel, 1973). Scientifically efficacious mathematical thinking has always had deep roots in qualitative, substantive meaning (Heilbron, 1993; Kuhn, 1961; Roche, 1998). Analogous integrations of qualitative and quantitative methods have been used in psychology, sociology, and education for decades (Bond & Fox, 2007; Fisher, 2004; Wilson, 2005; Wright, 1997, 2000).

Third, yes, those societies and subcultures that have the capacities for increasing the velocity of new recipes have measurably greater amounts of social capital than others. The identification of invariant patterns in social capital will eventually lead to the calibration of precision measures and the deployment of universally uniform metrics as common currencies for the exchange of social value (Fisher, 2002, 2005, 2009a, 2009b).

Fourth, though I haven’t read “Smart World,” the book by Richard Ogle that Brooks refers to, the theory of the extended mind embodied in social networks sounds highly indebted to the work of Bruno Latour (1987, 1995, 2005) and others working in the social studies of science (O’Connell, 1993) and in social psychology (Hutchins, 1995; Magnus, 2007). Brooks and Ogle are exactly right in their assertions about the kinds of collective cognition that are needed for real innovation. The devilish details are embedded in the infrastructure of metrological standards and uniform metrics that coordinate and harmonize thought and behavior. We won’t realize our potential for creativity in the domains of the intangible forms of capital and intellectual property until we get our act together and create a new metric system for them (Fisher, 2009a, 2009b, 2010). Every time someone iterates through the protocol exemplified in Brooks’ column, we get a step closer to this goal.

References

Bond, T., & Fox, C. (2007). Applying the Rasch model: Fundamental measurement in the human sciences, 2d edition. Mahwah, New Jersey: Lawrence Erlbaum Associates.

Fisher, W. P., Jr. (2000). Objectivity in psychosocial measurement: What, why, how. Journal of Outcome Measurement, 4(2), 527-563 [http://www.livingcapitalmetrics.com/images/WP_Fisher_Jr_2000.pdf].

Fisher, W. P., Jr. (2002, Spring). “The Mystery of Capital” and the human sciences. Rasch Measurement Transactions, 15(4), 854 [http://www.rasch.org/rmt/rmt154j.htm].

Fisher, W. P., Jr. (2003a, December). Mathematics, measurement, metaphor, metaphysics: Part I. Implications for method in postmodern science. Theory & Psychology, 13(6), 753-90.

Fisher, W. P., Jr. (2003b, December). Mathematics, measurement, metaphor, metaphysics: Part II. Accounting for Galileo’s “fateful omission.” Theory & Psychology, 13(6), 791-828.

Fisher, W. P., Jr. (2004, October). Meaning and method in the social sciences. Human Studies: A Journal for Philosophy and the Social Sciences, 27(4), 429-54.

Fisher, W. P., Jr. (2005). Daredevil barnstorming to the tipping point: New aspirations for the human sciences. Journal of Applied Measurement, 6(3), 173-9 [http://www.livingcapitalmetrics.com/images/FisherJAM05.pdf].

Fisher, W. P., Jr. (2009a, November). Invariance and traceability for measures of human, social, and natural capital: Theory and application. Measurement (Elsevier), 42(9), 1278-1287.

Fisher, W. P. J. (2009b). NIST Critical national need idea White Paper: metrological infrastructure for human, social, and natural capital (Tech. Rep., http://www.livingcapitalmetrics.com/images/FisherNISTWhitePaper2.pdf). New Orleans: LivingCapitalMetrics.com.

Fisher, W. P., Jr. (2010). Bringing human, social, and natural capital to life: Practical consequences and opportunities. Journal of Applied Measurement, p. in press [http://www.livingcapitalmetrics.com/images/BringingHSN_FisherARMII.pdf].

Heelan, P. A. (1998, June). The scope of hermeneutics in natural science. Studies in History and Philosophy of Science Part A, 29(2), 273-98.

Heilbron, J. L. (1993). Weighing imponderables and other quantitative science around 1800 (Vol. 24 (Supplement), Part I, pp. 1-337). Historical studies in the physical and biological sciences). Berkeley, California: University of California Press.

Hutchins, E. (1995). Cognition in the wild. Cambridge, Massachusetts: MIT Press.

Kisiel, T. (1973). The mathematical and the hermeneutical: On Heidegger’s notion of the apriori. In E. G. Ballard & C. E. Scott (Eds.), Martin Heidegger: In Europe and America (pp. 109-20). The Hague: Martinus Nijhoff.

Kuhn, T. S. (1961). The function of measurement in modern physical science. Isis, 52(168), 161-193. (Rpt. in T. S. Kuhn, (Ed.). (1977). The essential tension: Selected studies in scientific tradition and change (pp. 178-224). Chicago: University of Chicago Press.)

Latour, B. (1987). Science in action: How to follow scientists and engineers through society. New York: Cambridge University Press.

Latour, B. (1995). Cogito ergo sumus! Or psychology swept inside out by the fresh air of the upper deck: Review of Hutchins’ Cognition in the Wild, MIT Press, 1995. Mind, Culture, and Activity: An International Journal, 3(192), 54-63.

Latour, B. (2005). Reassembling the social: An introduction to Actor-Network-Theory. (Clarendon Lectures in Management Studies). Oxford, England: Oxford University Press.

Magnus, P. D. (2007). Distributed cognition and the task of science. Social Studies of Science, 37(2), 297-310.

O’Connell, J. (1993). Metrology: The creation of universality by the circulation of particulars. Social Studies of Science, 23, 129-173.

Roche, J. (1998). The mathematics of measurement: A critical history. London: The Athlone Press.

Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, New Jersey: Lawrence Erlbaum Associates.

Wright, B. D. (1997, Winter). A history of social science measurement. Educational Measurement: Issues and Practice, 16(4), 33-45, 52 [http://www.rasch.org/memo62.htm].

Wright, B. D., Stone, M., & Enos, M. (2000). The evolution of meaning in practice. Rasch Measurement Transactions, 14(1), 736 [http://www.rasch.org/rmt/rmt141g.htm].

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.

Information and Leadership: New Opportunities for Advancing Strategy, Engaging Customers, and Motivating Employees

December 9, 2009

Or, What’s a Mathematical Model a Model Of, After All?
Or, How to Build Scale Models of Organizations and Use Them to Learn About Organizational Identity, Purpose, and Mission

William P. Fisher, Jr., Ph.D.

The greatest opportunity and most significant challenge to leadership in every area of life today is the management of information. So says Carol Bartz, CEO of Yahoo! in her entry in The Economist’s annual overview of world events, “The World in 2010.” Information can be both a blessing and a curse. The right information in the right hands at the right time is essential to effectiveness and efficiency. But unorganized and incoherent information can be worse than none at all. Too often leaders and managers are faced with deciding between gut instincts based in unaccountable intuitions and facts that are potentially seriously flawed, or that are merely presented in such overwhelming volumes as to be useless.

This situation is only going to get worse as information volumes continue to increase. The upside is that solutions exist, solutions that not only reduce data volume by factors as high as hundreds to one with no loss of information, but which also distinguish between merely apparent and really reliable information. What we have in these solutions are the means of following through on Carol Bartz’s information leadership warnings and recommendations.

Clearly communicating what matters, for instance, requires leaders to find meaning in new facts and the changing scene. They have to be able to use their vision of the organization, its mission, and its place in the world to tell what’s important and what isn’t, to put each event or opportunity in perspective. And what’s more is that the vision of the organization has to be dynamic. It, too, has to be able to change with the changing circumstances.

And this is where a whole new class of useful information solutions comes to bear. It may seem odd to say so, but leadership is fundamentally mathematical. You can begin to get a sense of what I mean in the ambiguity of the way leaders can be calculating. Making use of people’s skills and talents is a challenge that requires being able to assess facts and potentials in a way that intuitively gauges likelihoods of success. It is possible to lead, of course, without being manipulative; the point is that leadership requires an ability to envision and project an abstract heuristic ideal as a fundamental principle for focusing attention and separating the wheat from the chaff. A leader who dithers and wastes time and resources on irrelevancies is a contradiction in terms. An organization is supposed to have an identity, a purpose, and a mission in life independent of the local particulars of who its actual employees, customers, and suppliers are, and independent of the opportunities and challenges that arise in different times and places.

Of course, every organization is colored and shaped to some extent by every different person that comes into contact with it, and by the times and places it finds itself in. No one wants to feel like an interchangeable part in machine, but neither does anyone want to feel completely out of place, with no role to play. If an organization was entirely dependent on the particulars of who, what, when, and where, it’s status as a coherent organization with an identifiable presence would be compromised. So what we need is to find the right balance between the ideal and the real, the abstract and the concrete, and, as the philosopher Paul Ricoeur put it, between belonging and distanciation.

And indeed, scientists often note that no mathematical model ever holds in every detail in the real world. That isn’t what they’re intended to do, in fact. Mathematical models serve the purpose of being guides to creating meaningful, useful relationships. One of the leading lights of measurement theory, Georg Rasch, said it well over 50 years ago: models aren’t meant to be true, but to be useful.

Rasch accordingly also pointed out that, if we measure mass, force, and acceleration with enough precision, we see that even Newton’s laws of motion are not perfectly true. Measured to the nth decimal place, what we find is that observed amounts of mass, force, and acceleration form probability distributions that do indeed satisfy Newton’s laws. Even in classical physics, then, measurement models are best conceived probabilistically.

Over the last several decades, use of Rasch’s probabilistic measurement models in scaling tests, surveys, and assessments has grown exponentially. As has been explored at length in previous posts in this blog, most applications of Rasch’s models mistakenly treat them as statistical models, as so their real value and importance is missed. But even those actively engaged in using the models appropriately often do not engage with the basic question concerning what the model is a model of, in their particular application of it. The basic assumption seems to be that the model is a mathematical representation of relations between observations recorded in a data set, but this is an extremely narrow and unproductive point of view.

Let’s ask ourselves, instead, how we would model an organization. Why would we want to do that? We would want to do that for the same reasons we model anything, such as creating a safe and efficient way of experimenting with different configurations, and of coming to new understandings of basic principles. If we had a standard model of organizations of a certain type, or of organizations in a particular industry, we could use it to see how different variations on the basic structure and processes cause or are associated with different outcomes. Further, given that such models could be used to calibrate scales meaningfully measuring organizational development, industry-wide standards could be brought to bear in policy, decision making, and education, effecting new degrees of efficiency and effectiveness.

So, we’d previously said that the extent to which an organization finds its identity, realizes its purpose, and advances its mission (i.e., develops) is, within certain limits, a function of its capacity to be independent from local particulars. What we mean by this is that we expect employees to be able to perform their jobs no matter what day of the week it is, no matter who the customer is, no matter which particular instance of a product is involved, etc. Though no amount of skill, training, or experience can prepare someone for every possible contingency, people working in a given job description prepare themselves for a certain set of tasks, and are chosen by the organization for their capacities in that regard.

Similarly, we expect policies, job descriptions, work flows, etc. to function in similar fashions. Though the exact specifics of each employee’s abilities and each situation’s demands cannot be known in advance, enough is known that the defined aims will be achieved with high degrees of success. Of course, this is the point at which the interchangeability of employee ability and task difficulty can become demeaning and alienating. It will be important that we allow room for some creative play, and situate each level of ability along a continuum that allows everyone to see a developmental trajectory personalized to their particular strengths and needs.

So, how do we mathematically model the independence of the organization from its employees, policies, customers, and challenges, and scientifically evaluate that independence?

One way to begin is to posit that organizational development is equal to the differences between the abilities of the people employed, the efficiencies of the policies, alignments, and linkages implemented; and the challenges presented by the market. If we observe the abilities, efficiencies, and challenges in by means of a rating scale, the resulting model could be written as:

ln(Pmoas/(1-Pmoas)) = bm – fo – ca – rs

which hypothesizes that the natural logarithm of the response odds (the response probabilities divided by one minus themselves) is equal to the ability b of employee m minus the efficiency f of policy o minus the challenge c of market a minus the difficulty r of obtaining rating in category s. This model has the form of a multifaceted Rasch model (Linacre, 1989; others), used in academic research, rehabilitative functional assessments, and medical licensure testing.

What does it take for each of these model parameters to be independent of the others in the manner that we take for granted in actual practice? Can we frame our observations of the members of each facet in the model in ways that will clearly show us when we have failed to obtain the desired independence? Can we do that in a way that simultaneously provides us with a means for communicating information about individual employees, policies, and challenges efficiently in a common language?

Can that common language be expressed in words and numbers that capitalize on the independence of the model parameters and so mean the same thing across local particulars? Can we set up a system for checking and maintaining the meaning of the parameters over time? Can we build measures of employee abilities, policy efficiencies, and market challenges into our information systems in useful ways? Can we improve the overall quality, efficiency, and meaningfulness of our industry by collaborating with other firms, schools, non-profits, and government agencies in the development of reference standard metrics?

These questions all have the same answer: Yes, we can. These questions set the stage for understanding how effective leadership depends on effective information management. If, as Yahoo! CEO Bartz says, leadership has become more difficult in the age of blogospherical second-guessing and “opposition research,” why not tap all of that critical energy as a resource and put it to work figuring out what differences make a difference? If critics think they have important questions that need to be answered, the independence and consistency, or lack thereof, of their and others’ responses gives real heft to a “put-up-or-shut-up” criterion for distinguishing signal from noise.

This kind of a BS-detector supports leadership in two ways, by focusing attention on meaningful information, and by highlighting significant divergences from accepted opinion. The latter might turn out to be nothing more than exceptionally loud noise, but it might also signal something very important, a contrary opinion sensitive to special information available only from a particular perspective.

Bartz is right on, then, in saying that the central role of information in leadership has made listening and mentoring more important than ever. Modeling the organization and experimenting with it makes it possible to listen and mentor in completely new ways. Testing data for independent model parameters is akin to tuning the organization like an instrument. When independence is achieved, everything harmonizes. The path forward is clear, since the ratings delineate the range in which organizational performance consistently varies.

Variation in the measures is illustrated by the hierarchy of the policy and market items rated, which take positions in their distributions showing what consistently comes first, and what precedents have to be set for later challenges to be met successfully. By demanding that the model parameters be independent of one another, we have set ourselves up to learn something from the past that can be used to predict the future.

Further and quite importantly, as experience is repeatedly related to these quantitatively-scaled hierarchies, the factors that make policies and challenges take particular positions on the ruler come to be understood, theory is refined, and leadership gains an edge. Now, it is becoming possible to predict where new policies and challenges will fall on the measurement continuum, making it possible for more rapid responses and earlier anticipations of previously unseen opportunities.

It’s a different story, though, when dependencies emerge, as when one or more employees in a particular area unexpectedly disagree with otherwise broadly accepted policy efficiencies or market challenges, or when a particular policy provokes anomalous evaluations relative to some market challenges but not others. There’s a qualitatively different kind of learning that takes place when expectations are refuted. Instead of getting an answer to the question we asked, we got an answer to one we didn’t ask.

It might just be noise or error, but it is imperative to ask and find out what question the unexpected answer responds to. Routine management thrives on learning how to ever more efficiently predict quantitative results; its polar opposite, innovation, lives on the mystery of unexpected anomalies. If someone hadn’t been able to wonder what value hardened rubber left on a stove might have, what might have killed bacteria in a petri dish, or why an experimental effect disappeared when a lead plate was moved, Vulcanized tires, Penicillin, and X-ray devices might never have come about.

We are on the cusp of the information analogues of these ground-breaking innovations. Methods of integrating rigorously scientific quantities with qualitative creative grist clarify information in previously unimagined ways, and in so doing make it more leveragable than ever before for advancing strategy, engaging customers, and motivating employees.

The only thing in Carol Bartz’s article that I might take issue with comes in the first line, with the words “will be.” The truth is that information already is our greatest opportunity.

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.

Contrasting Network Communities: Transparent, Efficient, and Invested vs Not

November 30, 2009

Different networks and different communities have different amounts of social capital going for them. As was originally described by Putnam (1993), some networks are organized hierarchically in a command-and-control structure. The top layers here are the autocrats, nobility, or bosses who run the show. Rigid conformity is the name of the game to get by. Those in power can make or break anyone. Market transactions in this context are characterized by the thumb on the scale, the bribe, and the kickback. Everyone is watching out for themselves.

At the opposite extreme are horizontal networks characterized by altruism and a sense that doing what’s good for everyone will eventually come back around to be good for me. The ideal here is a republic in which the law rules and everyone has the same price of entry into the market.

What I’d like to focus on is what’s going on in these horizontal networks. What makes one a more tightly-knit community than another? The closeness people feel should not be oppressive or claustrophic or smothering. I’m thinking of community relations in which people feel safe, not just personally but creatively. How and when are diversity, dissent and innovation not just tolerated but celebrated? What makes it possible for a market in new ideas and new ways of doing things to take off?

And how does a community like this differ from another one that is just as horizontally structured but that does not give rise to anything at all creative?

The answers to all of these questions seem to me to hinge on the transparency, efficiency, and volume of investments in the relationships making up the networks. What kinds of investments? All kinds: emotional, social, intellectual, financial, spiritual, etc. Less transparent, inefficient, and low volume investments don’t have the thickness or complexity of the relationships that we can see through, that are well lubricated, and that are reinforced with frequent visits.

Putnam (1993, p. 183) has a very illuminating way of putting this: “The harmonies of a choral society illustrate how voluntary collaboration can create value that no individual, no matter how wealthy, no matter how wily, could produce alone.” Social capital is the coordination of thought and behavior that embodies trust, good will, and loyalty. Social capital is at play when an individual can rely on a thickly elaborated network of largely unknown others who provide clean water, nutritious food, effective public health practices (sanitation, restaurant inspections, and sewers), fire and police protection, a fair and just judiciary, electrical and information technology, affordably priced consumer goods, medical care, and who ensure the future by educating the next generation.

Life would be incredibly difficult if we could not trust others to obey traffic laws, or to do their jobs without taking unfair advantage of access to special knowledge (credit card numbers, cash, inside information), etc. But beyond that, we gain huge efficiencies in our lives because of the way our thoughts and behaviors are harmonized and coordinated on mass scales. We just simply do not have to worry about millions of things that are being taken care of, things that would completely freeze us in our tracks if they weren’t being done.

Thus, later on the same page, Putnam also observes that, “For political stability, for government effectiveness, and even for economic progress social capital may be even more important than physical or human capital.” And so, he says, “Where norms and networks of civic engagement are lacking, the outlook for collective action appears bleak.”

But what if two communities have identical norms and networks, but they differ in one crucial way: one relies on everyday language, used in conversations and written messages, to get things done, and the other has a new language, one with a heightened capacity for transparent meaningfulness and precision efficiency? Which one is likely to be more creative and innovative?

The question can be re-expressed in terms of Gladwell’s (2000) sense of the factors contributing to reaching a tipping point: the mavens, connectors, salespeople, and the stickiness of the messages. What if the mavens in two communities are equally knowledgeable, the connectors just as interconnected, and the salespeople just as persuasive, but messages are dramatically less sticky in one community than the other? In one network of networks, saying things once gets the right response 99% of the time, but in the other things have to be repeated seven times before the right response comes back even 50% of the time, and hardly anyone makes the effort to repeat things that many times. Guess which community will be safer, more creative, and thriving?

All of this, of course, is just another way to bring out the importance of improved measurement for improving network quality and community life. As Surowiecki put it in The Wisdom of Crowds, the SARS virus was sequenced in a matter of weeks by a network of labs sharing common technical standards; without those standards, it would have taken any one of them weeks to do the same job alone. The messages these labs sent back and forth had an elevated stickiness index because they were more transparently and efficiently codified than messages were back in the days before the technical standards were created.

So the question emerges, given the means to create common languages with enhanced stickiness properties, such as we have in advanced measurement models, what kinds of creativity and innovation can we expect when these languages are introduced in the domains of human, social, and natural capital markets? That is the question of the age, it seems to me…

Gladwell, M. (2000). The tipping point: How little things can make a big difference. Boston: Little, Brown, and Company.

Putnam, R. D. (1993). Making democracy work: Civic traditions in modern Italy. Princeton, New Jersey: Princeton University Press.

Surowiecki, J. (2004). The wisdom of crowds: Why the many are smarter than the few and how collective wisdom shapes business, economies, societies and nations. New York: Doubleday.

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.

Al Gore: Marshalling the Collective Will is NOT the Problem–The Problem is the Problem!

November 22, 2009

In his new book, former vice-president Al Gore says we have in hand all the tools we need to solve the climate change crises, except the collective will to do anything about them. I respectfully beg to differ. Finding the will is not the problem. We already have it and we have it volumes sufficient to the task. Gore is also wrong in claiming we have the tools we need. There are entire classes of scientific and economic tools that we are missing. It is because we lack the right tools that we are unable to focus and channel our will for solutions.

The short version of my argument is that we don’t have scientific, universally uniform, and ubiquitously used metrics for measuring overall environmental quality. Because we don’t have the measures, we can’t and don’t effectively and efficiently manage our natural capital and environmental assets. Without metrics akin to barrels of oil or bushels of grain, we don’t have markets for matching environmental quality supply with demand for it.

Without tools as essential as metrics and markets, we can’t harness our existing will to improve our relationship with the earth. What will do we have, you might ask? Our collective will is expressed in the profit motive. What we need to do is set up metrics and markets to harness the energy of the profit motive. We need to create systems for trading natural capital (and human and social capital) so that we generate real wealth and drive happiness indexes north by realizing human potential, building thriving communities, and nurturing sustainable environments. The profit motive is not our enemy. It is the source of energy we need to deal with the multiple crises we face: human, social, and environmental.

Now for the long version of my argument. The problem is the problem. We restrict our options for solving problems by the way we frame the issue. Einstein supposedly pointed out that big problems, ones framed at a level where they define the entire paradigmatic orientation to a class of smaller, solvable problems, cannot be solved from within the paradigm they emerge from. We tend to define problems from the modern point of view, in a Cartesian fashion, from the point of view of a subject that is separate from, and in no way involved in the construction of, the objects it encounters. What I want to point out is that it is this Cartesian orientation to problem definition that is itself the problem!

Set aside your opinions on the basic issues concerning climate change, and think about what’s going on. It is undeniable that human activities are implicated in changes to the environment, and that we have to learn to manage our effects on the planet, or they will feed back on us in potentially harmful ways. This is the nature of life in the flux and flow of ecological relationships. It is one of many ways in which observers are inherently implicated in constructing what is observed, which is recognized as holding true as much in physics as in anthropology. These are uncontroversial facts, quite apart from any concern with climate change.

And what these feedback loops imply, as has indeed already been pointed out by generations of scholars and thinkers, is that there is no such thing as a pure Cartesian subject separate from its objects. We shape the things in our world, and those things, in turn, shape us. Subjects and objects are mutually implicated. All observers are participant observers. It is inevitable that what we do and think will change the world, and the new world will require us to think and act differently.

The plethora of environmental crises we face are therefore situated in a new non-Cartesian paradigm. It is a fundamental error of the first order to approach a non-Cartesian problem as though it were merely another variation on the usual kind of thing that can be addressed fairly well from the Cartesian dualist perspective. When we think, as Al Gore does, that we should be socialistically organizing resources for a centrally-organized 5-year plan of attack on environmental problems, we are missing the point.

This approach can be put to work only in terms of an authoritarian form of control directed by a dictatorial panel of experts, a military junta, or a self-appointed czar. Framed from a Cartesian point of view, no democratic process will ever compel voters to do what needs to be done. As was illustrated so dramatically by the fall of Communism, the socialistic manipulation of the concrete particulars of human, social, and environmental problems is unsustainable and socially irresponsible.

The fact is that non-Cartesian problems are only made worse when we try to solve them with Cartesian solutions. This is why non-Cartesian problems are often described by philosophers as “hermeneutic,” a word that derives from the name of the Greek god Hermes, known by the ancient Romans as Mercury. Like liquid mercury, non-Cartesian problems merely split and multiply when we grasp at them clumsily ignoring our own involvement in the creation of the problem.

So we can go on trying to herd cats or nail jello to the wall, but to be part of the solution and not just another way of being part of the problem, we need to set up systems of thought and behavior that are not internally inconsistent and self-contradictory. No matter what we do, if we keep on marshalling resources to attack problems in deliberate and systematic ignorance of this cross-paradigmatic dissonance, we can only make matters worse.

What else can be done? Just what does it mean to go with the flow of the mutual implication of subject and object? How can we explicitly model the problem to include the participant observer?

“The medium is the message,” to quote Marshall McLuhan. As was pointed out so humorously by Woody Allen in his film, “Annie Hall,” this expression is often repeated and often misunderstood. Though all can see that the news and entertainment media are ubiquitous, the meaning of our captivation with the media of creative expression has not yet been clarified sufficiently well for generalized understanding.

Significant advances have occurred in recent years, however. The media we are captivated by define and limit not only how and what we communicate, but who and what we have been, are, and could be. Depending on the quality of their transparency and of the biases that color them, media convey moral, human, and economic values of various kinds. The media through which we express values include every conceivable technology, from alphabets and phonemes to buildings, clothing, and food preparation, to musical instruments, and the creations of art and science.

Media are at the crux of the lesson we have to learn if we are to frame the problems of environmental management so that we are living solutions, not exacerbating problems. Media of all kinds, from pen and paper to television to the Internet, are fundamentally technical. In fact, media are the original technologies. The words “text,” “textile,” and “technique” all derive from the Greek “techne,” to make, and have even deeper roots in the Sanskrit “TEK.” Technology is our primary medium of shared meaning. Technology embodies the meanings we create and distributes their values across society and around the world.

What we need to do to effect non-Cartesian solutions then is to dwell deeply with our shared meanings and values, and find new ways of living them out, ways that embody the unity of subject and object, problem and solution. Nice rhetoric, you might say, but what does it mean? What is its practical consequence?

Put in academic terms, the pragmatic issue concerns the nature of technology and how it provides measures of reality serving as the media through which we experience the world in terms of shared universals. Primary sources here include the works of writers like Latour, Wise, Jasanoff, Knorr-Cetina, Schaffer, Ihde, Heidegger, and others cited in previous posts in this blog, and in my published work.

To do more to cut to the chase, we can start to think of language and technology as embodying problem-solution unities. Words and tools are situated within ecologies of relationships that define their meanings and functions. We need to be more sensitive to the way meanings and values become embodied in language and technologies, and then are distributed across far-flung networks to coordinate collectively harmonized thought and action.

To get right down to where this all is leading, though it is probably far from obvious, the appropriate non-Cartesian orientation to the problems of environmental management raised in Al Gore’s new book ultimately culminates in creation of the technical networks through which we distribute measures of what we want to manage. These networks comprise the ecologies of meaning and values that we inhabit. Not coincidentally, they also create the markets in which human, social, and natural capital can be efficiently and effectively traded.

When these networks and markets are created, finding the collective will to deal with the environmental challenges we face will be the least of our problems. The profit motive is an exceptionally strong force. What we ought to be doing is figuring out how to harness it as the engine of social change. This contrasts diametrically with Al Gore’s perspective, which treats the profit motive as part of the problem.

Technical networks of instruments traceable to reference standards, and markets for the exchange of the values measured by those instruments, are what we ought to be focusing on. The previous post in this blog proposes an Intangible Assets Metric System, and is related to earlier posts on the role of common currencies for the exchange of meaningful quantitative values in creating functional markets for human, social, and natural capital. What we need are these infrastructural supports for creating the efficient markets in which demand for environmental solutions can be matched the supply of those solutions. The failure of socialism is testimony to the futility of trying to man-handle our way forward by brute force.

Of course, I will continue living out my life’s mission and passion by continuing to elaborate variations, explanations, and demonstrations of how this could be so….

Creative Commons License
LivingCapitalMetrics Blog by William P. Fisher, Jr., Ph.D. is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Based on a work at livingcapitalmetrics.wordpress.com.
Permissions beyond the scope of this license may be available at http://www.livingcapitalmetrics.com.