User:K991/sandbox

Normalization process theory is a sociological theory of the implementation, embedding, and integration of new technologies and organizational innovations developed by Carl R. May, Tracy Finch, and colleagues. The theory is a contribution to the field of science and technology studies (STS), and is the result of a programme of theory building by May and his co-researchers. Through three iterations, the theory has built upon the Normalization Process Model previously developed by May et al. to explain the social processes that lead to the routine embedding of innovative health technologies.

Normalization Process Theory focuses attention on agentic contributions – the things that individuals and groups do to operationalize new or modified modes of practice as they interact with with dynamic elements of their environments. It defines the implementation, embedding, and integration as a process that occurs when participants deliberately initiate and seek to sustain a sequence of events that bring it into operation. The dynamics of implementation processes are complex, but Normalization Process Theory facilitates understanding by focusing attention on the mechanisms through which participants invest and contribute to them. It reveals

“the work that actors do as they engage with some ensemble of activities (that may include new or changed ways of thinking, acting, and organizing) and by which means it becomes routinely embedded in the matrices of already existing, socially patterned, knowledge and practices .” These have explored objects, agents, and contexts. In a paper published under a creative commons license, May and colleagues describe how, since 2006, NPT has undergone three iterations .”.


 * 1) Objects. The first iteration of the theory focused attention on the relationship between the properties of a complex healthcare intervention and the Collective Action of its users. Here, agents’ contributions are made in reciprocal relationship with the emergent capability that they find in the objects – the ensembles of behavioural and cognitive practices – that they enact. These socio-material capabilities are governed by the possibilities and constraints presented by objects, and the extent to which they can be made workable and integrated in practice as they are mobilized ..


 * 1) Agents. The second iteration of the theory built on the analysis of Collective Action, and showed how this was linked to the mechanisms through which people make their activities meaningful and build commitments to them . . Here, investments of social structural and social cognitive resources are expressed as emergent contributions to social action through a set of generative mechanisms: coherence (what people do to make sense of objects, agency, and contexts); cognitive participation (what people do to initiate and be enrolled into delivering an ensemble of practices); collective action (what people do to enact those practices); and reflexive monitoring (what people do to appraise the consequences of their contributions). These constructs are the core of the theory, and provide the foundation of its analytic purchase on practice.
 * 1) Contexts. The third iteration of the theory developed the analysis of agentic contributions by offering an account of centrally important structural and cognitive resources on which agents draw as they take action Here, dynamic elements of social contexts are experienced by agents as capacity (the social structural resources, that they possess, including informational and material resources, and social norms and roles) and potential (the social cognitive resources that they possess, including knowledge and beliefs, and individual intentions and shared commitments). These resources are mobilized by agents when they invest in the ensembles of practices that are the objects of implementation.

Normalization process theory is a true middle range theory that is located within the 'turn to materiality' in STS. It therefore fits well with the case-study oriented approach to empirical investigation used in STS. It also appears to be a straightforward alternative to actor–network theory in that it does not insist on the agency of non-human actors, and seeks to be explanatory rather than descriptive. However, because Normalization Process Theory specifies a set of generative mechanisms that empirical investigation has shown to be relevant to implementation and integration of new technologies, it can also be used in larger scale structured and comparative studies. Although it fits well with the interpretive approach of ethnography and other qualitative research methods, it also lends itself to systematic review and survey research methods. As a middle range theory, it can be federated with other theories to explain empirical phenomena. It is compatible with theories of the transmission and organization of innovations, especially diffusion of innovations theory, labor process theory, and psychological theories including the theory of planned behavior and social learning theory.