User:Etchacan/sandbox

PARADIGM DEVELOPMENTS

Definitions. I would like to improve the existing 'Definitions' section of the page on Economics; this is my first attempt at contributing, and I don't want to foul up anyone else's ongoing efforts. I would like to introduce briefly the following points: (i) The contributions of authors from JS Mill right through to Hausman and Skidelsky that are responsible for the foundationist strategy used in building economic theory. In this, economists seek unchallengeable foundational truths, on which they construct theory using logic and maths, so that the whole should be pretty solid. (I realize that not all philosophers of science are enthusiastic about foundationism.) (ii) The contribution of various authors (notably Robbins) to the exclusion of uncertainty (sensu stricto) from economics. (This would need to flag-up the issue of the 'Dutch Book' argument against treating uncertainty differently from risk.) This material would also cover the issue of fuzzy data (qualitative/ vague) in economics. ... The reason for proposing to cover these 2 points is that they would enable some of the more recent material on the economics of complex systems (recent books/ papers by Potts, and by Smith) to be linked in. This is something readers might reasonably look for enlightenment (?!) on, in the light of the current turbulence in economic affairs, which seem to lie largely outside the scope of mainstream economics. It would also make sense to include Austrian economics as another way of defining economics in this revised session; it would fit in fairy comfortably, because it is connected to the foundationism issue. ... Any comments on this proposal? Etchacan (talk) 15:05, 24 May 2012 (UTC)

...............

Paradigms
All inquiring systems with which we investigate the world necessarily start with a pre-theoretical set of ideas about the nature of the entities involved, how they interact, what are the significant questions to be addressed, and what are proper methods of investigation. The last of these largely defines what is - and what is not - evidence of a significant anomaly. These ideas tend to come from the wider intellectual environment in which the discipline is embedded. As the discipline's stock of knowledge increases, ... fact nets ... hooks ....&&&& n

The Historical Sources
Jeremy Bentham ... utilitarianism ... well-being as unidimensional, uncertainty-free

JS Mill, reinforced these, elaborated utilitarianism (eg, higher pleasures, justice as a specially important kind of utility... also worked on methodology ... method a priori [quote ] ... inconsistency re observational evidence ... with exception of Martin Blaug, falsificationist, subsequent writers in economics have followed this route, retained these two qualities ... significance for the efficacy of mainstream

Methodenstreit is a German term (lit. 'debate over methods') referring to an intellectual controversy or debate over epistemology, research methodology, or the way in which academic inquiry is framed or pursued. More specifically, it also refers to a particular controversy over the method and epistemological character of economics carried on in the late 1880s and early 1890s between the supporters of the Austrian School of Economics, led by Carl Menger, and the proponents of the (German) Historical School, led by Gustav von Schmoller. To distinguish it from other similar disputes, German speakers sometimes specify it as the Methodenstreit der Nationalökonomie (Methodenstreit of economics), but outside of German speaking countries, the Germanism Methodenstreit mostly refers to this one. On an intellectual level the Methodenstreit was a question of whether there could be a science, apart from history, which could explain the dynamics of human action. Politically there were overtones of a conflict between the minarchism of the early Austrian School and the welfare state advocated by the Historical School.

The theory of indifference curves was developed by Francis Ysidro Edgeworth, who explained in his book "Mathematical Psychics: an Essay on the Application of Mathematics to the Moral Sciences”, 1881[3], the mathematics needed for its drawing; later on, Vilfredo Pareto was the first author to actually draw these curves, in his book "Manual of Political Economy", 1906[4][5]; and others in the first part of the 20th century. The theory can be derived from William Stanley Jevons'sordinal utility theory, which posits that individuals can always rank any consumption bundles by order of preference[6].

However, Arthur Marshall [3] took Mill’s restriction of the economic domain a step further. He acknowledged that real humans are obviously not Homo economicus, but thought that in certain realms, that model would be a good approximation. He believed that, in trade and war specifically, the forces pushing people toward what he obviously considered to be the universally winning strategy of ‘economically rational’ chronomyopic individualism would dominate human behaviour. More importantly, he abandoned some features of Mill’s definition, particularly its potential flexibility in relation to institutional, regulatory, and distributional issues; in many ways, he marks the divide between political economy and economics in the modern sense.

That sense is defined in Lionel Robbins’ 1935 essay, The Nature and Significance of Economic Science [4], where he defined economics as the science of constrained choice, primarily concerned with how scarce resources are allocated to competing alternatives. This definition represents a drastic narrowing of the domain. It not only excludes uncertainty, but it also excludes from the scope of economics both institutions and the medium-term evolution of economic systems. This arbitrarily isolates economics from its institutional framework – unfortunate, because the do two interact – and severs the causal pathways that link economic choices in one time with the availability of resources in later ones. He explains none of these exclusions, and it is very difficult to understand why his definition has been so widely accepted; it still dominates mainstream thinking, and has been very effective in maintaining the perception that there is such a thing as ‘positive economics’, [del comma] which is value free and apolitical.

The essay also helped to entrench apriorism as the proper source of the basic postulates of economics. As an example of one such postulate, Robbins asserts that we can always judge whether different experiences are of equal, greater, or less value to us, and that, ‘from this elementary fact’, we can derive the whole structure of NCE. (He specifically includes ‘formation of prices’ in this, even though the theory has never been able to handle this, cf Chapter 1, Section 1.3.) Most of our comparisons of experiences are not direct ones, between known, clearly defined items, such as whether to have more butter or more bread; rather, they are comparisons of our expectations about the outcome of the choices that confront us – whether to take this apprenticeship, or that university degree, for example. While this is no problem where we face quantifiable risk, we certainly cannot rank choices in the way that Robbins claimed where uncertainty is involved; and there cannot be indifference curves that involve the outcomes of uncertain choices (cf Chapter 1, section 1.3.1). Even if we restrict ourselves to decisions that we would all regard as strictly economic ones, many of these are highly uncertain; entrepreneurial decisions, and choices of R&D and marketing strategies, are obvious examples. This exclusion is a very odd one: Robbins must have known the work of Frank Knight, and JM Keynes on uncertainty [5]. He should not have simply ignored their concerns; rather, he should have given us reasons for rejecting their conclusions.

No-one doubts that some probabilities are difficult – if not impossible – to estimate, but mainstream economists deny that this can ever make any difference to the principles involved. Their standard reply to objections that depend upon distinguishing uncertainty from mere risk revolves around the so-called ‘Dutch Book’ argument; Daniel Hausman presents an admirably concise (and unusually complete) version of this argument [4]: (i) our subjective probabilities may or may not conform to the axioms of probability (e.g., that the probabilities of mutually exclusive events are additive); (ii), but if they do not, under certain circumstances we would have a ‘Dutch book’, i.e., a portfolio of bets (in the form of commitments to decisions that imply costs and rewards) that is guaranteed to loose us money, regardless of what happens; but (iii) we are economically rational (ER) operators, who want to avoid this; and, crucially, (iv) we will therefore learn to adjust our subjective probabilities accordingly.

The last item is questionable. Hausman seems to have been aware of this, and attempted to pre-empt objections with a ‘stylised fact’: he says that those who engage in uncertain activities will ‘make fools of themselves’ if their choices do not conform to the axioms, and that this is ‘contrary to observation’. He cites no source and presents no data for that piece of evidence; this is a striking deviation from the determinedly aprioristic approach of mainstream economics, in that it relies on an (alleged) observation to establish the truth of a basic concept. However, the learning necessary to prevent us from making fools of ourselves is only possible if certain conditions are met. First, in each case, decision-takers must have a reasonably large sample of closely-similar situations on which to base their probability estimates, and whose consequences they can observe as they develop. Secondly, the decision-takers must get clear, unambiguous signals about success and failure (and be able to recognize them); thirdly, success and failure must not be fuzzy (they would cease to be mutually exclusive); and finally, each action must have a single clear outcome, determined by a list of probabilities that remains unchanged during the action.

Uncertainty violates all these conditions; in fact, they only hold good for situations that are merely risky, and this introduces an uncomfortable element of circularity into the argument. Sample adequacy is problematic: uncertain situations are typically one-offs; at best, they are fuzzy members of some larger class. Signal-clarity is compromised if there is doubt about the nature of the problem being addressed, and the supposed mechanism cannot work in the presence of the well-known phenomenon of ‘self-sealing’, where managers adopt a style that suppresses feedback about failures [5]. Fuzziness over success and failure – often associated with ‘mission creep’ – is not uncommon. If some types of failure may later be reclassified as success, it will often be very difficult to tell whether the actors have learnt, or even should have learnt, to modify their estimates of the relevant probabilities.

Robbins’ version forms the basis of the continuing tradition of apriorism in economics; Von Mises’ reasons for rejecting the natural sciences model of investigation are slightly different. He asserts that economic science has to be about deduction from premisses that are known to be true beyond any possibility of challenge, but does not explain why this has to be so. (There are good reasons for thinking that this level of certainty is rare in all sciences, and absent from those that are still evolving, see Chapter 6.) To maintain the delusion that certainty is readily attainable, he has to ignore all the questions that might have been raised (even in his day) about his ‘praxeological categories’ (see below) of exchange, wages, and money. He also puts forward the argument that, because economic phenomena are complex, they can always be interpreted through various theories; and we can only choose among the latter by aprioristic reasoning, because that does not rely on the evidence whose interpretation is being contested. The same argument could be applied to other disciplines, whose subject matter is equally complex (cosmology, seismology, medicine, and biology, to name but a few), but which have not retreated into apriorism. More seriously it also assumes that empirical evidence is never capable of resolving (or at least contributing to the resolution of) disputes among theories – the very point he is trying to establish!

Von Mises was also a methodological individualist, treating co-operation (and social action generally) as the outcome of individuals rationally seeking their separate satisfactions by collaborating for limited, clearly-specified ends. He wanted to prune economics of any concern with collective bodies or actions (including aggregate data such as national income) as real entities that could be manipulated or themselves cause other effects; they were to be dealt with solely as manifestations of the actions of individuals.

However, this is not only the result of the dead hand of history: economic methodologists continue to defend apriorism. Daniel Hausman offers a modern, critical re-working of Mill’s approach [13][check remaining refs in seq -OK]. He agrees with Mill that economics is an inexact science, confidence in whose results rests on its ‘basic laws’, not on the tests of their implications. (As we have seen, Mill was quite inconsistent on this.) Hausman claims – correctly, I believe – that, in practice, most modern economists still subscribe to this opinion.

Hausman does set out some conditions that have to be met, to justify calling qualified, inexact, and limited generalizations ‘laws’. To qualify, a generalization, taken together with the relevant ceteris paribus clause, must be reliable, and not subject to a continual flow of exceptions. It should also possible to replace its ceteris paribus clauses with specific mechanisms that make it more reliable or more widely applicable. And it must be possible to identify those sub-domains in which the generalization is not reliable, and specify the interfering factors responsible.

This would all be entirely admirable, if only economists did it. Even limited reflection on what sorts of specific qualifications might enable us to curtail the domain of ceteris paribus suggests that they are almost certain to conflict with the assumptions that underpin the ‘qualified laws’ themselves. For example,

Milton Friedman developed a superficially different justification for setting aside realist criticisms of the foundational assumptions of mainstream economics; this is at least as widely-quoted as the ‘pure’ version of apriorism. He was a quasi-instrumentalist, claiming that a good theory is simply one that yields better predictions than its competitors. In a deliberately paradoxical way, he states that significant hypotheses always have assumptions that are ‘wildly inaccurate’, and ‘descriptive falsity is a necessary quality in powerful hypotheses’ [18].

Friedman uses a now-famous example to show that having unrealistic assumptions should not, by itself, disqualify a theory. He alleges that expert billiards players operate as if they knew the complicated mathematical equations involved, and could solve them in real time; we do not believe this unrealistic model – but (he claims) we have to operate as if we do. There are two things wrong here. (i) Predictors and Reality. Friedman’s choice of predictors has to come from some theory (there are substantial difficulties with the idea of throwing very large numbers of variables into a pool of potential predictors, and letting the gods of statistics sort them out). Suppose (as appears to be the case) that he thinks that players choose cue angle and momentum as-if they could calculate the behaviour of balls that are point masses, travelling in straight lines on a frictionless surface, and making perfectly elastic collisions. This will predict poorly, because the players actually impart various combinations of roll, slide, spin, and curve to the cue ball, by exploiting the differences between theory and reality (friction, masses that are not mathematical points, imperfect elasticity, etc). How could he identify the additional variables needed to improve prediction? The answer is, of course, to consider what variables should be effective, i.e., to abandon his criterion of predictive capability, and retreat into some form of realism.

(ii) How Agents Get Their Theories. Friedman was equally mistaken in insisting that we are obliged to adopt an as-if approach, in order to be able to link the observed actions to some standard theory. He is assuming that the only basis for managing the trajectories of the balls is the standard elementary model of the physics involved. However, it is entirely possible that the players have learnt, directly, a fuzzy control model for managing the real – and fairly complex – physics of the actual processes.

Skidelsky ... logic.

The Unexplored Alternative
modern cxy thx ... crits of applicability ...

?? Fox?? did work on checking whether 'laws of economics' obeyed ... Fox, G., (1997), Reason and Reality in the Methodologies of Economics: An Introduction, Edward Elgar, Cheltenham. 2. Blaug, M., (1980), The Methodology of Economics: Or, How Economists Explain, Cambridge University Press, Cambridge. 3. Hausman, D.M., (1992), The Inexact and Separate Science of Economics, Cambridge University Press, Cambridge. 4. Anderson A.R. and Belnap, N.D., (1975), Entailment: The Logic of Relevance and Necessity, Princeton University Press, Princeton, NJ 5. Mill, J.S., (1967), A System of Logic, Ratiocinative and Inductive: Being a Connected View of the Principles of Evidence and the Methods of Scientific Investigation, Longmans, London.

................ &&&

Other disciplines
Mechanisms similar to the original Kuhnian paradigm have been invoked in various disciplines. These include: the idea of major cultural themes, worldviews, ideologies, and  mindsets. They have somewhat similar meanings that apply to smaller and larger scale examples of disciplined thought. In addition, Michel Foucault used the terms episteme and discourse, mathesis and taxinomia, for aspects of a "paradigm" in Kuhn's original sense.

Complexity
... the study of emergence in general. Holland suggested that a number of factors are essential, if the system is to be capable of showing emergent behavior. These are: - A number of different types element, each with a variety of ways of inter- acting with the others. (There do not have to be large numbers of either: Life has only two types of element, and only four behavior-defining rules.) - Some way in which hierarchies can develop, so that assemblies of elements can take on the same roles as individual elements. (An assembly could be a group of neurons, or an entity in a social or economic situation.) - Each element (or assembly) should have a reasonably wide fan-out: it should affect a number of others, either directly, or by changing their potential for action. - The pattern of these ramifications should be varied, and include at least some indirect circular linkages. - The resulting connections between elements should not allow information and action to move through the whole collection of elements instantaneously (i.e., the system has to operate in ordinal or natural time).

Complexity edit:
Complexity is the state created by the presence of four causal features: time; heterogeneity; qualitative variables and ambiguity; and nonlinearity. In turn, these give rise to two of the most intransigently difficult characteristics of complex systems, uncertainty and emergence. Together, these six features ensure that the behaviour of a complex system is difficult to forecast, and – when we are looking at real systems rather than models – its structure is difficult to work out. In practice, this means that policy-making, developing effective strategies, and making decisions in specific cases are all much more difficult than in the case of a system which is simply and merely complicated.