Showing posts with label processes. Show all posts
Showing posts with label processes. Show all posts

Thursday, 31 January 2013

Economic Model

From Wikipedia, the free encyclopedia
  (Redirected from Economic models)
Jump to: navigation, search
A diagram of the IS/LM model
In economics, a model is a theoretical construct that represents economic processes by a set of variables and a set of logical and/or quantitative relationships between them. The economic model is a simplified framework designed to illustrate complex processes, often but not always using mathematical techniques. Frequently, economic models posit structural parameters. Structural parameters are underlying parameters in a model or class of models.[1] A model may have various parameters and those parameters may change to create various properties. Methodological uses of models include investigation, theorizing, fitting theories to the world.[2]

Contents

 [hide

[edit] Overview

In general terms, economic models have two functions: first as a simplification of and abstraction from observed data, and second as a means of selection of data based on a paradigm of econometric study.
Simplification is particularly important for economics given the enormous complexity of economic processes. This complexity can be attributed to the diversity of factors that determine economic activity; these factors include: individual and cooperative decision processes, resource limitations, environmental and geographical constraints, institutional and legal requirements and purely random fluctuations. Economists therefore must make a reasoned choice of which variables and which relationships between these variables are relevant and which ways of analyzing and presenting this information are useful.
Selection is important because the nature of an economic model will often determine what facts will be looked at, and how they will be compiled. For example inflation is a general economic concept, but to measure inflation requires a model of behavior, so that an economist can differentiate between real changes in price, and changes in price which are to be attributed to inflation.
In addition to their professional academic interest, the use of models include:
  • Forecasting economic activity in a way in which conclusions are logically related to assumptions;
  • Proposing economic policy to modify future economic activity;
  • Presenting reasoned arguments to politically justify economic policy at the national level, to explain and influence company strategy at the level of the firm, or to provide intelligent advice for household economic decisions at the level of households.
  • Planning and allocation, in the case of centrally planned economies, and on a smaller scale in logistics and management of businesses.
  • In finance predictive models have been used since the 1980s for trading (investment, and speculation), for example emerging market bonds were often traded based on economic models predicting the growth of the developing nation issuing them. Since the 1990s many long-term risk management models have incorporated economic relationships between simulated variables in an attempt to detect high-exposure future scenarios (often through a Monte Carlo method).
A model establishes an argumentative framework for applying logic and mathematics that can be independently discussed and tested and that can be applied in various instances. Policies and arguments that rely on economic models have a clear basis for soundness, namely the validity of the supporting model.
Economic models in current use do not pretend to be theories of everything economic; any such pretensions would immediately be thwarted by computational infeasibility and the paucity of theories for most types of economic behavior. Therefore conclusions drawn from models will be approximate representations of economic facts. However, properly constructed models can remove extraneous information and isolate useful approximations of key relationships. In this way more can be understood about the relationships in question than by trying to understand the entire economic process.
The details of model construction vary with type of model and its application, but a generic process can be identified. Generally any modelling process has two steps: generating a model, then checking the model for accuracy (sometimes called diagnostics). The diagnostic step is important because a model is only useful to the extent that it accurately mirrors the relationships that it purports to describe. Creating and diagnosing a model is frequently an iterative process in which the model is modified (and hopefully improved) with each iteration of diagnosis and respecification. Once a satisfactory model is found, it should be double checked by applying it to a different data set.

[edit] Types of models

According to whether all the model variables are deterministic, economic models can be classified as stochastic or non-stochastic models; according to whether all the variables are quantitative, economic models are classified as discrete or continuous choice model; according to the model's intended purpose/function, it can be classified as quantitative or qualitative; according to the model's ambit, it can be classified as a general equilibrium model, a partial equilibrium model, or even a non-equilibrium model; according to the economic agent's characteristics, models can be classified as rational agent models, representative agent models etc.
  • Non-stochastic mathematical models may be purely qualitative (for example, models involved in some aspect of social choice theory) or quantitative (involving rationalization of financial variables, for example with hyperbolic coordinates, and/or specific forms of functional relationships between variables). In some cases economic predictions of a model merely assert the direction of movement of economic variables, and so the functional relationships are used only in a qualitative sense: for example, if the price of an item increases, then the demand for that item will decrease. For such models, economists often use two-dimensional graphs instead of functions.
  • Qualitative models – Although almost all economic models involve some form of mathematical or quantitative analysis, qualitative models are occasionally used. One example is qualitative scenario planning in which possible future events are played out. Another example is non-numerical decision tree analysis. Qualitative models often suffer from lack of precision.
At a more practical level, quantitative modelling is applied to many areas of economics and several methodologies have evolved more or less independently of each other. As a result, no overall model taxonomy is naturally available. We can nonetheless provide a few examples which illustrate some particularly relevant points of model construction.
  • An accounting model is one based on the premise that for every credit there is a debit. More symbolically, an accounting model expresses some principle of conservation in the form
algebraic sum of inflows = sinks − sources
This principle is certainly true for money and it is the basis for national income accounting. Accounting models are true by convention, that is any experimental failure to confirm them, would be attributed to fraud, arithmetic error or an extraneous injection (or destruction) of cash which we would interpret as showing the experiment was conducted improperly.
  • Optimality and constrained optimization models – Other examples of quantitative models are based on principles such as profit or utility maximization. An example of such a model is given by the comparative statics of taxation on the profit-maximizing firm. The profit of a firm is given by
 \pi(x,t) = x p(x) - C(x) - t x \quad
where p(x) is the price that a product commands in the market if it is supplied at the rate x, xp(x) is the revenue obtained from selling the product, C(x) is the cost of bringing the product to market at the rate x, and t is the tax that the firm must pay per unit of the product sold.
The profit maximization assumption states that a firm will produce at the output rate x if that rate maximizes the firm's profit. Using differential calculus we can obtain conditions on x under which this holds. The first order maximization condition for x is
 \frac{\partial  \pi(x,t)}{\partial x} =\frac{\partial  (x p(x) - C(x))}{\partial x} -t= 0
Regarding x is an implicitly defined function of t by this equation (see implicit function theorem), one concludes that the derivative of x with respect to t has the same sign as
 \frac{\partial^2  (x p(x) - C(x))}{\partial^2 x}={\partial^2\pi(x,t)\over \partial x^2},
which is negative if the second order conditions for a local maximum are satisfied.
Thus the profit maximization model predicts something about the effect of taxation on output, namely that output decreases with increased taxation. If the predictions of the model fail, we conclude that the profit maximization hypothesis was false; this should lead to alternate theories of the firm, for example based on bounded rationality.
Borrowing a notion apparently first used in economics by Paul Samuelson, this model of taxation and the predicted dependency of output on the tax rate, illustrates an operationally meaningful theorem; that is one which requires some economically meaningful assumption which is falsifiable under certain conditions.
  • Aggregate models. Macroeconomics needs to deal with aggregate quantities such as output, the price level, the interest rate and so on. Now real output is actually a vector of goods and services, such as cars, passenger airplanes, computers, food items, secretarial services, home repair services etc. Similarly price is the vector of individual prices of goods and services. Models in which the vector nature of the quantities is maintained are used in practice, for example Leontief input-output models are of this kind. However, for the most part, these models are computationally much harder to deal with and harder to use as tools for qualitative analysis. For this reason, macroeconomic models usually lump together different variables into a single quantity such as output or price. Moreover, quantitative relationships between these aggregate variables are often parts of important macroeconomic theories. This process of aggregation and functional dependency between various aggregates usually is interpreted statistically and validated by econometrics. For instance, one ingredient of the Keynesian model is a functional relationship between consumption and national income: C = C(Y). This relationship plays an important role in Keynesian analysis.elow

[edit] Quantitative vs. Qualitative models

A quantitative model is designed to produce accurate predictions, without elucidating the underlying dynamics. On the other hand, a qualitative model aims to explain these dynamics without necessarily fitting empirical data or informing accurate predictions. Interest rate parity can be deemed a qualitative model in this sense: though it generally fails to fit exchange rate data as well as higher-powered statistical forecasting models, it offers an intuitive interpretation of the exchange rate and its relation to foreign and domestic interest and inflation rates. Views on the relative merits of qualitative and quantitative models vary across the profession: Milton Friedman can be viewed as having advocated a qualitative approach, while Ronald Coase worried that "if you torture the data long enough, it will confess;" Prospect theory as proposed by Daniel Kahneman(a Nobel prize winner) is more quantiative, while rational agent models are more qualitative.

[edit] Pitfalls

[edit] Restrictive, unrealistic assumptions

Provably unrealistic assumptions are pervasive in neoclassical economic theory (also called the "standard theory" or "neoclassical paradigm"), and those assumptions are inherited by simplified models for that theory. (Any model based on a flawed theory, cannot transcend the limitations of that theory.) Joseph Stiglitz' 2001 Nobel Prize lecture [3] reviews his work on Information Asymmetries, which contrasts with the assumption, in standard models, of "Perfect Information". Stiglitz surveys many aspects of these faulty standard models, and the faulty policy implications and recommendations that arise from their unrealistic assumptions. Stiglitz writes: (p. 519–520)
"I only varied one assumption – the assumption concerning perfect information – and in ways which seemed highly plausible. ... We succeeded in showing not only that the standard theory was not robust – changing only one assumption in ways which were totally plausible had drastic consequences, but also that an alternative robust paradigm with great explanatory power could be constructed. There were other deficiencies in the theory, some of which were closely connected. The standard theory assumed that technology and preferences were fixed. But changes in technology, R & D, are at the heart of capitalism. ... I similarly became increasingly convinced of the inappropriateness of the assumption of fixed preferences. (Footnote: In addition, much of recent economic theory has assumed that beliefs are, in some sense, rational. As noted earlier, there are many aspects of economic behavior that seem hard to reconcile with this hypothesis.)"
Economic models can be such powerful tools in understanding some economic relationships, that it is easy to ignore their limitations. One tangible example where the limits of Economic Models collided with reality, but were nevertheless accepted as "evidence" in public policy debates, involved models to simulate the effects of NAFTA, the North American Free Trade Agreement. James Stanford published his examination of 10 of these models. [4] [5]
The fundamental issue is circularity: embedding one's assumptions as foundational "input" axioms in a model, then proceeding to "prove" that, indeed, the model's "output" supports the validity of those assumptions. Such a model is consistent with similar models that have adopted those same assumptions. But is it consistent with reality? As with any scientific theory, empirical validation is needed, if we are to have any confidence in its predictive ability.
If those assumptions are, in fact, fundamental aspects of empirical reality, then the model's output will correctly describe reality (if it is properly "tuned", and if it is not missing any crucial assumptions). But if those assumptions are not valid for the particular aspect of reality one attempts to simulate, then it becomes a case of "GIGO" – Garbage In, Garbage Out".
James Stanford outlines this issue for the specific Computable General Equilibrium ("CGE") models that were introduced as evidence into the public policy debate, by advocates for NAFTA: [6]
"..CGE models are circular: if trade theory holds that free trade is mutually beneficial, then a quantitative simulation model based on that theoretical structure will automatically show that free trade is mutually beneficial...if the economy actually behaves in the manner supposed by the modeler, and the model itself sheds no light on this question, then a properly calibrated model may provide a rough empirical estimate of the effects of a policy change. But the validity of the model hangs entirely on the prior, nontested specification of its structural relationships ... [Hence, the apparent consensus of pro-NAFTA modelers] reflects more a consensus of prior theoretical views than a consensus of quantitative evidence."
Commenting on Stanford's analysis, one computer scientist wrote,
"When simulating the impact of a trade agreement on labor, it seems absurd to assume a priori that capital is immobile, that full employment will prevail, that unit labor costs are identical in the U.S. and Mexico, that American consumers will prefer products made in America (even if they are more expensive), and that trade flows between the U.S. and Mexico will exactly balance. Yet a recent examination of ten prominent CGE models showed that nine of them include at least one of those unrealistic assumptions, and two of the CGE models included all the above assumptions.
This situation bears a disturbing resemblance to computer-assisted intellectual dishonesty. Human beings have always been masters of self-deception, and hiding the essential basis of one's deception by embedding it in a computer program surely helps reduce what might otherwise become an intolerable burden of cognitive dissonance." [7]
In commenting on the general phenomenon of embedding unrealistic "GIGO" assumptions in neoclassical economic models, Nobel prizewinner Joseph Stiglitz is only slightly more diplomatic: (p. 507-8)
"But the ... model, by construction, ruled out the information asymmetries which are at the heart of macro-economic problems. Only if an individual has a severe case of schizophrenia is it possible for such problems to arise. If one begins with a model that assumes that markets clear, it is hard to see how one can get much insight into unemployment (the failure of the labor market to clear)." [3]
Despite the prominence of Stiglitz' 2001 Nobel prize lecture, the use of misleading (perhaps intentionally) neoclassical models persisted in 2007, according to these authors: [8]
" ... projected welfare gains from trade liberalization are derived from global computable general equilibrium (CGE) models, which are based on highly unrealistic assumptions. CGE models have become the main tool for economic analysis of the benefits of multilateral trade liberalization; therefore, it is essential that these models be scrutinized for their realism and relevance. ... we analyze the foundation of CGE models and argue that their predictions are often misleading. ... We appeal for more honest simulation strategies that produce a variety of plausible outcomes."
The working paper, "Debunking the Myths of Computable General Equilibrium Models", [9] provides both a history, and a readable theoretical analysis of what CGE models are, and are not. In particular, despite their name, CGE models use neither the Walrass general equilibrium, nor the Arrow-Debreus General Equilibrium frameworks. Thus, CGE models are highly distorted simplifications of theoretical frameworks—collectively called "the neoclassical economic paradigm" – which—themselves—were largely discredited by Joseph Stiglitz.
In the "Concluding Remarks" (p. 524) of his 2001 Nobel Prize lecture, Stiglitz examined why the neoclassical paradigm—and models based on it—persists, despite his publication, over a decade earlier, of some of his seminal results showing that Information Asymmetries invalidated core Assumptions of that paradigm and its models:
"One might ask, how can we explain the persistence of the paradigm for so long? Partly, it must be because, in spite of its deficiencies, it did provide insights into many economic phenomena. ... But one cannot ignore the possibility that the survival of the [neoclassical] paradigm was partly because the belief in that paradigm, and the policy prescriptions, has served certain interests." [3]
In the aftermath of the 2007–2009 global economic meltdown, the profession's attachment to unrealistic models is increasingly being questioned and criticized. After a weeklong workshop, one group of economists released a paper highly critical of their own profession's unethical use of unrealistic models. Their Abstract offers an indictment of fundamental practices:
"The economics profession appears to have been unaware of the long build-up to the current worldwide financial crisis and to have significantly underestimated its dimensions once it started to unfold. In our view, this lack of understanding is due to a misallocation of research efforts in economics. We trace the deeper roots of this failure to the profession’s focus on models that, by design, disregard key elements driving outcomes in real-world markets. The economics profession has failed in communicating the limitations, weaknesses, and even dangers of its preferred models to the public. This state of affairs makes clear the need for a major reorientation of focus in the research economists undertake, as well as for the establishment of an ethical code that would ask economists to understand and communicate the limitations and potential misuses of their models." [10]

[edit] Omitted details

A great danger inherent in the simplification required to fit the entire economy into a model is omitting critical elements. Some economists believe that making the model as simple as possible is an art form, but the details left out are often contentious. For instance:
  • Market models often exclude externalities such as unpunished pollution. Such models are the basis for many environmentalist attacks on mainstream economists. It is said that if the social costs of externalities were included in the models their conclusions would be very different, and models are often accused of leaving out these terms because of economist's pro-free market bias.
  • In turn, environmental economics has been accused of omitting key financial considerations from its models. For example the returns to solar power investments are sometimes modelled without a discount factor, so that the present utility of solar energy delivered in a century's time is precisely equal to gas-power station energy today.
  • Financial models can be oversimplified by relying on historically unprecedented arbitrage-free markets, probably underestimating the chance of crises, and under-pricing or under-planning for risk.
  • Models of consumption either assume that humans are immortal or that teenagers plan their life around an optimal retirement supported by the next generation. (These conclusions are probably harmless, except possibly to the credibility of the modelling profession.)
  • All Models share the same problem of the butterfly effect. Because they represent large complex nonlinear systems, it is possible that any missing variable as well as errors in value of included variables can lead to erroneous results.

[edit] Are economic models falsifiable?

The sharp distinction between falsifiable economic models and those that are not is by no means a universally accepted one. Indeed one can argue that the ceteris paribus (all else being equal) qualification that accompanies any claim in economics is nothing more than an all-purpose escape clause (See N. de Marchi and M. Blaug.) The all else being equal claim allows holding all variables constant except the few that the model is attempting to reason about. This allows the separation and clarification of the specific relationship. However, in reality all else is never equal, so economic models are guaranteed to not be perfect. The goal of the model is that the isolated and simplified relationship has some predictive power that can be tested, mainly that it is a theory capable of being applied to reality. To qualify as a theory, a model should arguably answer three questions: Theory of what?, Why should we care?, What merit is in your explanation? If the model fails to do so, it is probably too detached from reality and meaningful societal issues to qualify as theory. Research conducted according to this three-question test finds that in the 2004 edition of the Journal of Economic Theory, only 12% of the articles satisfy the three requirements.” [11] Ignoring the fact that the ceteris paribus assumption is being made is another big failure often made when a model is applied. At the minimum an attempt must be made to look at the various factors that may not be equal and take those into account.

[edit] History

One of the major problems addressed by economic models has been understanding economic growth. An early attempt to provide a technique to approach this came from the French physiocratic school in the Eighteenth century. Among these economists, François Quesnay should be noted, particularly for his development and use of tables he called Tableaux économiques. These tables have in fact been interpreted in more modern terminology as a Leontiev model, see the Phillips reference below.
All through the 18th century (that is, well before the founding of modern political economy, conventionally marked by Adam Smith's 1776 Wealth of Nations) simple probabilistic models were used to understand the economics of insurance. This was a natural extrapolation of the theory of gambling, and played an important role both in the development of probability theory itself and in the development of actuarial science. Many of the giants of 18th century mathematics contributed to this field. Around 1730, De Moivre addressed some of these problems in the 3rd edition of the Doctrine of Chances. Even earlier (1709), Nicolas Bernoulli studies problems related to savings and interest in the Ars Conjectandi. In 1730, Daniel Bernoulli studied "moral probability" in his book Mensura Sortis, where he introduced what would today be called "logarithmic utility of money" and applied it to gambling and insurance problems, including a solution of the paradoxical Saint Petersburg problem. All of these developments were summarized by Laplace in his Analytical Theory of Probabilities (1812). Clearly, by the time David Ricardo came along he had a lot of well-established math to draw from.

[edit] Tests of macroeconomic predictions

In the late 1980s a research institute compared twelve leading macroeconomic models available at the time. They compared the models' predictions for how the economy would respond to specific economic shocks (allowing the models to control for all the variability in the real world; this was a test of model vs. model, not a test against the actual outcome). Although the models simplified the world and started from a stable, known common parameters the various models gave significantly different answers. For instance, in calculating the impact of a monetary loosening on output some models estimated a 3% change in GDP after one year, and one gave almost no change, with the rest spread between.[12]
Partly as a result of such experiments, modern central bankers no longer have as much confidence that it is possible to 'fine-tune' the economy as they had in the 1960s and early 1970s. Modern policy makers tend to use a less activist approach, explicitly because they lack confidence that their models will actually predict where the economy is going, or the effect of any shock upon it. The new, more humble, approach sees danger in dramatic policy changes based on model predictions, because of several practical and theoretical limitations in current macroeconomic models; in addition to the theoretical pitfalls, (listed above) some problems specific to aggregate modelling are:
  • Limitations in model construction caused by difficulties in understanding the underlying mechanisms of the real economy. (Hence the profusion of separate models.)
  • The law of Unintended consequences, on elements of the real economy not yet included in the model.
  • The time lag in both receiving data and the reaction of economic variables to policy makers attempts to 'steer' them (mostly through monetary policy) in the direction that central bankers want them to move. Milton Friedman has vigorously argued that these lags are so long and unpredictably variable that effective management of the macroeconomy is impossible.
  • The difficulty in correctly specifying all of the parameters (through econometric measurements) even if the structural model and data were perfect.
  • The fact that all the model's relationships and coefficients are stochastic, so that the error term becomes very large quickly, and the available snapshot of the input parameters is already out of date.
  • Modern economic models incorporate the reaction of the public & market to the policy maker's actions (through game theory), and this feedback is included in modern models (following the rational expectations revolution and Robert Lucas, Jr.'s critique of the optimal control concept of precise macroeconomic management). If the response to the decision maker's actions (and their credibility) must be included in the model then it becomes much harder to influence some of the variables simulated.

[edit] Comparison with models in other sciences

Complex systems specialist and mathematician David Orrell wrote on this issue and explained that the weather, human health and economics use similar methods of prediction (mathematical models). Their systems – the atmosphere, the human body and the economy – also have similar levels of complexity. He found that forecasts fail because the models suffer from two problems : i- they cannot capture the full detail of the underlying system, so rely on approximate equations; ii- they are sensitive to small changes in the exact form of these equations. This is because complex systems like the economy or the climate consist of a delicate balance of opposing forces, so a slight imbalance in their representation has big effects. Thus, predictions of things like economic recessions are still highly inaccurate, despite the use of enormous models running on fast computers. [2]

[edit] The effects of deterministic chaos on economic models

Economic and meteorological simulations may share a fundamental limit to their predictive powers: chaos. Although the modern mathematical work on chaotic systems began in the 1970s the danger of chaos had been identified and defined in Econometrica as early as 1958:
"Good theorising consists to a large extent in avoiding assumptions....(with the property that)....a small change in what is posited will seriously affect the conclusions."
(William Baumol, Econometrica, 26 see: Economics on the Edge of Chaos).
It is straightforward to design economic models susceptible to butterfly effects of initial-condition sensitivity.[13][14]
However, the econometric research program to identify which variables are chaotic (if any) has largely concluded that aggregate macroeconomic variables probably do not behave chaotically. This would mean that refinements to the models could ultimately produce reliable long-term forecasts. However the validity of this conclusion has generated two challenges:
  • In 2004 Philip Mirowski challenged this view and those who hold it, saying that chaos in economics is suffering from a biased "crusade" against it by neo-classical economics in order to preserve their mathematical models.
  • The variables in finance may well be subject to chaos. Also in 2004, the University of Canterbury study Economics on the Edge of Chaos concludes that after noise is removed from S&P 500 returns, evidence of deterministic chaos is found.
More recently, chaos (or the butterfly effect) has been identified as less significant than previously thought to explain prediction errors. Rather, the predictive power of economics and meteorology would mostly be limited by the models themselves and the nature of their underlying systems (see Comparison with models in other sciences above).

[edit] The critique of hubris in planning

A key strand of free market economic thinking is that the market's "invisible hand" guides an economy to prosperity more efficiently than central planning using an economic model. One reason, emphasized by Friedrich Hayek, is the claim that many of the true forces shaping the economy can never be captured in a single plan. This is an argument which cannot be made through a conventional (mathematical) economic model, because it says that there are critical systemic-elements that will always be omitted from any top-down analysis of the economy.[15]

[edit] Examples of economic models

[edit] See also

[edit] Notes

  1. ^ Moffatt, Mike. (2008) About.com Structural Parameters Economics Glossary; Terms Beginning with S. Accessed June 19, 2008.
  2. ^ Mary S. Morgan, 2008 "models," The New Palgrave Dictionary of Economics, 2nd Edition, Abstract.
       • Vivian Walsh 1987. "models and theory," The New Palgrave: A Dictionary of Economics, v. 3, pp. 482-83.
  3. ^ a b c Joseph E. Stiglitz. 2001 Nobel Prize lecture: "INFORMATION AND THE CHANGE IN THE PARADIGM IN ECONOMICS". http://nobelprize.org/nobel_prizes/economics/laureates/2001/stiglitz-lecture.pdf.
  4. ^ James Stanford. "Continental Economic Integration: Modeling the Impact on Labor," Annals of the American Academy of Political and Social Science, Mar 1993, V526 p. 92-110
  5. ^ James Stanford. 1993. "FREE TRADE AND THE IMAGINARY WORLDS OF ECONOMIC MODELERS". http://www.pcdf.org/1993/45stanfo.htm.
  6. ^ Robert Aponte. "NAFTA AND MEXICAN MIGRATION TO MICHIGAN AND THE U.S.". http://www.jsri.msu.edu/RandS/research/wps/wp25.pdf.
  7. ^ Rick Crawford. 1996. Gerbner, George; Mowlana, Hamid; Schiller, Herbert I (1996), Computer-assisted Crises, ISBN 978-0-8133-2072-4, http://www.google.com/search?tbm=bks&tbo=1&q=CGE+models+included+all+the+above+assumptions.+This+situation+bears+a+disturbing+resemblance in "Invisible Crises: What Conglomerate Control of Media Means for America and the World". Ed. Herbert Schiller, Hamid Mowlana, George Gerbner. Westview. 1996.     Free, authorized version viewable at: Computer-assisted Crises, http://infowarethics.org/computer-assisted.crises.html
  8. ^ Lance Taylor & Rudiger von Arnim. March 2007. "Projected Benefits of the Doha Round Hinge on Misleading Trade Models". http://www.newschool.edu/cepa/publications/policynotes/Doha%20Policy%20Note%20Final%2003_12_07.pdf.
  9. ^ Mitra-Kahn, Benjamin H., 2008. "Debunking the Myths of Computable General Equilibrium Models". http://www.newschool.edu/cepa/publications/workingpapers/SCEPA%20Working%20Paper%202008-1%20Kahn.pdf. SCEPA Working Paper 01-2008.
  10. ^ Colander, David; Follmer, Hans; Haas, Armin; Goldberg, Michael D.; Juselius, Katarina; Kirman, Alan; Lux, Thomas; and Sloth, Birgitte: The Financial Crisis and the Systemic Failure of Academic Economics. SSRN 1355882. (March 9, 2009). Univ. of Copenhagen Dept. of Economics Discussion Paper No. 09-03
  11. ^ Klein, Daniel B. and Pedro P. Romero. "Model Building Versus Theorizing: The Paucity of Theory in the Journal of Economic Theory" (May 2007). [1]
  12. ^ Frankel,, Jeffrey A. (May 1986). "The Sources of Disagreement Among International Macro Models and Implications for Policy Coordination". NBER Working Paper. http://www.nber.org/papers/w1925.pdf. Retrieved 23 January 2012.
  13. ^ Paul Wilmott on his early research in finance: "I quickly dropped... chaos theory (as) it was too easy to construct ‘toy models’ that looked plausible but were useless in practice." Wilmott, Paul (2009), Frequently Asked Questions in Quantitative Finance, John Wiley and Sons, p. 227, http://books.google.com/books?id=n4swgjSoMyIC&lpg=PT227&pg=PT227#v=onepage
  14. ^ Kuchta, Steve (2004) (pdf), Nonlinearity and Chaos in Macroeconomics and Financial Markets, University of Connecticut, http://www.sp.uconn.edu/~ages/files/NL_Chaos_and_%20Macro%20-%20429%20Essay.pdf
  15. ^ Hayek, Friedrich (September, 1945), "The Use of Knowledge in Society", American Economic Review 35 (4): 519–530, JSTOR 1809376.

[edit] References

  • Baumol, William & Blinder, Alan (1982), Economics: Principles and Policy (2nd ed.), New York: Harcourt Brace Jovanovich, ISBN 0-15-518839-9.
  • Caldwell, Bruce (1994), Beyond Positivism: Economic Methodology in the Twentieth Century (Revised ed.), New York: Routledge, ISBN 0-415-10911-6.
  • Holcombe, R. (1989), Economic Models and Methodology, New York: Greenwood Press, ISBN 0-313-26679-4. Defines model by analogy with maps, an idea borrowed from Baumol and Blinder. Discusses deduction within models, and logical derivation of one model from another. Chapter 9 compares the neoclassical school and the Austrian school, in particular in relation to falsifiability.
  • Lange, Oskar (1945), "The Scope and Method of Economics", Review of Economic Studies (The Review of Economic Studies Ltd.) 13 (1): 19–32, doi:10.2307/2296113, JSTOR 2296113. One of the earliest studies on methodology of economics, analysing the postulate of rationality.
  • de Marchi, N. B. & Blaug, M. (1991), Appraising Economic Theories: Studies in the Methodology of Research Programs, Brookfield, VT: Edward Elgar, ISBN 1-85278-515-2. A series of essays and papers analysing questions about how (and whether) models and theories in economics are empirically verified and the current status of positivism in economics.
  • Morishima, Michio (1976), The Economic Theory of Modern Society, New York: Cambridge University Press, ISBN 0-521-21088-7. A thorough discussion of many quantitative models used in modern economic theory. Also a careful discussion of aggregation.
  • Orrell, David (2007), Apollo's Arrow: The Science of Prediction and the Future of Everything, Toronto: Harper Collins Canada, ISBN 0-00-200740-1.
  • Phillips, Almarin (1955), "The Tableau Économique as a Simple Leontief Model", Quarterly Journal of Economics (The MIT Press) 69 (1): 137–144, doi:10.2307/1884854, JSTOR 1884854.
  • Samuelson, Paul A. (1948), "The Simple Mathematics of Income Determination", in Metzler, Lloyd A., Income, Employment and Public Policy; essays in honor of Alvin Hansen, New York: W. W. Norton.
  • Samuelson, Paul A. (1983), Foundations of Economic Analysis (Enlarged ed.), Cambridge: Harvard University Press, ISBN 0-674-31301-1. This is a classic book carefully discussing comparative statics in microeconomics, though some dynamics is studied as well as some macroeconomic theory. This should not be confused with Samuelson's popular textbook.
  • Tinbergen, Jan (1939), Statistical Testing of Business Cycle Theories, Geneva: League of Nations.
  • Walsh, Vivian (1987), "Models and theory", The New Palgrave: A Dictionary of Economics, 3, New York: Stockton Press, pp. 482–483, ISBN 0-935859-10-1.
  • Wold, H. (1938), A Study in the Analysis of Stationary Time Series, Stockholm: Almqvist and Wicksell.
  • Wold, H. & Jureen, L. (1953), Demand Analysis: A Study in Econometrics, New York: Wiley.

[edit] External links

Tuesday, 18 December 2012

Innovation

From Wikipedia, the free encyclopedia

  (Redirected from Economists of innovation)

Jump to: navigation, search

Innovation is the development of new customers value through solutions that meet new needs, inarticulate needs, or old customer and market needs in new ways. This is accomplished through different or more effective products, processes, services, technologies, or ideas that are readily available to markets, governments, and society. Innovation differs from invention in that innovation refers to the use of a better and, as a result, novel idea or method, whereas invention refers more directly to the creation of the idea or method itself. Innovation differs from improvement in that innovation refers to the notion of doing something different (Lat. innovare: "to change") rather than doing the same thing better.

Contents

 [hide

[edit] Etymology

The word innovation derives from the Latin word innovates, which is the noun form of innovare "to renew or change," stemming from in—"into" + novus—"new". Diffusion of innovation research was first started in 1903 by seminal researcher Gabriel Tarde, who first plotted the S-shaped diffusion curve. Tarde (1903) defined the innovation-decision process as a series of steps that includes:[1]
  1. First knowledge
  2. Forming an attitude
  3. A decision to adopt or reject
  4. Implementation and use
  5. Confirmation of the decision

[edit] Inter-disciplinary views

[edit] Individual

Creativity has been studied using many different approaches.

[edit] Society

Due to its widespread effect, innovation is an important topic in the study of economics, business, entrepreneurship, design, technology, sociology, and engineering. In society, innovation aids in comfort, convenience, and efficiency in everyday life. For instance, the benchmarks in railroad equipment and infrastructure added to greater safety, maintenance, speed, and weight capacity for passenger services. These innovations included wood to steel cars, iron to steel rails, stove-heated to steam-heated cars, gas lighting to electric lighting, diesel-powered to electric-diesel locomotives. By the mid-20th century, trains were making longer, faster, and more comfortable trips at lower costs for passengers.[2] Other areas that add to everyday quality of life include: the innovations to the light bulb from incandescent to compact fluorescent then LED technologies which offer greater efficiency, durability and brightness; adoption of modems to cellular phones, paving the way to smartphones which supply the public with internet access any time or place; cathode-ray tube to flat-screen LCD televisions and others.

[edit] Business and economics

In business and economics, innovation is the catalyst to growth. With rapid advancements in transportation and communications over the past few decades, the old world concepts of factor endowments and comparative advantage which focused on an area’s unique inputs are outmoded for today’s global economy. Economist Joseph Schumpeter, who contributed greatly to the study of innovation, argued that industries must incessantly revolutionize the economic structure from within, that is innovate with better or more effective processes and products, such as the shift from the craft shop to factory. He famously asserted that “creative destruction is the essential fact about capitalism.”[3] In addition, entrepreneurs continuously look for better ways to satisfy their consumer base with improved quality, durability, service, and price which come to fruition in innovation with advanced technologies and organizational strategies.[4]
One prime example is the explosive boom of Silicon Valley startups out of the Stanford Industrial Park. In 1957, dissatisfied employees of Shockley Semiconductor, the company of Nobel laureate and co-inventor of the transistor William Shockley, left to form an independent firm, Fairchild Semiconductor. After several years, Fairchild developed into a formidable presence in the sector. Eventually, these founders left to start their own companies based on their own, unique, latest ideas, and then leading employees started their own firms. Over the next 20 years, this snowball process launched the momentous startup company explosion of information technology firms. Essentially, Silicon Valley began as 65 new enterprises born out of Shockley’s eight former employees.[5]

[edit] Organizations

In the organizational context, innovation may be linked to positive changes in efficiency, productivity, quality, competitiveness, market share, and others. All organizations can innovate, including for example hospitals,[6] universities, and local governments. For instance, former Mayor Martin O’Malley pushed the City of Baltimore to use CitiStat, a performance-measurement data and management system that allows city officials to maintain statistics on crime trends to condition of potholes. This system aids in better evaluation of policies and procedures with accountability and efficiency in terms of time and money. In its first year, CitiStat saved the city $13.2 million.[7] Even mass transit systems have innovated with hybrid bus fleets to real-time tracking at bus stands. In addition, the growing use of mobile data terminals in vehicles that serves as communication hubs between vehicles and control center automatically send data on location, passenger counts, engine performance, mileage and other information. This tool helps to deliver and manage transportation systems.[8]
Still other innovative strategies include hospitals digitizing medical information in electronic medical records; HUD’s HOPE VI initiatives to eradicate city’s severely distressed public housing to revitalized, mixed income environments; the Harlem Children’s Zone that uses a community-based approach to educate local area children; and EPA’s brownfield grants that aids in turning over brownfields for environmental protection, green spaces, community and commercial development.

[edit] Sources of Innovation

There are several sources of innovation. According to Peter F. Drucker the general sources of innovations are different changes in industry structure, in market structure, in local and global demographics, in human perception, mood and meaning, in the amount of already available scientific knowledge, etc.. Also, internet research, developing of people skills, language development, cultural background, skype, Facebook, etc. In the simplest linear model of innovation the traditionally recognized source is manufacturer innovation. This is where an agent (person or business) innovates in order to sell the innovation. Another source of innovation, only now becoming widely recognized, is end-user innovation. This is where an agent (person or company) develops an innovation for their own (personal or in-house) use because existing products do not meet their needs. MIT economist Eric von Hippel has identified end-user innovation as, by far, the most important and critical in his classic book on the subject, Sources of Innovation.[9] In addition, the famous robotics engineer Joseph F. Engelberger asserts that innovations require only three things: 1. A recognized need, 2. Competent people with relevant technology, and 3. Financial support.[10] The Kline Chain-linked model of innovation[11] places emphasis on potential market needs as drivers of the innovation process, and describes the complex and often iterative feedback loops between marketing, design, manufacturing, and R&D.
Innovation by businesses is achieved in many ways, with much attention now given to formal research and development (R&D) for "breakthrough innovations." R&D help spur on patents and other scientific innovations that leads to productive growth in such areas as industry, medicine, engineering, and government.[12] Yet, innovations can be developed by less formal on-the-job modifications of practice, through exchange and combination of professional experience and by many other routes. The more radical and revolutionary innovations tend to emerge from R&D, while more incremental innovations may emerge from practice – but there are many exceptions to each of these trends.
An important innovation factor includes customers buying products or using services. As a result, firms may incorporate users in focus groups (user centred approach), work closely with so called lead users (lead user approach) or users might adapt their products themselves. The lead user method focuses on idea generation based on leading users to develop breakthrough innovations. U-STIR, a project to innovate Europe’s surface transportation system, employs such workshops.[13] Regarding this user innovation, a great deal of innovation is done by those actually implementing and using technologies and products as part of their normal activities. In most of the times user innovators have some personal record motivating them. Sometimes user-innovators may become entrepreneurs, selling their product, they may choose to trade their innovation in exchange for other innovations, or they may be adopted by their suppliers. Nowadays, they may also choose to freely reveal their innovations, using methods like open source. In such networks of innovation the users or communities of users can further develop technologies and reinvent their social meaning.[14][15]

[edit] Goals/failures

Programs of organizational innovation are typically tightly linked to organizational goals and objectives, to the business plan, and to market competitive positioning. One driver for innovation programs in corporations is to achieve growth objectives. As Davila et al. (2006) notes, "Companies cannot grow through cost reduction and reengineering alone... Innovation is the key element in providing aggressive top-line growth, and for increasing bottom-line results." [16]
One survey across a large number of manufacturing and services organizations found, ranked in decreasing order of popularity, that systematic programs of organizational innovation are most frequently driven by: Improved quality, Creation of new markets, Extension of the product, range, Reduced labor costs, Improved production processes, Reduced materials, Reduced environmental damage, Replacement of products/services, Reduced energy consumption, Conformance to regulations.[16]
These goals vary between improvements to products, processes and services and dispel a popular myth that innovation deals mainly with new product development. Most of the goals could apply to any organisation be it a manufacturing facility, marketing firm, hospital or local government. Whether innovation goals are successfully achieved or otherwise depends greatly on the environment prevailing in the firm.[17]
Conversely, failure can develop in programs of innovations. The causes of failure have been widely researched and can vary considerably. Some causes will be external to the organization and outside its influence of control. Others will be internal and ultimately within the control of the organization. Internal causes of failure can be divided into causes associated with the cultural infrastructure and causes associated with the innovation process itself. Common causes of failure within the innovation process in most organisations can be distilled into five types: Poor goal definition, Poor alignment of actions to goals, Poor participation in teams, Poor monitoring of results, Poor communication and access to information.[18]

[edit] Diffusion

InnovationLifeCycle.jpg
Once innovation occurs, innovations may be spread from the innovator to other individuals and groups. This process has been proposed that the life cycle of innovations can be described using the 's-curve' or diffusion curve. The s-curve maps growth of revenue or productivity against time. In the early stage of a particular innovation, growth is relatively slow as the new product establishes itself. At some point customers begin to demand and the product growth increases more rapidly. New incremental innovations or changes to the product allow growth to continue. Towards the end of its life cycle growth slows and may even begin to decline. In the later stages, no amount of new investment in that product will yield a normal rate of return
The s-curve derives from an assumption that new products are likely to have "product life". i.e. a start-up phase, a rapid increase in revenue and eventual decline. In fact the great majority of innovations never get off the bottom of the curve, and never produce normal returns.
Innovative companies will typically be working on new innovations that will eventually replace older ones. Successive s-curves will come along to replace older ones and continue to drive growth upwards. In the figure above the first curve shows a current technology. The second shows an emerging technology that currently yields lower growth but will eventually overtake current technology and lead to even greater levels of growth. The length of life will depend on many factors.[19]

[edit] Measures

There are two fundamentally different types of measures for innovation: the organizational level and the political level.

[edit] Organizational level

The measure of innovation at the organizational level relates to individuals, team-level assessments, and private companies from the smallest to the largest. Measure of innovation for organizations can be conducted by surveys, workshops, consultants or internal benchmarking. There is today no established general way to measure organizational innovation. Corporate measurements are generally structured around balanced scorecards which cover several aspects of innovation such as business measures related to finances, innovation process efficiency, employees' contribution and motivation, as well benefits for customers. Measured values will vary widely between businesses, covering for example new product revenue, spending in R&D, time to market, customer and employee perception & satisfaction, number of patents, additional sales resulting from past innovations.[20]

[edit] Political level

For the political level, measures of innovation are more focused on a country or region competitive advantage through innovation. In this context, organizational capabilities can be evaluated through various evaluation frameworks, such as those of the European Foundation for Quality Management. The OECD Oslo Manual (1995) suggests standard guidelines on measuring technological product and process innovation. Some people consider the Oslo Manual complementary to the Frascati Manual from 1963. The new Oslo manual from 2005 takes a wider perspective to innovation, and includes marketing and organizational innovation. These standards are used for example in the European Community Innovation Surveys.[21]
Other ways of measuring innovation have traditionally been expenditure, for example, investment in R&D (Research and Development) as percentage of GNP (Gross National Product). Whether this is a good measurement of innovation has been widely discussed and the Oslo Manual has incorporated some of the critique against earlier methods of measuring. The traditional methods of measuring still inform many policy decisions. The EU Lisbon Strategy has set as a goal that their average expenditure on R&D should be 3% of GDP.[22]

[edit] Indicators

Many scholars claim that there is a great bias towards the "science and technology mode" (S&T-mode or STI-mode), while the "learning by doing, using and interacting mode" (DUI-mode) is widely ignored. For an example, that means you can have the better high tech or software, but there are also crucial learning tasks important for innovation. But these measurements and research are rarely done.
A common industry view (unsupported by empirical evidence) is that comparative cost-effectiveness research (CER) is a form of price control which, by reducing returns to industry, limits R&D expenditure, stifles future innovation and compromises new products access to markets.[23] Some academics claim the CER is a valuable value-based measure of innovation which accords truly significant advances in therapy (those that provide 'health gain') higher prices than free market mechanisms.[24] Such value-based pricing has been viewed as a means of indicating to industry the type of innovation that should be rewarded from the public purse.[25] The Australian academic Thomas Alured Faunce has developed the case that national comparative cost-effectiveness assessment systems should be viewed as measuring 'health innovation' as an evidence-based concept distinct from valuing innovation through the operation of competitive markets (a method which requires strong anti-trust laws to be effective) on the basis that both methods of assessing innovation in pharmaceuticals are mentioned in annex 2C.1 of the AUSFTA.[26][27][28]

[edit] Measurement indices

Several indexes exist that attempt to measure innovation include:
  • The Innovation Index, developed by the Indiana Business Research Center, to measure innovation capacity at the county or regional level in the U.S.[29]
  • The State Technology and Science Index, developed by the Milken Institute is a U.S. wide benchmark to measure the science and technology capabilities that furnish high paying jobs based around key components.
  • The Oslo Manual is focused on North America, Europe, and other rich economies.
  • The Bogota Manual, similar to the above, focuses on Latin America and the Caribbean countries.
  • The Creative Class developed by Richard Florida
  • The Innovation Capacity Index (ICI) published by a large number of international professors working in a collaborative fashion. The top scorers of ICI 2009–2010 being: 1. Sweden 82.2; 2. Finland 77.8; and 3. United States 77.5.
  • The Global Innovation Index is a global index measuring the level of innovation of a country, produced jointly by The Boston Consulting Group (BCG), the National Association of Manufacturers (NAM), and The Manufacturing Institute (MI), the NAM's nonpartisan research affiliate. NAM describes it as the "largest and most comprehensive global index of its kind".
  • The INSEAD Global Innovation Index
  • The INSEAD Innovation Efficacy Index

[edit] Global innovation index

This international innovation index is one of many research studies that try to build a ranking of countries related to innovation. Other indexes are the Innovations Indikator, Innovation Union Scoreboard, EIU Innovation Ranking, BCG International Innovation Index, Global Competitiveness Report, World Competitiveness Scoreboard, ITIF Index. The top 3 countries among all these different indexes are Switzerland, Sweden and Singapore.[30]
The global innovation index looks at both the business outcomes of innovation and government's ability to encourage and support innovation through public policy. The study comprised a survey of more than 1,000 senior executives from NAM member companies across all industries; in-depth interviews with 30 of the executives; and a comparison of the "innovation friendliness" of 110 countries and all 50 U.S. states. The findings are published in the report, "The Innovation Imperative in Manufacturing: How the United States Can Restore Its Edge."[31]
The report discusses not only country performance but also what companies are doing and should be doing to spur innovation. It looks at new policy indicators for innovation, including tax incentives and policies for immigration, education and intellectual property.
The latest index was published in March 2009.[32] To rank the countries, the study measured both innovation inputs and outputs. Innovation inputs included government and fiscal policy, education policy and the innovation environment. Outputs included patents, technology transfer, and other R&D results; business performance, such as labor productivity and total shareholder returns; and the impact of innovation on business migration and economic growth. The following is a list of the twenty largest countries (as measured by GDP) by the International Innovation Index:
RankCountryOverallInnovation InputsInnovation Performance
1 South Korea2.261.752.55
2 United States1.801.282.16
3 Japan1.791.162.25
4 Sweden1.641.251.88
5 Netherlands1.551.401.55
6 Canada1.421.391.32
7 United Kingdom1.421.331.37
8 Germany1.121.051.09
9 France1.121.170.96
10 Australia1.020.891.05
11 Spain0.930.830.95
12 Belgium0.860.850.79
13 China0.730.071.32
14 Italy0.210.160.24
15 India0.060.14−0.02
16 Russia−0.09−0.02−0.16
17 Mexico−0.160.11−0.42
18 Turkey−0.210.15−0.55
19 Indonesia−0.57−0.63−0.46
20 Brazil−0.59−0.62−0.51

[edit] Government policies

Given the noticeable effects on efficiency, quality of life, and productive growth, innovation is a key factor in society and economy. Consequently, policymakers are working to develop environments that will foster innovation and its resulting positive benefits. For instance, experts are advocating that the U.S. federal government launch a National Infrastructure Foundation, a nimble, collaborative strategic intervention organization that will house innovations programs from fragmented silos under one entity, inform federal officials on innovation performance metrics, strengthen industry-university partnerships, and support innovation economic development initiatives, especially to strengthen regional clusters. Because clusters are the geographic incubators of innovative products and processes, a cluster development grant program would also be targeted for implementation. By focusing on innovating in such areas as precision manufacturing, information technology, and clean energy, other areas of national concern would be tackled including government debt, carbon footprint, and oil dependence.[12] The U.S. Economic Development Administration understand this reality in their continued Regional Innovation Clusters initiative.[33] In addition, federal grants in R&D, a crucial driver of innovation and productive growth, should be expanded to levels similar to Japan, Finland, South Korea, and Switzerland in order to stay globally competitive. Also, such grants should be better procured to metropolitan areas, the essential engines of the American economy.[12]
Many countries recognize the importance of research and development as well as innovation including Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT);[34] Germany’s Federal Ministry of Education and Research;[35] and the Ministry of Science and Technology in the People’s Republic of China [1]. Furthermore, Russia’s innovation programme is the Medvedev modernisation programme which aims at creating a diversified economy based on high technology and innovation. Also, the Government of Western Australia has established a number of innovation incentives for government departments. Landgate was the first Western Australian government agency to establish its Innovation Program.[36] The Cairns Region established the Tropical Innovation Awards in 2010 open to all businesses in Australia.[37] The 2011 Awards were extended to include participants from all Tropical Zone Countries.

[edit] See also

[edit] References

  1. ^ Tarde, G. (1903). The laws of imitation (E. Clews Parsons, Trans.). New York: H. Holt & Co.
  2. ^ EuDaly, K, Schafer, M, Boyd, Jim, Jessup, S, McBridge, A, Glischinksi, S. (2009). The Complete Book of North American Railroading. Voyageur Press. 1-352 pgs.
  3. ^ Schumpeter, J. A. (1943). Capitalism, Socialism, and Democracy (6 ed.). Routledge. pp. 81–84. ISBN 0-415-10762-8.
  4. ^ Heyne, P., Boettke, P. J., and Prychitko, D. L. (2010). The Economic Way of Thinking. Prentice Hall, 12th ed. Pp. 163, 317–318.
  5. ^ Gregory Gromov (2011). Silicon Valley History. http://www.netvalley.com/svhistory.html
  6. ^ Salge, T.O. & Vera, A. 2009, Hospital innovativeness and organizational performance, Health Care Management Review, Vol. 34, Issue 1, pp. 54–67.
  7. ^ Perez, T. and Rushing R. (2007). The CitiStat Model: How Data-Driven Government Can Increase Efficiency and Effectiveness. Center for American Progress Report. Pp. 1–18.
  8. ^ Transportation Research Board. (2007). Transit Cooperative Research Program (TCRP) Synthesis 70: Mobile Data Terminals. Pp. 1–5. http://onlinepubs.trb.org/onlinepubs/tcrp/tcrp_syn_70.pdf
  9. ^ Von Hippel, E. (1988). Sources of Innovation. Oxford University Press. The Sources of Innovation
  10. ^ Engelberger, J. F. (1982). Robotics in practice: Future capabilities. Electronic Servicing & Technology magazine.
  11. ^ Kline (1985). Research, Invention, Innovation and Production: Models and Reality, Report INN-1, March 1985, Mechanical Engineering Department, Stanford University.
  12. ^ a b c Mark, M., Katz, B., Rahman, S., and Warren, D. (2008) MetroPolicy: Shaping A New Federal Partnership for a Metropolitan Nation. Brookings Institution: Metropolitan Policy Program Report. Pp. 4–103.
  13. ^ "U-STIR". U-stir.eu. http://www.u-stir.eu/index.phtml?id=2537&ID1=2537&sprache=en. Retrieved 2011-09-07.
  14. ^ Tuomi, I. (2002). Networks of Innovation. Oxford University Press. Networks of Innovation
  15. ^ Siltala, R. (2010). Innovativity and cooperative learning in business life and teaching. University of Turku.
  16. ^ a b Davila, T., Epstein, M. J., and Shelton, R. (2006). "Making Innovation Work: How to Manage It, Measure It, and Profit from It. " Upper Saddle River: Wharton School Publishing.
  17. ^ Khan, A. M (1989). Innovative and Noninnovative Small Firms: Types and Characteristics. Management Science, Vol. 35, no. 5. Pp. 597–606.
  18. ^ O'Sullivan, David (2002). "Framework for Managing Development in the Networked Organisations". Journal of Computers in Industry 47 (1): 77–88.
  19. ^ Rogers, E. M. (1962). Diffusion of Innovation. New York, NY: Free Press.
  20. ^ Davila, Tony; Marc J. Epstein and Robert Shelton (2006). Making Innovation Work: How to Manage It, Measure It, and Profit from It. Upper Saddle River: Wharton School Publishing
  21. ^ OECD The Measurement of Scientific and Technological Activities. Proposed Guidelines for Collecting and Interpreting Technological Innovation Data. Oslo Manual. 2nd edition, DSTI, OECD / European Commission Eurostat, Paris 31 Dec 1995.
  22. ^ "Industrial innovation – Enterprise and Industry". Ec.europa.eu. http://ec.europa.eu/enterprise/policies/innovation/. Retrieved 2011-09-07.
  23. ^ Chalkidou K, Tunis S, Lopert R, Rochaix L, Sawicki PT, Nasser M, Xerri B. Comparative Effectiveness research and Evidence-Based Health Policy: Experience from Four Countries. The Milbank Quarterly 2009; 87(2): 339–367 at 362–363.
  24. ^ Roughead E, Lopert R and Sansom L. Prices for innovative pharmaceutical products that provide health gain: a comparison between Australia and the United States Value in Health 2007;10:514–20
  25. ^ Hughes B. Payers Growing Influence on R&D Decision Making. Nature Reviews Drugs Discovery 2008; 7: 876–78.
  26. ^ Faunce T, Bai J and Nguyen D. Impact of the Australia-US Free Trade Agreement on Australian medicines regulation and prices. Journal of Generic Medicines 2010; 7(1): 18-29
  27. ^ Faunce TA. Global intellectual property protection of “innovative” pharmaceuticals:Challenges for bioethics and health law in B Bennett and G Tomossy (eds) Globalization and Health Springer 2006 http://law.anu.edu.au/StaffUploads/236-Ch%20Globalisation%20and%20Health%20Fau.pdf . Retrieved 18 June 2009.
  28. ^ Faunce TA. Reference pricing for pharmaceuticals: is the Australia-United States Free Trade Agreement affecting Australia's Pharmaceutical Benefits Scheme? Medical Journal of Australia. 2007 Aug 20;187(4):240–2.
  29. ^ "Tools". Statsamerica.org. http://www.statsamerica.org/innovation/data.html. Retrieved 2011-09-07.
  30. ^ "Innovation Indicator 2011". 2011. http://www.innovationsindikator.de/der-innovationsindikator/english-summary/. Retrieved 2012-05-27.
  31. ^ "U.S. Ranks #8 In Global Innovation Index". Industryweek.com. 2009-03-10. http://www.industryweek.com/articles/u-s-_ranks_8_in_global_innovation_index_18638.aspx. Retrieved 2009-08-28.
  32. ^ "The Innovation Imperative in Manufacturing: How the United States Can Restore Its Edge" (PDF). http://www.nam.org/innovationreport.pdf. Retrieved 2009-08-28.
  33. ^ http://www.eda.gov/PDF/EDA_FY_2010_Annual_Report.pdf
  34. ^ "Science and Technology". MEXT. http://www.mext.go.jp/english/a06.htm. Retrieved 2011-09-07.
  35. ^ "BMBF " Ministry". Bmbf.de. http://www.bmbf.de/en/Ministry.php. Retrieved 2011-09-07.
  36. ^ http://www.landgate.wa.gov.au/innovation
  37. ^ http://www.tropicalinnovationawards.com

[edit] External links