Cambridge Core - Comparative Politics - Foundations of Comparative Politics - by Ken Newton. PDF; Export citation. Contents. pp ix-xviii. Access. PDF; Export . Request PDF on ResearchGate | Foundations of Comparative Politics | This authoritative new introductory text covers the key concepts, theories and issues. Welcome to Foundations of Comparative Politics! In Comparative . TIP: This link downloads a PDF from Moodle; if you're not yet enrolled in.
|Language:||English, Spanish, German|
|ePub File Size:||26.33 MB|
|PDF File Size:||15.58 MB|
|Distribution:||Free* [*Sign up for free]|
Foundations of Comparative Politics. This student-friendly introduction to the key theories and concepts of comparative politics now provides even broader. This page intentionally left blank Foundations of Comparative Politics This AVIA Source: tvnovellas.info 16 The . Cambridge Core - Comparative Politics - Foundations of Comparative Politics - by Kenneth Newton. Access. PDF; Export citation. Contents. pp vii-xvii. Access.
Recent trends and developments in quantitative methods show that quantitative and qualitative methods are increasingly integrated to jointly handle challenges with broad and profound impacts on the social sciences as a whole. This essay presents a brief introduction of the recent three revolutions in quantitative methods. The paper further displays that the challenges arising from the three revolutions are essentially the same ones with those in comparative politics, such as modeling complex interdependence, dealing with fuzzy concepts and the messy real world, and so on. Finally, the essay uses a few examples of some new analytical tools developed by quantitative methodologists to illustrate that qualitative knowledge and quantitative techniques should be seamlessly mixed to be innovative and powerful methods. All this points to a common future of comparative politics and quantitative methods. Science is built on comparison. Nothing can be learnt without comparison.
The system can't perform the operation now. Try again later. Citations per year.
Duplicate citations. The following articles are merged in Scholar. Their combined citations are counted only for the first article.
Merged citations. This "Cited by" count includes citations to the following articles in Scholar. Add co-authors Co-authors. Upload PDF. Follow this author. New articles by this author. New citations to this author. A researcher can define her own variables and apply simulation techniques to know the characteristics of a nonstandard and complicated variable. To quantitative methods, the new challenge is how to use variables to express concepts and patterns.
Variables, concepts, and patterns are three different things existing in three different worlds—variables are statistical quantities, concepts are theoretical entities, and patterns are empirical beings. A scientific study needs to connect the three worlds.
Without transforming concepts into variables, no statistical tools can be applied. Without representing concepts, variables and their realizations data are meaningless or even misleading in substantive research, and quantitative methods can generate insensible results. The real challenge to both quantitative methods and comparative politics is not to choose which one among concepts, patterns, and variables, but how to connect the three and transform them into one another.
Conventionally, quantitative methods are applied after conceptualization and operationalization of concepts have been done, while comparativists proceed as if they have to force fuzzy and multidimensional concepts into simple variables in order to apply quantitative methods. Quantitative analysts know more about how to specify a variable to capture the fuzziness, uncertainty, and dimensionality of a concept, but they need to work with people in the substantive fields to understand the concepts and their theoretical applications.
Phil Schmitter observed that more and more researchers use both quantitative methods and a small N analysis of carefully selected cases, and so did I and many others. The task is challenging but extremely promising.
As mentioned above, the credibility revolution elevates the importance of qualitative knowledge in causality identification in observational studies. Identification of casual effects requires qualitative methods, and then estimation of causal effects is conducted with quantitative methods. And the two tasks are integral in causal inference, and qualitative and quantitative methods are merged in a single method of causal inference.
The Bayesian revolution requires to specify priors. At the same time, they regard priors as a necessary evil to use powerful MCMC techniques, and try to minimize the influence of priors on the posteriors to let the objective likelihood data to completely dominate Bayesian learning.
Recognizing the value of unquantifiable information and knowledge, many others advocate to embrace the opportunity provided by Bayesian setup and use priors to incorporate relevant information we have before we see new data.
Priors can and should be specified with existing theoretical judgments, expert knowledge, and even intuitions of the researcher Greenberg, Priors fully representing our existing understandings and knowledge are valuable and helpful for advancing the cumulative process of learning. Furthermore, the Bayesian approach, with prior specification, explicitly requires that the researcher be an expert of a subject before she analyzes the data.
If a researcher is completely ignorant about the subject, then she is not qualified to conduct the research, no matter how many data are collected or how skillful she is as a data analyst. Like the credibility revolution, the Bayesian revolution merges qualitative and quantitative methods in a single process of Bayesian updating. The big data revolution blurs the lines between quantitative and qualitative methods in a more fundamental way.
Most big data are not numeric data, but texts, audios, videos, images, etc. They are often unstructured and require qualitative judgments to make sense of the data and extract signals form noise. Pattern discovery is one of the major tasks in big data analysis and inspires new developments of quantitative methods. Therefore, if the conventional distinction between quantitative and qualitative methods depends on whether data are quantified or not, the big data revolution erases the distinction all together.
There are numerous new tools invented for doing diverse research tasks. In this section, I would like to use some important recent developments in quantitative methods as examples to show that new tools are increasingly available to deal with the challenges, and to argue that better solutions to the shared problems come from a joint effort of comparativists and quantitative methodologists.
The research task of identification and the tools for implementing identification, such as various matching approaches or re-sampling approaches, make it clear that, in pursuing casual studies, quantitative methods share the same methodological foundation of comparative case studies Imbens and Angrist ; King and Zeng ; Iacus et al. The only difference is that quantitative methods are primarily interested in average causal effects of many matched cases, and the comparative case study is interested more in individual casual effects of a few matched cases.
Comparativists are good at doing careful selection of cases, presenting justifications of case selections, and carefully comparing cases for causal inference. Their trainings, expertise, skills, and experiences are precious for causal inference pursuing average causal effects with large number of observations.
Many political scientists, comparativists included, do field experiments, especially survey experiments. Is this a qualitative or a quantitative way of doing research? The design of an experiment needs statistical training and local knowledge, and requires quantitative methodologists and comparativists to work together to figure out survey questions, sampling schemes randomization or stratification , potential confounders, solutions to missing data, and so on.
Are the control group and treatment group exchangeable? These types of questions are not simply statistical or substantive questions.
As mentioned before, another powerful tool for identifying causation is graphical theory Pearl, Graphical theory is used to find confounders, i. Graphical diagrams are drawn to help investigate what other association relations should be blocked to draw the conclusion that the observed association between X and Y is equal to the causal effect. The justification of what variables should be included to block back door paths confounding association is mainly based on subject-matter knowledge.
Mediation analysis in the causal inference literature claims that it investigates the causal mechanism Imai et al.
A causal story can only be constructed in a qualitative way, and the curiosity for causal explanations can only be satisfied with qualitative methods, such as in-depth narratives and process tracing. The recent advancement of quantitative methods on causal inference provides solid methodological rules for testing hypothesized causal relations, but without qualitative knowledge and analysis, we cannot obtain causal inferences or explanations.
The Bayesian revolution enables the researcher to specify sophisticated models. MCMC makes high-dimensional integration an easy job to do, and now we can model interdependence among a large number of units. Cross-level interactions can be nicely incorporated in Bayesian multilevel modeling.
In addition, the flexibility of Bayesian model setup and powerful MCMC tools imply that scholars can analyze horizontal and hierarchical interdependence in a single model. The big data revolution has stimulated many inventions of new analytical methods to model complexity and interdependence, such as network analysis and complex social modeling. The rapid advancement of quantitative methods highlights the crucial role of qualitative knowledge.
I may illustrate this point with a few examples. Spatial modeling might be one of the most familiar methods to political scientists for modeling interdependence Franzese and Hays , Spatial econometrics has been criticized for the pre-specified spatial metric W, i.
Different specifications of the spatial metric can generate very different inferences of all parameters in the model, not only the spatial autoregressive coefficient associated to W. Critics are worried about the lack of robustness of spatial modeling, and are unconformable about the fact that the W matrix is specified with qualitative knowledge Arbia and Fingleton ; Conley ; Conley and Topa Indeed, specification of the W matrix injects important information into the model, and only experts of the subject might be able to specify and justify the W matrix with their qualitative judgments about interdependence.
Quantitative methods can estimate the effect of interdependence, but qualitative knowledge is necessary to know the structure of interdependence. The solution has to rely on both. The second example is network analysis, which has been increasingly applied in political science to study interdependence Hafner-Burton and Montgomery ; Ward et al. Spatial modeling can be regarded as a special case of network analysis.
Network analysis is a descriptive tool to summarize characteristics of a system and units as well as the features of relationships ties connecting units. SlideShare Explore Search You.
Submit Search. Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime. Upcoming SlideShare. Like this document? Why not share! An annual anal Embed Size px. Start on. Show related SlideShares at end.