We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter is concerned with method in art history and focuses on R. G. Collingwood’s impact on Michael Baxandall’s contribution to this subject in Patterns of Intention (1985). It begins with a discussion of Baxandall’s appeal to Collingwood’s notion of “re-enactment” and Karl Popper’s “situational logic,” followed by an explanation of the Collingwoodian roots of his “triangle of re-enactment.” Taking for granted that the proper interpretation of Collingwood’s notion of “re-enactment” is in terms of Peirce’s notion of “abduction,” itself understood in terms of the “Gabbay–Woods schema,” I then offer a reading of chapter IV and Baxandall’s analysis of Piero della Francesca’s Baptism of Christ, that bring to the fore the central influence of Collingwood’s conceptions – the claim being that Baxandall’s successful application of them shows their worth.
Phonological processes tend to involve local dependencies, an observation that has been expressed explicitly or implicitly in many phonological theories, such as the use of minimal symbols in SPE and the inclusion of primarily strictly local constraints in Optimality Theory. I propose a learning-based account of local phonological processes, providing an explicit computational model. The model is grounded in experimental results that suggest children are initially insensitive to long-distance dependencies and that as their ability to track non-adjacent dependencies grows, learners still prefer local generalisations to non-local ones. The model encodes these results by constructing phonological processes starting around an alternating segment and expanding outward to incorporate more phonological context only when surface forms cannot be predicted with sufficient accuracy. The model successfully constructs local phonological generalisations and exhibits the same preference for local patterns that humans do, suggesting that locality can emerge as a computational consequence of a simple learning procedure.
In abductive reasoning, scientific theories are evaluated on the basis of how well they would explain the available evidence. There are a number of subtly different accounts of this type of reasoning, most of which are inspired by the popular slogan 'Inference to the Best Explanation.' However, these accounts disagree about exactly how to spell out the slogan so as to avoid various problems for abductive reasoning. This Element aims, firstly, to give an opinionated overview both of the many accounts of abductive reasoning that have been proposed and the problems that have motivated them; and, secondly, to critically evaluate these accounts in a way that points toward a systematic view of the nature and purpose of abductive reasoning in science. This title is also available as Open Access on Cambridge Core.
This chapter begins by differentiating qualitative and quantitative research. While some have argued that these approaches are incommensurable paradigms, this chapter argues that they are commensurable but suited to answering different research questions. It introduces a typology of research questions, with six types of question – three qualitative (describing phenomena, theoretical framing, and generating explanations) and three quantitative (measuring phenomena, testing theory, and exploring explanations). The chapter ends by reviewing heuristics to help researchers generate novel and productive research questions.
Time is an active element in a communitarian theory of WTO law. Across the passage of time, the idea-complexes of obligations and rights identified in previous chapters interact, bringing about law in a third overarching idea-complex. This chapter examines how this third idea-complex takes the form of a sui generis legal system generating transformative justice. Here the law focuses on the present and is reasoned abductively according to the best inference consistent with current knowledge. Notwithstanding this reconciliation, the transformations required and induced by it are profound. They demand that actors pay attention to interests other than their own. They also demand that actors conceive of and conform their behavior in light of that transformed interest. In WTO law this interest co-exists uneasily with the sovereignty of states so that there is a persistent tension between individual member interests and the collective interest of the membership. Outcomes of WTO disputes often manifest this basic tension.
Many people have had the experience of thinking with pleasure, or of the pleasure of thinking, but can we substantiate the idea that thinking can be pleasurable? This first chapter explores non-psychological productions to search for evidence of the pleasure of thinking in the visual arts, in philosophy – mainly in the writings of Baruch Spinoza and Hannah Arendt - and in the self-writing of a few authors, among them Sigmund Freud and the novelist Chaim Potok. This enables to identify five modalities of the pleasure of thinking. The chapter then poses the theoretical frame for the book, a sociocultural and developmental psychology, and highlights its ontological and epistemological axioms. Two concepts are defined: on the one hand, the concepts of thinking, including reasoning, sense-making, and daydreaming; on the other hand, the concept of pleasure. The methodology used for the book is then exposed – an abductive integration of theoretical work with the secondary analysis of existing work and of data collected in our past work. The outline of the volume is finally presented.
This chapter details the practical, theoretical, and philosophical aspects of experimental science. It discusses how one chooses a project, performs experiments, interprets the resulting data, makes inferences, and develops and tests theories. It then asks the question, "are our theories accurate representations of the natural world, that is, do they reflect reality?" Surprisingly, this is not an easy question to answer. Scientists assume so, but are they warranted in this assumption? Realists say "yes," but anti-realists argue that realism is simply a mental representation of the world as we perceive it, that is, metaphysical in nature. Regardless of one's sense of reality, the fact remains that science has been and continues to be of tremendous practical value. It would have to be a miracle if our knowledge and manipulation of the nature were not real. Even if they were, how do we know they are true in an absolute sense, not just relative to our own experience? This is a thorny philosophical question, the answer to which depends on the context in which it is asked. The take-home message for the practicing scientist is "never assume your results are true."
Prior research has shown the importance of latent user needs for enabling innovation in early product development phases. The success of a product is largely dependent on to what extent the product satisfies customer needs, and latent user needs play a significant role in impacting the way the product or service unexpectedly delights the user. Complications arise because traditional need finding methods are not able to account for the nuances of latent user needs. A user's need is multidimensional while traditional methods are built on deductive reasoning. The traditional method isolates parts of the user's needs, only pointing to what is deducible within its search space. To address this, we introduce abduction as a way to broaden traditional need finding methods. From a logic based argument it is shown that abduction accounts for the dimensionality of user needs by integrating various traditional need finding theories using design knowledge to isolate the latent need. This theoretical development shows that latent need finding must go beyond a deductive focus, to developing methods that are able to conjecture with the deduced facts in order to abduce the latent user need.
Design process descriptions in the literature in general and those using C-K theory in particular lack some useful cognitive information that may affect the credibility of the process. Notions from abduction research are presented and proposed for enhancing such descriptions. Specifically, it is important to distinguish between design activities that are intuitive and those that result from deliberation; a topic that has long been discussed by philosophers of science and design scholars. The focus of the paper is on the ubiquitous design moves of proposing an idea and selecting among ideas, and on their execution by expert and novice designers.
Macro-causal analysis pivots between exploring patterns of change and exploring causal patterns. It employs a bounded conception of history, keeping causal factors constant, but expands the length of causal chains to explore how their temporal order affects causal outcomes. It analyzes this causal order in terms of elements of physical time: timing, sequencing, tempo, and duration. In paying attention to these elements, it in effect unfreezes physical time, which defines the existing linear notions of causality. Macro-causal analysis relaxes these linearity assumptions by expanding the analysis from what Pierson called short–short to short–long, long–short, and long–long explanations. It thus recognizes that theories rest on temporal assumptions and that understanding those assumptions invites exploring backgrounded causal factors that can help update theories. This udpating process is abduction, which ultimately makes hypotheses more testworthy. Besides elongating causal chains, this type of analysis also elongates outcomes by paying attention to a range of near-miss outcomse that are frequently overlooked but provide often important new inductive insights.
Various CHA scholars have contributed to causal process tracing and helped establish it as principal alternative to VBA. A recent, Bayesian-informed version is particularly relevant to CHA because it shares the same historiographical sensibilities of seeing knowledge evolve through a close dialogue between new findings and the existing foreknowledge. It makes causal inferences conditional on a pre-testing articulation of testworthy hypotheses and juxtaposes new test results onto older ones at the post-testing stage. Process tracing defines the testworthiness of hypotheses according to the number and diversity of empirical implications a theory has before the testing starts. It also defines test strength in terms of the specificity of the hypotheses that are tested and of their uniqueness vis-à-vis the alternative explanations against whichthey are paired. The chapter introduces a new tool, the theory ledger, to help evaluate test strength and to update confidence in causal inferencing.
In 1592 Japan’s Toyotomi Hideyoshi mobilized a massive force and invaded Chosŏn Korea. Upon landing on Korea, the Japanese slaughtered almost all Koreans they could find in the fort of Pusan – an atrocity they called “a festival of blood.” In the sixth month of 1593 the Japanese attacked a local town (Chinju) and killed thousands of Koreans in it. Why did the Japanese troops commit atrocity until their invasion ended in failure in 1598? In addition to act of brutality which they had praticed for long in their country, the Japanese troops got frustrated as their war of invasion was falling apart. They directed their anger and vexation to Korean people. A widespread genocide took place in the second half of 1597 when the Japanese resumed a massive attack on Chosŏn’s southern provinces. Hideyoshi was livid that the Koreans who staged “rebellions” for defense spoiled his military campaign. Hideyoshi ordered his generals: “I will send more troops year after year, kill Koreans one by one, and empty their country.” The freewheeling atrocities the Japanese troops committed in Chosŏn reflected Hideyoshi’s senseless and ruthless push for Chosŏn’s submission to his authority that he thought was boundless.
Lakshmi Balachandran Nair, Libera Università Internazionale degli Studi Sociali Guido Carli, Italy,Michael Gibbert, Università della Svizzera Italiana, Switzerland,Bareerah Hafeez Hoorani, Radboud University Nijmegen, Institute for Management Research, The Netherlands
In this chapter, we discuss the fundamentals of case study research. First of all, we discuss the research functions underlying research questions (i.e. exploratory and explanatory). Different types of explanatory research questions catering to the research functions are discussed next. Depending on the focal variable, these questions are termed X-centered (focusing on the independent variable X), Y-centered (focusing on the dependent variable Y), and X&Y centered (focusing on X and Y). The importance of the context (Z) is discussed afterwards. The logical reasonings underpinning case study research (i.e. induction, deduction, and abduction) are also discussed. The chapter also addresses some common (mis)conceptions regarding the research functions, research questions, research context, and logical reasoning.
A copy of the Ancient History until Caesar today in the Vatican includes an early depiction of the Rape of the Sabine Women, a violent event from the early history of Rome (more on this in Section 6.2).1 It was probably produced in Genoa or Naples around 1300. The scene is placed in front of a classicizing colonnade with slender columns and depressed arches, potentially an allusion to the antique historical setting. The imagery is minimalist and remains two-dimensional, and compositional decisions limit the exposition of gender-related violence. Altogether there are five pairs of women and men equally distributed under the arches of the colonnade (Figure 6.1). The coifs, hats, and robes of men, together with the circlets and dresses of women, transposes the event into an aristocratic–patrician milieu: We witness what noblemen do to noblewomen. The image has a strong symmetrical structure: The same compositions are repeated on the sides, around the central image. There appears to be a sort of amplification of the gestures. On the flanks, the Roman aggressor embraces the shoulders of the Sabine victim and perhaps touches her breasts. The women here raise their arms with the palms turned towards the outside, which may indicate acceptance.2 One would even be tempted to say that they smile – the approach is welcomed. Quite the opposite, in the inner couplets the men clearly hold the women’s wrists, which signals coercion and use of force. In the center the female victim is embraced, the hands of Romulus (?), with a golden hat, rest on her back and caress her chin. She does not reciprocate the gesture, and this may express her rejection of the imposed intimacy. In any case, the image shows a mixture of negative and positive reactions to the abduction. It displays the major question of the illustrations that accompany romances: Whether or not the female protagonists consented to their capture and the ensuing sexual intercourse? The stories of Helen of Troy, the Sabine women, or Lucretia all revolve around this central issue. Produced primarily for the aristocratic and financial elite in Italy, these French texts and their imagery provide an important backdrop to the communal condemnation of sexual violence and point toward the emergence of erotized representation after 1400.
The terms “inference to the best explanation” and “abduction” are often used interchangeably. Yet inference to the best explanation is concerned mostly with the choice of explanations by researchers based on their findings generated from data and is supposed to be in the last stage of an empirical study. In contrast, abduction is about formulating hypotheses and selecting the more promising ones to test before or after data collection, and thus corresponds to an earlier stage of inquiry. Given a set of data collected on a phenomenon, there are no systematic steps for reaching the best explanation for the phenomenon. That said, this concluding chapter suggests seven heuristics derived from Chapters 5 to 7, aimed at assisting management researchers to explain their findings. Constructing an explanation involves a great deal of judgement and decision making as well as requires imagination and intuition. The process arouses a variety of feelings and emotions. In short, explaining management phenomena is not just a scientific endeavor but also an art.
Chapter 5 – Heuristics and Positions: A Framework for Analysing Discourses of Humanisation – discusses what is necessary from a methodological perspective to analyse the appearance of the individual human being in global politics. To this end, the chapter develops an interpretative methodology and presents in more detail the abductive research logic and the method employed in the case studies. The book draws on positioning analysis and introduces it to thetoolbox of IR scholars. The chapter also outlines the research design and its operationalisation for the case studies. The analytical framework allows overcoming methodological individualism by studying the appearance of the individual human being in global politics instead of studying individual human beings. In doing so, the individual human being is made analytically accessible for scholars of global politics.
This chapter introduces Bayesian probability and the rules of probability theory, emphasizing that Bayesian probability emerges as the uniquely consistent extension of deductive (Boolean) logic to contexts of uncertainty and incomplete information
This chapter examines the history of the concept of “aesthetics” across multiple disciplines. It concludes with recommendations for conceiving and investigating aesthetics as a cross-cultural endeavor that does not privilege Western ways of thinking about aesthetics and art.
Literature suggests that people typically understand knowledge by induction and produce knowledge by synthesis. This paper revisits the various modes of reasoning – explanatory abduction, innovative abduction, deduction, and induction – that have been proposed by earlier researchers as crucial modes of reasoning underlying the design process. First, our paper expands earlier work on abductive reasoning – an essential mode of reasoning involved in the process of synthesis – by understanding its role with the help of the “SAPPhIRE” model of causality. The explanations of abductive reasoning in design using the SAPPhIRE model have been compared with those using existing models. Second, the paper captures and analyzes various modes of reasoning during design synthesis with the help of the “Extended Integrated Model of Designing”. The analysis of participants' verbal speech and outcomes shows the model's ability to explain the various modes of reasoning that occur in design. The results indicate the above models to provide a more extensive account of reasoning in design synthesis. Earlier empirical validation of both the models lends further support to the claim of their explanatory capacity.
This thesis aims to develop a domain-independent system for repairing faulty Datalog-like theories by combining three existing techniques: abduction, belief revision, and conceptual change. Accordingly, the proposed system is named the ABC repair system (ABC). Given an observed assertion and a current theory, abduction adds axioms, which explain that observation by making the corresponding assertion derivable from the expanded theory. Belief revision incorporates a new piece of information which conflicts with the input theory by deleting old axioms. Conceptual change uses the reformation algorithm for blocking unwanted proofs or unblocking wanted proofs. The former two techniques change an axiom as a whole, while reformation changes the language in which the theory is written. These three techniques are complementary. But they have not previously been combined into one system. We are working on aligning these three techniques in ABC, which is capable of repairing logical theories with better result than each individual technique alone. In addition, ABC extends abduction and belief revision to operate on preconditions: the former deletes preconditions from rules, and the latter adds preconditions to rules. Datalog is used as the underlying logic of theories in this thesis, but the proposed system has the potential to be adapted to theories in other logics.
Abstract prepared by Xue Li by taking directly from the thesis.