We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
For much of its modern history, linguistics has taken an ontological stance on language as a structural entity, with a wide set of implications for how languages are understood as bounded entities. This is not about the different epistemological approaches to a structural version of language taken by various schools of linguistics, but about the basic ontological assumptions about what language is. A structural ontology made it possible to treat language as an object amenable to scientific study, enabling descriptions of languages around the world and facilitating many advances in our understandings of languages as structural entities. Yet this very tendency towards seeing languages as autonomous systems has enabled those forms of thinking that emphasize boundedness. When we contrast a structural ontology with a practice ontology, where the focus is on what people do with available linguistic resources, it becomes clear that in some of the recent translanguaging debates, people are talking about different things, language as structure and language as practice. Because structural and social (practice) language ontologies are so different, the debates about translanguaging have become mired in misunderstandings.
The historical relationship between linguistics and applied linguistics, one producing knowledge about language and the other applying it to real-world contexts, creates a hierarchy of linguistic knowledge, with linguistic knowledge at the top, everyday views on language at the bottom and applied linguistics somewhere in the middle, mediating between the two. This relationship has started to shift as applied linguists have sought to develop their own views of language based on their engagements with language users and contexts. A key framework for this book is a form of critical social realism that allows for more than one reality, grounds epistemologies in social relations and takes a critical-ethical position on choosing between different versions of the world. Central to this discussion are questions of ontology – what language is – and the ontological turn in the social sciences. Alongside ontological questions about what languages are, a related concern is whose version of language counts. Various ways of getting at this, from lay, folk and citizen linguistic perspectives, have emphasized this need to include knowledge of language from outside the disciplinary confines of linguistics. A practical theory of language surely needs a strong relationship with how language users think about language.
This chapter traces the rise and decline of the conviction that Britain was a nation uniquely hospitable to refugees and especially proud of its longstanding traditions of asylum. In fact, social attitudes and state policies towards migrants fluctuated dramatically throughout the modern era, triggered by a variety of controversial or destabilizing events ranging from dynastic royal marriages to continental revolutions and international conflicts. By 1905 Britain had passed its first general immigration control act, providing a framework governing who could be permitted to enter the country and under what conditions, regulations that have continued to be refined and extended to this day. Yet this legislation also took up issues about statehood and citizenship that can be traced back to the French Revolution in 1789 and which reverberated throughout the nineteenth and twentieth centuries, issues that have been represented and explored in a wide range of literary and cultural genres by writers and artists including Charlotte Smith, Charles Dickens, Henry James, Israel Zangwill, Agatha Christie, and Iris Murdoch. By 1960 a new post-war recognition of the scale of what was increasingly recognized as a fully global migration crisis had changed these insular local debates forever.
In this essay I consider what myths are, and provide a very short sketch of Darwin’s life and work. I also suggest some possible reasons about the mythology around him, paving the way for the chapters to follow.
The history of government presents six lessons. Government has been crucial to building a nation truer to our nation, but it has taken both government and markets to do this. Government is big and complex in response to changes that were occurring in the economy which sent the country in directions inconsistent with its national values. Fourth, despite its size and complexity, government undertakes the same three essential functions it has undertaken over the history of the country. Government has been necessary to create, sustain, and expand markets, to protect people from economic loss and physical injury, and to maintain a social safety net for people mired in poverty due to age, health, or market conditions not of their doing. Finally, the pathway towards achieving America’s fundamental political values has been littered with mistakes and regrets. What makes us a nation has changed to encompass those who have been excluded and marginalized.
The book concludes on a positive and action-oriented note. We highlight the role of hope in our society and how it can benefit suicide prevention efforts. The chapter summarizes how the book has accomplished three aims: (1) providing a biopsychosocial perspective on suicide and prevention, (2) serving as a public health intervention by increasing awareness, de-bunking myths, and destigmatizing suicide, and (3) helping readers shift to a perspective of hope and optimism. Capitalizing on this optimism, the chapter then focuses on increasing agency and offering concrete action steps, such that the reader feels they can be part of suicide prevention.
Ordinary civilians are assumed to panic or freeze in crises, but research has shown that this is a myth. In many crises, civilians provide life-saving help to those in need. They may even form emergent groups, which are temporary organizations that are involved in crisis response activities. Their actions can be of major importance to the crisis response efforts, but professionals are often reluctant to include volunteers in formal crisis structures out of distrust and because it requires considerable adaptation. By excluding volunteers, responders are sure that trained professionals provide high-quality support to affected communities. The attitude of frontline responders to volunteers poses a dilemma. It is important to anticipate the presence of well-intentioned volunteers and build relations with them, so that their skills and intentions can be rapidly identified and potential coordination can be established early on. Civilians can be given a variety of tasks, depending on the crisis, but it should not foreclose the recognition of their possible victimhood. Open engagement enables the adaptive incorporation of civilians in frontline crisis response efforts.
John Gould’s father was a gardener. A very, very good one – good enough to be head of the Royal Gardens at Windsor. John apprenticed, too, becoming a gardener in his own right at Ripley Castle, Yorkshire, in 1825. As good as he was at flowers and trees, birds became young John Gould’s true passion early in life. Like John Edmonstone, John Gould (1804–1881) adopted Charles Waterton’s preservation techniques that kept taxidermied bird feathers crisp and vibrant for decades (some still exist in museums today), and he began to employ the technique to make extra cash. He sold preserved birds and their eggs to fancy Eton schoolboys near his father’s work. His collecting side-hustle soon landed him a professional post: curator and preserver of the new Zoological Society of London. They paid him £100 a year, a respectable sum for an uneducated son of a gardener, though not enough to make him Charles Darwin’s social equal (Darwin initially received a £400 annual allowance from his father plus £10,000 as a wedding present).
Darwin claimed that On the Origin of Species, or the Preservation of Favoured Races in the Struggle for Life was only an “abstract” of that much longer book he had begun to write in 1856, after his irreverent meeting with J. D. Hooker, T. H. Huxley, and T. V. Wollaston, and Lyell’s exasperated encouragement in May. But he never completed that larger book. Instead, he worked on plants and pigeons and collected information through surveys from other naturalists and professional specimen hunters like Alfred Russel Wallace for the better part of a decade.
For all their scientific prowess and public renown, there is no comparable Lyell-ism, Faraday-ism, Einstein-ism, Curie-ism, Hawking-ism, or deGrasse-Tyson-ism. So, there must be something even more powerful than scientific ideas alone caught in the net of this ism attached to Darwin. And whatever the term meant, it’s fair to say that Darwinism frightened Bryan.
Historian Everett Mendelsohn was intrigued. In the middle of writing a review of an annual survey of academic publications in the History of Science, he marveled that an article in that volume contained almost 40 pages’ worth of references to works on Darwin published in just the years between 1959 and 1963. Almost 200 works published in a handful of years – no single figure in the history of science commanded such an impressive academic following. Yet Mendelsohn noted that, paradoxically, no one had written a proper biography of Darwin by 1965. Oh sure, there was commentary. Lots of commentary. But so many of the authors were retired biologists who had a tendency toward hagiography or, the opposite, with axes to grind.
Meeting the “White Raja of Sarawak” in Singapore in 1853 had been a stroke of luck. Honestly, it could have been a major turning point in what had been an unlucky career so far for 30-year-old collector Alfred Russel Wallace (1823–1913) (Figure 4.1). But the steep, rocky, sweaty climb up Borneo’s Mt. Serembu (also known as Bung Moan or Bukit Peninjau) in the last week of December 1855 wasn’t exactly what Wallace expected. His eyeglasses fogged in the humidity. Bamboo taller than buildings crowded the narrow path. Near the top, the rainforest finally parted. But it revealed neither a temple nor some sort of massive colonial complex with all the trappings of empire worthy of a “raja.” Instead, there leaned a modest, very un-colonial-ruler-like white cabin. When he saw it, Wallace literally called it “rude.”
Charles Darwin spent nearly the whole of his writing career attempting to convince his colleagues, the general public, and, by extension, you and me, that change occurs gradually. Tiny slivers of difference accumulate over time like grains of sand in a vast hourglass. Change happens, in other words. It’s painfully slow, but it’s inevitable. By implication, two organisms that look different enough to us to be classified as separate species share, many tens of thousands or even millions of generations back, the same ancestors. (Inbreeding means we don’t even need to go back quite that many generations to demonstrate overlap, but you get the point.) But change that gradual means, as Darwin himself well recognized, that looking for “missing links” would be a pretty silly errand. Differences between one generation and the next look to our eyes just like common variation. It’s one grain falling from the top of the hourglass to the bottom. You can’t perceive the change. You would have to go back in time to find the very first individuals who possessed a particular trait – bat-like wings, say, or human-ish hands – and then, turning to their parents, you would see something almost identical.
Transmutation. “Evolutio,” if you wanted to be fancy and Italian about it. Whatever you want to call it, the grand unrolling of one type into another, connecting all living things into a single tree of life was all the rage among the society gentlemen. James Burnett, Lord Monboddo, an influential Scottish judge in the 1700s, had said shocking things about it. Monboddo’s metaphysics separated humans from brutes by only the thinnest slice of cognition. And imagine how he scandalized the chattering classes when, according to rumor anyway, he suggested perhaps tails even lingered, dangling from the spinal cords of the underdeveloped. They called him an “eccentric,” a fusty, argumentative judge and a voracious reader. Perhaps too learned – genius and madness, you know.
The Good News finally snagged him. In late September 1881, he was near the end, bedridden, languishing in a soft purple robe, still able to read, though he always preferred to be read to. Lady Hope entered the drawing room at the top of the stairs quietly, respectfully, as the golden hour gently illuminated corn fields and English oak forests through his picturesque bay window. The faintest crown of white hair encircled his head in the late afternoon light; the rest was wizardly beard (Figure 6.1). Lady Hope, the well-known evangelist, was visiting the Darwins, and she approached the old scientist cautiously. But she needn’t have. In his wrinkled hands he held the Bible, open to the New Testament Epistle of Hebrews. “The Royal Book,” Darwin called it, serenely, mentioning a few favored passages.
The stone is still there in the garden. That’s what gets me. It’s not the house itself – houses decay slowly and can be preserved pretty easily, especially in Britain where even an eighteenth-century country house is not “old.” It’s not even the tree behind the house, alive when Charles Darwin still lived in his Down House, now propped up by guywires against inevitable collapse as a kind of totem of the great naturalist’s existence. If you leave the rear exit, the one that takes you to Darwin’s preserved greenhouse and the stunning flora on a pretty path lined in that particular English way of making the perfectly manicured seem somehow “natural,” you might glance to the left and see behind a small iron fence a one-foot-wide stone. A round mill stone or pottery wheel, it was, or appears to have been.
The legend of Charles Darwin has never been more alive or more potent, but by virtue of this, his legacy has become susceptible to myths and misunderstandings. Understanding Charles Darwin examines key questions such as what did Darwin's work change about the world? In what ways is 'Darwinism' reflective of Darwin's own views? What problems were left unsolved? In our elevation of Darwin to this iconic status, have we neglected to recognise the work of other scientists? The book also examines Darwin's struggle with his religious beliefs, considering his findings, and whether he was truly an atheist. In this engaging account, Peterson paints an intimate portrait of Darwin from his own words in private correspondence and journals. The result is the Darwin you never knew.
Edited by
Ben Kiernan, Yale University, Connecticut,T. M. Lemos, Huron University College, University of Western Ontario,Tristan S. Taylor, University of New England, Australia
General editor
Ben Kiernan, Yale University, Connecticut
A question not asked to this point in the study of genocide by all the scholars associated with this work in the various disciplines is whether or not there something inherent in the very social construction we call “religion” that lends itself, adapts itself, all-too-easily to those communities— both nation-states and non-nation-state actors—that perpetrate genocide, either in actuality or in potential? Thus, this contribution begins with something of a theoretical look-see vis-à-vis that nexus between religion and genocide by suggesting applicable definitions for both and further outlining the constituent factors of each. (NB: There are, in truth, uncomfortable similarities between religious groups and genocidal perpetrator groups which, to my understanding, have never been addressed or explored.) To further bolster my overall argument—that religion, however defined and understood, is a “participating factor” (my preferred term) in all genocides, both historically and contemporarily—a series of case studies, using Raphael Lemkin’s tri-partite division from his incomplete History of Genocide—Antiquity, Middle Ages, Modern Times—are examined to determine whether my thesis holds.
Effective methods for training and education in the dissemination of evidence-based treatments is a priority. This commentary provides doctoral clinical psychology graduate student authors perspectives on common myths about cognitive behavioural therapy (CBT). Three myths were identified and considered: (1) CBT does not value the therapeutic relationship; (2) CBT is overly rigid; and (3) exposure techniques are cruel. Graduate students were engaged in a competency-based course in Cognitive Behavioural Approaches to Psychotherapy at an American Psychological Association (APA)-accredited doctoral clinical psychology program. The origins of common myths identified by graduate students included a lack of in-depth coverage of CBT and brief video segments provided during introductory courses, lived experience with CBT, and pre-determined views of manualized treatment and exposure techniques. Myth-addressing factors discussed by graduate students included holding space at the start of training for a discussion of attitudes about CBT, specific learning activities, and course content described in this commentary. Finally, self-reported changes in graduate students’ attitudes and behaviour following the course included a more favourable view of CBT as valuing the therapeutic relationship, as well as implementation of resources provided, and techniques learned and practised at practicum settings. Limitations and lessons learned are discussed through the lens of a model of adult learning that may be applied to future graduate training in evidence-based therapies like CBT.
Key learning aims
(1) To understand common myths about cognitive behavioural therapy (CBT) that doctoral students in clinical psychology hold prior to entering a course in CBT.
(2) To understand the possible origins of these myths, factors that may address their impacts, and changes in attitudes and behaviour among graduate students as a result.
(3) To examine the lessons learned that can be applied to future training in evidence-based therapies like CBT.