We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We consider compactification of low-energy string theory, mostly in the supergravity regime and mostly for the heterotic case, and we discuss the conditions for obtaining N = 1 in four dimensions. We review topology issues, in particular the relation of spinors with holonomies, Kahler and Calabi–Yau manifolds, cohomology, homology, and their relation to mass spectra in four dimensions. We explain the moduli space of Calabi–Yau space, the Kahler moduli, and complex structure moduli. We then consider new features of the type IIB and heterotic E8 × E8 models.
Geographic information systems (GIS) are discussed encompassing data base management, geodatabase, data structure of geographic features, topologic data structure, geographic data model, type of data models, Earth datum, map projection, map scale, geoprocessing and geovisualization, delineation of drainage areas and streams, and derivation of hydrologic parameters using GIS.
Chapter 13 discusses neural networks and deep learning; included is a presentation of deep convolutional networks that seem to have a great potential in the classification of medical images.
The concept of superposed fracture networks consisting of different generations, and often types, of fractures that have developed sequentially is discussed. Superposed networks can consist of different types of extension or shear fractures, and each fracture may abut, cross or follow (reactivate) earlier fractures. An example of a superposed fracture network in Liassic limestones in Somerset, UK, is presented, which comprises two sets of veins and a later joint network. The veins develop as damage zones around faults, with veins of the later set crossing or trailing along the earlier set. The later joints either cross-cut the earlier veins or reactivate them, the latter being common for the thicker (more than about 5 mm) veins. The veins and joint networks have markedly different geometries and topologies. The veins are spatially clustered and are typically dominated by I-nodes, while the joints are more evenly distributed and tend to be dominated by Y-nodes. The combined network of veins and joints at Lilstock is dominated by X-nodes because so many joints cross-cut the earlier veins. Understanding the development of superposed fracture networks leads to better understanding of the kinematic, mechanical, tectonic and fluid flow history of rocks.
Robustness is a property of system analyses, namely monotonic maps from the complete lattice of subsets of a (system’s state) space to the two-point lattice. The definition of robustness requires the space to be a metric space. Robust analyses cannot discriminate between a subset of the metric space and its closure; therefore, one can restrict to the complete lattice of closed subsets. When the metric space is compact, the complete lattice of closed subsets ordered by reverse inclusion is $\omega$-continuous, and robust analyses are exactly the Scott-continuous maps. Thus, one can also ask whether a robust analysis is computable (with respect to a countable base). The main result of this paper establishes a relation between robustness and Scott continuity when the metric space is not compact. The key idea is to replace the metric space with a compact Hausdorff space, and relate robustness and Scott continuity by an adjunction between the complete lattice of closed subsets of the metric space and the $\omega$-continuous lattice of closed subsets of the compact Hausdorff space. We demonstrate the applicability of this result with several examples involving Banach spaces.
We introduce basic topological concepts, which are used to define continuous mappings, and topological invariants. Next, we introduce a differential structure on manifolds, to extend calculus from Euclidean spaces to the more general setting of differentiable manifolds.
This detailed yet accessible text provides an essential introduction to the advanced mathematical methods at the core of theoretical physics. The book steadily develops the key concepts required for an understanding of symmetry principles and topological structures, such as group theory, differentiable manifolds, Riemannian geometry, and Lie algebras. Based on a course for senior undergraduate students of physics, it is written in a clear, pedagogical style and would also be valuable to students in other areas of science and engineering. The material has been subject to more than twenty years of feedback from students, ensuring that explanations and examples are lucid and considered, and numerous worked examples and exercises reinforce key concepts and further strengthen readers' understanding. This text unites a wide variety of important topics that are often scattered across different books, and provides a solid platform for more specialized study or research.
Focusing on understanding sources of cybersecurity data, this chapter explores the end-to-end opportunities for data collection. It goes on to discuss the sources of cybersecurity data and how multiple datasets can be leveraged in understanding cyber threats.
The notion of legal space is increasingly being used to address the challenges of multiple and overlapping spheres of legality that the notion of legal order cannot capture. This article shows how legal space can serve as an alternative (or at least complementary) concept to legal order in view of the limitations of the latter. It sketches out a notion of legal space that is inspired by topology, an approach that analyses the qualitative nature of spaces. It is concerned with understanding the ways in which legalities interact, rather than with ‘measuring’ their spatial dimensions. A topology-inspired approach to legal space can contribute to conceptualizing, in a novel manner, the inner structure of legal spaces, the boundaries of these spaces and their interrelations with other spaces. It offers an analytical toolkit for better understanding multiple legalities, providing categories to characterize sets of legal elements as well as phenomena such as overlaps and hybridity. It is conceptually less constrained than the concept of legal order, and thus allows us to address various bodies of law ranging from classical domestic law, EU law and international law to global administrative law, corporate social responsibility law, platform law and lex sportiva.
The exotic internal structure of polar topologies in multiferroic materials offers a rich landscape for materials science research. As the spatial scale of these entities is often subatomic in nature, aberration-corrected transmission electron microscopy (TEM) is the ideal characterization technique. Software to quantify and visualize the slight shifts in atomic placement within unit cells is of paramount importance due to the now routine acquisition of images at such resolution. In the previous ~decade since the commercialization of aberration-corrected TEM, many research groups have written their own code to visualize these polar entities. More recently, open-access Python packages have been developed for the purpose of TEM atomic position quantification. Building on these packages, we introduce the TEMUL Toolkit: a Python package for analysis and visualization of atomic resolution images. Here, we focus specifically on the TopoTEM module of the toolkit where we show an easy to follow, streamlined version of calculating the atomic displacements relative to the surrounding lattice and thus plotting polarization. We hope this toolkit will benefit the rapidly expanding field of topology-based nano-electronic and quantum materials research, and we invite the electron microscopy community to contribute to this open-access project.
Shapes are perceived unanalyzed, without rigid representation of their parts. They do not comply with standard symbolic knowledge representation criteria; they are treated and judged by appearance. Resolving the relationship of parts to parts and parts to wholes has a constructive role in perception and design. This paper presents a computational account of part–whole figuration in design. To this end, shape rules are used to show how a shape is seen, and shape decompositions having structures of topologies and Boolean algebras reveal alternative structures for parts. Four examples of shape computation are presented. Topologies demonstrate the relationships of wholes, parts, and subparts, in the computations enabling the comparison and relativization of structures, and lattice diagrams are used to present their order. Retrospectively, the topologies help to recall the generative history and establish computational continuity. When the parts are modified to recognize emergent squares locally, other emergent shapes are highlighted globally as the topology is re-adjusted. Two types of emergence are identified: local and global. Seeing the local parts modifies how we analyze the global whole, and thus, a local observation yields a global order.
People search can be reformulated as submodular maximization problems to achieve solutions with theoretical guarantees. However, the number of submodular function outcome is
$2^N$
from N sets. Compressing functions via nonlinear Fourier transform and spraying out sets are two ways to overcome this issue. This research proposed the submodular deep compressed sensing of convolutional sparse coding (SDCS-CSC) and applied the Topological Fourier Sparse Set (TFSS) algorithms to solve people search problems. The TFSS is based on topological and compressed sensing techniques, while the CSC is based on DCS techniques. Both algorithms enable an unmanned aerial vehicle to search for the people in environments. Experiments demonstrate that the algorithms can search for the people more efficiently than the benchmark approaches. This research also suggests how to select CSC or TFSS algorithms for different search problems.
This preliminary chapter contains notations, definitions, and basic concepts needed for the study of Measure Theory and Functional Analysis. Most of this chapter is for reference and may be read only as needed. Included are concepts such as Convergence, Continuity, and Compactness in Euclidean Spaces. The theory of Euclidean Measure and sets of Measure Zero are covered. An overview is included of Integration sufficient to begin the study of Functional Analysis. The chapter finishes with topics such as Functions of Bounded Variation, Inequalities, along with a discussion of the Axiom of Choice.
Metric Spaces, Normed Spaces, and Banach Spaces are investigated. Topological concepts of Open and Closed sets, Convergence, Continuity, Compactness, Completeness, and Total Boundedness are studied. The Stone–Weierstrass Approximation Theorem is proven.
A uniform approach to computing with infinite objects like real numbers, tuples of these, compacts sets and uniformly continuous maps is presented. In the work of Berger, it was shown how to extract certified algorithms working with the signed digit representation from constructive proofs. Berger and the present author generalised this approach to complete metric spaces and showed how to deal with compact sets. Here, we unify this work and lay the foundations for doing a similar thing for the much more comprehensive class of compact Hausdorff spaces occurring in applications. The approach is of the same computational power as Weihrauch’s Type-Two Theory of Effectivity.
The ALEA Coq library formalizes measure theory based on a variant of the Giry monad on the category of sets. This enables the interpretation of a probabilistic programming language with primitives for sampling from discrete distributions. However, continuous distributions have to be discretized because the corresponding measures cannot be defined on all subsets of their carriers. This paper proposes the use of synthetic topology to model continuous distributions for probabilistic computations in type theory. We study the initial σ-frame and the corresponding induced topology on arbitrary sets. Based on these intrinsic topologies, we define valuations and lower integrals on sets and prove versions of the Riesz and Fubini theorems. We then show how the Lebesgue valuation, and hence continuous distributions, can be constructed.
This chapter is a brief reminder of point-set topology including examples of the most prominent topologies needed later on in the text. Further topics include ordinal numbers and the ordinal space (as a topological space), cardinality and counting and the construction of the Cantor middle-thirds set and the Cantor function (devil’s staircase) and its inverse function.
Missionaries have flocked to the Kyrgyz Republic ever since the collapse of the Soviet Union. Evangelical-Pentecostal and Tablighi missions have been particularly active on what they conceive of as a fertile post-atheist frontier. But as these missions project their message of truth onto the frontier, the dangers of the frontier may overwhelm them. Based on long-term ethnographic fieldwork amongst foreign and local Tablighis and evangelical-Pentecostals, this article formulates an analytic of the frontier that highlights the affective and relational characteristics of missionary activities and their effects. This analytic explains why and how missionaries are attracted to the frontier, as well as some of the successes and failures of their expansionist efforts. In doing so, the article reveals the potency of instability, a feature that is particularly evident in missionary work, but also resonates with other frontier situations.
Rudimentary catalogues of cosmic voids were first complied in the mid-1980s, but they were limited in scope by the lack of adequate deep galaxy survey data. Over several decades, catalogues have improved as have cosmic void identification methods. Voids in the galaxy distribution have become important objects and modern tools that are now used to investigate properties of the Universe. They have been and continue to be applied to problems in precision cosmology. The first step in utilizing this new tool is to compile massive surveys of the distant Universe that yield sufficiently large samples of cosmic voids. Then reliable void identification techniques were developed. These include sophisticated methods of 3D analysis. For some tests, “stacked voids” are created to enhance the measurement precision. Specific research results are summarized showing concrete results. Three other topics are discussed: topology of the void and supercluster structure, the LTB Universe models, and finally void galaxies.
Edited by
Matthew Craven, School of Oriental and African Studies, University of London,Sundhya Pahuja, University of Melbourne,Gerry Simpson, London School of Economics and Political Science