We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Various measures have been introduced in the existing literature to evaluate extreme risk exposure under the effect of an observable factor. Due to the nice properties of the higher-moment (HM) coherent risk measure, we propose a conditional version of the HM (CoHM) risk measure by incorporating the information of an observable factor. We conduct an asymptotic analysis of this measure in the presence of extreme risks under the weak contagion at a high confidence level, which is further applied to the special case of the conditional Haezendonck–Goovaerts risk measure (CoHG). Numerical illustrations are also provided to examine the accuracy of the asymptotic formulas and to analyze the sensitivity of the risk contribution of the CoHG. Based on the asymptotic result in the Fréchet case, we propose an estimator for the CoHM via an extrapolation, supported by a simulation study.
Almost by definition of risk, rare events play a crucial role. We tackle this problem by presenting some basic tools from extreme value theory (EVT). From a statistical point of view, the workhorses are the block maxima method (BMM) and the peaks over threshold method (POTM). Besides giving the mathematical formulation, we exemplify both approaches via simulated examples. Once these tools are in place, we can provide estimators of the relevant risk measures such as high-exceedance probabilities, quantiles and return periods. In a crucial part of the book, we then estimate these quantities for sea-level data at Hoek van Holland near Rotterdam. We obtain estimates, including confidence intervals, for a necessary dike height withstanding a required 1 in 10 000 years storm event. Further applications concern financial data and data from the L’Aquila earthquake. For the latter, we present dynamic models for earthquake aftershocks. After an excursion to the world of records in athletics, we present the signature application of EVT through the story of the sinking of the MV Derbyshire. We show how an application of EVT techniques has saved many lives at sea.
Expectiles have received increasing attention as a risk measure in risk management because of their coherency and elicitability at the level $\alpha\geq1/2$. With a view to practical risk assessments, this paper delves into the worst-case expectile, where only partial information on the underlying distribution is available and there is no closed-form representation. We explore the asymptotic behavior of the worst-case expectile on two specified ambiguity sets: one is through the Wasserstein distance from a reference distribution and transforms this problem into a convex optimization problem via the well-known Kusuoka representation, and the other is induced by higher moment constraints. We obtain precise results in some special cases; nevertheless, there are no unified closed-form solutions. We aim to fully characterize the extreme behaviors; that is, we pursue an approximate solution as the level $\alpha $ tends to 1, which is aesthetically pleasing. As an application of our technique, we investigate the ambiguity set induced by higher moment conditions. Finally, we compare our worst-case expectile approach with a more conservative method based on stochastic order, which is referred to as ‘model aggregation’.
We show how convergence to the Gumbel distribution in an extreme value setting can be understood in an information-theoretic sense. We introduce a new type of score function which behaves well under the maximum operation, and which implies simple expressions for entropy and relative entropy. We show that, assuming certain properties of the von Mises representation, convergence to the Gumbel distribution can be proved in the strong sense of relative entropy.
We study the geometric and topological features of U-statistics of order k when the k-tuples satisfying geometric and topological constraints do not occur frequently. Using appropriate scaling, we establish the convergence of U-statistics in vague topology, while the structure of a non-degenerate limit measure is also revealed. Our general result shows various limit theorems for geometric and topological statistics, including persistent Betti numbers of Čech complexes, the volume of simplices, a functional of the Morse critical points, and values of the min-type distance function. The required vague convergence can be obtained as a result of the limit theorem for point processes induced by U-statistics. The latter convergence particularly occurs in the
$\mathcal M_0$
-topology.
Modeling dependencies between climate extremes is important for climate risk assessment, for instance when allocating emergency management funds. In statistics, multivariate extreme value theory is often used to model spatial extremes. However, most commonly used approaches require strong assumptions and are either too simplistic or over-parameterized. From a machine learning perspective, generative adversarial networks (GANs) are a powerful tool to model dependencies in high-dimensional spaces. Yet in the standard setting, GANs do not well represent dependencies in the extremes. Here we combine GANs with extreme value theory (evtGAN) to model spatial dependencies in summer maxima of temperature and winter maxima in precipitation over a large part of western Europe. We use data from a stationary 2000-year climate model simulation to validate the approach and explore its sensitivity to small sample sizes. Our results show that evtGAN outperforms classical GANs and standard statistical approaches to model spatial extremes. Already with about 50 years of data, which corresponds to commonly available climate records, we obtain reasonably good performance. In general, dependencies between temperature extremes are better captured than dependencies between precipitation extremes due to the high spatial coherence in temperature fields. Our approach can be applied to other climate variables and can be used to emulate climate models when running very long simulations to determine dependencies in the extremes is deemed infeasible.
The species–area relationship (SAR) has been described as one of the few general patterns in ecology. Although there are many types of SAR, here we are concerned solely with the so-called species accumulation curve (SAC). The theoretical basis of this relationship is not well established. Here, we suggest that extreme value theory, also known as the statistics of extremes, provides a theoretical foundation for, as well as functions to fit, empirical species accumulation curves. Among the several procedures in extreme value theory, the appropriate way to deal with the species accumulation curve is the so-called block minima procedure. We first provide a brief description of this approach and the relevant formulas. We then illustrate the application of the block minima approach using data on tree species from a 50 ha plot in Barro Colorado Island, Panama. We conclude by discussing the extent to which the assumptions under which the extreme types theorem occurs are confirmed by the data. Although we recognize limitations to the present application of extreme value theory, we predict that it will provide fertile ground for future work on the theory of SARs and its application in the fields of ecology, biogeography and conservation.
Suppose k balls are dropped into n boxes independently with uniform probability, where n, k are large with ratio approximately equal to some positive real
$\lambda$
. The maximum box count has a counterintuitive behavior: first of all, with high probability it takes at most two values
$m_n$
or
$m_n+1$
, where
$m_n$
is roughly
$\frac{\ln n}{\ln \ln n}$
. Moreover, it oscillates between these two values with an unusual periodicity. In order to prove this statement and various generalizations, it is first shown that for
$X_1,\ldots,X_n$
independent and identically distributed discrete random variables with common distribution F, under mild conditions, the limiting distribution of their maximum oscillates in three possible families, depending on the tail of the distribution. The result stated at the beginning follows from the ensemble equivalence for the order statistics in various allocations problems, obtained via conditioning limit theory. Results about the number of ties for the maximum, as well as applications, are also provided.
Dielectric breakdown in a thin oxide is presented in terms of an interacting particle system on a two-dimensional lattice. All edges in the system are initially assumed to be closed. An edge between two adjacent vertices will open according to an exponentially distributed random variable. Breakdown occurs at the time an open path connects the top layer of the lattice to the bottom layer. Using the extreme value theory, we show that the time until breakdown is asymptotically Weibull distributed.
We consider point process convergence for sequences of independent and identically distributed random walks. The objective is to derive asymptotic theory for the largest extremes of these random walks. We show convergence of the maximum random walk to the Gumbel or the Fréchet distributions. The proofs depend heavily on precise large deviation results for sums of independent random variables with a finite moment generating function or with a subexponential distribution.
Marine is the oldest type of insurance coverage. Nevertheless, unlike cargo and hull covers, marine liability is a rather young line of business with claims that can have very heavy and long tails. For reinsurers, the accumulation of losses from an event insured by various Protection and Indemnity clubs is an additional source for very large claims in the portfolio. In this paper, we first describe some recent developments of the marine liability market and then statistically analyze a data set of large losses for this line of business in a detailed manner both in terms of frequency and severity, including censoring techniques and tests for stationarity over time. We further formalize and examine an optimization problem that occurs for reinsurers participating in XL on XL coverages in this line of business and give illustrations of its solution.
In this work we deal with extreme value theory in the context of continued fractions using techniques from probability theory, ergodic theory and real analysis. We give an upper bound for the rate of convergence in the Doeblin–Iosifescu asymptotics for the exceedances of digits obtained from the regular continued fraction expansion of a number chosen randomly from $(0,1)$ according to the Gauss measure. As a consequence, we significantly improve the best known upper bound on the rate of convergence of the maxima in this case. We observe that the asymptotics of order statistics and the extremal point process can also be investigated using our methods.
Frequent large losses from recent catastrophes have caused great concerns among insurers/reinsurers, who then turn to seek mitigations of such catastrophe risks by issuing catastrophe (CAT) bonds and thereby transferring the risks to the bond market. Whereas, the pricing of CAT bonds remains a challenging task, mainly due to the facts that the CAT bond market is incomplete and that the pricing usually requires knowledge about the tail of the risks. In this paper, we propose a general pricing framework based on a product pricing measure, which combines a distorted probability measure that prices the catastrophe risks underlying the CAT bond with a risk-neutral probability measure that prices interest rate risk. We also demonstrate the use of the peaks over threshold (POT) method to uncover the tail risk. Finally, we conduct case studies using Mexico and California earthquake data to demonstrate the applicability of our pricing framework.
Based on the ratio of two block maxima, we propose a large sample test for the length of memory of a stationary symmetric α-stable discrete parameter random field. We show that the power function converges to 1 as the sample-size increases to ∞ under various classes of alternatives having longer memory in the sense of Samorodnitsky (2004). Ergodic theory of nonsingular ℤd-actions plays a very important role in the design and analysis of our large sample test.
While max-stable processes are typically written as pointwise maxima over an infinite number of stochastic processes, in this paper, we consider a family of representations based on ℓp-norms. This family includes both the construction of the Reich–Shaby model and the classical spectral representation by de Haan (1984) as special cases. As the representation of a max-stable process is not unique, we present formulae to switch between different equivalent representations. We further provide a necessary and sufficient condition for the existence of an ℓp-norm-based representation in terms of the stable tail dependence function of a max-stable process. Finally, we discuss several properties of the represented processes such as ergodicity or mixing.
CAT bonds play an important role in transferring insurance risks to the capital market. It has been observed that typical CAT bond premiums have changed since the recent financial crisis, which has been attributed to market participants being increasingly risk averse. In this work, we first propose a new premium principle, the financial loss premium principle, which includes a term measuring losses in the financial market that we represent here by the Conditional Tail Expectation (CTE) of the negative daily log-return of the S&P 500 index. Our analysis of empirical evidence suggests indeed that in the post-crisis market, instead of simply increasing the fixed level of risk load universally, the increased risk aversion should be modeled jointly by a fixed level of risk load and a financial loss factor to reflect trends in the financial market. This new premium principle is shown to be flexible with respect to the confidence/exceedance level of CTE. In the second part, we focus on the particular example of extreme wildfire risk. The distribution of the amount of precipitation in Fort McMurray, Canada, which is a very important factor in the occurrence of wildfires, is analyzed using extreme value modeling techniques. A wildfire bond with parametric trigger of precipitation is then designed to mitigate extreme wildfire risk, and its premium is predicted using an extreme value analysis of its expected loss. With an application to the 2016 Fort McMurray wildfire, we demonstrate that the extreme value model is sensible, and we further analyze how our results and construction can be used to provide a design framework for CAT bonds which may appeal to (re)insurers and investors alike.
Extreme value theory for random vectors and stochastic processes with continuous trajectories is usually formulated for random objects where the univariate marginal distributions are identical. In the spirit of Sklar's theorem from copula theory, such marginal standardization is carried out by the pointwise probability integral transform. Certain situations, however, call for stochastic models whose trajectories are not continuous but merely upper semicontinuous (USC). Unfortunately, the pointwise application of the probability integral transform to a USC process does not, in general, preserve the upper semicontinuity of the trajectories. In this paper we give sufficient conditions to enable marginal standardization of USC processes and we state a partial extension of Sklar's theorem for USC processes. We specialize the results to max-stable processes whose marginal distributions and normalizing sequences are allowed to vary with the coordinate.
The extremal behaviour of a Markov chain is typically characterised by its tail chain. For asymptotically dependent Markov chains, existing formulations fail to capture the full evolution of the extreme event when the chain moves out of the extreme tail region, and, for asymptotically independent chains, recent results fail to cover well-known asymptotically independent processes, such as Markov processes with a Gaussian copula between consecutive values. We use more sophisticated limiting mechanisms that cover a broader class of asymptotically independent processes than current methods, including an extension of the canonical Heffernan‒Tawn normalisation scheme, and reveal features which existing methods reduce to a degenerate form associated with nonextreme states.
We consider random variables observed at arrival times of a renewal process, which possibly depends on those observations and has regularly varying steps with infinite mean. Due to the dependence and heavy-tailed steps, the limiting behavior of extreme observations until a given time t tends to be rather involved. We describe the asymptotics and extend several partial results which appeared in this setting. The theory is applied to determine the asymptotic distribution of maximal excursions and sojourn times for continuous-time random walks.
Natural disasters may have considerable impact on society as well as on the (re-)insurance industry. Max-stable processes are ideally suited for the modelling of the spatial extent of such extreme events, but it is often assumed that there is no temporal dependence. Only a few papers have introduced spatiotemporal max-stable models, extending the Smith, Schlather and Brown‒Resnick spatial processes. These models suffer from two major drawbacks: time plays a similar role to space and the temporal dynamics are not explicit. In order to overcome these defects, we introduce spatiotemporal max-stable models where we partly decouple the influence of time and space in their spectral representations. We introduce both continuous- and discrete-time versions. We then consider particular Markovian cases with a max-autoregressive representation and discuss their properties. Finally, we briefly propose an inference methodology which is tested through a simulation study.