We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We extend the growth-at-risk (GaR) literature by examining US growth risks over 130 years using a time-varying parameter stochastic volatility regression model. This model effectively captures the distribution of GDP growth over long samples, accommodating changing relationships across variables and structural breaks. Our analysis offers several key insights for policymakers. We identify significant temporal variation in both the level and determinants of GaR. The stability of upside risks to GDP growth, as seen in previous research, is largely confined to the Great Moderation period, with a more balanced risk distribution prior to the 1970s. Additionally, the distribution of GDP growth has narrowed significantly since the end of the Bretton Woods system. Financial stress is consistently associated with higher downside risks, without affecting upside risks. Moreover, indicators such as credit growth and house prices influence both downside and upside risks during economic booms. Our findings also contribute to the financial cycle literature by providing a comprehensive view of the drivers and risks associated with economic booms and recessions over time.
This article studies how sudden changes in bank credit supply impact economic activity. I identify shocks to bank credit supply based on firms’ aggregate debt composition. I use a model where firms fund production with bonds and loans. In the model, bank shocks are the only type of shock that imply opposite movements in the two types of debt as firms adjust their debt composition to new credit conditions. Bank shocks account for a third of output fluctuations and are predictive of the bond spread.
We provide new evidence about US monetary policy using a model that: (i) estimates time-varying monetary policy weights without relying on stylized theoretical assumptions; (ii) allows for endogenous breakdowns in the relationship between interest rates, inflation, and output; and (iii) generates a unique measure of monetary policy activism that accounts for economic instability. The joint incorporation of endogenous time-varying uncertainty about the monetary policy parameters and the stability of the relationship between interest rates, inflation, and output materially reduces the probability of determinate monetary policy. The average probability of determinacy over the period post-1982 to 1997 is below 60% (hence well below seminal estimates of determinacy probabilities that are close to unity). Post-1990, the average probability of determinacy is 75%, falling to approximately 60% when we allow for typical levels of trend inflation.
Much historical yield-monitor data is from fields where a uniform rate of nitrogen was applied. A new approach is proposed using this data to get site-specific nitrogen recommendations. Bayesian methods are used to estimate a linear plateau model where only the plateau is spatially varying. The model is then illustrated by using it to make site-specific nitrogen recommendations for corn production in Mississippi. The in-sample recommendations generated by this approach return an estimated $9/acre on the example field. The long-term goal is to combine this information with other information such as remote sensing measurements.
Fertility decline in human history is a complex enigma. Different triggers have been proposed, among others the increased demand for human capital resulting in parents making a quantity–quality (QQ) trade-off. This is the first study that examines the existence of a QQ trade-off and the possible gender bias by analyzing fertility intentions rather than fertility outcomes. We rely on the unified growth theory to understand the QQ trade-off conceptually and a discrete choice experiment conducted among 426 respondents in Ethiopia to analyze fertility intentions empirically. We confirm the existence of a QQ trade-off only when the number of children is less than six and find that intentions are gendered in two ways: (i) boys are preferred over girls, and (ii) men are willing to trade-off more education in return for more children. Results imply that a focus on both stimulating intentions for education, especially girls' education, and on family size intentions is important to accelerate the demographic transition.
Longevity risk is putting more and more financial pressure on governments and pension plans worldwide due to pensioners’ increasing trend of life expectancy and the growing numbers of people reaching retirement age. Lee and Carter (1992, Journal of the American Statistical Association, 87(419), 659–671.) applied a one-factor dynamic factor model to forecast the trend of mortality improvement, and the model has since become the field’s workhorse. It is, however, well known that their model is subject to the limitation of overlooking cross-dependence between different age groups. We introduce Factor-Augmented Vector Autoregressive (FAVAR) models to the mortality modelling literature. The model, obtained by adding an unobserved factor process to a Vector Autoregressive (VAR) process, nests VAR and Lee–Carter models as special cases and inherits both frameworks’ advantages. A Bayesian estimation approach, adapted from the Minnesota prior, is proposed. The empirical application to the US and French mortality data demonstrates our proposed method’s efficacy in both in-sample and out-of-sample performance.
Partial equilibrium models have been used extensively by policy makers to prospectively determine the consequences of government programs that affect consumer incomes or the prices consumers pay. However, these models have not previously been used to analyze government programs that inform consumers. In this paper, we develop a model that policy makers can use to quantitatively predict how consumers will respond to risk communications that contain new health information. The model combines Bayesian learning with the utility-maximization of consumer choice. We discuss how this model can be used to evaluate information policies; we then test the model by simulating the impacts of the North Dakota Folic Acid Educational Campaign as a validation exercise.
This study investigates the time-varying effects of international uncertainty shocks. I use a global vector autoregressive model with drifting coefficients and factor stochastic volatility in the errors to model the G7 economies jointly. The measure of uncertainty is constructed by estimating a time-varying scalar driving the innovation variances of the latent factors, which is also included in the conditional mean of the process. To achieve regularization, I use Bayesian techniques for estimation, and rely on hierarchical global–local priors to shrink the high-dimensional multivariate system towards sparsity. I compare the obtained econometric measure of uncertainty to alternative indices and discuss commonalities and differences. Moreover, I find that international uncertainty may differ substantially compared to identically constructed domestic measures. Structural inference points towards pronounced real and financial effects of uncertainty shocks in all considered economies. These effects are subject to heterogeneities over time and the cross-section, providing empirical evidence in favor of using the flexible econometric framework introduced in this study.
We measure the economic impact of varietal improvement and technological change in flue-cured tobacco across quantity (e.g., yield) and quality dimensions under a voluntary quality constraint. Since 1961, flue-cured tobacco breeders in the United States have been subject to the Minimum Standards Program that sets limits on acceptable quality characteristics for commercial tobacco varieties. We implement a Bayesian hierarchical model to measure the contribution of breeding efforts to changes in tobacco yields and quality between 1954 and 2017. The Bayesian model addresses limited data for varieties in the trials and allows easy generation of the necessary parameters of economic interest.
There is renewed interest in levelling up the regions of the UK. The combination of social and political discontent, and the sluggishness of key UK macroeconomic indicators like productivity growth, has led to increased interest in understanding the regional economies of the UK. In turn, this has led to more investment in economic statistics. Specifically, the Office for National Statistics (ONS) recently started to produce quarterly regional GDP data for the nine English regions and Wales that date back to 2012Q1. This complements existing real GVA data for the regions available from the ONS on an annual basis back to 1998; with the devolved administrations of Scotland and Northern Ireland producing their own quarterly output measures. In this paper we reconcile these two data sources along with UK quarterly output data that date back to 1970. This enables us to produce both more timely real terms estimates of quarterly economic growth in the regions of the UK and a new reconciled historical time-series of quarterly regional real output data from 1970. We explore a number of features of interest of these new data. This includes producing a new quarterly regional productivity series and commenting on the evolution of regional productivity growth in the UK.
The generalized linear model (GLM) is a statistical model which has been widely used in actuarial practices, especially for insurance ratemaking. Due to the inherent longitudinality of property and casualty insurance claim datasets, there have been some trials of incorporating unobserved heterogeneity of each policyholder from the repeated observations. To achieve this goal, random effects models have been proposed, but theoretical discussions of the methods to test the presence of random effects in GLM framework are still scarce. In this article, the concept of Bregman divergence is explored, which has some good properties for statistical modeling and can be connected to diverse model selection diagnostics as in Goh and Dey [(2014) Journal of Multivariate Analysis, 124, 371–383]. We can apply model diagnostics derived from the Bregman divergence for testing robustness of a chosen prior by the modeler to possible misspecification of prior distribution both on the naive model, which assumes that random effects follow a point mass distribution as its prior distribution, and the proposed model, which assumes a continuous prior density of random effects. This approach provides insurance companies a concrete framework for testing the presence of nonconstant random effects in both claim frequency and severity and furthermore appropriate hierarchical model which can explain both observed and unobserved heterogeneity of the policyholders for insurance ratemaking. Both models are calibrated using a claim dataset from the Wisconsin Local Government Property Insurance Fund which includes both observed claim counts and amounts from a portfolio of policyholders.
This study evaluates the effects of vegetative soil conservation practices (afforestation and/or bamboo planting) on farm profit and its components, revenue and variable cost. Since farmers self-select themselves as adopters of conservation measures, there could be a problem of selection bias in evaluating their soil conservation practices. We address the selection bias by using propensity score matching. We also check if there exists spatial spillover in adoption of vegetative conservation measures and how it affects matching. We use primary survey data from the Darjeeling district of the Eastern Himalayan region for the year 2013. Our results suggest strong spatial correlation. We find that the propensity score estimated from the spatial model provides better matches than the non-spatial model. While the results show that vegetative soil conservation can lead to significant gains in revenue, it also increases costs so that no significant gains in profit accrue to farmers.
Basis forecasting is important for producers and consumers of agricultural commodities in their risk management decisions. However, the best performing forecasting model found in previous studies varies substantially. Given this inconsistency, we take a Bayesian approach, which addresses model uncertainty by combining forecasts from different models. Results show model performance differs by location and forecast horizon, but the forecast from the Bayesian approach often performs favorably. In some cases, however, the simple moving averages have lower forecast errors. Besides the nearby basis, we also examine basis in a specific month and find that regression-based models outperform others in longer horizons.
Policy-critical, micro-level statistical data are often unavailable at the desired level of disaggregation. We present a Bayesian methodology for “downscaling” aggregated count data to the micro level, using an outside statistical sample. Our procedure combines numerical simulation with exact calculation of combinatorial probabilities. We motivate our approach with an application estimating the number of farms in a region, using count totals at higher levels of aggregation. In a simulation analysis over varying population sizes, we demonstrate both robustness to sampling variability and outperformance relative to maximum likelihood. Spatial considerations, implementation of “informative” priors, non-spatial classification problems, and best practices are discussed.
We present a demonstration of a Bayesian spatial probit model for a
dichotomous choice contingent valuation method willingness-to-pay (WTP)
questions. If voting behavior is spatially correlated, spatial
interdependence exists within the data, and standard probit models will
result in biased and inconsistent estimated nonbid coefficients. Adjusting
sample WTP to population WTP requires unbiased estimates of the nonbid
coefficients, and we find a $17 difference in population WTP per household
in a standard vs. spatial model. We conclude that failure to correctly model
spatial dependence can lead to differences in WTP estimates with potentially
important policy ramifications.
This study investigates the determinants affecting producers' adoption of some Best Management Practices (BMPs). Priors about the signs of certain variables are explicitly accounted for by testing for inequality restrictions through importance sampling. Education, gender, age, and on-farm residence are found to have significant effects on the adoption of some BMPs. Farms with larger animal production are more apt to implement manure management practices, crop rotation, and riparian buffer strips. Also, farms with larger cultivated acres are more inclined to implement herbicide control practices, crop rotation, and riparian buffer strips. Belonging to an agro-environment club has a positive impact for most BMPs.
Hedging is one of the most important risk management decisions that farmers make and has a potentially large role in the level of profit eventually earned from farming. Using panel data from a survey of Georgia farmers that recorded their hedging decisions for 4 years on four crops, we examine the role of habit, demographics, farm characteristics, and information sources on the hedging decisions made by 57 different farmers. We find that the role of habit varies widely and that estimation of a single habit effect suffers from aggregation bias. Thus, modeling farmer-level heterogeneity in the examination of habit and hedging is crucial.
Différentes heuristiques ont été avancées par les psychologues et les
économistes afin de rendre compte des comportements sur les marchés
financiers. Elles soulignent les biais cognitifs qui affectent les croyances
individuelles, et s'efforcent d'expliquer dans une certaine mesure les
anomalies constatées sur les marchés financiers. L'expérimentation menée
vise à tester les heuristiques de conservatisme, de représentativité et
d'ancrage-ajustement dans un contexte dynamique de quinze périodes : les
sujets reçoivent, à chaque période, une information financière et révisent
individuellement leurs croyances quant à la qualité d'une entreprise. Les
croyances observées s'avèrent incompatibles avec l'hypothèse de révision
bayésienne: les sujets ont tendance à surévaluer les petites probabilités et
à sous-évaluer les fortes probabilités. L'heuristique de représentativité
est, de la même manière, invalidée : le traitement économétrique montre que
les sujets sous-pondèrent les signaux les plus intenses, preuve qu'ils ne
tirent pas parti de leurs intensités informationnelles. Les hypothèses de
conservatisme et d'ancrage-ajustement sont au contraire conjointement
validées : les sujets sous-pondèrent l'information nouvelle quand ils
révisent leurs croyances mais ce comportement de révision est pleinement
conditionné au fait que les sujets s'écartent ou se rapprochent d'une valeur
d'ancrage.
This paper proposes a new kind of asymmetric GARCH where the conditional variance obeys two different regimes with a smooth transition function. In one formulation, the conditional variance reacts differently to negative and positive shocks while in a second formulation, small and big shocks have separate effects. The introduction of a threshold allows for a mixed effect. A Bayesian strategy, based on the comparison between posterior and predictive Bayesian residuals, is built for detecting the presence and the shape of non-linearities. The method is applied to the Brussels and Tokyo stock indexes. The attractiveness of an alternative parameterisation of the GARCH model is emphasised as a potential solution to some numerical problems.
This paper considers a special non-linear time series problem, that of testing for co-integration in a Bayesian framework when there is a break in the co-integrating relationship. It is shown that a partial linearization of the likelihood function solves many puzzling questions, in particular identification and common factor restrictions which are originally imbedded in the model. A generalization of the Jeffreys’ prior is derived for the dynamic parameter which monitors co-integration. The procedure is applied to a one time much debated question in France which concerns the wage regulation policy implemented at the beginning of the eighties.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.