We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article is concerned with the problem of determining an unknown source of non-potential, external time-dependent perturbations of an incompressible fluid from large-scale observations on the flow field. A relaxation-based approach is proposed for accomplishing this, which makes use of a nonlinear property of the equations of motions to asymptotically enslave small scales to large scales. In particular, an algorithm is introduced that systematically produces approximations of the flow field on the unobserved scales in order to generate an approximation to the unknown force; the process is then repeated to generate an improved approximation of the unobserved scales, and so on. A mathematical proof of convergence of this algorithm is established in the context of the two-dimensional Navier–Stokes equations with periodic boundary conditions under the assumption that the force belongs to the observational subspace of phase space; at each stage in the algorithm, it is shown that the model error, represented as the difference between the approximating and true force, asymptotically decreases to zero in a geometric fashion provided that sufficiently many scales are observed and certain parameters of the algorithm are appropriately tuned.
In order to clarify and visualize the real state of the structural performances of ships in operation and establish a more optimal, data-driven framework for ship design, construction and operation, an industry-academia joint R&D project on the digital twin for ship structures (DTSS) was conducted in Japan. This paper presents the major achievements of the project. The DTSS aims to grasp the stress responses over the whole ship structure in waves by data assimilation that merges hull monitoring and numerical simulation. Three data assimilation methods, namely, the wave spectrum method, Kalman filter method, and inverse finite element method were used, and their effectiveness was examined through model and full-scale ship measurements. Methods for predicting short-term extreme responses and long-term cumulative fatigue damage were developed for navigation and maintenance support using statistical approaches. In comparison with conventional approaches, response predictions were significantly improved by DTSS using real response data in encountered waves. Utilization scenarios for DTSS in the maritime industry were presented from the viewpoints of navigation support, maintenance support, rule improvement, and product value improvement, together with future research needs for implementation in the maritime industry.
Edited by
Alik Ismail-Zadeh, Karlsruhe Institute of Technology, Germany,Fabio Castelli, Università degli Studi, Florence,Dylan Jones, University of Toronto,Sabrina Sanchez, Max Planck Institute for Solar System Research, Germany
Abstract: Variational data assimilation through the adjoint method is a powerful emerging technique in geodynamics. It allows one to retrodict past states of the Earth’s mantle as optimal flow histories relative to the current state, so that poorly known mantle flow parameters such as rheology and composition can be tested explicitly against observations gleaned from the geologic record. By yielding testable time dependent Earth models, the technique links observations from seismology, geology, mineral physics, and paleomagnetism in a dynamically consistent way, greatly enhancing our understanding of the solid Earth system. It motivates three research fronts. The first is computational, because the iterative nature of the technique combined with the need of Earth models for high spatial and temporal resolution classifies the task as a grand challenge problem at the level of exa-scale computing. The second is seismological, because the seismic mantle state estimate provides key input information for retrodictions, but entails substantial uncertainties. This calls for efforts to construct 3D reference and collaborative seismic models, and to account for seismic data uncertainties. The third is geological, because retrodictions necessarily use simplified Earth models and noisy input data. Synthetic tests show that retrodictions always reduce the final state misfit, regardless of model and data error. So the quality of any retrodiction must be assessed by geological constraints on past mantle flow. Horizontal surface velocities are an input rather than an output of the retrodiction problem; but viable retrodiction tests can be linked to estimates of vertical lithosphere motion induced by mantle convective stresses.
Edited by
Alik Ismail-Zadeh, Karlsruhe Institute of Technology, Germany,Fabio Castelli, Università degli Studi, Florence,Dylan Jones, University of Toronto,Sabrina Sanchez, Max Planck Institute for Solar System Research, Germany
Abstract: Earthquake early warning (EEW) systems aim to provide advance warning of impending strong ground shaking, in which earthquake ground shaking is predicted in real-time or near real-time. Many EEW systems are based on a strategy which first quickly determines the earthquake hypocentre and magnitude, and then predicts the strength of ground shaking at various locations using the hypocentre distance and magnitude. Recently, however, a new strategy was proposed in which the current seismic wavefield is rapidly estimated by using data assimilation, and then the future wavefield is predicted on the basis of the physics of wave propagation. This technique for real-time prediction of ground shaking in EEW does not necessarily require the earthquake hypocentre and magnitude. In this paper, I review real-time shake-mapping and data assimilation for precise estimation of ongoing ground shaking, and prediction of future shaking in EEW.
Edited by
Alik Ismail-Zadeh, Karlsruhe Institute of Technology, Germany,Fabio Castelli, Università degli Studi, Florence,Dylan Jones, University of Toronto,Sabrina Sanchez, Max Planck Institute for Solar System Research, Germany
Abstract: We introduce direct and inverse problems, which describe dynamical processes causing change in the Earth system and its space environment. A well-posedness of the problems is defined in the sense of Hadamard and in the sense of Tikhonov, and it is linked to the existence, uniqueness, and stability of the problem solution. Some examples of ill- and well-posed problems are considered. Basic knowledge and approaches in data assimilation and solving inverse problems are discussed along with errors and uncertainties in data and model parameters as well as sensitivities of model results. Finally, we briefly review the book’s chapters which present state-of-the-art knowledge in data assimilation and geophysical inversions and applications in many disciplines of the Earth sciences: from the Earth’s core to the near-Earth environment.
Edited by
Alik Ismail-Zadeh, Karlsruhe Institute of Technology, Germany,Fabio Castelli, Università degli Studi, Florence,Dylan Jones, University of Toronto,Sabrina Sanchez, Max Planck Institute for Solar System Research, Germany
Abstract: Geomagnetic data assimilation is a recently established research discipline in geomagnetism. It aims to optimally combine geomagnetic observations and numerical geodynamo models to better estimate the dynamic state of the Earth’s outer core, and to predict geomagnetic secular variation. Over the past decade, rapid advances have been made in geomagnetic data assimilation on various fronts by several research groups around the globe, such as using geomagnetic data assimilation to understand and interpret the observed geomagnetic secular variation, estimating part of the core state that is not observable on the Earth’s surface, and making geomagnetic forecasts on multi-year time scales. In parallel, efforts have also been made on proxy systems for understanding fundamental statistical properties of geomagnetic data assimilation, and for developing algorithms tailored specifically for geomagnetic data assimilation. In this chapter, we provide a comprehensive overview of these advances, as well as some of the immediate challenges of geomagnetic data assimilation, and possible solutions and pathways to move forward.
Edited by
Alik Ismail-Zadeh, Karlsruhe Institute of Technology, Germany,Fabio Castelli, Università degli Studi, Florence,Dylan Jones, University of Toronto,Sabrina Sanchez, Max Planck Institute for Solar System Research, Germany
Abstract: There is a fundamental need to understand and improve the errors and uncertainties associated with estimates of seasonal snow analysis and prediction. Over the past few decades, snow cover remote sensing techniques have increased in accuracy, but the retrieval of spatially and temporally continuous estimates of snow depth or snow water equivalent remains challenging tasks. Model-based snow estimates often bear significant uncertainties due to model structure and error-prone forcing data and parameter estimates. A potential method to overcome model and observational shortcomings is data assimilation. Data assimilation leverages the information content in both observations and models while minimising inherent limitations that result from uncertainty. This chapter reviews current snow models, snow remote sensing methods, and data assimilation techniques that can reduce uncertainties in the characterisation of seasonal snow.
Edited by
Alik Ismail-Zadeh, Karlsruhe Institute of Technology, Germany,Fabio Castelli, Università degli Studi, Florence,Dylan Jones, University of Toronto,Sabrina Sanchez, Max Planck Institute for Solar System Research, Germany
Abstract: Geomagnetic data assimilation aims at constraining the state of the geodynamo working at the Earth’s deep interior by sparse magnetic observations at and above the Earth’s surface. Due to difficulty separating the different magnetic field sources in the observations, spectral models of the geomagnetic field are generally used as inputs for data assimilation. However, the assimilation of raw pointwise observations can be relevant within certain configurations, specifically with paleomagnetic and historical geomagnetic data. Covariance localisation, which is a key ingredient to the assimilation performance in an ensemble framework, is relatively unexplored, and differs with respect to spectral and pointwise observations. This chapter introduces the main characteristics of geomagnetic data and magnetic field models, and explores the role of model and observation covariances and localisation in typical assimilation set-ups, focusing on the use of 3D dynamo simulations as the background model.
Uncertainty quantification (UQ) plays a crucial role in data assimilation (DA) since it impacts both the quality of the reconstruction and near-future forecast. However, traditional UQ approaches are often limited in their ability to handle complex datasets and may have a large computational cost. In this paper, we present a new ensemble-based approach to extend the 4DVarNet framework, an end-to-end deep learning scheme backboned on variational DA used to estimate the mean of the state along a given DA window. We use conditional 4DVarNet simulations compliant with the available observations to estimate the 4DVarNet probability density function. Our approach enables to combine both the efficiency of 4DVarNet in terms of computational cost and validation performance with a fast and memory-saving Monte-Carlo based post-processing of the reconstruction, leading to the so-called En4DVarNet estimation of the state pdf. We demonstrate our approach in a case study involving the sea surface height: 4DVarNet is pretrained on an idealized Observation System Simulation Experiment (OSSE), then used on real-world dataset (OSE). The sampling of independent realizations of the state is made among the catalogue of model-based data used during training. To illustrate our approach, we use a nadir altimeter constellation in January 2017 and show how the uncertainties retrieved by combining 4DVarNet with the statistical properties of the training dataset lead to a relevant information providing in most cases a confidence interval compliant with the Cryosat-2 nadir alongtrack dataset kept for validation.
Many contemporary problems within the Earth sciences are complex, and require an interdisciplinary approach. This book provides a comprehensive reference on data assimilation and inverse problems, as well as their applications across a broad range of geophysical disciplines. With contributions from world leading researchers, it covers basic knowledge about geophysical inversions and data assimilation and discusses a range of important research issues and applications in atmospheric and cryospheric sciences, hydrology, geochronology, geodesy, geodynamics, geomagnetism, gravity, near-Earth electron radiation, seismology, and volcanology. Highlighting the importance of research in data assimilation for understanding dynamical processes of the Earth and its space environment and for predictability, it summarizes relevant new advances in data assimilation and inverse problems related to different geophysical fields. Covering both theory and practical applications, it is an ideal reference for researchers and graduate students within the geosciences who are interested in inverse problems, data assimilation, predictability, and numerical methods.
Three areas where machine learning (ML) and physics have been merging: (a) Physical models can have computationally expensive components replaced by inexpensive ML models, giving rise to hybrid models. (b) In physics-informed machine learning, ML models can be solved satisfying the laws of physics (e.g. conservation of energy, mass, etc.) either approximately or exactly. (c) In forecasting, ML models can be combined with numerical/dynamical models under data assimilation.
Data assimilation of flow measurements is an essential tool for extracting information in fluid dynamics problems. Recent works have shown that the physics-informed neural networks (PINNs) enable the reconstruction of unsteady fluid flows, governed by the Navier–Stokes equations, if the network is given enough flow measurements that are appropriately distributed in time and space. In many practical applications, however, experimental measurements involve only time-averaged quantities or their higher order statistics which are governed by the under-determined Reynolds-averaged Navier–Stokes (RANS) equations. In this study, we perform PINN-based reconstruction of time-averaged quantities of an unsteady flow from sparse velocity data. The applied technique leverages the time-averaged velocity data to infer unknown closure quantities (curl of unsteady RANS forcing), as well as to interpolate the fields from sparse measurements. Furthermore, the method’s capabilities are extended further to the assimilation of Reynolds stresses where PINNs successfully interpolate the data to complete the velocity as well as the stresses fields and gain insight into the pressure field of the investigated flow.
Satellite imagery can detect temporary cloud trails or ship tracks formed from aerosols emitted from large ships traversing our oceans, a phenomenon that global climate models cannot directly reproduce. Ship tracks are observable examples of marine cloud brightening, a potential solar climate intervention that shows promise in helping combat climate change. In this paper, we demonstrate a simulation-based approach in learning the behavior of ship tracks based upon a novel stochastic emulation mechanism. Our method uses wind fields to determine the movement of aerosol–cloud tracks and uses a stochastic partial differential equation (SPDE) to model their persistence behavior. This SPDE incorporates both a drift and diffusion term which describes the movement of aerosol particles via wind and their diffusivity through the atmosphere, respectively. We first present our proposed approach with examples using simulated wind fields and ship paths. We then successfully demonstrate our tool by applying the approximate Bayesian computation method-sequential Monte Carlo for data assimilation.
The growth of machine learning (ML) in environmental science can be divided into a slow phase lasting till the mid-2010s and a fast phase thereafter. The rapid transition was brought about by the emergence of powerful new ML methods, allowing ML to successfully tackle many problems where numerical models and statistical models have been hampered. Deep convolutional neural network models greatly advanced the use of ML on 2D or 3D data. Transfer learning has allowed ML to progress in climate science, where data records are generally short for ML. ML and physics are also merging in new areas, for example: (a) using ML for general circulation model parametrization, (b) adding physics constraints in ML models, and (c) using ML in data assimilation.
We propose an improved adjoint-based method for the reconstruction and prediction of the nonlinear wave field from coarse-resolution measurement data. We adopt the data assimilation framework using an adjoint equation to search for the optimal initial wave field to match the wave field simulation result at later times with the given measurement data. Compared with the conventional approach where the optimised initial surface elevation and velocity potential are independent of each other, our method features an additional constraint to dynamically connect these two control variables based on the dispersion relation of waves. The performance of our new method and the conventional method is assessed with the nonlinear wave data generated from phase-resolved nonlinear wave simulations using the high-order spectral method. We consider a variety of wave steepness and noise levels for the nonlinear irregular waves. It is found that the conventional method tends to overestimate the surface elevation in the high-frequency region and underestimate the velocity potential. In comparison, our new method shows significantly improved performance in the reconstruction and prediction of instantaneous surface elevation, surface velocity potential and high-order wave statistics, including the skewness and kurtosis.
MIT professor Edward Lorenz made a serendipitous discovery about weather in the early 1960s, using a digital computer. He found that a small error in the initial conditions for a weather forecast could grow exponentially into a large error within days. This came to be known as the Butterfly Effect, one of the founding principles of chaos theory. To obtain initial conditions for weather forecasts, data from hundreds of weather balloons launched daily around the world are combined with satellite data through a procedure called data assimilation. The Butterfly Effect limits the useful period of weather prediction to about two weeks in advance. Carrying out an ensemble of weather forecasts, starting from different initial conditions, allows probabilistic predictions that extend beyond this limit. The philosophical concept of determinism is introduced, as is the concept of Laplace’s Demon, motivating the notion of a Climate Demon that predicts climate perfectly.
This paper presents a new algorithm for lidar data assimilation relying on a new forward model. Current mapping algorithms suffer from multiple shortcomings, which can be related to the lack of clear forward model. In order to address these issues, we provide a mathematical framework where we show how the use of coarse model parameters results in a new data assimilation problem. Understanding this new problem proves essential to derive sound inference algorithms. We introduce a model parameter specifically tailored for lidar data assimilation, which closely relates to the local mean free path. Using this new model parameter, we derive its associated forward model and we provide the resulting mapping algorithm. We further discuss how our proposed algorithm relates to usual occupancy grid mapping. Finally, we present an example with real lidar measurements.
To improve Antarctic sea-ice simulations and estimations, an ensemble-based Data Assimilation System for the Southern Ocean (DASSO) was developed based on a regional sea ice–ocean coupled model, which assimilates sea-ice thickness (SIT) together with sea-ice concentration (SIC) derived from satellites. To validate the performance of DASSO, experiments were conducted from 15 April to 14 October 2016. Generally, assimilating SIC and SIT can suppress the overestimation of sea ice in the model-free run. Besides considering uncertainties in the operational atmospheric forcing data, a covariance inflation procedure in data assimilation further improves the simulation of Antarctic sea ice, especially SIT. The results demonstrate the effectiveness of assimilating sea-ice observations in reconstructing the state of Antarctic sea ice, but also highlight the necessity of more reasonable error estimation for the background as well as the observation.
The Real Time Mesoscale Analysis (RTMA), a two-dimensional variational analysis algorithm, is used to provide hourly analyses of surface sensible weather elements for situational awareness at spatial resolutions of 3 km over Alaska. In this work we focus on the analysis of horizontal visibility in Alaska, which is a region prone to weather related aviation accidents that are in part due to a relatively sparse observation network. In this study we evaluate the impact of assimilating estimates of horizontal visibility derived from a novel network of web cameras in Alaska with the RTMA. Results suggest that the web camera-derived estimates of visibility can capture low visibility conditions and have the potential to improve the RTMA visibility analysis under conditions of low instrument flight rules and instrument flight rules.