In this paper we revisit the discrepancy principle for Tikhonov regularization of nonlinear ill-posed problems in Hilbert spaces and provide some new and improved saturation results under less restrictive conditions, comparing with the existing results in the literature.
This paper contains two major contributions. First we derive, following the discrete de Rham (DDR) and Virtual Element (VEM) paradigms, pressure-robust methods for the Stokes equations that support arbitrary orders and polyhedral meshes. Unlike other methods presented in the literature, pressure-robustness is achieved here without resorting to an $\boldsymbol{H}({\rm div})$-conforming construction on a submesh, but rather projecting the volumetric force onto the discrete $\boldsymbol{H}({\bf curl})$ space. The cancellation of the pressure error contribution stems from key commutation properties of the underlying DDR and VEM complexes. The pressure-robust error estimates in $h^{k+1}$ (with $h$ denoting the meshsize and $k\ge 0$ the polynomial degree of the DDR or VEM complex) are proven theoretically and supported by a panel of three-dimensional numerical tests. The second major contribution of the paper is an in-depth study of the relations between the DDR and VEM approaches. We show, in particular, that a complex developed following one paradigm admits a reformulation in the other, and that couples of related DDR and VEM complexes satisfy commuting diagram properties with the degrees of freedom maps.
In the study of extremes, the presence of asymptotic independence signifies that extreme events across multiple variables are probably less likely to occur together. Although well-understood in a bivariate context, the concept remains relatively unexplored when addressing the nuances of joint occurrence of extremes in higher dimensions. In this paper, we propose a notion of mutual asymptotic independence to capture the behavior of joint extremes in dimensions larger than two and contrast it with the classical notion of (pairwise) asymptotic independence. Furthermore, we define $k$-wise asymptotic independence which lies in between pairwise and mutual asymptotic independence. The concepts are compared using examples of Archimedean, Gaussian and Marshall-Olkin copulas among others. Notably, for the popular Gaussian copula, we provide explicit conditions on the correlation matrix for mutual asymptotic independence to hold; moreover, we are able to compute exact tail orders for various tail events.
This paper considers both the least squares and quasi-maximum likelihood estimation for the recently proposed scalable ARMA model, a parametric infinite-order vector AR model, and their asymptotic normality is also established. It makes feasible the inference on this computationally efficient model, especially for economic and financial time series. An efficient block coordinate descent algorithm is further introduced to search for estimates, and a Bayesian information criterion with selection consistency is suggested for model selection. Simulation experiments are conducted to illustrate their finite sample performance, and a real application on six macroeconomic indicators illustrates the usefulness of the proposed methodology.
This paper presents the first analysis of parameter-uniform convergence for a hybridizable discontinuous Galerkin (HDG) method applied to a singularly perturbed convection-diffusion problem in 2D using a Shishkin mesh. The primary difficulty lies in accurately estimating the convection term in the layer, where existing methods often fall short. To address this, a novel error control technique is employed, along with reasonable assumptions regarding the stabilization function. The results show that, with polynomial degrees not exceeding $k$, the method achieves supercloseness of almost $k+\frac{1}{2}$ order in an energy norm. Numerical experiments confirm the theoretical accuracy and efficiency of the proposed method.
Stable infiniteness, strong finite witnessability, and smoothness are model-theoretic properties relevant to theory combination in satisfiability modulo theories. Theories that are strongly finitely witnessable and smooth are called strongly polite and can be effectively combined with other theories. Toledo, Zohar, and Barrett conjectured that stably infinite and strongly finitely witnessable theories are smooth and therefore strongly polite. They called counterexamples to this conjecture unicorn theories, as their existence seemed unlikely. We prove that, indeed, unicorns do not exist. We also prove versions of the L\"owenheim-Skolem theorem and the {\L}o\'s-Vaught test for many-sorted logic.
This paper studies the influence of probabilism and non-determinism on some quantitative aspect X of the execution of a system modeled as a Markov decision process (MDP). To this end, the novel notion of demonic variance is introduced: For a random variable X in an MDP M, it is defined as 1/2 times the maximal expected squared distance of the values of X in two independent execution of M in which also the non-deterministic choices are resolved independently by two distinct schedulers. It is shown that the demonic variance is between 1 and 2 times as large as the maximal variance of X in M that can be achieved by a single scheduler. This allows defining a non-determinism score for M and X measuring how strongly the difference of X in two executions of M can be influenced by the non-deterministic choices. Properties of MDPs M with extremal values of the non-determinism score are established. Further, the algorithmic problems of computing the maximal variance and the demonic variance are investigated for two random variables, namely weighted reachability and accumulated rewards. In the process, also the structure of schedulers maximizing the variance and of scheduler pairs realizing the demonic variance is analyzed.
Triple periodic minimal surfaces (TPMS) have garnered significant interest due to their structural efficiency and controllable geometry, making them suitable for a wide range of applications. This paper investigates the relationships between porosity and persistence entropy with the shape factor of TPMS. We propose conjectures suggesting that these relationships are polynomial in nature, derived through the application of machine learning techniques. This study exemplifies the integration of machine learning methodologies in pure mathematical research. Besides the conjectures, we provide the mathematical models that might have the potential implications for the design and modeling of TPMS structures in various practical applications.
We describe a simple deterministic near-linear time approximation scheme for uncapacitated minimum cost flow in undirected graphs with real edge weights, a problem also known as transshipment. Specifically, our algorithm takes as input a (connected) undirected graph $G = (V, E)$, vertex demands $b \in \mathbb{R}^V$ such that $\sum_{v \in V} b(v) = 0$, positive edge costs $c \in \mathbb{R}_{>0}^E$, and a parameter $\varepsilon > 0$. In $O(\varepsilon^{-2} m \log^{O(1)} n)$ time, it returns a flow $f$ such that the net flow out of each vertex is equal to the vertex's demand and the cost of the flow is within a $(1 + \varepsilon)$ factor of optimal. Our algorithm is combinatorial and has no running time dependency on the demands or edge costs. With the exception of a recent result presented at STOC 2022 for polynomially bounded edge weights, all almost- and near-linear time approximation schemes for transshipment relied on randomization to embed the problem instance into low-dimensional space. Our algorithm instead deterministically approximates the cost of routing decisions that would be made if the input were subject to a random tree embedding. To avoid computing the $\Omega(n^2)$ vertex-vertex distances that an approximation of this kind suggests, we also take advantage of the clustering method used in the well-known Thorup-Zwick distance oracle.
In this paper we combine the principled approach to modalities from multimodal type theory (MTT) with the computationally well-behaved realization of identity types from cubical type theory (CTT). The result -- cubical modal type theory (Cubical MTT) -- has the desirable features of both systems. In fact, the whole is more than the sum of its parts: Cubical MTT validates desirable extensionality principles for modalities that MTT only supported through ad hoc means. We investigate the semantics of Cubical MTT and provide an axiomatic approach to producing models of Cubical MTT based on the internal language of topoi and use it to construct presheaf models. Finally, we demonstrate the practicality and utility of this axiomatic approach to models by constructing a model of (cubical) guarded recursion in a cubical version of the topos of trees. We then use this model to justify an axiomatization of L\"ob induction and thereby use Cubical MTT to smoothly reason about guarded recursion.
With climate extremes' rising frequency and intensity, robust analytical tools are crucial to predict their impacts on terrestrial ecosystems. Machine learning techniques show promise but require well-structured, high-quality, and curated analysis-ready datasets. Earth observation datasets comprehensively monitor ecosystem dynamics and responses to climatic extremes, yet the data complexity can challenge the effectiveness of machine learning models. Despite recent progress in deep learning to ecosystem monitoring, there is a need for datasets specifically designed to analyse compound heatwave and drought extreme impact. Here, we introduce the DeepExtremeCubes database, tailored to map around these extremes, focusing on persistent natural vegetation. It comprises over 40,000 spatially sampled small data cubes (i.e. minicubes) globally, with a spatial coverage of 2.5 by 2.5 km. Each minicube includes (i) Sentinel-2 L2A images, (ii) ERA5-Land variables and generated extreme event cube covering 2016 to 2022, and (iii) ancillary land cover and topography maps. The paper aims to (1) streamline data accessibility, structuring, pre-processing, and enhance scientific reproducibility, and (2) facilitate biosphere dynamics forecasting in response to compound extremes.