This paper investigates extremal quantiles under two-way cluster dependence. We demonstrate that the limiting distribution of the unconditional intermediate order quantiles in the tails converges to a Gaussian distribution. This is remarkable as two-way cluster dependence entails potential non-Gaussianity in general, but extremal quantiles do not suffer from this issue. Building upon this result, we extend our analysis to extremal quantile regressions of intermediate order.
Riemannian optimization is concerned with problems, where the independent variable lies on a smooth manifold. There is a number of problems from numerical linear algebra that fall into this category, where the manifold is usually specified by special matrix structures, such as orthogonality or definiteness. Following this line of research, we investigate tools for Riemannian optimization on the symplectic Stiefel manifold. We complement the existing set of numerical optimization algorithms with a Riemannian trust region method tailored to the symplectic Stiefel manifold. To this end, we derive a matrix formula for the Riemannian Hessian under a right-invariant metric. Moreover, we propose a novel retraction for approximating the Riemannian geodesics. Finally, we conduct a comparative study in which we juxtapose the performance of the Riemannian variants of the steepest descent, conjugate gradients, and trust region methods on selected matrix optimization problems that feature symplectic constraints.
Threshold selection is a fundamental problem in any threshold-based extreme value analysis. While models are asymptotically motivated, selecting an appropriate threshold for finite samples is difficult and highly subjective through standard methods. Inference for high quantiles can also be highly sensitive to the choice of threshold. Too low a threshold choice leads to bias in the fit of the extreme value model, while too high a choice leads to unnecessary additional uncertainty in the estimation of model parameters. We develop a novel methodology for automated threshold selection that directly tackles this bias-variance trade-off. We also develop a method to account for the uncertainty in the threshold estimation and propagate this uncertainty through to high quantile inference. Through a simulation study, we demonstrate the effectiveness of our method for threshold selection and subsequent extreme quantile estimation, relative to the leading existing methods, and show how the method's effectiveness is not sensitive to the tuning parameters. We apply our method to the well-known, troublesome example of the River Nidd dataset.
The implication problem for conditional independence (CI) asks whether the fact that a probability distribution obeys a given finite set of CI relations implies that a further CI statement also holds in this distribution. This problem has a long and fascinating history, cumulating in positive results about implications now known as the semigraphoid axioms as well as impossibility results about a general finite characterization of CI implications. Motivated by violation of faithfulness assumptions in causal discovery, we study the implication problem in the special setting where the CI relations are obtained from a directed acyclic graphical (DAG) model along with one additional CI statement. Focusing on the Gaussian case, we give a complete characterization of when such an implication is graphical by using algebraic techniques. Moreover, prompted by the relevance of strong faithfulness in statistical guarantees for causal discovery algorithms, we give a graphical solution for an approximate CI implication problem, in which we ask whether small values of one additional partial correlation entail small values for yet a further partial correlation.
This paper studies many-to-one assignment markets, or matching markets with wages. Although it is well-known that the core of this model is non-empty, the structure of the core has not been fully investigated. To the known dissimilarities with the one-to-one assignment game, we add that the bargaining set does not coincide with the core and the kernel may not be included in the core. Besides, not all extreme core allocations can be obtained by means of a lexicographic maximization or a lexicographic minimization procedure, as it is the case in the one-to-one assignment game. The maximum and minimum competitive salaries are characterized in two ways: axiomatically and by means of easily verifiable properties of an associated directed graph. Regarding the remaining extreme core allocations of the many-to-one assignment game, we propose a lexicographic procedure that, for each order on the set of workers, sequentially maximizes or minimizes each worker's competitive salary. This procedure provides all extreme vectors of competitive salaries, that is all extreme core allocations.
This paper derives a discrete dual problem for a prototypical hybrid high-order method for convex minimization problems. The discrete primal and dual problem satisfy a weak convex duality that leads to a priori error estimates with convergence rates under additional smoothness assumptions. This duality holds for general polyhedral meshes and arbitrary polynomial degrees of the discretization. A novel postprocessing is proposed and allows for a~posteriori error estimates on regular triangulations into simplices using primal-dual techniques. This motivates an adaptive mesh-refining algorithm, which performs superiorly compared to uniform mesh refinements.
The paper concerns problems of the recovery of differential operators from a noisy Fourier transform. In particular, optimal methods are obtained for the recovery of powers of generalized Laplace operators from a noisy Fourier transform in the $L_2$-metric.
Traditional density and quantile estimators are often inconsistent with each other. Their simultaneous usage may lead to inconsistent results. To address this issue, we propose a novel smooth density estimator that is naturally consistent with the Harrell-Davis quantile estimator. We also provide a jittering implementation to support discrete-continuous mixture distributions.
When analyzing large datasets, it is common to select a model prior to making inferences. For reliable inferences, it is important to make adjustments that account for the model selection process, resulting in selective inferences. Our paper introduces an asymptotic pivot to infer about the effects of selected variables on conditional quantile functions. Utilizing estimators from smoothed quantile regression, our proposed pivot is easy to compute and ensures asymptotically-exact selective inferences without making strict distributional assumptions about the response variable. At the core of the pivot is the use of external randomization, which enables us to utilize the full sample for both selection and inference without the need to partition the data into independent data subsets or discard data at either step. On simulated data, we find that: (i) the asymptotic confidence intervals based on our pivot achieve the desired coverage rates, even in cases where sample splitting fails due to insufficient sample size for inference; (ii) our intervals are consistently shorter than those produced by sample splitting across various models and signal settings. We report similar findings when we apply our approach to study risk factors for low birth weights in a publicly accessible dataset of US birth records from 2022.
We propose a method for obtaining parsimonious decompositions of networks into higher order interactions which can take the form of arbitrary motifs.The method is based on a class of analytically solvable generative models, where vertices are connected via explicit copies of motifs, which in combination with non-parametric priors allow us to infer higher order interactions from dyadic graph data without any prior knowledge on the types or frequencies of such interactions. Crucially, we also consider 'degree--corrected' models that correctly reflect the degree distribution of the network and consequently prove to be a better fit for many real world--networks compared to non-degree corrected models. We test the presented approach on simulated data for which we recover the set of underlying higher order interactions to a high degree of accuracy. For empirical networks the method identifies concise sets of atomic subgraphs from within thousands of candidates that cover a large fraction of edges and include higher order interactions of known structural and functional significance. The method not only produces an explicit higher order representation of the network but also a fit of the network to analytically tractable models opening new avenues for the systematic study of higher order network structures.
We consider the problem of causal inference based on observational data (or the related missing data problem) with a binary or discrete treatment variable. In that context we study counterfactual density estimation, which provides more nuanced information than counterfactual mean estimation (i.e., the average treatment effect). We impose the shape-constraint of log-concavity (a unimodality constraint) on the counterfactual densities, and then develop doubly robust estimators of the log-concave counterfactual density (based on an augmented inverse-probability weighted pseudo-outcome), and show the consistency in various global metrics of that estimator. Based on that estimator we also develop asymptotically valid pointwise confidence intervals for the counterfactual density.