Using $\Gamma$-convergence, we study the Cahn-Hilliard problem with interface width parameter $\varepsilon > 0$ for phase transitions on manifolds with conical singularities. We prove that minimizers of the corresponding energy functional exist and converge, as $\varepsilon \to 0$, to a function that takes only two values with an interface along a hypersurface that has minimal area among those satisfying a volume constraint. In a numerical example, we use continuation and bifurcation methods to study families of critical points at small $\varepsilon > 0$ on 2D elliptical cones, parameterized by height and ellipticity of the base. Some of these critical points are minimizers with interfaces crossing the cone tip. On the other hand, we prove that interfaces which are minimizers of the perimeter functional, corresponding to $\varepsilon = 0$, never pass through the cone tip for general cones with angle less than $2\pi$. Thus tip minimizers for finite $\varepsilon > 0$ must become saddles as $\varepsilon \to 0$, and we numerically identify the associated bifurcation, finding a delicate interplay of $\varepsilon > 0$ and the cone parameters in our example.
We investigate inexact proximity operators for weakly convex functions. To this aim, we derive sum rules for proximal {\epsilon}-subdifferentials, by incorporating the moduli of weak convexity of the functions into the respective formulas. This allows us to investigate inexact proximity operators for weakly convex functions in terms of proximal {\epsilon}-subdifferentials.
In the present paper, we examine a Crouzeix-Raviart approximation of the $p(\cdot)$-Dirichlet problem. We derive a $\textit{medius}$ error estimate, $\textit{i.e.}$, a best-approximation result, which holds for uniformly continuous exponents and implies $\textit{a priori}$ error estimates, which apply for H\"older continuous exponents and are optimal for Lipschitz continuous exponents. Numerical experiments are carried out to review the theoretical findings.
Many analyses of multivariate data focus on evaluating the dependence between two sets of variables, rather than the dependence among individual variables within each set. Canonical correlation analysis (CCA) is a classical data analysis technique that estimates parameters describing the dependence between such sets. However, inference procedures based on traditional CCA rely on the assumption that all variables are jointly normally distributed. We present a semiparametric approach to CCA in which the multivariate margins of each variable set may be arbitrary, but the dependence between variable sets is described by a parametric model that provides low-dimensional summaries of dependence. While maximum likelihood estimation in the proposed model is intractable, we propose two estimation strategies: one using a pseudolikelihood for the model and one using a Markov chain Monte Carlo (MCMC) algorithm that provides Bayesian estimates and confidence regions for the between-set dependence parameters. The MCMC algorithm is derived from a multirank likelihood function, which uses only part of the information in the observed data in exchange for being free of assumptions about the multivariate margins. We apply the proposed Bayesian inference procedure to Brazilian climate data and monthly stock returns from the materials and communications market sectors.
At STOC 2002, Eiter, Gottlob, and Makino presented a technique called ordered generation that yields an $n^{O(d)}$-delay algorithm listing all minimal transversals of an $n$-vertex hypergraph of degeneracy $d$. Recently at IWOCA 2019, Conte, Kant\'e, Marino, and Uno asked whether this XP-delay algorithm parameterized by $d$ could be made FPT-delay for a weaker notion of degeneracy, or even parameterized by the maximum degree $\Delta$, i.e., whether it can be turned into an algorithm with delay $f(\Delta)\cdot n^{O(1)}$ for some computable function $f$. Moreover, and as a first step toward answering that question, they note that they could not achieve these time bounds even for the particular case of minimal dominating sets enumeration. In this paper, using ordered generation, we show that an FPT-delay algorithm can be devised for minimal transversals enumeration parameterized by the degeneracy and dimension, giving a positive and more general answer to the latter question.
This work generalizes the binary search problem to a $d$-dimensional domain $S_1\times\cdots\times S_d$, where $S_i=\{0, 1, \ldots,n_i-1\}$ and $d\geq 1$, in the following way. Given $(t_1,\ldots,t_d)$, the target element to be found, the result of a comparison of a selected element $(x_1,\ldots,x_d)$ is the sequence of inequalities each stating that either $t_i < x_i$ or $t_i>x_i$, for $i\in\{1,\ldots,d\}$, for which at least one is correct, and the algorithm does not know the coordinate $i$ on which the correct direction to the target is given. Among other cases, we show asymptotically almost matching lower and upper bounds of the query complexity to be in $\Omega(n^{d-1}/d)$ and $O(n^d)$ for the case of $n_i=n$. In particular, for fixed $d$ these bounds asymptotically do match. This problem is equivalent to the classical binary search in case of one dimension and shows interesting differences for higher dimensions. For example, if one would impose that each of the $d$ inequalities is correct, then the search can be completed in $\log_2\max\{n_1,\ldots,n_d\}$ queries. In an intermediate model when the algorithm knows which one of the inequalities is correct the sufficient number of queries is $\log_2(n_1\cdot\ldots\cdot n_d)$. The latter follows from a graph search model proposed by Emamjomeh-Zadeh et al. [STOC 2016].
Are asymptotic confidence sequences and anytime $p$-values uniformly valid for a nontrivial class of distributions $\mathcal{P}$? We give a positive answer to this question by deriving distribution-uniform anytime-valid inference procedures. Historically, anytime-valid methods -- including confidence sequences, anytime $p$-values, and sequential hypothesis tests that enable inference at stopping times -- have been justified nonasymptotically. Nevertheless, asymptotic procedures such as those based on the central limit theorem occupy an important part of statistical toolbox due to their simplicity, universality, and weak assumptions. While recent work has derived asymptotic analogues of anytime-valid methods with the aforementioned benefits, these were not shown to be $\mathcal{P}$-uniform, meaning that their asymptotics are not uniformly valid in a class of distributions $\mathcal{P}$. Indeed, the anytime-valid inference literature currently has no central limit theory to draw from that is both uniform in $\mathcal{P}$ and in the sample size $n$. This paper fills that gap by deriving a novel $\mathcal{P}$-uniform strong Gaussian approximation theorem. We apply some of these results to obtain an anytime-valid test of conditional independence without the Model-X assumption, as well as a $\mathcal{P}$-uniform law of the iterated logarithm.
We consider an observed subcritical Galton Watson process $\{Y_n,\ n\in \mathbb{Z} \}$ with correlated stationary immigration process $\{\epsilon_n,\ n\in \mathbb{Z} \}$. Two situations are presented. The first one is when $\mbox{Cov}(\epsilon_0,\epsilon_k)=0$ for $k$ larger than some $k_0$: a consistent estimator for the reproduction and mean immigration rates is given, and a central limit theorem is proved. The second one is when $\{\epsilon_n,\ n\in \mathbb{Z} \}$ has general correlation structure: under mixing assumptions, we exhibit an estimator for the the logarithm of the reproduction rate and we prove that it converges in quadratic mean with explicit speed. In addition, when the mixing coefficients decrease fast enough, we provide and prove a two terms expansion for the estimator. Numerical illustrations are provided.
In this work, we propose an efficient nullspace-preserving saddle search (NPSS) method for a class of phase transitions involving translational invariance, where the critical states are often degenerate. The NPSS method includes two stages, escaping from the basin and searching for the index-1 generalized saddle point. The NPSS method climbs upward from the generalized local minimum in segments to overcome the challenges of degeneracy. In each segment, an effective ascent direction is ensured by keeping this direction orthogonal to the nullspace of the initial state in this segment. This method can escape the basin quickly and converge to the transition states. We apply the NPSS method to the phase transitions between crystals, and between crystal and quasicrystal, based on the Landau-Brazovskii and Lifshitz-Petrich free energy functionals. Numerical results show a good performance of the NPSS method.
We describe a new dependent-rounding algorithmic framework for bipartite graphs. Given a fractional assignment $\vec x$ of values to edges of graph $G = (U \cup V, E)$, the algorithms return an integral solution $\vec X$ such that each right-node $v \in V$ has at most one neighboring edge $f$ with $X_f = 1$, and where the variables $X_e$ also satisfy broad nonpositive-correlation properties. In particular, for any edges $e_1, e_2$ sharing a left-node $u \in U$, the variables $X_{e_1}, X_{e_2}$ have strong negative-correlation properties, i.e. the expectation of $X_{e_1} X_{e_2}$ is significantly below $x_{e_1} x_{e_2}$. This algorithm is based on generating negatively-correlated Exponential random variables and using them in a contention-resolution scheme inspired by an algorithm Im & Shadloo (2020). Our algorithm gives stronger and much more flexible negative correlation properties. Dependent rounding schemes with negative correlation properties have been used for approximation algorithms for job-scheduling on unrelated machines to minimize weighted completion times (Bansal, Srinivasan, & Svensson (2021), Im & Shadloo (2020), Im & Li (2023)). Using our new dependent-rounding algorithm, among other improvements, we obtain a $1.398$-approximation for this problem. This significantly improves over the prior $1.45$-approximation ratio of Im & Li (2023).
We consider the problem of counting 4-cycles ($C_4$) in an undirected graph $G$ of $n$ vertices and $m$ edges (in bipartite graphs, 4-cycles are also often referred to as $\textit{butterflies}$). There have been a number of previous algorithms for this problem based on sorting the graph by degree and using randomized hash tables. These are appealing in theory due to compact storage and fast access on average. But, the performance of hash tables can degrade unpredictably and are also vulnerable to adversarial input. We develop a new simpler algorithm for counting $C_4$ requiring $O(m\bar\delta(G))$ time and $O(n)$ space, where $\bar \delta(G) \leq O(\sqrt{m})$ is the $\textit{average degeneracy}$ parameter introduced by Burkhardt, Faber & Harris (2020). It has several practical improvements over previous algorithms; for example, it is fully deterministic, does not require any sorting of the input graph, and uses only addition and array access in its inner loops. To the best of our knowledge, all previous efficient algorithms for $C_4$ counting have required $\Omega(m)$ space in addition to storing the input graph. Our algorithm is very simple and easily adapted to count 4-cycles incident to each vertex and edge. Empirical tests demonstrate that our array-based approach is $4\times$ -- $7\times$ faster on average compared to popular hash table implementations.