In this work, we propose extropy measures based on density copula, distributional copula, and survival copula, and explore their properties. We study the effect of monotone transformations for the proposed measures and obtain bounds. We establish connections between cumulative copula extropy and three dependence measures: Spearman's rho, Kendall's tau, and Blest's measure of rank correlation. Finally, we propose estimators for the cumulative copula extropy and survival copula extropy with an illustration using real life datasets.
This paper explores an iterative coupling approach to solve linear thermo-poroelasticity problems, with its application as a high-fidelity discretization utilizing finite elements during the training of projection-based reduced order models. One of the main challenges in addressing coupled multi-physics problems is the complexity and computational expenses involved. In this study, we introduce a decoupled iterative solution approach, integrated with reduced order modeling, aimed at augmenting the efficiency of the computational algorithm. The iterative coupling technique we employ builds upon the established fixed-stress splitting scheme that has been extensively investigated for Biot's poroelasticity. By leveraging solutions derived from this coupled iterative scheme, the reduced order model employs an additional Galerkin projection onto a reduced basis space formed by a small number of modes obtained through proper orthogonal decomposition. The effectiveness of the proposed algorithm is demonstrated through numerical experiments, showcasing its computational prowess.
We propose a finite difference scheme for the numerical solution of a two-dimensional singularly perturbed convection-diffusion partial differential equation whose solution features interacting boundary and interior layers, the latter due to discontinuities in source term. The problem is posed on the unit square. The second derivative is multiplied by a singular perturbation parameter, $\epsilon$, while the nature of the first derivative term is such that flow is aligned with a boundary. These two facts mean that solutions tend to exhibit layers of both exponential and characteristic type. We solve the problem using a finite difference method, specially adapted to the discontinuities, and applied on a piecewise-uniform (Shishkin). We prove that that the computed solution converges to the true one at a rate that is independent of the perturbation parameter, and is nearly first-order. We present numerical results that verify that these results are sharp.
In the Bayes paradigm and for a given loss function, we propose the construction of a new type of posterior distributions, that extends the classical Bayes one, for estimating the law of an $n$-sample. The loss functions we have in mind are based on the total variation and Hellinger distances as well as some $\mathbb{L}_{j}$-ones. We prove that, with a probability close to one, this new posterior distribution concentrates its mass in a neighbourhood of the law of the data, for the chosen loss function, provided that this law belongs to the support of the prior or, at least, lies close enough to it. We therefore establish that the new posterior distribution enjoys some robustness properties with respect to a possible misspecification of the prior, or more precisely, its support. For the total variation and squared Hellinger losses, we also show that the posterior distribution keeps its concentration properties when the data are only independent, hence not necessarily i.i.d., provided that most of their marginals or the average of these are close enough to some probability distribution around which the prior puts enough mass. The posterior distribution is therefore also stable with respect to the equidistribution assumption. We illustrate these results by several applications. We consider the problems of estimating a location parameter or both the location and the scale of a density in a nonparametric framework. Finally, we also tackle the problem of estimating a density, with the squared Hellinger loss, in a high-dimensional parametric model under some sparsity conditions. The results established in this paper are non-asymptotic and provide, as much as possible, explicit constants.
In this work, we present the physics-informed neural network (PINN) model applied particularly to dynamic problems in solid mechanics. We focus on forward and inverse problems. Particularly, we show how a PINN model can be used efficiently for material identification in a dynamic setting. In this work, we assume linear continuum elasticity. We show results for two-dimensional (2D) plane strain problem and then we proceed to apply the same techniques for a three-dimensional (3D) problem. As for the training data we use the solution based on the finite element method. We rigorously show that PINN models are accurate, robust and computationally efficient, especially as a surrogate model for material identification problems. Also, we employ state-of-the-art techniques from the PINN literature which are an improvement to the vanilla implementation of PINN. Based on our results, we believe that the framework we have developed can be readily adapted to computational platforms for solving multiple dynamic problems in solid mechanics.
We introduce a novel sampler called the energy based diffusion generator for generating samples from arbitrary target distributions. The sampling model employs a structure similar to a variational autoencoder, utilizing a decoder to transform latent variables from a simple distribution into random variables approximating the target distribution, and we design an encoder based on the diffusion model. Leveraging the powerful modeling capacity of the diffusion model for complex distributions, we can obtain an accurate variational estimate of the Kullback-Leibler divergence between the distributions of the generated samples and the target. Moreover, we propose a decoder based on generalized Hamiltonian dynamics to further enhance sampling performance. Through empirical evaluation, we demonstrate the effectiveness of our method across various complex distribution functions, showcasing its superiority compared to existing methods.
In the analysis of cluster randomized trials, two typical features are that individuals within a cluster are correlated and that the total number of clusters can sometimes be limited. While model-robust treatment effect estimators have been recently developed, their asymptotic theory requires the number of clusters to approach infinity, and one often has to empirically assess the applicability of those methods in finite samples. To address this challenge, we propose a conformal causal inference framework that achieves the target coverage probability of treatment effects in finite samples without the need for asymptotic approximations. Meanwhile, we prove that this framework is compatible with arbitrary working models, including machine learning algorithms leveraging baseline covariates, possesses robustness against arbitrary misspecification of working models, and accommodates a variety of within-cluster correlations. Under this framework, we offer efficient algorithms to make inferences on treatment effects at both the cluster and individual levels, applicable to user-specified covariate subgroups and two types of test data. Finally, we demonstrate our methods via simulations and a real data application based on a cluster randomized trial for treating chronic pain.
In this article, we propose an interval constraint programming method for globally solving catalog-based categorical optimization problems. It supports catalogs of arbitrary size and properties of arbitrary dimension, and does not require any modeling effort from the user. A novel catalog-based contractor (or filtering operator) guarantees consistency between the categorical properties and the existing catalog items. This results in an intuitive and generic approach that is exact, rigorous (robust to roundoff errors) and can be easily implemented in an off-the-shelf interval-based continuous solver that interleaves branching and constraint propagation. We demonstrate the validity of the approach on a numerical problem in which a categorical variable is described by a two-dimensional property space. A Julia prototype is available as open-source software under the MIT license at //github.com/cvanaret/CateGOrical.jl
We study effective randomness-preserving transformations of path-incompressible trees. Some path-incompressible trees with infinitely many paths do not compute perfect path-random trees with computable oracle-use. Sparse perfect path-incompressible trees can be effectively densified, almost surely. We characterize the branching density of path-random trees.
A well-balanced second-order finite volume scheme is proposed and analyzed for a 2 X 2 system of non-linear partial differential equations which describes the dynamics of growing sandpiles created by a vertical source on a flat, bounded rectangular table in multiple dimensions. To derive a second-order scheme, we combine a MUSCL type spatial reconstruction with strong stability preserving Runge-Kutta time stepping method. The resulting scheme is ensured to be well-balanced through a modified limiting approach that allows the scheme to reduce to well-balanced first-order scheme near the steady state while maintaining the second-order accuracy away from it. The well-balanced property of the scheme is proven analytically in one dimension and demonstrated numerically in two dimensions. Additionally, numerical experiments reveal that the second-order scheme reduces finite time oscillations, takes fewer time iterations for achieving the steady state and gives sharper resolutions of the physical structure of the sandpile, as compared to the existing first-order schemes of the literature.
This work, in a pioneering approach, attempts to build a biometric system that works purely based on the fluid mechanics governing exhaled breath. We test the hypothesis that the structure of turbulence in exhaled human breath can be exploited to build biometric algorithms. This work relies on the idea that the extrathoracic airway is unique for every individual, making the exhaled breath a biomarker. Methods including classical multi-dimensional hypothesis testing approach and machine learning models are employed in building user authentication algorithms, namely user confirmation and user identification. A user confirmation algorithm tries to verify whether a user is the person they claim to be. A user identification algorithm tries to identify a user's identity with no prior information available. A dataset of exhaled breath time series samples from 94 human subjects was used to evaluate the performance of these algorithms. The user confirmation algorithms performed exceedingly well for the given dataset with over $97\%$ true confirmation rate. The machine learning based algorithm achieved a good true confirmation rate, reiterating our understanding of why machine learning based algorithms typically outperform classical hypothesis test based algorithms. The user identification algorithm performs reasonably well with the provided dataset with over $50\%$ of the users identified as being within two possible suspects. We show surprisingly unique turbulent signatures in the exhaled breath that have not been discovered before. In addition to discussions on a novel biometric system, we make arguments to utilise this idea as a tool to gain insights into the morphometric variation of extrathoracic airway across individuals. Such tools are expected to have future potential in the area of personalised medicines.