In this paper, a thermodynamically consistent solution of the interfacial Riemann problem for the first-order hyperbolic continuum model of Godunov, Peshkov and Romenski (GPR model) is presented. In the presence of phase transition, interfacial physics are governed by molecular interaction on a microscopic scale, beyond the scope of the macroscopic continuum model in the bulk phases. The developed two-phase Riemann solvers tackle this multi-scale problem, by incorporating a local thermodynamic model to predict the interfacial entropy production. Using phenomenological relations of non-equilibrium thermodynamics, interfacial mass and heat fluxes are derived from the entropy production and provide closure at the phase boundary. We employ the proposed Riemann solvers in an efficient sharp interface level-set Ghost-Fluid framework to provide coupling conditions at phase interfaces under phase transition. As a single-phase benchmark, a Rayleigh-B\'enard convection is studied to compare the hyperbolic thermal relaxation formulation of the GPR model against the hyperbolic-parabolic Euler-Fourier system. The novel interfacial Riemann solvers are validated against molecular dynamics simulations of evaporating shock tubes with the Lennard-Jones shifted and truncated potential. On a macroscopic scale, evaporating shock tubes are computed for the material n-Dodecane and compared against Euler-Fourier results. Finally, the efficiency and robustness of the scheme is demonstrated with shock-droplet interaction simulations that involve both phase transfer and surface tension, while featuring severe interface deformations.
We propose an analytical solution for approximating the gradient of the Evidence Lower Bound (ELBO) in variational inference problems where the statistical model is a Bayesian network consisting of observations drawn from a mixture of a Gaussian distribution embedded in unrelated clutter, known as the clutter problem. The method employs the reparameterization trick to move the gradient operator inside the expectation and relies on the assumption that, because the likelihood factorizes over the observed data, the variational distribution is generally more compactly supported than the Gaussian distribution in the likelihood factors. This allows efficient local approximation of the individual likelihood factors, which leads to an analytical solution for the integral defining the gradient expectation. We integrate the proposed gradient approximation as the expectation step in an EM (Expectation Maximization) algorithm for maximizing ELBO and test against classical deterministic approaches in Bayesian inference, such as the Laplace approximation, Expectation Propagation and Mean-Field Variational Inference. The proposed method demonstrates good accuracy and rate of convergence together with linear computational complexity.
The lifted multicut problem is a combinatorial optimization problem whose feasible solutions relate one-to-one to the decompositions of a graph $G = (V, E)$. Given an augmentation $\widehat{G} = (V, E \cup F)$ of $G$ and given costs $c \in \mathbb{R}^{E \cup F}$, the objective is to minimize the sum of those $c_{uw}$ with $uw \in E \cup F$ for which $u$ and $w$ are in distinct components. For $F = \emptyset$, the problem specializes to the multicut problem, and for $E = \tbinom{V}{2}$ to the clique partitioning problem. We study a binary linear program formulation of the lifted multicut problem. More specifically, we contribute to the analysis of the associated lifted multicut polytopes: Firstly, we establish a necessary, sufficient and efficiently decidable condition for a lower box inequality to define a facet. Secondly, we show that deciding whether a cut inequality of the binary linear program defines a facet is NP-hard.
This work presents a procedure to solve the Euler equations by explicitly updating, in a conservative manner, a generic thermodynamic variable such as temperature, pressure or entropy instead of the total energy. The presented procedure is valid for any equation of state and spatial discretization. When using complex equations of state such as Span-Wagner, choosing the temperature as the generic thermodynamic variable yields great reductions in the computational costs associated to thermodynamic evaluations. Results computed with a state of the art thermodynamic model are presented, and computational times are analyzed. Particular attention is dedicated to the conservation of total energy, the propagation speed of shock waves and jump conditions. The procedure is thoroughly tested using the Span-Wagner equation of state through the CoolProp thermodynamic library and the Van der Waals equation of state, both in the ideal and non-ideal compressible fluid-dynamics regimes, by comparing it to the standard total energy update and analytical solutions where available.
We give a constructive characterization of matrices satisfying the reverse-order law for the Moore--Penrose pseudoinverse. In particular, for a given matrix $A$ we construct another matrix $B$, of arbitrary compatible size and chosen rank, in terms of the right singular vectors of $A$, such that the reverse order law for $AB$ is satisfied. Moreover, we show that any matrix satisfying this law comes from a similar construction. As a consequence, several equivalent conditions to $B^+ A^+$ being a pseudoinverse of $AB$ are given, for example $\mathcal{C}(A^*AB)=\mathcal{C}(BB^*A^*)$ or $B\left(AB\right)^+A$ being an orthogonal projection. In addition, we parameterize all possible SVD decompositions of a fixed matrix and give Greville-like equivalent conditions for $B^+A^+$ being a $\{1,2\}$-,$\{1,2,3\}$- and $\{1,2,4\}$-inverse of $AB$, with a geometric insight in terms of the principal angles between $\mathcal{C}(A^*)$ and $\mathcal{C}(B)$.
The presence of political misinformation and ideological echo chambers on social media platforms is concerning given the important role that these sites play in the public's exposure to news and current events. Algorithmic systems employed on these platforms are presumed to play a role in these phenomena, but little is known about their mechanisms and effects. In this work, we conduct an algorithmic audit of Twitter's Who-To-Follow friend recommendation system, the first empirical audit that investigates the impact of this algorithm in-situ. We create automated Twitter accounts that initially follow left and right affiliated U.S. politicians during the 2022 U.S. midterm elections and then grow their information networks using the platform's recommender system. We pair the experiment with an observational study of Twitter users who already follow the same politicians. Broadly, we find that while following the recommendation algorithm leads accounts into dense and reciprocal neighborhoods that structurally resemble echo chambers, the recommender also results in less political homogeneity of a user's network compared to accounts growing their networks through social endorsement. Furthermore, accounts that exclusively followed users recommended by the algorithm had fewer opportunities to encounter content centered on false or misleading election narratives compared to choosing friends based on social endorsement.
This paper proposes a practically efficient algorithm with optimal theoretical regret which solves the classical network revenue management (NRM) problem with unknown, nonparametric demand. Over a time horizon of length $T$, in each time period the retailer needs to decide prices of $N$ types of products which are produced based on $M$ types of resources with unreplenishable initial inventory. When demand is nonparametric with some mild assumptions, Miao and Wang (2021) is the first paper which proposes an algorithm with $O(\text{poly}(N,M,\ln(T))\sqrt{T})$ type of regret (in particular, $\tilde O(N^{3.5}\sqrt{T})$ plus additional high-order terms that are $o(\sqrt{T})$ with sufficiently large $T\gg N$). In this paper, we improve the previous result by proposing a primal-dual optimization algorithm which is not only more practical, but also with an improved regret of $\tilde O(N^{3.25}\sqrt{T})$ free from additional high-order terms. A key technical contribution of the proposed algorithm is the so-called demand balancing, which pairs the primal solution (i.e., the price) in each time period with another price to offset the violation of complementary slackness on resource inventory constraints. Numerical experiments compared with several benchmark algorithms further illustrate the effectiveness of our algorithm.
The uniform one-dimensional fragment of first-order logic was introduced a few years ago as a generalization of the two-variable fragment to contexts involving relations of arity greater than two. Quantifiers in this logic are used in blocks, each block consisting only of existential quantifiers or only of universal quantifiers. In this paper we consider the possibility of mixing both types of quantifiers in blocks. We show the finite (exponential) model property and NExpTime-completeness of the satisfiability problem for two restrictions of the resulting formalism: in the first we require that every block of quantifiers is either purely universal or ends with the existential quantifier, in the second we restrict the number of variables to three; in both equality is not allowed. We also extend the second variation to a rich subfragment of the three-variable fragment (without equality) that still has the finite model property and decidable, NExpTime{}-complete satisfiability.
This paper introduces a novel pipeline to reconstruct the geometry of interacting multi-person in clothing on a globally coherent scene space from a single image. The main challenge arises from the occlusion: a part of a human body is not visible from a single view due to the occlusion by others or the self, which introduces missing geometry and physical implausibility (e.g., penetration). We overcome this challenge by utilizing two human priors for complete 3D geometry and surface contacts. For the geometry prior, an encoder learns to regress the image of a person with missing body parts to the latent vectors; a decoder decodes these vectors to produce 3D features of the associated geometry; and an implicit network combines these features with a surface normal map to reconstruct a complete and detailed 3D humans. For the contact prior, we develop an image-space contact detector that outputs a probability distribution of surface contacts between people in 3D. We use these priors to globally refine the body poses, enabling the penetration-free and accurate reconstruction of interacting multi-person in clothing on the scene space. The results demonstrate that our method is complete, globally coherent, and physically plausible compared to existing methods.
In the era of generative artificial intelligence and the Internet of Things, while there is explosive growth in the volume of data and the associated need for processing, analysis, and storage, several new challenges are faced in identifying spurious and fake information and protecting the privacy of sensitive data. This has led to an increasing demand for more robust and resilient schemes for authentication, integrity protection, encryption, non-repudiation, and privacy-preservation of data. The chapters in this book present some of the state-of-the-art research works in the field of cryptography and security in computing and communications.
In this paper the recoverable robust shortest path problem is investigated. Discrete budgeted interval uncertainty representation is used to model uncertain second-stage arc costs. The known complexity results for this problem are strengthened. It is shown that it is Sigma_3^p-hard for the arc exclusion and the arc symmetric difference neighborhoods. Furthermore, it is also proven that the inner adversarial problem for these neighborhoods is Pi_2^p-hard.