亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

Equipping the rototranslation group $SE(2)$ with a sub-Riemannian structure inspired by the visual cortex V1, we propose algorithms for image inpainting and enhancement based on hypoelliptic diffusion. We innovate on previous implementations of the methods by Citti, Sarti, and Boscain et al., by proposing an alternative that prevents fading and is capable of producing sharper results in a procedure that we call WaxOn-WaxOff. We also exploit the sub-Riemannian structure to define a completely new unsharp filter using $SE(2)$, analogous to the classical unsharp filter for 2D image processing. We demonstrate our method on blood vessels enhancement in retinal scans.

相關內容

圖像修復(英語:Inpainting)指重建的圖像和視頻中丟失或損壞的部分的過程。例如在博物館中,這項工作常由經驗豐富的博物館管理員或者藝術品修復師來進行。數碼世界中,圖像修復又稱圖像插值或視頻插值,指利用復雜的算法來替換已丟失、損壞的圖像數據,主要替換一些小區域和瑕疵。

Industrial applications heavily rely on open-source software (OSS) libraries, which provide various benefits. But, they can also present a substantial risk if a vulnerability or attack arises and the community fails to promptly address the issue and release a fix due to inactivity. To be able to monitor the activities of such communities, a comprehensive list of repositories for the libraries of an ecosystem must be accessible. Based on these repositories, integrated libraries of an application can be monitored to observe whether they are adequately maintained. In this descriptive study, we analyze the accessibility of GitHub repositories for PyPI and NPM libraries. For all available libraries, we extract assigned repository URLs, direct dependencies and use the page rank algorithm to comprehensively analyze the ecosystems from a library and dependency chain perspective. For invalid repository URLs, we derive potential reasons. Both ecosystems show varying accessibility to GitHub repository URLs, depending on the page rank score of the analyzed libraries. For individual libraries, up to 73.8% of PyPI and up to 69.4% of NPM libraries have repository URLs. Within dependency chains, up to 80.1% of PyPI libraries have URLs, while up to 81.1% for NPM. That means, most libraries, especially the ones of increasing importance, can be monitored on GitHub. Among the most common reasons for invalid repository URLs is no URLs being assigned at all, which amounts up to 17.9% for PyPI and up to 39.6% for NPM. Package maintainers should address this issue and update the repository information to enable monitoring of their libraries.

In the context of right-censored data, we study the problem of predicting the restricted time to event based on a set of covariates. Under a quadratic loss, this problem is equivalent to estimating the conditional Restricted Mean Survival Time (RMST). To that aim, we propose a flexible and easy-to-use ensemble algorithm that combines pseudo-observations and super learner. The classical theoretical results of the super learner are extended to right-censored data, using a new definition of pseudo-observations, the so-called split pseudo-observations. Simulation studies indicate that the split pseudo-observations and the standard pseudo-observations are similar even for small sample sizes. The method is applied to maintenance and colon cancer datasets, showing the interest of the method in practice, as compared to other prediction methods. We complement the predictions obtained from our method with our RMST-adapted risk measure, prediction intervals and variable importance measures developed in a previous work.

The problem Level Planarity asks for a crossing-free drawing of a graph in the plane such that vertices are placed at prescribed y-coordinates (called levels) and such that every edge is realized as a y-monotone curve. In the variant Constrained Level Planarity, each level y is equipped with a partial order <_y on its vertices and in the desired drawing the left-to-right order of vertices on level y has to be a linear extension of <_y. Constrained Level Planarity is known to be a remarkably difficult problem: previous results by Klemz and Rote [ACM Trans. Alg. 2019] and by Br\"uckner and Rutter [SODA 2017] imply that it remains NP-hard even when restricted to graphs whose tree-depth and feedback vertex set number are bounded by a constant and even when the instances are additionally required to be either proper, meaning that each edge spans two consecutive levels, or ordered, meaning that all given partial orders are total orders. In particular, these results rule out the existence of FPT-time (even XP-time) algorithms with respect to these and related graph parameters (unless P=NP). However, the parameterized complexity of Constrained Level Planarity with respect to the vertex cover number of the input graph remained open. In this paper, we show that Constrained Level Planarity can be solved in FPT-time when parameterized by the vertex cover number. In view of the previous intractability statements, our result is best-possible in several regards: a speed-up to polynomial time or a generalization to the aforementioned smaller graph parameters is not possible, even if restricting to proper or ordered instances.

We analyse abstract data types that model numerical structures with a concept of error. Specifically, we focus on arithmetic data types that contain an error value $\bot$ whose main purpose is to always return a value for division. To rings and fields, we add a division operator $x/y$ and study a class of algebras called common meadows wherein $x/0 = \bot$. The set of equations true in all common meadows is named the equational theory of common meadows. We give a finite equational axiomatisation of the equational theory of common meadows and prove that it is complete and that the equational theory is decidable.

The Unique Games Conjecture (UGC) constitutes a highly dynamic subarea within computational complexity theory, intricately linked to the outstanding P versus NP problem. Despite multiple insightful results in the past few years, a proof for the conjecture remains elusive. In this work, we construct a novel dynamical systems-based approach for studying unique games and, more generally, the field of computational complexity. We propose a family of dynamical systems whose equilibria correspond to solutions of unique games and prove that unsatisfiable instances lead to ergodic dynamics. Moreover, as the instance hardness increases, the weight of the invariant measure in the vicinity of the optimal assignments scales polynomially, sub-exponentially, or exponentially depending on the value gap. We numerically reproduce a previously hypothesized hardness plot associated with the UGC. Our results indicate that the UGC is likely true, subject to our proposed conjectures that link dynamical systems theory with computational complexity.

In the $k$-Disjoint Shortest Paths ($k$-DSP) problem, we are given a weighted graph $G$ on $n$ nodes and $m$ edges with specified source vertices $s_1, \dots, s_k$, and target vertices $t_1, \dots, t_k$, and are tasked with determining if $G$ contains vertex-disjoint $(s_i,t_i)$-shortest paths. For any constant $k$, it is known that $k$-DSP can be solved in polynomial time over undirected graphs and directed acyclic graphs (DAGs). However, the exact time complexity of $k$-DSP remains mysterious, with large gaps between the fastest known algorithms and best conditional lower bounds. In this paper, we obtain faster algorithms for important cases of $k$-DSP, and present better conditional lower bounds for $k$-DSP and its variants. Previous work solved 2-DSP over weighted undirected graphs in $O(n^7)$ time, and weighted DAGs in $O(mn)$ time. For the main result of this paper, we present linear time algorithms for solving 2-DSP on weighted undirected graphs and DAGs. Our algorithms are algebraic however, and so only solve the detection rather than search version of 2-DSP. For lower bounds, prior work implied that $k$-Clique can be reduced to $2k$-DSP in DAGs and undirected graphs with $O((kn)^2)$ nodes. We improve this reduction, by showing how to reduce from $k$-Clique to $k$-DSP in DAGs and undirected graphs with $O((kn)^2)$ nodes. A variant of $k$-DSP is the $k$-Disjoint Paths ($k$-DP) problem, where the solution paths no longer need to be shortest paths. Previous work reduced from $k$-Clique to $p$-DP in DAGs with $O(kn)$ nodes, for $p= k + k(k-1)/2$. We improve this by showing a reduction from $k$-Clique to $p$-DP, for $p=k + \lfloor k^2/4\rfloor$. Under the $k$-Clique Hypothesis from fine-grained complexity, our results establish better conditional lower bounds for $k$-DSP for all $k\ge 4$, and better conditional lower bounds for $p$-DP for all $p\le 4031$.

We introduce a new generalization of relative entropy to non-negative vectors with sums $\gt 1$. We show in a purely combinatorial setting, with no probabilistic considerations, that in the presence of linear constraints defining a convex polytope, a concentration phenomenon arises for this generalized relative entropy, and we quantify the concentration precisely. We also present a probabilistic formulation, and extend the concentration results to it. In addition, we provide a number of simplifications and improvements to our previous work, notably in dualizing the optimization problem, in the concentration with respect to $\ell_{\infty}$ distance, and in the relationship to generalized KL-divergence. A number of our results apply to general compact convex sets, not necessarily polyhedral.

In the Minimum Consistent Subset (MCS) problem, we are presented with a connected simple undirected graph $G=(V,E)$, consisting of a vertex set $V$ of size $n$ and an edge set $E$. Each vertex in $V$ is assigned a color from the set $\{1,2,\ldots, c\}$. The objective is to determine a subset $V' \subseteq V$ with minimum possible cardinality, such that for every vertex $v \in V$, at least one of its nearest neighbors in $V'$ (measured in terms of the hop distance) shares the same color as $v$. The decision problem, indicating whether there exists a subset $V'$ of cardinality at most $l$ for some positive integer $l$, is known to be NP-complete even for planar graphs. In this paper, we establish that the MCS problem for trees, when the number of colors $c$ is considered an input parameter, is NP-complete. We propose a fixed-parameter tractable (FPT) algorithm for MCS on trees running in $O(2^{6c}n^6)$ time, significantly improving the currently best-known algorithm whose running time is $O(2^{4c}n^{2c+3})$. In an effort to comprehensively understand the computational complexity of the MCS problem across different graph classes, we extend our investigation to interval graphs. We show that it remains NP-complete for interval graphs, thus enriching graph classes where MCS remains intractable.

In Basili and Pratelli (2024), a novel and coherent concept of interval probability measures has been introduced, providing a method for representing imprecise probabilities and uncertainty. Within the framework of set algebra, we introduced the concepts of weak complementation and interval probability measures associated with a family of random variables, which effectively capture the inherent uncertainty in any event. This paper conducts a comprehensive analysis of these concepts within a specific probability space. Additionally, we elaborate on an updating rule for events, integrating essential concepts of statistical independence, dependence, and stochastic dominance.

In this work, we present an extensive analysis of clock synchronization algorithms, with a specific focus on message complexity. We begin by introducing fundamental concepts in clock synchronization, such as the Byzantine generals problem and specific concepts like clock accuracy, precision, skew, offset, timestamping, and clock drift estimation. Describing the concept of logical clocks, their implementation in distributed systems is discussed, highlighting their significance and various approaches. The paper then examines four prominent clock synchronization algorithms: Lamport's Algorithm, Ricart-Agrawala Algorithm, Vector Clocks Algorithm, and Christian's Algorithm. Special attention is given to the analysis of message complexity, providing insights into the efficiency of each algorithm. Finally, we compare the message complexities of the discussed algorithms.

北京阿比特科技有限公司