亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

The complexity of the list homomorphism problem for signed graphs appears difficult to classify. Existing results focus on special classes of signed graphs, such as trees and reflexive signed graphs. Irreflexive signed graphs are in a certain sense the heart of the problem, as noted by a recent paper of Kim and Siggers. We focus on a special class of irreflexive signed graphs, namely those in which the unicoloured edges form a spanning path or cycle, which we call separable signed graphs. We classify the complexity of list homomorphisms to these separable signed graphs; we believe that these signed graphs will play an important role for the general resolution of the irreflexive case. We also relate our results to a conjecture of Kim and Siggers concerning the special case of semi-balanced irreflexive signed graphs; we have proved the conjecture in another paper, and the present results add structural information to that topic.

相關內容

Quantum computing becomes more of a reality as time passes, bringing several cybersecurity challenges. Modern cryptography is based on the computational complexity of specific mathematical problems, but as new quantum-based computers appear, classical methods might not be enough to secure communications. In this paper, we analyse the state of the Galileo Open Service Navigation Message Authentication (OSNMA) to overcome these new threats. This analysis and its assessment have been performed using OSNMA documentation, reviewing the available Post Quantum Cryptography (PQC) algorithms competing in the National Institute of Standards and Technology (NIST) standardization process, and studying the possibility of its implementation in the Galileo service. The main barrier to adopting the PQC approach is the size of both the signature and the key. The analysis shows that OSNMA is not yet prepared to face the quantum threat, and a significant change would be required. This work concludes by assessing different temporal countermeasures that can be implemented to sustain the system's integrity in the short term.

The delimitation of biological species, i.e., deciding which individuals belong to the same species and whether and how many different species are represented in a data set, is key to the conservation of biodiversity. Much existing work uses only genetic data for species delimitation, often employing some kind of cluster analysis. This can be misleading, because geographically distant groups of individuals can be genetically quite different even if they belong to the same species. We investigate the problem of testing whether two potentially separated groups of individuals can belong to a single species or not based on genetic and spatial data. Existing methods such as the partial Mantel test and jackknife-based distance-distance regression are considered. New approaches, i.e., an adaptation of a mixed effects model, a bootstrap approach, and a jackknife version of partial Mantel, are proposed. All these methods address the issue that distance data violate the independence assumption for standard inference regarding correlation and regression; a standard linear regression is also considered. The approaches are compared on simulated meta-populations generated with SLiM and GSpace - two software packages that can simulate spatially-explicit genetic data at an individual level. Simulations show that the new jackknife version of the partial Mantel test provides a good compromise between power and respecting the nominal type I error rate. Mixed-effects models have larger power than jackknife-based methods, but tend to display type I error rates slightly above the significance level. An application on brassy ringlets concludes the paper.

Many combinatorial optimization problems can be formulated as the search for a subgraph that satisfies certain properties and minimizes the total weight. We assume here that the vertices correspond to points in a metric space and can take any position in given uncertainty sets. Then, the cost function to be minimized is the sum of the distances for the worst positions of the vertices in their uncertainty sets. We propose two types of polynomial-time approximation algorithms. The first one relies on solving a deterministic counterpart of the problem where the uncertain distances are replaced with maximum pairwise distances. We study in details the resulting approximation ratio, which depends on the structure of the feasible subgraphs and whether the metric space is Ptolemaic or not. The second algorithm is a fully-polynomial time approximation scheme for the special case of $s-t$ paths.

We consider the nonparametric regression problem when the covariates are located on an unknown smooth compact submanifold of a Euclidean space. Under defining a random geometric graph structure over the covariates we analyze the asymptotic frequentist behaviour of the posterior distribution arising from Bayesian priors designed through random basis expansion in the graph Laplacian eigenbasis. Under Holder smoothness assumption on the regression function and the density of the covariates over the submanifold, we prove that the posterior contraction rates of such methods are minimax optimal (up to logarithmic factors) for any positive smoothness index.

We detail for the first time a complete explicit description of the quasi-cyclic structure of all classical finite generalized quadrangles. Using these descriptions we construct families of quasi-cyclic LDPC codes derived from the point-line incidence matrix of the quadrangles by explicitly calculating quasi-cyclic generator and parity check matrices for these codes. This allows us to construct parity check and generator matrices of all such codes of length up to 400000. These codes cover a wide range of transmission rates, are easy and fast to implement and perform close to Shannon's limit with no visible error floors. We also include some performance data for these codes. Furthermore, we include a complete explicit description of the quasi-cyclic structure of the point-line and point-hyperplane incidences of the finite projective and affine spaces.

Robotic exploration has long captivated researchers aiming to map complex environments efficiently. Techniques such as potential fields and frontier exploration have traditionally been employed in this pursuit, primarily focusing on solitary agents. Recent advancements have shifted towards optimizing exploration efficiency through multiagent systems. However, many existing approaches overlook critical real-world factors, such as broadcast range limitations, communication costs, and coverage overlap. This paper addresses these gaps by proposing a distributed maze exploration strategy (CU-LVP) that assumes constrained broadcast ranges and utilizes Voronoi diagrams for better area partitioning. By adapting traditional multiagent methods to distributed environments with limited broadcast ranges, this study evaluates their performance across diverse maze topologies, demonstrating the efficacy and practical applicability of the proposed method. The code and experimental results supporting this study are available in the following repository: //github.com/manouslinard/multiagent-exploration/.

Differential abundance analysis is a key component of microbiome studies. While dozens of methods for it exist, currently, there is no consensus on the preferred methods. Correctness of results in differential abundance analysis is an ambiguous concept that cannot be evaluated without employing simulated data, but we argue that consistency of results across datasets should be considered as an essential quality of a well-performing method. We compared the performance of 14 differential abundance analysis methods employing datasets from 54 taxonomic profiling studies based on 16S rRNA gene or shotgun sequencing. For each method, we examined how the results replicated between random partitions of each dataset and between datasets from independent studies. While certain methods showed good consistency, some widely used methods were observed to produce a substantial number of conflicting findings. Overall, the highest consistency without unnecessary reduction in sensitivity was attained by analyzing relative abundances with a non-parametric method (Wilcoxon test or ordinal regression model) or linear regression (MaAsLin2). Comparable performance was also attained by analyzing presence/absence of taxa with logistic regression.

The problems of optimal recovering univariate functions and their derivatives are studied. To solve these problems, two variants of the truncation method are constructed, which are order-optimal both in the sense of accuracy and in terms of the amount of involved Galerkin information. For numerical summation, it has been established how the parameters characterizing the problem being solved affect its stability.

Predictive posterior densities (PPDs) are of interest in approximate Bayesian inference. Typically, these are estimated by simple Monte Carlo (MC) averages using samples from the approximate posterior. We observe that the signal-to-noise ratio (SNR) of such estimators can be extremely low. An analysis for exact inference reveals SNR decays exponentially as there is an increase in (a) the mismatch between training and test data, (b) the dimensionality of the latent space, or (c) the size of the test data relative to the training data. Further analysis extends these results to approximate inference. To remedy the low SNR problem, we propose replacing simple MC sampling with importance sampling using a proposal distribution optimized at test time on a variational proxy for the SNR and demonstrate that this yields greatly improved estimates.

Temporal reasoning with conditionals is more complex than both classical temporal reasoning and reasoning with timeless conditionals, and can lead to some rather counter-intuitive conclusions. For instance, Aristotle's famous "Sea Battle Tomorrow" puzzle leads to a fatalistic conclusion: whether there will be a sea battle tomorrow or not, but that is necessarily the case now. We propose a branching-time logic LTC to formalise reasoning about temporal conditionals and provide that logic with adequate formal semantics. The logic LTC extends the Nexttime fragment of CTL*, with operators for model updates, restricting the domain to only future moments where antecedent is still possible to satisfy. We provide formal semantics for these operators that implements the restrictor interpretation of antecedents of temporalized conditionals, by suitably restricting the domain of discourse. As a motivating example, we demonstrate that a naturally formalised in our logic version of the `Sea Battle' argument renders it unsound, thereby providing a solution to the problem with fatalist conclusion that it entails, because its underlying reasoning per cases argument no longer applies when these cases are treated not as material implications but as temporal conditionals. On the technical side, we analyze the semantics of LTC and provide a series of reductions of LTC-formulae, first recursively eliminating the dynamic update operators and then the path quantifiers in such formulae. Using these reductions we obtain a sound and complete axiomatization for LTC, and reduce its decision problem to that of the modal logic KD.

北京阿比特科技有限公司