亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

We present subquadratic algorithms in the algebraic decision-tree model for several \textsc{3Sum}-hard geometric problems, all of which can be reduced to the following question: Given two sets $A$, $B$, each consisting of $n$ pairwise disjoint segments in the plane, and a set $C$ of $n$ triangles in the plane, we want to count, for each triangle $\Delta\in C$, the number of intersection points between the segments of $A$ and those of $B$ that lie in $\Delta$. The problems considered in this paper have been studied by Chan~(2020), who gave algorithms that solve them, in the standard real-RAM model, in $O((n^2/\log^2n)\log^{O(1)}\log n)$ time. We present solutions in the algebraic decision-tree model whose cost is $O(n^{60/31+\varepsilon})$, for any $\varepsilon>0$. Our approach is based on a primal-dual range searching mechanism, which exploits the multi-level polynomial partitioning machinery recently developed by Agarwal, Aronov, Ezra, and Zahl~(2020). A key step in the procedure is a variant of point location in arrangements, say of lines in the plane, which is based solely on the \emph{order type} of the lines, a "handicap" that turns out to be beneficial for speeding up our algorithm.

相關內容

ACM/IEEE第23屆模型驅動工程語言和系統國際會議,是模型驅動軟件和系統工程的首要會議系列,由ACM-SIGSOFT和IEEE-TCSE支持組織。自1998年以來,模型涵蓋了建模的各個方面,從語言和方法到工具和應用程序。模特的參加者來自不同的背景,包括研究人員、學者、工程師和工業專業人士。MODELS 2019是一個論壇,參與者可以圍繞建模和模型驅動的軟件和系統交流前沿研究成果和創新實踐經驗。今年的版本將為建模社區提供進一步推進建模基礎的機會,并在網絡物理系統、嵌入式系統、社會技術系統、云計算、大數據、機器學習、安全、開源等新興領域提出建模的創新應用以及可持續性。 官網鏈接: · 賭博機/老虎機 · 模型選擇 · 線性的 · MoDELS ·
2021 年 11 月 8 日

Model selection in contextual bandits is an important complementary problem to regret minimization with respect to a fixed model class. We consider the simplest non-trivial instance of model-selection: distinguishing a simple multi-armed bandit problem from a linear contextual bandit problem. Even in this instance, current state-of-the-art methods explore in a suboptimal manner and require strong "feature-diversity" conditions. In this paper, we introduce new algorithms that a) explore in a data-adaptive manner, and b) provide model selection guarantees of the form $\mathcal{O}(d^{\alpha} T^{1- \alpha})$ with no feature diversity conditions whatsoever, where $d$ denotes the dimension of the linear model and $T$ denotes the total number of rounds. The first algorithm enjoys a "best-of-both-worlds" property, recovering two prior results that hold under distinct distributional assumptions, simultaneously. The second removes distributional assumptions altogether, expanding the scope for tractable model selection. Our approach extends to model selection among nested linear contextual bandits under some additional assumptions.

In the study of geometric problems, the complexity class $\exists \mathbb{R}$ turned out to play a crucial role. It exhibits a deep connection between purely geometric problems and real algebra, and is sometimes referred to as the "real analogue" to the class NP. While NP can be considered as a class of computational problems that deals with existentially quantified boolean variables, $\exists \mathbb{R}$ deals with existentially quantified real variables. In analogy to $\Pi_2^p$ and $\Sigma_2^p$ in the famous polynomial hierarchy, we introduce and motivate the complexity classes $\forall\exists \mathbb{R}$ and $\exists \forall \mathbb{R}$ with real variables. Our main interest is focused on the Area Universality problem, where we are given a plane graph $G$, and ask if for each assignment of areas to the inner faces of $G$ there is an area-realizing straight-line drawing of $G$. We conjecture that the problem Area Universality is $\forall\exists \mathbb{R}$-complete and support this conjecture by a series of partial results, where we prove $\exists \mathbb{R}$- and $\forall\exists \mathbb{R}$-completeness of variants of Area Universality. To do so, we also introduce first tools to study $\forall\exists \mathbb{R}$, such as restricted variants of UETR, which are $\forall\exists \mathbb{R}$-complete. Finally, we present geometric problems as candidates for $\forall\exists \mathbb{R}$-complete problems. These problems have connections to the concepts of imprecision, robustness, and extendability.

We revisit Hopcroft's problem and related fundamental problems about geometric range searching. Given $n$ points and $n$ lines in the plane, we show how to count the number of point-line incidence pairs or the number of point-above-line pairs in $O(n^{4/3})$ time, which matches the conjectured lower bound and improves the best previous time bound of $n^{4/3}2^{O(\log^*n)}$ obtained almost 30 years ago by Matou\v{s}ek. We describe two interesting and different ways to achieve the result: the first is randomized and uses a new 2D version of fractional cascading for arrangements of lines; the second is deterministic and uses decision trees in a manner inspired by the sorting technique of Fredman (1976). The second approach extends to any constant dimension. Many consequences follow from these new ideas: for example, we obtain an $O(n^{4/3})$-time algorithm for line segment intersection counting in the plane, $O(n^{4/3})$-time randomized algorithms for bichromatic closest pair and Euclidean minimum spanning tree in three or four dimensions, and a randomized data structure for halfplane range counting in the plane with $O(n^{4/3})$ preprocessing time and space and $O(n^{1/3})$ query time.

Probabilistic pushdown automata (pPDA) are a standard operational model for programming languages involving discrete random choices, procedures, and returns. Temporal properties are useful for gaining insight into the chronological order of events during program execution. Existing approaches in the literature have focused mostly on $\omega$-regular and LTL properties. In this paper, we study the model checking problem of pPDA against $\omega$-visibly pushdown languages that can be described by specification logics such as CaRet and are strictly more expressive than $\omega$-regular properties. With these logical formulae, it is possible to specify properties that explicitly take the structured computations arising from procedural programs into account. For example, CaRet is able to match procedure calls with their corresponding future returns, and thus allows to express fundamental program properties like total and partial correctness.

We consider an important generalization of the Steiner tree problem, the \emph{Steiner forest problem}, in the Euclidean plane: the input is a multiset $X \subseteq \mathbb{R}^2$, partitioned into $k$ color classes $C_1, C_2, \ldots, C_k \subseteq X$. The goal is to find a minimum-cost Euclidean graph $G$ such that every color class $C_i$ is connected in $G$. We study this Steiner forest problem in the streaming setting, where the stream consists of insertions and deletions of points to $X$. Each input point $x\in X$ arrives with its color $\textsf{color}(x) \in [k]$, and as usual for dynamic geometric streams, the input points are restricted to the discrete grid $\{0, \ldots, \Delta\}^2$. We design a single-pass streaming algorithm that uses $\mathrm{poly}(k \cdot \log\Delta)$ space and time, and estimates the cost of an optimal Steiner forest solution within ratio arbitrarily close to the famous Euclidean Steiner ratio $\alpha_2$ (currently $1.1547 \le \alpha_2 \le 1.214$). This approximation guarantee matches the state of the art bound for streaming Steiner tree, i.e., when $k=1$. Our approach relies on a novel combination of streaming techniques, like sampling and linear sketching, with the classical Arora-style dynamic-programming framework for geometric optimization problems, which usually requires large memory and has so far not been applied in the streaming setting. We complement our streaming algorithm for the Steiner forest problem with simple arguments showing that any finite approximation requires $\Omega(k)$ bits of space.

We study the revenue guarantees and approximability of item pricing. Recent work shows that with $n$ heterogeneous items, item-pricing guarantees an $O(\log n)$ approximation to the optimal revenue achievable by any (buy-many) mechanism, even when buyers have arbitrarily combinatorial valuations. However, finding good item prices is challenging -- it is known that even under unit-demand valuations, it is NP-hard to find item prices that approximate the revenue of the optimal item pricing better than $O(\sqrt{n})$. Our work provides a more fine-grained analysis of the revenue guarantees and computational complexity in terms of the number of item ``categories'' which may be significantly fewer than $n$. We assume the items are partitioned in $k$ categories so that items within a category are totally-ordered and a buyer's value for a bundle depends only on the best item contained from every category. We show that item-pricing guarantees an $O(\log k)$ approximation to the optimal (buy-many) revenue and provide a PTAS for computing the optimal item-pricing when $k$ is constant. We also provide a matching lower bound showing that the problem is (strongly) NP-hard even when $k=1$. Our results naturally extend to the case where items are only partially ordered, in which case the revenue guarantees and computational complexity depend on the width of the partial ordering, i.e. the largest set for which no two items are comparable.

A 1965 problem due to Danzer asks whether there exists a set in Euclidean space with finite density intersecting any convex body of volume one. A recent approach to this problem is concerned with the construction of dense forests and is obtained by a suitable weakening of the volume constraint. A dense forest is a discrete point set of finite density getting uniformly close to long enough line segments. The distribution of points in a dense forest is then quantified in terms of a visibility function. Another way to weaken the assumptions in Danzer's problem is by relaxing the density constraint. In this respect, a new concept is introduced in this paper, namely that of an optical forest. An optical forest in $\mathbb{R}^{d}$ is a point set with optimal visibility but not necessarily with finite density. In the literature, the best constructions of Danzer sets and dense forests lack effectivity. The goal of this paper is to provide constructions of dense and optical forests which yield the best known results in any dimension $d \ge 2$ both in terms of visibility and density bounds and effectiveness. Namely, there are three main results in this work: (1) the construction of a dense forest with the best known visibility bound which, furthermore, enjoys the property of being deterministic; (2) the deterministic construction of an optical forest with a density failing to be finite only up to a logarithm and (3) the construction of a planar Peres-type forest (that is, a dense forest obtained from a construction due to Peres) with the best known visibility bound. This is achieved by constructing a deterministic digital sequence satisfying strong dispersion properties.

We introduce a novel model-theoretic framework inspired from graph modification and based on the interplay between model theory and algorithmic graph minors. We propose a new compound logic operating with two types of sentences, expressing graph modification: the modulator sentence, defining some property of the modified part of the graph, and the target sentence, defining some property of the resulting graph. In our framework, modulator sentences are in monadic second-order logic and have models of bounded treewidth, while target sentences express first-order logic properties along with minor-exclusion. Our logic captures problems that are not definable in first order logic and, moreover, may have instances of unbounded treewidth. Also, it permits the modelling of wide families of problems involving vertex/edge removals, alternative modulator measures (such as elimination distance or G-treewidth), multistage modifications, and various cut problems. Our main result is that, for this compound logic, model checking can be done in quadratic time. This algorithmic meta-theorem encompasses, unifies, and extends all known meta-algorithmic results on minor-closed graph classes. Moreover, all derived algorithms are constructive and this, as a byproduct, extends the constructibility horizon of the algorithmic applications of the Graph Minors theorem of Robertson and Seymour. The proposed logic can be seen as a general framework to capitalize on the potential of the irrelevant vertex technique.

In clustering problems, a central decision-maker is given a complete metric graph over vertices and must provide a clustering of vertices that minimizes some objective function. In fair clustering problems, vertices are endowed with a color (e.g., membership in a group), and the features of a valid clustering might also include the representation of colors in that clustering. Prior work in fair clustering assumes complete knowledge of group membership. In this paper, we generalize prior work by assuming imperfect knowledge of group membership through probabilistic assignments. We present clustering algorithms in this more general setting with approximation ratio guarantees. We also address the problem of "metric membership", where different groups have a notion of order and distance. Experiments are conducted using our proposed algorithms as well as baselines to validate our approach and also surface nuanced concerns when group membership is not known deterministically.

We investigate the adversarial robustness of streaming algorithms. In this context, an algorithm is considered robust if its performance guarantees hold even if the stream is chosen adaptively by an adversary that observes the outputs of the algorithm along the stream and can react in an online manner. While deterministic streaming algorithms are inherently robust, many central problems in the streaming literature do not admit sublinear-space deterministic algorithms; on the other hand, classical space-efficient randomized algorithms for these problems are generally not adversarially robust. This raises the natural question of whether there exist efficient adversarially robust (randomized) streaming algorithms for these problems. In this work, we show that the answer is positive for various important streaming problems in the insertion-only model, including distinct elements and more generally $F_p$-estimation, $F_p$-heavy hitters, entropy estimation, and others. For all of these problems, we develop adversarially robust $(1+\varepsilon)$-approximation algorithms whose required space matches that of the best known non-robust algorithms up to a $\text{poly}(\log n, 1/\varepsilon)$ multiplicative factor (and in some cases even up to a constant factor). Towards this end, we develop several generic tools allowing one to efficiently transform a non-robust streaming algorithm into a robust one in various scenarios.

北京阿比特科技有限公司