Jacobi's results on the computation of the order and of the normal forms of a differential system are translated in the formalism of differential algebra. In the quasi-regular case, we give complete proofs according to Jacobi's arguments. The main result is {\it Jacobi's bound}, still conjectural in the general case: the order of a differential system $P_{1}, \ldots, P_{n}$ is not greater than the maximum $\cO$ of the sums $\sum_{i=1}^{n} a_{i,\sigma(i)}$, for all permutations $\sigma$ of the indices, where $a_{i,j}:={\rm ord}_{x_{j}}P_{i}$, \emph{viz.}\ the \emph{tropical determinant of the matrix $(a_{i,j})$}. The order is precisely equal to $\cO$ iff Jacobi's \emph{truncated determinant} does not vanish. Jacobi also gave a polynomial time algorithm to compute $\cO$, similar to Kuhn's \index{Hungarian method}``Hungarian method'' and some variants of shortest path algorithms, related to the computation of integers $\ell_{i}$ such that a normal form may be obtained, in the generic case, by differentiating $\ell_{i}$ times equation $P_{i}$. Fundamental results about changes of orderings and the various normal forms a system may have, including differential resolvents, are also provided.
In this paper we present a finite element analysis for a Dirichlet boundary control problem governed by the Stokes equation. The Dirichlet control is considered in a convex closed subset of the energy space $\mathbf{H}^1(\Omega).$ Most of the previous works on the Stokes Dirichlet boundary control problem deals with either tangential control or the case where the flux of the control is zero. This choice of the control is very particular and their choice of the formulation leads to the control with limited regularity. To overcome this difficulty, we introduce the Stokes problem with outflow condition and the control acts on the Dirichlet boundary only hence our control is more general and it has both the tangential and normal components. We prove well-posedness and discuss on the regularity of the control problem. The first-order optimality condition for the control leads to a Signorini problem. We develop a two-level finite element discretization by using $\mathbf{P}_1$ elements(on the fine mesh) for the velocity and the control variable and $P_0$ elements (on the coarse mesh) for the pressure variable. The standard energy error analysis gives $\frac{1}{2}+\frac{\delta}{2}$ order of convergence when the control is in $\mathbf{H}^{\frac{3}{2}+\delta}(\Omega)$ space. Here we have improved it to $\frac{1}{2}+\delta,$ which is optimal. Also, when the control lies in less regular space we derived optimal order of convergence up to the regularity. The theoretical results are corroborated by a variety of numerical tests.
Let a polytope $\mathcal{P}$ be defined by one of the following ways: (i) $\mathcal{P} = \{x \in \mathbb{R}^n \colon A x \leq b\}$, where $A \in \mathbb{Z}^{(n+m) \times n}$, $b \in \mathbb{Z}^{(n+m)}$, and $rank(A) = n$, (ii) $\mathcal{P} = \{x \in \mathbb{R}_+^n \colon A x = b\}$, where $A \in \mathbb{Z}^{m \times n}$, $b \in \mathbb{Z}^{m}$, and $rank(A) = m$, and let all the rank minors of $A$ be bounded by $\Delta$ in the absolute values. We show that $|\mathcal{P} \cap \mathbb{Z}^n|$ can be computed with an algorithm, having the arithmetic complexity bound $$ O\bigl( \nu(d,m,\Delta) \cdot d^3 \cdot \Delta^4 \cdot \log(\Delta) \bigr), $$ where $d = \dim(\mathcal{P})$ and $\nu(d,m,\Delta)$ is the maximal possible number of vertices in a $d$-dimensional polytope $P$, defined by one of the systems above. Using the obtained result, we have the following arithmetical complexity bounds to compute $|P \cap \mathbb{Z}^n|$: 1) The bound $O(\frac{d}{m}+1)^m \cdot d^3 \cdot \Delta^4 \cdot \log(\Delta)$ that is polynomial on $d$ and $\Delta$, for any fixed $m$; 2) The bound $O\bigl(\frac{m}{d}+1\bigr)^{\frac{d}{2}} \cdot d^4 \cdot \Delta^4 \cdot \log(\Delta)$ that is polynomial on $m$ and $\Delta$, for any fixed $d$; 3) The bound $O(d)^{4 + \frac{d}{2}} \cdot \Delta^{4+d} \cdot \log(\Delta)$ that is polynomial on $\Delta$, for any fixed $d$. Given bounds can be used to obtain faster algorithms for the ILP feasibility problem, and for the problem to count integer points in a simplex or in an unbounded Subset-Sum polytope. Unbounded and parametric versions of the above problem are also considered.
In the negative perceptron problem we are given $n$ data points $({\boldsymbol x}_i,y_i)$, where ${\boldsymbol x}_i$ is a $d$-dimensional vector and $y_i\in\{+1,-1\}$ is a binary label. The data are not linearly separable and hence we content ourselves to find a linear classifier with the largest possible \emph{negative} margin. In other words, we want to find a unit norm vector ${\boldsymbol \theta}$ that maximizes $\min_{i\le n}y_i\langle {\boldsymbol \theta},{\boldsymbol x}_i\rangle$. This is a non-convex optimization problem (it is equivalent to finding a maximum norm vector in a polytope), and we study its typical properties under two random models for the data. We consider the proportional asymptotics in which $n,d\to \infty$ with $n/d\to\delta$, and prove upper and lower bounds on the maximum margin $\kappa_{\text{s}}(\delta)$ or -- equivalently -- on its inverse function $\delta_{\text{s}}(\kappa)$. In other words, $\delta_{\text{s}}(\kappa)$ is the overparametrization threshold: for $n/d\le \delta_{\text{s}}(\kappa)-\varepsilon$ a classifier achieving vanishing training error exists with high probability, while for $n/d\ge \delta_{\text{s}}(\kappa)+\varepsilon$ it does not. Our bounds on $\delta_{\text{s}}(\kappa)$ match to the leading order as $\kappa\to -\infty$. We then analyze a linear programming algorithm to find a solution, and characterize the corresponding threshold $\delta_{\text{lin}}(\kappa)$. We observe a gap between the interpolation threshold $\delta_{\text{s}}(\kappa)$ and the linear programming threshold $\delta_{\text{lin}}(\kappa)$, raising the question of the behavior of other algorithms.
Johnson-Lindenstrauss lemma states random projections can be used as a topology preserving embedding technique for fixed vectors. In this paper, we try to understand how random projections affect probabilistic properties of random vectors. In particular we prove the distribution of inner product of two independent random vectors $X, Z \in {R}^n$ is preserved by random projection $S:{R}^n \to {R}^m$. More precisely, \[ \sup_t \left| \text{P}(\frac{1}{C_{m,n}} X^TS^TSZ <t) - \text{P}(\frac{1}{\sqrt{n}} X^TZ<t) \right| \le O\left(\frac{1}{\sqrt{n}}+ \frac{1}{\sqrt{m}} \right) \] As a by-product, we obtain product central limit theorem (product-CLT) for $\sum_{k=1}^{n} X_k Y_k$, where $\{X_k\}$ is a martingale difference sequence, and $\{Y_k\}$ has dependency within the sequence. We also obtain the rate of convergence in the spirit of Berry-Esseen theorem.
Makespan minimization on identical machines is a fundamental problem in online scheduling. The goal is to assign a sequence of jobs to $m$ identical parallel machines so as to minimize the maximum completion time of any job. Already in the 1960s, Graham showed that Greedy is $(2-1/m)$-competitive. The best deterministic online algorithm currently known achieves a competitive ratio of 1.9201. No deterministic online strategy can obtain a competitiveness smaller than 1.88. In this paper, we study online makespan minimization in the popular random-order model, where the jobs of a given input arrive as a random permutation. It is known that Greedy does not attain a competitive factor asymptotically smaller than 2 in this setting. We present the first improved performance guarantees. Specifically, we develop a deterministic online algorithm that achieves a competitive ratio of 1.8478. The result relies on a new analysis approach. We identify a set of properties that a random permutation of the input jobs satisfies with high probability. Then we conduct a worst-case analysis of our algorithm, for the respective class of permutations. The analysis implies that the stated competitiveness holds not only in expectation but with high probability. Moreover, it provides mathematical evidence that job sequences leading to higher performance ratios are extremely rare, pathological inputs. We complement the results by lower bounds, for the random-order model. We show that no deterministic online algorithm can achieve a competitive ratio smaller than 4/3. Moreover, no deterministic online algorithm can attain a competitiveness smaller than 3/2 with high probability.
This paper studies Makespan Minimization in the secretary model. Formally, jobs, specified by their processing times, are presented in a uniformly random order. An online algorithm has to assign each job permanently and irrevocably to one of m parallel and identical machines such that the expected time it takes to process them all, the makespan, is minimized. We give two deterministic algorithms. First, a straightforward adaptation of the semi-online strategy LightLoad provides a very simple algorithm retaining its competitive ratio of 1.75. A new and sophisticated algorithm is 1.535-competitive. These competitive ratios are not only obtained in expectation but, in fact, for all but a very tiny fraction of job orders. Classically, online makespan minimization only considers the worst-case order. Here, no competitive ratio below 1.885 for deterministic algorithms and 1.581 using randomization is possible. The best randomized algorithm so far is 1.916-competitive. Our results show that classical worst-case orders are quite rare and pessimistic for many applications. They also demonstrate the power of randomization when compared to much stronger deterministic reordering models. We complement our results by providing first lower bounds. A competitive ratio obtained on nearly all possible job orders must be at least 1.257. This implies a lower bound of 1.043 for both deterministic and randomized algorithms in the general model.
There is no unified definition of Data anomalies, which refers to the specific data operation mode that may violate the consistency of the database. Known data anomalies include Dirty Write, Dirty Read, Non-repeatable Read, Phantom, Read Skew and Write Skew, etc. In order to improve the efficiency of concurrency control algorithms, data anomalies are also used to define the isolation levels, because the weak isolation level can improve the efficiency of transaction processing systems. This paper systematically studies the data anomalies and the corresponding isolation levels. We report twenty-two new data anomalies that other papers have not reported, and all data anomalies are classified miraculously. Based on the classification of data anomalies, two new isolation levels systems with different granularity are proposed, which reveals the rule of defining isolation levels based on data anomalies and makes the cognition of data anomalies and isolation levels more concise.
In this paper, we propose a new high order semi-implicit scheme for the all Mach full Euler equations of gas dynamics. Material waves are treated explicitly, while acoustic waves are treated implicitly, thus avoiding severe CFL restrictions for low Mach flows. High order accuracy in time is obtained by semi-implicit temporal integrator based on the IMEX Runge-Kutta (IMEX-RK) framework. High order in space is achieved by finite difference WENO schemes with characteristic-wise reconstructions adapted to the semi-implicit IMEX-RK time discretization. Type A IMEX schemes are constructed to handle not well-prepared initial conditions. Besides, these schemes are proven to be asymptotic preserving and asymptotically accurate as the Mach number vanishes for well-prepared initial conditions. Divergence-free property of the time-discrete schemes is proved. The proposed scheme can also well capture discontinuous solutions in the compressible regime, especially for two dimensional Riemann problems. Numerical tests in one and two space dimensions will illustrate the effectiveness of the proposed schemes.
The Grundy number of a graph is the maximum number of colours used by the ``First-Fit'' greedy colouring algorithm over all vertex orderings. Given a vertex ordering $\sigma= v_1,\dots,v_n$, the ``First-Fit'' greedy colouring algorithm colours the vertices in the order of $\sigma$ by assigning to each vertex the smallest colour unused in its neighbourhood. By restricting this procedure to vertex orderings that are connected, we obtain {\em connected greedy colourings}. For some graphs, all connected greedy colourings use exactly $\chi(G)$ colours; they are called {\em good graphs}. On the opposite, some graphs do not admit any connected greedy colouring using only $\chi(G)$ colours; they are called {\em ugly graphs}. We show that no perfect graph is ugly. We also give simple proofs of this fact for subclasses of perfect graphs (block graphs, comparability graphs), and show that no $K_4$-minor free graph is ugly.
The Normalized Cut (NCut) objective function, widely used in data clustering and image segmentation, quantifies the cost of graph partitioning in a way that biases clusters or segments that are balanced towards having lower values than unbalanced partitionings. However, this bias is so strong that it avoids any singleton partitions, even when vertices are very weakly connected to the rest of the graph. Motivated by the B\"uhler-Hein family of balanced cut costs, we propose the family of Compassionately Conservative Balanced (CCB) Cut costs, which are indexed by a parameter that can be used to strike a compromise between the desire to avoid too many singleton partitions and the notion that all partitions should be balanced. We show that CCB-Cut minimization can be relaxed into an orthogonally constrained $\ell_{\tau}$-minimization problem that coincides with the problem of computing Piecewise Flat Embeddings (PFE) for one particular index value, and we present an algorithm for solving the relaxed problem by iteratively minimizing a sequence of reweighted Rayleigh quotients (IRRQ). Using images from the BSDS500 database, we show that image segmentation based on CCB-Cut minimization provides better accuracy with respect to ground truth and greater variability in region size than NCut-based image segmentation.