亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

Many modern datasets exhibit dependencies among observations as well as variables. This gives rise to the challenging problem of analyzing high-dimensional matrix-variate data with unknown dependence structures. To address this challenge, Kalaitzis et. al. (2013) proposed the Bigraphical Lasso (BiGLasso), an estimator for precision matrices of matrix-normals based on the Cartesian product of graphs. Subsequently, Greenewald, Zhou and Hero (GZH 2019) introduced a multiway tensor generalization of the BiGLasso estimator, known as the TeraLasso estimator. In this paper, we provide sharper rates of convergence in the Frobenius and operator norm for both BiGLasso and TeraLasso estimators for estimating inverse covariance matrices. This improves upon the rates presented in GZH 2019. In particular, (a) we strengthen the bounds for the relative errors in the operator and Frobenius norm by a factor of approximately $\log p$; (b) Crucially, this improvement allows for finite-sample estimation errors in both norms to be derived for the two-way Kronecker sum model. The two-way regime is important because it is the setting that is the most theoretically challenging, and simultaneously the most common in applications. Normality is not needed in our proofs; instead, we consider sub-gaussian ensembles and derive tight concentration of measure bounds, using tensor unfolding techniques. The proof techniques may be of independent interest.

相關內容

Power posteriors "robustify" standard Bayesian inference by raising the likelihood to a constant fractional power, effectively downweighting its influence in the calculation of the posterior. Power posteriors have been shown to be more robust to model misspecification than standard posteriors in many settings. Previous work has shown that power posteriors derived from low-dimensional, parametric locally asymptotically normal models are asymptotically normal (Bernstein-von Mises) even under model misspecification. We extend these results to show that the power posterior moments converge to those of the limiting normal distribution suggested by the Bernstein-von Mises theorem. We then use this result to show that the mean of the power posterior, a point estimator, is asymptotically equivalent to the maximum likelihood estimator.

Semivariance is a measure of the dispersion of all observations that fall above the mean or target value of a random variable and it plays an important role in life-length, actuarial and income studies. In this paper, we develop a new non-parametric test for equality of upper semi-variance. We use the U-statistic theory to derive the test statistic and then study the asymptotic properties of the test statistic. We also develop a jackknife empirical likelihood (JEL) ratio test for equality of upper Semivariance. Extensive Monte Carlo simulation studies are carried out to validate the performance of the proposed JEL-based test. We illustrate the test procedure using real data.

Kernel-based multi-marker tests for survival outcomes use primarily the Cox model to adjust for covariates. The proportional hazards assumption made by the Cox model could be unrealistic, especially in the long-term follow-up. We develop a suite of novel multi-marker survival tests for genetic association based on the accelerated failure time model, which is a popular alternative to the Cox model due to its direct physical interpretation. The tests are based on the asymptotic distributions of their test statistics and are thus computationally efficient. The association tests can account for the heterogeneity of genetic effects across sub-populations/individuals to increase the power. All the new tests can deal with competing risks and left truncation. Moreover, we develop small-sample corrections to the tests to improve their accuracy under small samples. Extensive numerical experiments show that the new tests perform very well in various scenarios. An application to a genetic dataset of Alzheimer's disease illustrates the tests' practical utility.

Anderson acceleration (AA) is a technique for accelerating the convergence of an underlying fixed-point iteration. AA is widely used within computational science, with applications ranging from electronic structure calculation to the training of neural networks. Despite AA's widespread use, relatively little is understood about it theoretically. An important and unanswered question in this context is: To what extent can AA actually accelerate convergence of the underlying fixed-point iteration? While simple enough to state, this question appears rather difficult to answer. For example, it is unanswered even in the simplest (non-trivial) case where the underlying fixed-point iteration consists of applying a two-dimensional affine function. In this note we consider a restarted variant of AA applied to solve symmetric linear systems with restart window of size one. Several results are derived from the analytical solution of a nonlinear eigenvalue problem characterizing residual propagation of the AA iteration. This includes a complete characterization of the method to solve $2 \times 2$ linear systems, rigorously quantifying how the asymptotic convergence factor depends on the initial iterate, and quantifying by how much AA accelerates the underlying fixed-point iteration. We also prove that even if the underlying fixed-point iteration diverges, the associated AA iteration may still converge.

Soft robotic shape estimation and proprioception are challenging because of soft robot's complex deformation behaviors and infinite degrees of freedom. A soft robot's continuously deforming body makes it difficult to integrate rigid sensors and to reliably estimate its shape. In this work, we present Proprioceptive Omnidirectional End-effector (POE), which has six embedded microphones across the tendon-driven soft robot's surface. We first introduce novel applications of previously proposed 3D reconstruction methods to acoustic signals from the microphones for soft robot shape proprioception. To improve the proprioception pipeline's training efficiency and model prediction consistency, we present POE-M. POE-M first predicts key point positions from the acoustic signal observations with the embedded microphone array. Then we utilize an energy-minimization method to reconstruct a physically admissible high-resolution mesh of POE given the estimated key points. We evaluate the mesh reconstruction module with simulated data and the full POE-M pipeline with real-world experiments. We demonstrate that POE-M's explicit guidance of the key points during the mesh reconstruction process provides robustness and stability to the pipeline with ablation studies. POE-M reduced the maximum Chamfer distance error by 23.10 % compared to the state-of-the-art end-to-end soft robot proprioception models and achieved 4.91 mm average Chamfer distance error during evaluation.

The rise of AI in human contexts places new demands on automated systems to be transparent and explainable. We examine some anthropomorphic ideas and principles relevant to such accountablity in order to develop a theoretical framework for thinking about digital systems in complex human contexts and the problem of explaining their behaviour. Structurally, systems are made of modular and hierachical components, which we abstract in a new system model using notions of modes and mode transitions. A mode is an independent component of the system with its own objectives, monitoring data, and algorithms. The behaviour of a mode, including its transitions to other modes, is determined by functions that interpret each mode's monitoring data in the light of its objectives and algorithms. We show how these belief functions can help explain system behaviour by visualising their evaluation as trajectories in higher-dimensional geometric spaces. These ideas are formalised mathematically by abstract and concrete simplicial complexes. We offer three techniques: a framework for design heuristics, a general system theory based on modes, and a geometric visualisation, and apply them in three types of human-centred systems.

The power law is useful in describing count phenomena such as network degrees and word frequencies. With a single parameter, it captures the main feature that the frequencies are linear on the log-log scale. Nevertheless, there have been criticisms of the power law, for example that a threshold needs to be pre-selected without its uncertainty quantified, that the power law is simply inadequate, and that subsequent hypothesis tests are required to determine whether the data could have come from the power law. We propose a modelling framework that combines two different generalisations of the power law, namely the generalised Pareto distribution and the Zipf-polylog distribution, to resolve these issues. The proposed mixture distributions are shown to fit the data well and quantify the threshold uncertainty in a natural way. A model selection step embedded in the Bayesian inference algorithm further answers the question whether the power law is adequate.

We use tropical algebras as platforms for a very efficient digital signature protocol. Security relies on computational hardness of factoring one-variable tropical polynomials; this problem is known to be NP-hard.

Pairwise comparison models are used for quantitatively evaluating utility and ranking in various fields. The increasing scale of modern problems underscores the need to understand statistical inference in these models when the number of subjects diverges, which is currently lacking in the literature except in a few special instances. This paper addresses this gap by establishing an asymptotic normality result for the maximum likelihood estimator in a broad class of pairwise comparison models. The key idea lies in identifying the Fisher information matrix as a weighted graph Laplacian matrix which can be studied via a meticulous spectral analysis. Our findings provide the first unified theory for performing statistical inference in a wide range of pairwise comparison models beyond the Bradley--Terry model, benefiting practitioners with a solid theoretical guarantee for their use. Simulations utilizing synthetic data are conducted to validate the asymptotic normality result, followed by a hypothesis test using a tennis competition dataset.

Continuous queries over data streams may suffer from blocking operations and/or unbound wait, which may delay answers until some relevant input arrives through the data stream. These delays may turn answers, when they arrive, obsolete to users who sometimes have to make decisions with no help whatsoever. Therefore, it can be useful to provide hypothetical answers - "given the current information, it is possible that X will become true at time t" - instead of no information at all. In this paper we present a semantics for queries and corresponding answers that covers such hypothetical answers, together with an online algorithm for updating the set of facts that are consistent with the currently available information.

北京阿比特科技有限公司