亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

This paper extends the optimal-trading framework developed in arXiv:2409.03586v1 to compute optimal strategies with real-world constraints. The aim of the current paper, as with the previous, is to study trading in the context of multi-player non-cooperative games. While the former paper relies on methods from the calculus of variations and optimal strategies arise as the solution of partial differential equations, the current paper demonstrates that the entire framework may be re-framed as a quadratic programming problem and cast in this light constraints are readily incorporated into the calculation of optimal strategies. An added benefit is that two-trader equilibria may be calculated as the end-points of a dynamic process of traders forming repeated adjustments to each other's strategy.

相關內容

This paper deals with the application of probabilistic time integration methods to semi-explicit partial differential-algebraic equations of parabolic type and its semi-discrete counterparts, namely semi-explicit differential-algebraic equations of index 2. The proposed methods iteratively construct a probability distribution over the solution of deterministic problems, enhancing the information obtained from the numerical simulation. Within this paper, we examine the efficacy of the randomized versions of the implicit Euler method, the midpoint scheme, and exponential integrators of first and second order. By demonstrating the consistency and convergence properties of these solvers, we illustrate their utility in capturing the sensitivity of the solution to numerical errors. Our analysis establishes the theoretical validity of randomized time integration for constrained systems and offers insights into the calibration of probabilistic integrators for practical applications.

Evaluating forecasts is essential to understand and improve forecasting and make forecasts useful to decision makers. A variety of R packages provide a broad variety of scoring rules, visualisations and diagnostic tools. One particular challenge, which scoringutils aims to address, is handling the complexity of evaluating and comparing forecasts from several forecasters across multiple dimensions such as time, space, and different types of targets. scoringutils extends the existing landscape by offering a convenient and flexible data.table-based framework for evaluating and comparing probabilistic forecasts (forecasts represented by a full predictive distribution). Notably, scoringutils is the first package to offer extensive support for probabilistic forecasts in the form of predictive quantiles, a format that is currently used by several infectious disease Forecast Hubs. The package is easily extendable, meaning that users can supply their own scoring rules or extend existing classes to handle new types of forecasts. scoringutils provides broad functionality to check the data and diagnose issues, to visualise forecasts and missing data, to transform data before scoring, to handle missing forecasts, to aggregate scores, and to visualise the results of the evaluation. The paper presents the package and its core functionality and illustrates common workflows using example data of forecasts for COVID-19 cases and deaths submitted to the European COVID-19 Forecast Hub.

We study the problem of modeling and inference for spatio-temporal count processes. Our approach uses parsimonious parameterisations of multivariate autoregressive count time series models, including possible regression on covariates. We control the number of parameters by specifying spatial neighbourhood structures for possibly huge matrices that take into account spatio-temporal dependencies. This work is motivated by real data applications which call for suitable models. Extensive simulation studies show that our approach yields reliable estimators.

Split conformal prediction techniques are applied to regression problems with circular responses by introducing a suitable conformity score, leading to prediction sets with adaptive arc length and finite-sample coverage guarantees for any circular predictive model under exchangeable data. Leveraging the high performance of existing predictive models designed for linear responses, we analyze a general projection procedure that converts any linear response regression model into one suitable for circular responses. When random forests serve as basis models in this projection procedure, we harness the out-of-bag dynamics to eliminate the necessity for a separate calibration sample in the construction of prediction sets. For synthetic and real datasets the resulting projected random forests model produces more efficient out-of-bag conformal prediction sets, with shorter median arc length, when compared to the split conformal prediction sets generated by two existing alternative models.

Random circuits giving rise to unitary designs are key tools in quantum information science and many-body physics. In this work, we investigate a class of random quantum circuits with a specific gate structure. Within this framework, we prove that one-dimensional structured random circuits with non-Haar random local gates can exhibit substantially more global randomness compared to Haar random circuits with the same underlying circuit architecture. In particular, we derive all the exact eigenvalues and eigenvectors of the second-moment operators for these structured random circuits under a solvable condition, by establishing a link to the Kitaev chain, and show that their spectral gaps can exceed those of Haar random circuits. Our findings have applications in improving circuit depth bounds for randomized benchmarking and the generation of approximate unitary 2-designs from shallow random circuits.

This paper proposes the RePAIR dataset that represents a challenging benchmark to test modern computational and data driven methods for puzzle-solving and reassembly tasks. Our dataset has unique properties that are uncommon to current benchmarks for 2D and 3D puzzle solving. The fragments and fractures are realistic, caused by a collapse of a fresco during a World War II bombing at the Pompeii archaeological park. The fragments are also eroded and have missing pieces with irregular shapes and different dimensions, challenging further the reassembly algorithms. The dataset is multi-modal providing high resolution images with characteristic pictorial elements, detailed 3D scans of the fragments and meta-data annotated by the archaeologists. Ground truth has been generated through several years of unceasing fieldwork, including the excavation and cleaning of each fragment, followed by manual puzzle solving by archaeologists of a subset of approx. 1000 pieces among the 16000 available. After digitizing all the fragments in 3D, a benchmark was prepared to challenge current reassembly and puzzle-solving methods that often solve more simplistic synthetic scenarios. The tested baselines show that there clearly exists a gap to fill in solving this computationally complex problem.

This research introduces a novel hydrofoil-based propulsion framework for unmanned aquatic robots, inspired by the undulating locomotion observed in select aquatic species. The proposed system incorporates a camber-modulating mechanism to enhance hydrofoil propulsive force generation and eventually efficiency. Through dynamic simulations, we validate the effectiveness of the camber-adjusting hydrofoil compared to a symmetric counterpart. The results demonstrate a significant improvement in horizontal thrust, emphasizing the potential of the cambering approach to enhance propulsive performance. Additionally, a prototype flipper design is presented, featuring individual control of heave and pitch motions, as well as a camber-adjustment mechanism. The integrated system not only provides efficient water-based propulsion but also offers the capacity for generating vertical forces during take-off maneuvers for seaplanes. The design is tailored to harness wave energy, contributing to the exploration of alternative energy resources. This work advances the understanding of bionic oscillatory principles for aquatic robots and provides a foundation for future developments in environmentally safe and agile underwater exploration.

We develop and analyze data subsampling techniques for Poisson regression, the standard model for count data $y\in\mathbb{N}$. In particular, we consider the Poisson generalized linear model with ID- and square root-link functions. We consider the method of coresets, which are small weighted subsets that approximate the loss function of Poisson regression up to a factor of $1\pm\varepsilon$. We show $\Omega(n)$ lower bounds against coresets for Poisson regression that continue to hold against arbitrary data reduction techniques up to logarithmic factors. By introducing a novel complexity parameter and a domain shifting approach, we show that sublinear coresets with $1\pm\varepsilon$ approximation guarantee exist when the complexity parameter is small. In particular, the dependence on the number of input points can be reduced to polylogarithmic. We show that the dependence on other input parameters can also be bounded sublinearly, though not always logarithmically. In particular, we show that the square root-link admits an $O(\log(y_{\max}))$ dependence, where $y_{\max}$ denotes the largest count presented in the data, while the ID-link requires a $\Theta(\sqrt{y_{\max}/\log(y_{\max})})$ dependence. As an auxiliary result for proving the tightness of the bound with respect to $y_{\max}$ in the case of the ID-link, we show an improved bound on the principal branch of the Lambert $W_0$ function, which may be of independent interest. We further show the limitations of our analysis when $p$th degree root-link functions for $p\geq 3$ are considered, which indicate that other analytical or computational methods would be required if such a generalization is even possible.

In this work, we propose a novel diagonalization-based preconditioner for the all-at-once linear system arising from the optimal control problem of parabolic equations. The proposed preconditioner is constructed based on an $\epsilon$-circulant modification to the rotated block diagonal (RBD) preconditioning technique, which can be efficiently diagonalized by fast Fourier transforms in a parallel-in-time fashion. \textcolor{black}{To our knowledge, this marks the first application of the $\epsilon$-circulant modification to RBD preconditioning. Before our work, the studies of PinT preconditioning techniques for the optimal control problem are mainly focused on $\epsilon$-circulant modification to Schur complement based preconditioners, which involves multiplication of forward and backward evolutionary processes and thus square the condition number. Compared with those Schur complement based preconditioning techniques in the literature, the advantage of the proposed $\epsilon$-circulant modified RBD preconditioning is that it does not involve the multiplication of forward and backward evolutionary processes. When the generalized minimal residual method is deployed on the preconditioned system, we prove that when choosing $\epsilon=\mathcal{O}(\sqrt{\tau})$ with $\tau$ being the temporal step-size , the convergence rate of the preconditioned GMRES solver is independent of the matrix size and the regularization parameter. Such restriction on $\epsilon$ is more relax than the assumptions on $\epsilon$ from other works related to $\epsilon$-circulant based preconditioning techniques for the optimal control problem. Numerical results are provided to demonstrate the effectiveness of our proposed solvers.

We introduce a novel approach to error correction decoding in the presence of additive alpha-stable noise, which serves as a model of interference-limited wireless systems. In the absence of modifications to decoding algorithms, treating alpha-stable distributions as Gaussian results in significant performance loss. Building on Guessing Random Additive Noise Decoding (GRAND), we consider two approaches. The first accounts for alpha-stable noise in the evaluation of log-likelihood ratios (LLRs) that serve as input to Ordered Reliability Bits GRAND (ORBGRAND). The second builds on an ORBGRAND variant that was originally designed to account for jamming that treats outlying LLRs as erasures. This results in a hybrid error and erasure correcting decoder that corrects errors via ORBGRAND and corrects erasures via Gaussian elimination. The block error rate (BLER) performance of both approaches are similar. Both outperform decoding assuming that the LLRs originated from Gaussian noise by 2 to 3 dB for [128,112] 5G NR CA-Polar and CRC codes.

北京阿比特科技有限公司