亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

The rapid and massive diffusion of electric vehicles poses new challenges to the electric system, which must be able to supply these new loads, but at the same time opens up new opportunities thanks to the possible provision of ancillary services. Indeed, in the so-called Vehicle-to-Grid (V2G) set-up, the charging power can be modulated throughout the day so that a fleet of vehicles can absorb an excess of power from the grid or provide extra power during a shortage.To this end, many works in the literature focus on the optimization of each vehicle daily charging profiles to offer the requested ancillary services while guaranteeing a charged battery for each vehicle at the end of the day. However, the size of the economic benefits related to the provision of ancillary services varies significantly with the modeling approaches, different assumptions, and considered scenarios. In this paper we propose a profitability analysis with reference to a recently proposed framework for V2G optimal operation in presence of uncertainty. We provide necessary and sufficient conditions for profitability in a simplified case and we show via simulation that they also hold for the general case.

相關內容

We numerically investigate the generalized Steklov problem for the modified Helmholtz equation and focus on the relation between its spectrum and the geometric structure of the domain. We address three distinct aspects: (i) the asymptotic behavior of eigenvalues for polygonal domains; (ii) the dependence of the integrals of eigenfunctions on the domain symmetries; and (iii) the localization and exponential decay of Steklov eigenfunctions away from the boundary for smooth shapes and in the presence of corners. For this purpose, we implemented two complementary numerical methods to compute the eigenvalues and eigenfunctions of the associated Dirichlet-to-Neumann operator for various simply-connected planar domains. We also discuss applications of the obtained results in the theory of diffusion-controlled reactions and formulate several conjectures with relevance in spectral geometry.

We determine the minimum possible column multiplicity of even, doubly-, and triply-even codes given their length. This refines a classification result for the possible lengths of $q^r$-divisible codes over $\mathbb{F}_q$. We also give a few computational results for field sizes $q>2$. Non-existence results of divisible codes with restricted column multiplicities for a given length have applications e.g. in Galois geometry and can be used for upper bounds on the maximum cardinality of subspace codes.

The flexoelectric effect, coupling polarization and strain gradient as well as strain and electric field gradients, is universal to dielectrics, but, as compared to piezoelectricity, it is more difficult to harness as it requires field gradients and it is a small-scale effect. These drawbacks can be overcome by suitably designing metamaterials made of a non-piezoelectric base material but exhibiting apparent piezoelectricity. We develop a theoretical and computational framework to perform topology optimization of the representative volume element of such metamaterials by accurately modeling the governing equations of flexoelectricity using a Cartesian B-spline method, describing geometry with a level set, and resorting to genetic algorithms for optimization. We consider a multi-objective optimization problem where area fraction competes with four fundamental piezoelectric functionalities (stress/strain sensor/ actuator). We computationally obtain Pareto fronts, and discuss the different geometries depending on the apparent piezoelectric coefficient being optimized. In general, we find competitive estimations of apparent piezoelectricity as compared to reference materials such as quartz and PZT ceramics. This opens the possibility to design devices for sensing, actuation and energy harvesting from a much wider, cheaper and effective class of materials.

The use of the non-parametric Restricted Mean Survival Time endpoint (RMST) has grown in popularity as trialists look to analyse time-to-event outcomes without the restrictions of the proportional hazards assumption. In this paper, we evaluate the power and type I error rate of the parametric and non-parametric RMST estimators when treatment effect is explained by multiple covariates, including an interaction term. Utilising the RMST estimator in this way allows the combined treatment effect to be summarised as a one-dimensional estimator, which is evaluated using a one-sided hypothesis Z-test. The estimators are either fully specified or misspecified, both in terms of unaccounted covariates or misspecified knot points (where trials exhibit crossing survival curves). A placebo-controlled trial of Gamma interferon is used as a motivating example to simulate associated survival times. When correctly specified, the parametric RMST estimator has the greatest power, regardless of the time of analysis. The misspecified RMST estimator generally performs similarly when covariates mirror those of the fitted case study dataset. However, as the magnitude of the unaccounted covariate increases, the associated power of the estimator decreases. In all cases, the non-parametric RMST estimator has the lowest power, and power remains very reliant on the time of analysis (with a later analysis time correlated with greater power).

The synthesis of information deriving from complex networks is a topic receiving increasing relevance in ecology and environmental sciences. In particular, the aggregation of multilayer networks, i.e. network structures formed by multiple interacting networks (the layers), constitutes a fast-growing field. In several environmental applications, the layers of a multilayer network are modelled as a collection of similarity matrices describing how similar pairs of biological entities are, based on different types of features (e.g. biological traits). The present paper first discusses two main techniques for combining the multi-layered information into a single network (the so-called monoplex), i.e. Similarity Network Fusion (SNF) and Similarity Matrix Average (SMA). Then, the effectiveness of the two methods is tested on a real-world dataset of the relative abundance of microbial species in the ecosystems of nine glaciers (four glaciers in the Alps and five in the Andes). A preliminary clustering analysis on the monoplexes obtained with different methods shows the emergence of a tightly connected community formed by species that are typical of cryoconite holes worldwide. Moreover, the weights assigned to different layers by the SMA algorithm suggest that two large South American glaciers (Exploradores and Perito Moreno) are structurally different from the smaller glaciers in both Europe and South America. Overall, these results highlight the importance of integration methods in the discovery of the underlying organizational structure of biological entities in multilayer ecological networks.

Today, digital identity management for individuals is either inconvenient and error-prone or creates undesirable lock-in effects and violates privacy and security expectations. These shortcomings inhibit the digital transformation in general and seem particularly concerning in the context of novel applications such as access control for decentralized autonomous organizations and identification in the Metaverse. Decentralized or self-sovereign identity (SSI) aims to offer a solution to this dilemma by empowering individuals to manage their digital identity through machine-verifiable attestations stored in a "digital wallet" application on their edge devices. However, when presented to a relying party, these attestations typically reveal more attributes than required and allow tracking end users' activities. Several academic works and practical solutions exist to reduce or avoid such excessive information disclosure, from simple selective disclosure to data-minimizing anonymous credentials based on zero-knowledge proofs (ZKPs). We first demonstrate that the SSI solutions that are currently built with anonymous credentials still lack essential features such as scalable revocation, certificate chaining, and integration with secure elements. We then argue that general-purpose ZKPs in the form of zk-SNARKs can appropriately address these pressing challenges. We describe our implementation and conduct performance tests on different edge devices to illustrate that the performance of zk-SNARK-based anonymous credentials is already practical. We also discuss further advantages that general-purpose ZKPs can easily provide for digital wallets, for instance, to create "designated verifier presentations" that facilitate new design options for digital identity infrastructures that previously were not accessible because of the threat of man-in-the-middle attacks.

A comprehensive mathematical model of the multiphysics flow of blood and Cerebrospinal Fluid (CSF) in the brain can be expressed as the coupling of a poromechanics system and Stokes' equations: the first describes fluids filtration through the cerebral tissue and the tissue's elastic response, while the latter models the flow of the CSF in the brain ventricles. This model describes the functioning of the brain's waste clearance mechanism, which has been recently discovered to play an essential role in the progress of neurodegenerative diseases. To model the interactions between different scales in the porous medium, we propose a physically consistent coupling between Multi-compartment Poroelasticity (MPE) equations and Stokes' equations. In this work, we introduce a numerical scheme for the discretization of such coupled MPE-Stokes system, employing a high-order discontinuous Galerkin method on polytopal grids to efficiently account for the geometric complexity of the domain. We analyze the stability and convergence of the space semidiscretized formulation, we prove a-priori error estimates, and we present a temporal discretization based on a combination of Newmark's $\beta$-method for the elastic wave equation and the $\theta$-method for the other equations of the model. Numerical simulations carried out on test cases with manufactured solutions validate the theoretical error estimates. We also present numerical results on a two-dimensional slice of a patient-specific brain geometry reconstructed from diagnostic images, to test in practice the advantages of the proposed approach.

We consider the estimation of the cumulative hazard function, and equivalently the distribution function, with censored data under a setup that preserves the privacy of the survival database. This is done through a $\alpha$-locally differentially private mechanism for the failure indicators and by proposing a non-parametric kernel estimator for the cumulative hazard function that remains consistent under the privatization. Under mild conditions, we also prove lowers bounds for the minimax rates of convergence and show that estimator is minimax optimal under a well-chosen bandwidth.

In recent years, research involving human participants has been critical to advances in artificial intelligence (AI) and machine learning (ML), particularly in the areas of conversational, human-compatible, and cooperative AI. For example, around 12% and 6% of publications at recent AAAI and NeurIPS conferences indicate the collection of original human data, respectively. Yet AI and ML researchers lack guidelines for ethical, transparent research practices with human participants. Fewer than one out of every four of these AAAI and NeurIPS papers provide details of ethical review, the collection of informed consent, or participant compensation. This paper aims to bridge this gap by exploring normative similarities and differences between AI research and related fields that involve human participants. Though psychology, human-computer interaction, and other adjacent fields offer historic lessons and helpful insights, AI research raises several specific concerns$\unicode{x2014}$namely, participatory design, crowdsourced dataset development, and an expansive role of corporations$\unicode{x2014}$that necessitate a contextual ethics framework. To address these concerns, this paper outlines a set of guidelines for ethical and transparent practice with human participants in AI and ML research. These guidelines can be found in Section 4 on pp. 4$\unicode{x2013}$7.

We show a deterministic constant-time local algorithm for constructing an approximately maximum flow and minimum fractional cut in multisource-multitarget networks with bounded degrees and bounded edge capacities. Locality means that the decision we make about each edge only depends on its constant radius neighborhood. We show two applications of the algorithms: one is related to the Aldous-Lyons Conjecture, and the other is about approximating the neighborhood distribution of graphs by bounded-size graphs. The scope of our results can be extended to unimodular random graphs and networks. As a corollary, we generalize the Maximum Flow Minimum Cut Theorem to unimodular random flow networks.

北京阿比特科技有限公司