亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

Mitigating climate change requires a transition away from fossil fuels towards renewable energy. As a result, power generation becomes more volatile and options for microgrids and islanded power-grid operation are being broadly discussed. Therefore, studying the power grids of physical islands, as a model for islanded microgrids, is of particular interest when it comes to enhancing our understanding of power-grid stability. In the present paper, we investigate the statistical properties of the power-grid frequency of three island systems: Iceland, Ireland, and the Balearic Islands. We utilise a Fokker-Planck approach to construct stochastic differential equations that describe market activities, control, and noise acting on power-grid dynamics. Using the obtained parameters we create synthetic time series of the frequency dynamics. Our main contribution is to propose two extensions of stochastic power-grid frequency models and showcase the applicability of these new models to non-Gaussian statistics, as encountered in islands.

相關內容

ACM/IEEE第23屆模型驅動工程語言和系統國際會議,是模型驅動軟件和系統工程的首要會議系列,由ACM-SIGSOFT和IEEE-TCSE支持組織。自1998年以來,模型涵蓋了建模的各個方面,從語言和方法到工具和應用程序。模特的參加者來自不同的背景,包括研究人員、學者、工程師和工業專業人士。MODELS 2019是一個論壇,參與者可以圍繞建模和模型驅動的軟件和系統交流前沿研究成果和創新實踐經驗。今年的版本將為建模社區提供進一步推進建模基礎的機會,并在網絡物理系統、嵌入式系統、社會技術系統、云計算、大數據、機器學習、安全、開源等新興領域提出建模的創新應用以及可持續性。 官網鏈接: · 線性的 · 數值分析 ·
2023 年 11 月 7 日

We introduce a semi-explicit time-stepping scheme of second order for linear poroelasticity satisfying a weak coupling condition. Here, semi-explicit means that the system, which needs to be solved in each step, decouples and hence improves the computational efficiency. The construction and the convergence proof are based on the connection to a differential equation with two time delays, namely one and two times the step size. Numerical experiments confirm the theoretical results and indicate the applicability to higher-order schemes.

Numerous applications in the field of molecular communications (MC) such as healthcare systems are often event-driven. The conventional Shannon capacity may not be the appropriate metric for assessing performance in such cases. We propose the identification (ID) capacity as an alternative metric. Particularly, we consider randomized identification (RI) over the discrete-time Poisson channel (DTPC), which is typically used as a model for MC systems that utilize molecule-counting receivers. In the ID paradigm, the receiver's focus is not on decoding the message sent. However, he wants to determine whether a message of particular significance to him has been sent or not. In contrast to Shannon transmission codes, the size of ID codes for a Discrete Memoryless Channel (DMC) grows doubly exponentially fast with the blocklength, if randomized encoding is used. In this paper, we derive the capacity formula for RI over the DTPC subject to some peak and average power constraints. Furthermore, we analyze the case of state-dependent DTPC.

Percolation theory investigates systems of interconnected units, their resilience to damage and their propensity to propagation. For random networks we can solve the percolation problems analytically using the generating function formalism. Yet, with the introduction of higher order networks, the generating function calculations are becoming difficult to perform and harder to validate. Here, I illustrate the mapping of percolation in higher order networks to percolation in chygraphs. Chygraphs are defined as a set of complexes where complexes are hypergraphs with vertex sets in the set of complexes. In a previous work I reported the generating function formalism to percolation in chygraphs and obtained an analytical equation for the order parameter. Taking advantage of this result, I recapitulate analytical results for percolation problems in higher order networks and report extensions to more complex scenarios using symbolic calculations. The code for symbolic calculations can be found at //github.com/av2atgh/chygraph.

Geochemical mapping of risk element concentrations in soils is performed in countries around the world. It results in large datasets of high analytical quality, which can be used to identify soils that violate individual legislative limits for safe food production. However, there is a lack of advanced data mining tools that would be suitable for sensitive exploratory data analysis of big data while respecting the natural variability of soil composition. To distinguish anthropogenic contamination from natural variation, the analysis of the entire data distributions for smaller sub-areas is key. In this article, we propose a new data mining method for geochemical mapping data based on functional data analysis of probability density functions in the framework of Bayes spaces after post-stratification of a big dataset to smaller districts. Proposed tools allow us to analyse the entire distribution, going beyond a superficial detection of extreme concentration anomalies. We illustrate the proposed methodology on a dataset gathered according to the Czech national legislation (1990--2009). Taking into account specific properties of probability density functions and recent results for orthogonal decomposition of multivariate densities enabled us to reveal real contamination patterns that were so far only suspected in Czech agricultural soils. We process the above Czech soil composition dataset by first compartmentalising it into spatial units, in particular the districts, and by subsequently clustering these districts according to diagnostic features of their uni- and multivariate distributions at high concentration ends. Comparison between compartments is key to the reliable distinction of diffuse contamination. In this work, we used soil contamination by Cu-bearing pesticides as an example for empirical testing of the proposed data mining approach.

We consider state and parameter estimation for compartmental models having both time-varying and time-invariant parameters. Though the described Bayesian computational framework is general, we look at a specific application to the susceptible-infectious-removed (SIR) model which describes a basic mechanism for the spread of infectious diseases through a system of coupled nonlinear differential equations. The SIR model consists of three states, namely, the three compartments, and two parameters which control the coupling among the states. The deterministic SIR model with time-invariant parameters has shown to be overly simplistic for modelling the complex long-term dynamics of diseases transmission. Recognizing that certain model parameters will naturally vary in time due to seasonal trends, non-pharmaceutical interventions, and other random effects, the estimation procedure must systematically permit these time-varying effects to be captured, without unduly introducing artificial dynamics into the system. To this end, we leverage the robustness of the Markov Chain Monte Carlo (MCMC) algorithm for the estimation of time-invariant parameters alongside nonlinear filters for the joint estimation of the system state and time-varying parameters. We demonstrate performance of the framework by first considering a series of examples using synthetic data, followed by an exposition on public health data collected in the province of Ontario.

Implicit models for magnetic coenergy have been proposed by Pera et al. to describe the anisotropic nonlinear material behavior of electrical steel sheets. This approach aims at predicting magnetic response for any direction of excitation by interpolating measured of B--H curves in the rolling and transverse directions. In an analogous manner, an implicit model for magnetic energy is proposed. We highlight some mathematical properties of these implicit models and discuss their numerical realization, outline the computation of magnetic material laws via implicit differentiation, and discuss the potential use for finite element analysis in the context of nonlinear magnetostatics.

We numerically investigate the generalized Steklov problem for the modified Helmholtz equation and focus on the relation between its spectrum and the geometric structure of the domain. We address three distinct aspects: (i) the asymptotic behavior of eigenvalues for polygonal domains; (ii) the dependence of the integrals of eigenfunctions on the domain symmetries; and (iii) the localization and exponential decay of Steklov eigenfunctions away from the boundary for smooth shapes and in the presence of corners. For this purpose, we implemented two complementary numerical methods to compute the eigenvalues and eigenfunctions of the associated Dirichlet-to-Neumann operator for various simply-connected planar domains. We also discuss applications of the obtained results in the theory of diffusion-controlled reactions and formulate several conjectures with relevance in spectral geometry.

The synthesis of information deriving from complex networks is a topic receiving increasing relevance in ecology and environmental sciences. In particular, the aggregation of multilayer networks, i.e. network structures formed by multiple interacting networks (the layers), constitutes a fast-growing field. In several environmental applications, the layers of a multilayer network are modelled as a collection of similarity matrices describing how similar pairs of biological entities are, based on different types of features (e.g. biological traits). The present paper first discusses two main techniques for combining the multi-layered information into a single network (the so-called monoplex), i.e. Similarity Network Fusion (SNF) and Similarity Matrix Average (SMA). Then, the effectiveness of the two methods is tested on a real-world dataset of the relative abundance of microbial species in the ecosystems of nine glaciers (four glaciers in the Alps and five in the Andes). A preliminary clustering analysis on the monoplexes obtained with different methods shows the emergence of a tightly connected community formed by species that are typical of cryoconite holes worldwide. Moreover, the weights assigned to different layers by the SMA algorithm suggest that two large South American glaciers (Exploradores and Perito Moreno) are structurally different from the smaller glaciers in both Europe and South America. Overall, these results highlight the importance of integration methods in the discovery of the underlying organizational structure of biological entities in multilayer ecological networks.

Surface parameterization plays a fundamental role in many science and engineering problems. In particular, as genus-0 closed surfaces are topologically equivalent to a sphere, many spherical parameterization methods have been developed over the past few decades. However, in practice, mapping a genus-0 closed surface onto a sphere may result in a large distortion due to their geometric difference. In this work, we propose a new framework for computing ellipsoidal conformal and quasi-conformal parameterizations of genus-0 closed surfaces, in which the target parameter domain is an ellipsoid instead of a sphere. By combining simple conformal transformations with different types of quasi-conformal mappings, we can easily achieve a large variety of ellipsoidal parameterizations with their bijectivity guaranteed by quasi-conformal theory. Numerical experiments are presented to demonstrate the effectiveness of the proposed framework.

At least two, different approaches to define and solve statistical models for the analysis of economic systems exist: the typical, econometric one, interpreting the Gravity Model specification as the expected link weight of an arbitrary probability distribution, and the one rooted into statistical physics, constructing maximum-entropy distributions constrained to satisfy certain network properties. In a couple of recent, companion papers they have been successfully integrated within the framework induced by the constrained minimisation of the Kullback-Leibler divergence: specifically, two, broad classes of models have been devised, i.e. the integrated and the conditional ones, defined by different, probabilistic rules to place links, load them with weights and turn them into proper, econometric prescriptions. Still, the recipes adopted by the two approaches to estimate the parameters entering into the definition of each model differ. In econometrics, a likelihood that decouples the binary and weighted parts of a model, treating a network as deterministic, is typically maximised; to restore its random character, two alternatives exist: either solving the likelihood maximisation on each configuration of the ensemble and taking the average of the parameters afterwards or taking the average of the likelihood function and maximising the latter one. The difference between these approaches lies in the order in which the operations of averaging and maximisation are taken - a difference that is reminiscent of the quenched and annealed ways of averaging out the disorder in spin glasses. The results of the present contribution, devoted to comparing these recipes in the case of continuous, conditional network models, indicate that the annealed estimation recipe represents the best alternative to the deterministic one.

北京阿比特科技有限公司