亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

Amidst the climate crisis, the massive introduction of renewable energy sources has brought tremendous challenges to both the power grid and its surrounding markets. As datacenters have become ever-larger and more powerful, they play an increasingly significant role in the energy arena. With their unique characteristics, datacenters have been proved to be well-suited for regulating the power grid yet currently provide little, if any, such active response. This problem is due to issues such as unsuitability of the market design, high complexity of the currently proposed solutions, as well as the potential risks thereof. This work aims to provide individual datacenters with insights on the feasibility and profitability of directly participating in the energy market. By modelling the power system of datacenters, and by conducting simulations on real-world datacenter traces, we demonstrate the substantial financial incentive for individual datacenters to directly participate in both the day-ahead and the balancing markets. In turn, we suggest a new short-term, direct scheme of market participation for individual datacenters in place of the current long-term, inactive participation. Furthermore, we develop a novel proactive DVFS scheduling algorithm that can both reduce energy consumption and save energy costs during the market participation of datacenters. Also, in developing this scheduler, we propose an innovative combination of machine learning methods and the DVFS technology that can provide the power grid with indirect demand response (DR). Our experimental results strongly support that individual datacenters can and should directly participate in the energy market both to save their energy costs and to curb their energy consumption, whilst providing the power grid with indirect DR.

相關內容

While the digitization of power distribution grids brings many benefits, it also introduces new vulnerabilities for cyber-attacks. To maintain secure operations in the emerging threat landscape, detecting and implementing countermeasures against cyber-attacks are paramount. However, due to the lack of publicly available attack data against Smart Grids (SGs) for countermeasure development, simulation-based data generation approaches offer the potential to provide the needed data foundation. Therefore, our proposed approach provides flexible and scalable replication of multi-staged cyber-attacks in an SG Co-Simulation Environment (COSE). The COSE consists of an energy grid simulator, simulators for Operation Technology (OT) devices, and a network emulator for realistic IT process networks. Focusing on defensive and offensive use cases in COSE, our simulated attacker can perform network scans, find vulnerabilities, exploit them, gain administrative privileges, and execute malicious commands on OT devices. As an exemplary countermeasure, we present a built-in Intrusion Detection System (IDS) that analyzes generated network traffic using anomaly detection with Machine Learning (ML) approaches. In this work, we provide an overview of the SG COSE, present a multi-stage attack model with the potential to disrupt grid operations, and show exemplary performance evaluations of the IDS in specific scenarios.

Increasing volatilities within power transmission and distribution force power grid operators to amplify their use of communication infrastructure to monitor and control their grid. The resulting increase in communication creates a larger attack surface for malicious actors. Indeed, cyber attacks on power grids have already succeeded in causing temporary, large-scale blackouts in the recent past. In this paper, we analyze the communication infrastructure of power grids to derive resulting fundamental challenges of power grids with respect to cybersecurity. Based on these challenges, we identify a broad set of resulting attack vectors and attack scenarios that threaten the security of power grids. To address these challenges, we propose to rely on a defense-in-depth strategy, which encompasses measures for (i) device and application security, (ii) network security, (iii) physical security, as well as (iv) policies, procedures, and awareness. For each of these categories, we distill and discuss a comprehensive set of state-of-the art approaches, and identify further opportunities to strengthen cybersecurity in interconnected power grids.

The growth of the Internet and its associated technologies; including digital services have tremendously impacted our society. However, scholars have noted a trend in data flow and collection; and have alleged mass surveillance and digital supremacy. To this end therefore, nations of the world such as Russia, China, Germany, Canada, France and Brazil among others have taken steps toward changing the narrative. The question now is, should Africans join these giants in this school of thought on digital sovereignty or fold their hands to remain on the other side of the divide? This question among others are the main reasons that provoked the thoughts of putting this paper together. This is with a view to demystifying the strategies to reconfigure data infrastructure in the context of Africa. It also highlights the benefits of digital technologies and its propensity to foster all round development in the continent as it relates to economic face-lift, employment creation, national security, among others. There is therefore a need for African nations to design appropriate blueprint to ensure security of her digital infrastructure and the flow of data within her cyber space. In addition, a roadmap in the immediate, short- and long-term in accordance with the framework of African developmental goals should be put in place to guide the implementation.

The paper develops a typical management problem of a large power producer (i.e., he can partly influence the market price). In particular, he routinely needs to decide how much of his generation it is preferable to commit to fixed price bilateral contracts (e.g., futures) or to the spot market. However, he also needs to plan how to distribute the production across the different plants under his control. The two decisions, namely the sales-mix and the generation plan, naturally interact, since the opportunity to influence the spot price depends, among other things, by the amount of the energy that the producer directs on the spot market. We develop a risk management problem, since we consider an optimization problem combining a trade-off between expectation and conditional value at risk of the profit function of the producer. The sources of uncertainty are relatively large and encompass demand, renewables generation and the fuel costs of conventional plants. We also model endogenously the price of futures in a way reflecting an information advantage of a large power producer. In particular, it is assumed that the market forecast the price of futures in a naive way, namely not anticipating the impact of the large producer on the spot market. The paper provides a MILP formulation of the problem, and it analyzes the solution through a simulation based on Spanish power market data.

Providing forecasts for ultra-long time series plays a vital role in various activities, such as investment decisions, industrial production arrangements, and farm management. This paper develops a novel distributed forecasting framework to tackle challenges associated with forecasting ultra-long time series by using the industry-standard MapReduce framework. The proposed model combination approach facilitates distributed time series forecasting by combining the local estimators of time series models delivered from worker nodes and minimizing a global loss function. In this way, instead of unrealistically assuming the data generating process (DGP) of an ultra-long time series stays invariant, we make assumptions only on the DGP of subseries spanning shorter time periods. We investigate the performance of the proposed approach with AutoRegressive Integrated Moving Average (ARIMA) models using the real data application as well as numerical simulations. Compared to directly fitting the whole data with ARIMA models, our approach results in improved forecasting accuracy and computational efficiency both in point forecasts and prediction intervals, especially for longer forecast horizons. Moreover, we explore some potential factors that may affect the forecasting performance of our approach.

Temporal network analysis and time evolution of network characteristics are powerful tools in describing the changing topology of dynamic networks. This paper uses such approaches to better visualize and provide analytical measures for the changes in performance that we observed in Voronoi-type spatial coverage, particularly for the example of time evolving networks with a changing number of wireless sensors being deployed. Specifically, our analysis focuses on the role different combinations of impenetrable obstacles and environmental noise play in connectivity and overall network structure. It is shown how the use of (i) temporal network graphs, and (ii) network centrality and regularity measures illustrate the differences between various options developed for the balancing act of energy and time efficiency in network coverage. Lastly, we compare the outcome of these measures with the less abstract classification variables, such as percent area covered, and cumulative distance travelled.

This paper addresses the energy management of a grid-connected renewable generation plant coupled with a battery energy storage device in the capacity firming market, designed to promote renewable power generation facilities in small non-interconnected grids. The core contribution is to propose a probabilistic forecast-driven strategy, modeled as a min-max-min robust optimization problem with recourse. It is solved using a Benders-dual cutting plane algorithm and a column and constraints generation algorithm in a tractable manner. A dynamic risk-averse parameters selection strategy based on the quantile forecasts distribution is proposed to improve the results. A secondary contribution is to use a recently developed deep learning model known as normalizing flows to generate quantile forecasts of renewable generation for the robust optimization problem. This technique provides a general mechanism for defining expressive probability distributions, only requiring the specification of a base distribution and a series of bijective transformations. Overall, the robust approach improves the results over a deterministic approach with nominal point forecasts by finding a trade-off between conservative and risk-seeking policies. The case study uses the photovoltaic generation monitored on-site at the University of Li\`ege (ULi\`ege), Belgium.

The radio access network (RAN) part of the next-generation wireless networks will require efficient solutions for satisfying low latency and high-throughput services. The open RAN (O-RAN) is one of the candidates to achieve this goal, in addition to increasing vendor diversity and promoting openness. In the O-RAN architecture, network functions are executed in central units (CU), distributed units (DU), and radio units (RU). These entities are virtualized on general-purpose CPUs and form a processing pool. These processing pools can be located in different geographical places and have limited capacity, affecting the energy consumption and the performance of networks. Additionally, since user demand is not deterministic, special attention should be paid to allocating resource blocks to users by ensuring their expected quality of service for latency-sensitive traffic flows. In this paper, we propose a joint optimization solution to enhance energy efficiency and provide delay guarantees to the users in the O-RAN architecture. We formulate this novel problem and linearize it to provide a solution with a mixed-integer linear problem (MILP) solver. We compare this with a baseline that addresses this optimization problem using a disjoint approach. The results show that our approach outperforms the baseline method in terms of energy efficiency.

The long-term goal of this work is the development of high-fidelity simulation tools for dispersive tsunami propagation. A dispersive model is especially important for short wavelength phenomena such as an asteroid impact into the ocean, and is also important in modeling other events where the simpler shallow water equations are insufficient. Adaptive simulations are crucial to bridge the scales from deep ocean to inundation, but have difficulties with the implicit system of equations that results from dispersive models. We propose a fractional step scheme that advances the solution on separate patches with different spatial resolutions and time steps. We show a simulation with 7 levels of adaptive meshes and onshore inundation resulting from a simulated asteroid impact off the coast of Washington. Finally, we discuss a number of open research questions that need to be resolved for high quality simulations.

The concept of smart grid has been introduced as a new vision of the conventional power grid to figure out an efficient way of integrating green and renewable energy technologies. In this way, Internet-connected smart grid, also called energy Internet, is also emerging as an innovative approach to ensure the energy from anywhere at any time. The ultimate goal of these developments is to build a sustainable society. However, integrating and coordinating a large number of growing connections can be a challenging issue for the traditional centralized grid system. Consequently, the smart grid is undergoing a transformation to the decentralized topology from its centralized form. On the other hand, blockchain has some excellent features which make it a promising application for smart grid paradigm. In this paper, we have an aim to provide a comprehensive survey on application of blockchain in smart grid. As such, we identify the significant security challenges of smart grid scenarios that can be addressed by blockchain. Then, we present a number of blockchain-based recent research works presented in different literatures addressing security issues in the area of smart grid. We also summarize several related practical projects, trials, and products that have been emerged recently. Finally, we discuss essential research challenges and future directions of applying blockchain to smart grid security issues.

北京阿比特科技有限公司