亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

In the era of the Internet of Things (IoT), massive computing devices surrounding us operate and interact with each other to provide several significant services in industries, medical as well as in daily life activities at home, office, education sectors, and so on. The participating devices in an IoT network usually have resource constraints and the devices are prone to different cyber attacks, leading to the loopholes in the security and authentication. As a revolutionized and innovated technology, blockchain, that is applied in cryptocurrency, market prediction, etc., uses a distributed ledger that records transactions securely and efficiently. To utilize the great potential of blockchain, both industries and academia have paid a significant attention to integrate it with the IoT, as reported by several existing literature. On the other hand, Artificial Intelligence (AI) is able to embed intelligence in a system, and thus the AI can be integrated with IoT devices in order to automatically cope with different environments according to the demands. Furthermore, both blockchain and AI can be integrated with the IoT to design an automated secure and robust IoT model, as mentioned by numerous existing works. In this survey, we present a discussion on the IoT, blockchain, and AI, along with the descriptions of several research works that apply blockchain and AI in the IoT. In this direction, we point out strengths and limitations of the related existing researches. We also discuss different open challenges to exploit the full capacities of blockchain and AI in designing an IoT-based model. Therefore, the highlighted challenging issues can open the door for the development of future IoT models which will be intelligent and secure based on the integration of blockchain and AI with the IoT.

相關內容

Integration:Integration, the VLSI Journal。 Explanation:集成,VLSI雜志。 Publisher:Elsevier。 SIT:

The concept of traditional farming is changing rapidly with the introduction of smart technologies like the Internet of Things (IoT). Under the concept of smart agriculture, precision agriculture is gaining popularity to enable Decision Support System (DSS)-based farming management that utilizes widespread IoT sensors and wireless connectivity to enable automated detection and optimization of resources. Undoubtedly the success of the system would be impacted on crop productivity, where failure would impact severely. Like many other cyber-physical systems, one of the growing challenges to avoid system adversity is to ensure the system's security, privacy, and trust. But what are the vulnerabilities, threats, and security issues we should consider while deploying precision agriculture? This paper has conducted a holistic threat modeling on component levels of precision agriculture's standard infrastructure using popular threat intelligence tools STRIDE to identify common security issues. Our modeling identifies a noticing of fifty-eight potential security threats to consider. This presentation systematically presented them and advised general mitigation suggestions to support cyber security in precision agriculture.

To support the unprecedented growth of the Internet of Things (IoT) applications and the access of tremendous IoT devices, two new technologies emerge recently to overcome the shortage of spectrum resources. The first one, known as integrated sensing and communication (ISAC), aims to share the spectrum bandwidth for both radar sensing and data communication. The second one, called over-the-air computation (AirComp), enables simultaneous transmission and computation of data from multiple IoT devices in the same frequency. The promising performance of ISAC and AirComp motivates the current work on developing a framework that combines the merits of both called integrated sensing and AirComp (ISAA). Two schemes are designed to support multiple-input-multiple-output (MIMO) ISAA simultaneously, namely the shared and separated schemes. The performance metrics of radar sensing and AirComp are evaluated by the mean square errors of the estimated target response matrix and the received computation results, respectively. The design challenge of MIMO ISAA lies in the joint optimization of radar sensing beamformers and data transmission beamformers at the IoT devices, and data aggregation beamformer at the server, which results in complex non-convex problem. To solve this problem, an algorithmic solution based on the technique of semidefinite relaxation is proposed. The results reveal that the beamformer at each sensor needs to account for supporting dual-functional signals in the shared scheme, while dedicated beamformers for sensing and AirComp are needed to mitigate the mutual interference between the two functionalities in the separated scheme. The use case of target location estimation based on ISAA is demonstrated in simulation to show the performance superiority.

With the global Internet of Things IoT market size predicted to grow to over 1 trillion dollars in the next 5 years, many large corporations are scrambling to solidify their product line as the defacto device suite for consumers. This has led to each corporation developing their devices in a siloed environment with unique protocols and runtime frameworks that explicitly exclude the ability to work with the competitions devices. This development silo has created problems with programming complexity for application developers as well as concurrency and scalability limitations for applications that involve a network of IoT devices. The Constellation project is a distributed IoT runtime system that attempts to address these challenges by creating an operating system layer that decouples applications from devices. This layer provides mechanisms designed to allow applications to interface with an underlying substrate of IoT devices while abstracting away the complexities of application concurrency, device interoperability, and system scalability. This paper provides an overview of the Constellation system as well as details four new project expansions to improve system scalability.

Stream processing has been an active research field for more than 20 years, but it is now witnessing its prime time due to recent successful efforts by the research community and numerous worldwide open-source communities. This survey provides a comprehensive overview of fundamental aspects of stream processing systems and their evolution in the functional areas of out-of-order data management, state management, fault tolerance, high availability, load management, elasticity, and reconfiguration. We review noteworthy past research findings, outline the similarities and differences between early ('00-'10) and modern ('11-'18) streaming systems, and discuss recent trends and open problems.

Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis in locations close to where data is captured based on artificial intelligence. The aim of edge intelligence is to enhance the quality and speed of data processing and protect the privacy and security of the data. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this paper, we present a thorough and comprehensive survey on the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, namely edge caching, edge training, edge inference, and edge offloading, based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare and analyse the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, etc. This survey article provides a comprehensive introduction to edge intelligence and its application areas. In addition, we summarise the development of the emerging research field and the current state-of-the-art and discuss the important open issues and possible theoretical and technical solutions.

The concept of smart grid has been introduced as a new vision of the conventional power grid to figure out an efficient way of integrating green and renewable energy technologies. In this way, Internet-connected smart grid, also called energy Internet, is also emerging as an innovative approach to ensure the energy from anywhere at any time. The ultimate goal of these developments is to build a sustainable society. However, integrating and coordinating a large number of growing connections can be a challenging issue for the traditional centralized grid system. Consequently, the smart grid is undergoing a transformation to the decentralized topology from its centralized form. On the other hand, blockchain has some excellent features which make it a promising application for smart grid paradigm. In this paper, we have an aim to provide a comprehensive survey on application of blockchain in smart grid. As such, we identify the significant security challenges of smart grid scenarios that can be addressed by blockchain. Then, we present a number of blockchain-based recent research works presented in different literatures addressing security issues in the area of smart grid. We also summarize several related practical projects, trials, and products that have been emerged recently. Finally, we discuss essential research challenges and future directions of applying blockchain to smart grid security issues.

Driven by the visions of Internet of Things and 5G communications, the edge computing systems integrate computing, storage and network resources at the edge of the network to provide computing infrastructure, enabling developers to quickly develop and deploy edge applications. Nowadays the edge computing systems have received widespread attention in both industry and academia. To explore new research opportunities and assist users in selecting suitable edge computing systems for specific applications, this survey paper provides a comprehensive overview of the existing edge computing systems and introduces representative projects. A comparison of open source tools is presented according to their applicability. Finally, we highlight energy efficiency and deep learning optimization of edge computing systems. Open issues for analyzing and designing an edge computing system are also studied in this survey.

To make deliberate progress towards more intelligent and more human-like artificial systems, we need to be following an appropriate feedback signal: we need to be able to define and evaluate intelligence in a way that enables comparisons between two systems, as well as comparisons with humans. Over the past hundred years, there has been an abundance of attempts to define and measure intelligence, across both the fields of psychology and AI. We summarize and critically assess these definitions and evaluation approaches, while making apparent the two historical conceptions of intelligence that have implicitly guided them. We note that in practice, the contemporary AI community still gravitates towards benchmarking intelligence by comparing the skill exhibited by AIs and humans at specific tasks such as board games and video games. We argue that solely measuring skill at any given task falls short of measuring intelligence, because skill is heavily modulated by prior knowledge and experience: unlimited priors or unlimited training data allow experimenters to "buy" arbitrary levels of skills for a system, in a way that masks the system's own generalization power. We then articulate a new formal definition of intelligence based on Algorithmic Information Theory, describing intelligence as skill-acquisition efficiency and highlighting the concepts of scope, generalization difficulty, priors, and experience. Using this definition, we propose a set of guidelines for what a general AI benchmark should look like. Finally, we present a benchmark closely following these guidelines, the Abstraction and Reasoning Corpus (ARC), built upon an explicit set of priors designed to be as close as possible to innate human priors. We argue that ARC can be used to measure a human-like form of general fluid intelligence and that it enables fair general intelligence comparisons between AI systems and humans.

Within the rapidly developing Internet of Things (IoT), numerous and diverse physical devices, Edge devices, Cloud infrastructure, and their quality of service requirements (QoS), need to be represented within a unified specification in order to enable rapid IoT application development, monitoring, and dynamic reconfiguration. But heterogeneities among different configuration knowledge representation models pose limitations for acquisition, discovery and curation of configuration knowledge for coordinated IoT applications. This paper proposes a unified data model to represent IoT resource configuration knowledge artifacts. It also proposes IoT-CANE (Context-Aware recommendatioN systEm) to facilitate incremental knowledge acquisition and declarative context driven knowledge recommendation.

The field of Multi-Agent System (MAS) is an active area of research within Artificial Intelligence, with an increasingly important impact in industrial and other real-world applications. Within a MAS, autonomous agents interact to pursue personal interests and/or to achieve common objectives. Distributed Constraint Optimization Problems (DCOPs) have emerged as one of the prominent agent architectures to govern the agents' autonomous behavior, where both algorithms and communication models are driven by the structure of the specific problem. During the last decade, several extensions to the DCOP model have enabled them to support MAS in complex, real-time, and uncertain environments. This survey aims at providing an overview of the DCOP model, giving a classification of its multiple extensions and addressing both resolution methods and applications that find a natural mapping within each class of DCOPs. The proposed classification suggests several future perspectives for DCOP extensions, and identifies challenges in the design of efficient resolution algorithms, possibly through the adaptation of strategies from different areas.

北京阿比特科技有限公司