亚洲男人的天堂2018av,欧美草比,久久久久久免费视频精选,国色天香在线看免费,久久久久亚洲av成人片仓井空

Membrane systems are a biologically-inspired computational model based on the structure of biological cells and the way chemicals interact and traverse their membranes. Although their dynamics are described by rules, encoding membrane systems into rewriting logic is not straightforward due to its complex control mechanisms. Multiple alternatives have been proposed in the literature and implemented in the Maude specification language. The recent release of the Maude strategy language and its associated strategy-aware model checker allow specifying these systems more easily, so that they become executable and verifiable for free. An easily-extensible interactive environment transforms membrane specifications into rewrite theories controlled by appropriate strategies, and allows simulating and verifying membrane computations by means of them.

相關內容

IFIP TC13 Conference on Human-Computer Interaction是人機交互領域的研究者和實踐者展示其工作的重要平臺。多年來,這些會議吸引了來自幾個國家和文化的研究人員。官網鏈接: · FAST · MoDELS · · Metal ·
2024 年 2 月 27 日

Fast and efficient simulations of metal additive manufacturing (AM) processes are highly relevant to exploring the full potential of this promising manufacturing technique. The microstructure composition plays an important role in characterizing the part quality and deriving mechanical properties. When complete parts are simulated, one often needs to resort to strong simplifications such as layer-wise heating due to the large number of simulated time steps compared to the small time step sizes. This article proposes a scan-resolved approach to the coupled thermo-microstructural problem. Building on a highly efficient thermal model, we discuss the implementation of a phenomenological microstructure model for the evolution of the three main constituents of Ti-6Al-4V: stable $\alpha_s$-phase, martensite $\alpha_m$-phase and $\beta$-phase. The implementation is tailored to modern hardware features using vectorization and fast approximations of transcendental functions. A performance model and numerical examples verify the high degree of optimization. We demonstrate the applicability and predictive power of the approach and the influence of scan strategy and geometry. Depending on the specific example, results can be obtained with moderate computational resources in a few hours to days. The numerical examples include a prediction of the microstructure on the full NIST AM Benchmark cantilever specimen.

Anthropomorphic social bots are engineered to emulate human verbal communication and generate toxic or inflammatory content across social networking services (SNSs). Bot-disseminated misinformation could subtly yet profoundly reshape societal processes by complexly interweaving factors like repeated disinformation exposure, amplified political polarization, compromised indicators of democratic health, shifted perceptions of national identity, propagation of false social norms, and manipulation of collective memory over time. However, extrapolating bots' pluripotency across hybridized, multilingual, and heterogeneous media ecologies from isolated SNS analyses remains largely unknown, underscoring the need for a comprehensive framework to characterise bots' emergent risks to civic discourse. Here we propose an interdisciplinary framework to characterise bots' pluripotency, incorporating quantification of influence, network dynamics monitoring, and interlingual feature analysis. When applied to the geopolitical discourse around the Russo-Ukrainian conflict, results from interlanguage toxicity profiling and network analysis elucidated spatiotemporal trajectories of pro-Russian and pro-Ukrainian human and bots across hybrid SNSs. Weaponized bots predominantly inhabited X, while human primarily populated Reddit in the social media warfare. This rigorous framework promises to elucidate interlingual homogeneity and heterogeneity in bots' pluripotent behaviours, revealing synergistic human-bot mechanisms underlying regimes of information manipulation, echo chamber formation, and collective memory manifestation in algorithmically structured societies.

We propose a numerical algorithm for the computation of multi-marginal optimal transport (MMOT) problems involving general measures that are not necessarily discrete. By developing a relaxation scheme in which marginal constraints are replaced by finitely many linear constraints and by proving a specifically tailored duality result for this setting, we approximate the MMOT problem by a linear semi-infinite optimization problem. Moreover, we are able to recover a feasible and approximately optimal solution of the MMOT problem, and its sub-optimality can be controlled to be arbitrarily close to 0 under mild conditions. The developed relaxation scheme leads to a numerical algorithm which can compute a feasible approximate optimizer of the MMOT problem whose theoretical sub-optimality can be chosen to be arbitrarily small. Besides the approximate optimizer, the algorithm is also able to compute both an upper bound and a lower bound on the optimal value of the MMOT problem. The difference between the computed bounds provides an explicit upper bound on the sub-optimality of the computed approximate optimizer. Through a numerical example, we demonstrate that the proposed algorithm is capable of computing a high-quality solution of an MMOT problem involving as many as 50 marginals along with an explicit estimate of its sub-optimality that is much less conservative compared to the theoretical estimate.

Many biological systems such as cell aggregates, tissues or bacterial colonies behave as unconventional systems of particles that are strongly constrained by volume exclusion and shape interactions. Understanding how these constraints lead to macroscopic self-organized structures is a fundamental question in e.g. developmental biology. To this end, various types of computational models have been developed: phase fields, cellular automata, vertex models, level-set, finite element simulations, etc. We introduce a new framework based on optimal transport theory to model particle systems with arbitrary dynamical shapes and deformability. Our method builds upon the pioneering work of Brenier on incompressible fluids and its recent applications to materials science. It lets us specify the shapes of individual cells and supports a wide range of interaction mechanisms, while automatically taking care of the volume exclusion constraint at an affordable numerical cost. We showcase the versatility of this approach by reproducing several classical systems in computational biology. Our Python code is freely available at: www.github.com/antoinediez/ICeShOT

Diffusion magnetic resonance imaging is sensitive to the microstructural properties of brain tissue. However, estimating clinically and scientifically relevant microstructural properties from the measured signals remains a highly challenging inverse problem that machine learning may help solve. This study investigated if recently developed rotationally invariant spherical convolutional neural networks can improve microstructural parameter estimation. We trained a spherical convolutional neural network to predict the ground-truth parameter values from efficiently simulated noisy data and applied the trained network to imaging data acquired in a clinical setting to generate microstructural parameter maps. Our network performed better than the spherical mean technique and multi-layer perceptron, achieving higher prediction accuracy than the spherical mean technique with less rotational variance than the multi-layer perceptron. Although we focused on a constrained two-compartment model of neuronal tissue, the network and training pipeline are generalizable and can be used to estimate the parameters of any Gaussian compartment model. To highlight this, we also trained the network to predict the parameters of a three-compartment model that enables the estimation of apparent neural soma density using tensor-valued diffusion encoding.

The keyhole phenomenon is widely observed in laser materials processing, including laser welding, remelting, cladding, drilling, and additive manufacturing. Keyhole-induced defects, primarily pores, dramatically affect the performance of final products, impeding the broad use of these laser-based technologies. The formation of these pores is typically associated with the dynamic behavior of the keyhole. So far, the accurate characterization and prediction of keyhole features, particularly keyhole depth, as a function of time has been a challenging task. In situ characterization of keyhole dynamic behavior using a synchrotron X-ray is complicated and expensive. Current simulations are hindered by their poor accuracies in predicting keyhole depths due to the lack of real-time laser absorptance data. Here, we develop a machine learning-aided simulation method that allows us to accurately predict keyhole depth over a wide range of processing parameters. Based on titanium and aluminum alloys, two commonly used engineering materials as examples, we achieve an accuracy with an error margin of 10 %, surpassing those simulated using other existing models (with an error margin in a range of 50-200 %). Our machine learning-aided simulation method is affordable and readily deployable for a large variety of materials, opening new doors to eliminate or reduce defects for a wide range of laser materials processing techniques.

Some mechanical systems, that are modeled to have inelastic collisions, nonetheless possess energy-conserving intermittent-contact solutions, known as collisionless solutions. Such a solution, representing a persistent hopping or walking across a level ground, may be important for understanding animal locomotion or for designing efficient walking machines. So far, collisionless motion has been analytically studied in simple two degrees of freedom (DOF) systems, or in a system that decouples into 2-DOF subsystems in the harmonic approximation. In this paper we extend the consideration to a N-DOF system, recovering the known solutions as a special N = 2 case of the general formulation. We show that in the harmonic approximation the collisionless solution is determined by the spectrum of the system. We formulate a solution existence condition, which requires the presence of at least one oscillating normal mode in the most constrained phase of the motion. An application of the developed general framework is illustrated by finding a collisionless solution for a rocking motion of a biped with an armed standing torso.

Interpolation of data on non-Euclidean spaces is an active research area fostered by its numerous applications. This work considers the Hermite interpolation problem: finding a sufficiently smooth manifold curve that interpolates a collection of data points on a Riemannian manifold while matching a prescribed derivative at each point. We propose a novel procedure relying on the general concept of retractions to solve this problem on a large class of manifolds, including those for which computing the Riemannian exponential or logarithmic maps is not straightforward, such as the manifold of fixed-rank matrices. We analyze the well-posedness of the method by introducing and showing the existence of retraction-convex sets, a generalization of geodesically convex sets. We extend to the manifold setting a classical result on the asymptotic interpolation error of Hermite interpolation. We finally illustrate these results and the effectiveness of the method with numerical experiments on the manifold of fixed-rank matrices and the Stiefel manifold of matrices with orthonormal columns.

Dose-finding trials are a key component of the drug development process and rely on a statistical design to help inform dosing decisions. Triallists wishing to choose a design require knowledge of operating characteristics of competing methods. This is often assessed using a large-scale simulation study with multiple designs and configurations investigated, which can be time-consuming and therefore limits the scope of the simulation. We introduce a new approach to the design of simulation studies of dose-finding trials. The approach simulates all potential outcomes that individuals could experience at each dose level in the trial. Datasets are simulated in advance and then the same datasets are applied to each of the competing methods to enable a more efficient head-to-head comparison. In two case-studies we show sizeable reductions in Monte Carlo error for comparing a performance metric between two competing designs. Efficiency gains depend on the similarity of the designs. Comparing two Phase I/II design variants, with high correlation of recommending the same optimal biologic dose, we show that the new approach requires a simulation study that is approximately 30 times smaller than the conventional approach. Furthermore, advance-simulated trial datasets can be reused to assess the performance of designs across multiple configurations. We recommend researchers consider this more efficient simulation approach in their dose-finding studies and we have updated the R package escalation to help facilitate implementation.

Validation metrics are key for the reliable tracking of scientific progress and for bridging the current chasm between artificial intelligence (AI) research and its translation into practice. However, increasing evidence shows that particularly in image analysis, metrics are often chosen inadequately in relation to the underlying research problem. This could be attributed to a lack of accessibility of metric-related knowledge: While taking into account the individual strengths, weaknesses, and limitations of validation metrics is a critical prerequisite to making educated choices, the relevant knowledge is currently scattered and poorly accessible to individual researchers. Based on a multi-stage Delphi process conducted by a multidisciplinary expert consortium as well as extensive community feedback, the present work provides the first reliable and comprehensive common point of access to information on pitfalls related to validation metrics in image analysis. Focusing on biomedical image analysis but with the potential of transfer to other fields, the addressed pitfalls generalize across application domains and are categorized according to a newly created, domain-agnostic taxonomy. To facilitate comprehension, illustrations and specific examples accompany each pitfall. As a structured body of information accessible to researchers of all levels of expertise, this work enhances global comprehension of a key topic in image analysis validation.

北京阿比特科技有限公司