Soft robotics is attractive for wearable applications that require conformal interactions with the human body. Soft wearable robotic garments hold promise for supplying dynamic compression or massage therapies, such as are applied for disorders affecting lymphatic and blood circulation. In this paper, we present a wearable robot capable of supplying dynamic compression and massage therapy via peristaltic motion of finger-sized soft, fluidic actuators. We show that this peristaltic wearable robot can supply dynamic compression pressures exceeding 22 kPa at frequencies of 14 Hz or more, meeting requirements for compression and massage therapy. A large variety of software-programmable compression wave patterns can be generated by varying frequency, amplitude, phase delay, and duration parameters. We first demonstrate the utility of this peristaltic wearable robot for compression therapy, showing fluid transport in a laboratory model of the upper limb. We theoretically and empirically identify driving regimes that optimize fluid transport. We second demonstrate the utility of this garment for dynamic massage therapy. These findings show the potential of such a wearable robot for the treatment of several health disorders associated with lymphatic and blood circulation, such as lymphedema and blood clots.
The number of mobile robots with constrained computing resources that need to execute complex machine learning models has been increasing during the past decade. Commonly, these robots rely on edge infrastructure accessible over wireless communication to execute heavy computational complex tasks. However, the edge might become unavailable and, consequently, oblige the execution of the tasks on the robot. This work focuses on making it possible to execute the tasks on the robots by reducing the complexity and the total number of parameters of pre-trained computer vision models. This is achieved by using model compression techniques such as Pruning and Knowledge Distillation. These compression techniques have strong theoretical and practical foundations, but their combined usage has not been widely explored in the literature. Therefore, this work especially focuses on investigating the effects of combining these two compression techniques. The results of this work reveal that up to 90% of the total number of parameters of a computer vision model can be removed without any considerable reduction in the model's accuracy.
We propose a doubly robust approach to characterizing treatment effect heterogeneity in observational studies. We develop a frequentist inferential procedure that utilizes posterior distributions for both the propensity score and outcome regression models to provide valid inference on the conditional average treatment effect even when high-dimensional or nonparametric models are used. We show that our approach leads to conservative inference in finite samples or under model misspecification, and provides a consistent variance estimator when both models are correctly specified. In simulations, we illustrate the utility of these results in difficult settings such as high-dimensional covariate spaces or highly flexible models for the propensity score and outcome regression. Lastly, we analyze environmental exposure data from NHANES to identify how the effects of these exposures vary by subject-level characteristics.
Simulation is an important step in robotics for creating control policies and testing various physical parameters. Soft robotics is a field that presents unique physical challenges for simulating its subjects due to the nonlinearity of deformable material components along with other innovative, and often complex, physical properties. Because of the computational cost of simulating soft and heterogeneous objects with traditional techniques, rigid robotics simulators are not well suited to simulating soft robots. Thus, many engineers must build their own one-off simulators tailored to their system, or use existing simulators with reduced performance. In order to facilitate the development of this exciting technology, this work presents an interactive-speed, accurate, and versatile simulator for a variety of types of soft robots. Cronos, our open-source 3D simulation engine, parallelizes a mass-spring model for ultra-fast performance on both deformable and rigid objects. Our approach is applicable to a wide array of nonlinear material configurations, including high deformability, volumetric actuation, or heterogenous stiffness. This versatility provides the ability to mix materials and geometric components freely within a single robot simulation. By exploiting the flexibility and scalability of nonlinear Hookean mass-spring systems, this framework simulates soft and rigid objects via a highly parallel model for near real-time speed. We describe an efficient GPU CUDA implementation, which we demonstrate to achieve computation of over 1 billion elements per second on consumer-grade GPU cards. Dynamic physical accuracy of the system is validated by comparing results to Euler-Bernoulli beam theory, natural frequency predictions, and empirical data of a soft structure under large deformation.
Trust management systems often use registries to authenticate data, or form trust decisions. Examples are revocation registries and trust status lists. By introducing distributed ledgers (DLs), it is also possible to create decentralized registries. A verifier then queries a node of the respective ledger, e.g., to retrieve trust status information during the verification of a credential. While this ensures trustworthy information, the process requires the verifier to be online and the ledger node available. Additionally, the connection from the verifier to the registry poses a privacy issue, as it leaks information about the user's behavior. In this paper, we resolve these issues by extending existing ledger APIs to support results that are trustworthy even in an offline setting. We do this by introducing attestations of the ledger's state, issued by ledger nodes, aggregatable into a collective attestation by all nodes. This attestation enables a user to prove the provenance of DL-based data to an offline verifier. Our approach is generic. So once deployed it serves as a basis for any use case with an offline verifier. We also provide an implementation for the Ethereum stack and evaluate it, demonstrating the practicability of our approach.
We identify and demonstrate a weakness of Petri Nets (PN) in specifying composite behavior of reactive systems. Specifically, we show how, when specifying multiple requirements in one PN model, modelers are obliged to specify mechanisms for combining these requirements. This yields, in many cases, over-specification and incorrect models. We demonstrate how some execution paths are missed, and some are generated unintentionally. To support this claim, we analyze PN models from the literature, identify the combination mechanisms, and demonstrate their effect on the correctness of the model. To address this problem, we propose to model the system behavior using behavioral programming (BP), a software development and modeling paradigm designed for seamless integration of independent requirements. Specifically, we demonstrate how the semantics of BP, which define how to interweave scenarios into a single model, allow avoiding the over-specification. Additionally, while BP maintains the same mathematical properties as PN, it provides means for changing the model dynamically, thus increasing the agility of the specification. We compare BP and PN in quantitative and qualitative measures by analyzing the models, their generated execution paths, and the specification process. Finally, while BP is supported by tools that allow for applying formal methods and reasoning techniques to the model, it lacks the legacy of PN tools and algorithms. To address this issue, we propose semantics and a tool for translating BP models to PN and vice versa.
Asymmetry along with heteroscedasticity or contamination often occurs with the growth of data dimensionality. In ultra-high dimensional data analysis, such irregular settings are usually overlooked for both theoretical and computational convenience. In this paper, we establish a framework for estimation in high-dimensional regression models using Penalized Robust Approximated quadratic M-estimators (PRAM). This framework allows general settings such as random errors lack of symmetry and homogeneity, or the covariates are not sub-Gaussian. To reduce the possible bias caused by the data's irregularity in mean regression, PRAM adopts a loss function with a flexible robustness parameter growing with the sample size. Theoretically, we first show that, in the ultra-high dimension setting, PRAM estimators have local estimation consistency at the minimax rate enjoyed by the LS-Lasso. Then we show that PRAM with an appropriate non-convex penalty in fact agrees with the local oracle solution, and thus obtain its oracle property. Computationally, we demonstrate the performances of six PRAM estimators using three types of loss functions for approximation (Huber, Tukey's biweight and Cauchy loss) combined with two types of penalty functions (Lasso and MCP). Our simulation studies and real data analysis demonstrate satisfactory finite sample performances of the PRAM estimator under general irregular settings.
We develop a model-based boosting approach for multivariate distributional regression within the framework of generalized additive models for location, scale, and shape. Our approach enables the simultaneous modeling of all distribution parameters of an arbitrary parametric distribution of a multivariate response conditional on explanatory variables, while being applicable to potentially high-dimensional data. Moreover, the boosting algorithm incorporates data-driven variable selection, taking various different types of effects into account. As a special merit of our approach, it allows for modelling the association between multiple continuous or discrete outcomes through the relevant covariates. After a detailed simulation study investigating estimation and prediction performance, we demonstrate the full flexibility of our approach in three diverse biomedical applications. The first is based on high-dimensional genomic cohort data from the UK Biobank, considering a bivariate binary response (chronic ischemic heart disease and high cholesterol). Here, we are able to identify genetic variants that are informative for the association between cholesterol and heart disease. The second application considers the demand for health care in Australia with the number of consultations and the number of prescribed medications as a bivariate count response. The third application analyses two dimensions of childhood undernutrition in Nigeria as a bivariate response and we find that the correlation between the two undernutrition scores is considerably different depending on the child's age and the region the child lives in.
High performance trajectory tracking control of quadrotor vehicles is an important challenge in aerial robotics. Symmetry is a fundamental property of physical systems and offers the potential to provide a tool to design high-performance control algorithms. We propose a design methodology that takes any given symmetry, linearises the associated error in a single set of coordinates, and uses LQR design to obtain a high performance control; an approach we term Equivariant Regulator design. We show that quadrotor vehicles admit several different symmetries: the direct product symmetry, the extended pose symmetry and the pose and velocity symmetry, and show that each symmetry can be used to define a global error. We compare the linearised systems via simulation and find that the extended pose and pose and velocity symmetries outperform the direct product symmetry in the presence of large disturbances. This suggests that choices of equivariant and group affine symmetries have improved linearisation error.
Bacterial infections are responsible for high mortality worldwide. Antimicrobial resistance underlying the infection, and multifaceted patient's clinical status can hamper the correct choice of antibiotic treatment. Randomized clinical trials provide average treatment effect estimates but are not ideal for risk stratification and optimization of therapeutic choice, i.e., individualized treatment effects (ITE). Here, we leverage large-scale electronic health record data, collected from Southern US academic clinics, to emulate a clinical trial, i.e., 'target trial', and develop a machine learning model of mortality prediction and ITE estimation for patients diagnosed with acute bacterial skin and skin structure infection (ABSSSI) due to methicillin-resistant Staphylococcus aureus (MRSA). ABSSSI-MRSA is a challenging condition with reduced treatment options - vancomycin is the preferred choice, but it has non-negligible side effects. First, we use propensity score matching to emulate the trial and create a treatment randomized (vancomycin vs. other antibiotics) dataset. Next, we use this data to train various machine learning methods (including boosted/LASSO logistic regression, support vector machines, and random forest) and choose the best model in terms of area under the receiver characteristic (AUC) through bootstrap validation. Lastly, we use the models to calculate ITE and identify possible averted deaths by therapy change. The out-of-bag tests indicate that SVM and RF are the most accurate, with AUC of 81% and 78%, respectively, but BLR/LASSO is not far behind (76%). By calculating the counterfactuals using the BLR/LASSO, vancomycin increases the risk of death, but it shows a large variation (odds ratio 1.2, 95% range 0.4-3.8) and the contribution to outcome probability is modest. Instead, the RF exhibits stronger changes in ITE, suggesting more complex treatment heterogeneity.
Reliability of SLAM systems is considered one of the critical requirements in modern autonomous systems. This directed the efforts to developing many state-of-the-art systems, creating challenging datasets, and introducing rigorous metrics to measure SLAM performance. However, the link between datasets and performance in the robustness/resilience context has rarely been explored. In order to fill this void, characterization of the operating conditions of SLAM systems is essential in order to provide an environment for quantitative measurement of robustness and resilience. In this paper, we argue that for proper evaluation of SLAM performance, the characterization of SLAM datasets serves as a critical first step. The study starts by reviewing previous efforts for quantitative characterization of SLAM datasets. Then, the problem of perturbation characterization is discussed and the linkage to SLAM robustness/resilience is established. After that, we propose a novel, generic and extendable framework for quantitative analysis and comparison of SLAM datasets. Additionally, a description of different characterization parameters is provided. Finally, we demonstrate the application of our framework by presenting the characterization results of three SLAM datasets: KITTI, EuroC-MAV, and TUM-VI highlighting the level of insights achieved by the proposed framework.