Browsing by Subject "Reliability"
- Results Per Page
- Sort Options
Item Open Access Accrual Noise Ratio as a Measure of Accrual Reliability(2009) Njoroge, KennethI develop an empirical model that estimates a firm-specific accrual noise ratio (ANR), an operational and statistically grounded measure of accrual reliability, and test the measure's construct validity. The model allows accrual reliability to vary across firms, which is particularly important because many reliability determinants vary in cross-section. Unlike metrics that measure relative perceived reliability, ANR measures accrual reliability independent of the perceptions of investors, creditors or auditors. I find that ANR relates in expected ways with multiple proxies of accounting reliability, that ANR's relation with the proxies of other accounting constructs is consistent with theory, and that ANR's sensitivity to percentage changes of accrual components is consistent with a subjective ordinal ranking of the components' reliability from prior literature.
Item Open Access An Empirically Based Stochastic Turbulence Simulator with Temporal Coherence for Wind Energy Applications(2016) Rinker, Jennifer MarieIn this dissertation, we develop a novel methodology for characterizing and simulating nonstationary, full-field, stochastic turbulent wind fields.
In this new method, nonstationarity is characterized and modeled via temporal coherence, which is quantified in the discrete frequency domain by probability distributions of the differences in phase between adjacent Fourier components.
The empirical distributions of the phase differences can also be extracted from measured data, and the resulting temporal coherence parameters can quantify the occurrence of nonstationarity in empirical wind data.
This dissertation (1) implements temporal coherence in a desktop turbulence simulator, (2) calibrates empirical temporal coherence models for four wind datasets, and (3) quantifies the increase in lifetime wind turbine loads caused by temporal coherence.
The four wind datasets were intentionally chosen from locations around the world so that they had significantly different ambient atmospheric conditions.
The prevalence of temporal coherence and its relationship to other standard wind parameters was modeled through empirical joint distributions (EJDs), which involved fitting marginal distributions and calculating correlations.
EJDs have the added benefit of being able to generate samples of wind parameters that reflect the characteristics of a particular site.
Lastly, to characterize the effect of temporal coherence on design loads, we created four models in the open-source wind turbine simulator FAST based on the \windpact turbines, fit response surfaces to them, and used the response surfaces to calculate lifetime turbine responses to wind fields simulated with and without temporal coherence.
The training data for the response surfaces was generated from exhaustive FAST simulations that were run on the high-performance computing (HPC) facilities at the National Renewable Energy Laboratory.
This process was repeated for wind field parameters drawn from the empirical distributions and for wind samples drawn using the recommended procedure in the wind turbine design standard \iec.
The effect of temporal coherence was calculated as a percent increase in the lifetime load over the base value with no temporal coherence.
Item Open Access From Adversaries to Anomalies: Addressing Real-World Vulnerabilities of Deep Learning-based Vision Models(2024) Inkawhich, Matthew JosephDeep Neural Networks (DNNs) have driven the performance of computer vision to new heights, which has led to them to being rapidly integrated into many of our real-world systems. Meanwhile, the majority of research on DNNs remains focused on enhancing accuracy and efficiency. Furthermore, the evaluation protocols used to quantify performance generally assume idealistic operating conditions that do not well-emulate realistic environments. For example, modern benchmarks typically have balanced class distributions, ample training data, consistent object scale, minimal noise, and only test on inputs that lie within the training distribution. As a result, we are currently integrating these naive and under-tested models into our trusted systems! In this work, we focus on the robustness of DNN-based vision models, seeking to understand their vulnerabilities to non-ideal deployment data. The rallying cry of our research is that before these models are deployed into our safety-critical applications (e.g., autonomous vehicles, defense technologies), we must attempt to anticipate, understand, and address all possible vulnerabilities. We begin by investigating a class of malignant inputs that are specifically designed to fool DNN models. We conduct this investigation by taking on the perspective of an adversary who wishes to attack a pretrained DNN by adding (nearly) imperceptible noise to a benign input to fool a downstream model. While most adversarial literature focuses on image classifiers, we seek to understand the feasibility of attacks on other tasks such as video recognition models and deep reinforcement learning agents. Sticking to the theme of \textit{realistic} vulnerabilities, we primarily focus on black-box attacks in which the adversary does not assume knowledge of the target model's architecture and parameters. Our novel attack algorithms achieve surprisingly strong effectiveness, thus uncovering new serious potential security risks.
While malignant adversarial inputs represent a critical vulnerability, they are still a fairly niche issue in the context of all problematic inputs for a DNN. In the second phase of our work, we turn our attention to the open-set vulnerability. Here, we acknowledge that during deployment, models may encounter novel classes from outside of their training distribution. Again, the majority of works in this area only consider image classifiers for their simplicity. This motivates us to study the more complex and practically useful open-set object detection problem. We address this problem in two phases. First, we create a tunable class-agnostic object proposal network that can be easily adapted to suit a variety of open-set applications. Next, we define a new Open-Set Object Detection and Discovery (OSODD) task that emphasizes both known and unknown object detection with class-wise separation. We then devise a novel framework that combines our tunable proposal network with a powerful transformer-based foundational model, which achieves state-of-the-art performance on this challenging task.
We conclude with a feasibility study of inference-time dynamic Convolutional Neural Networks (CNNs). We argue that this may be an exciting potential solution for improving robustness to natural variations such as changing object scale, aspect ratio, and surrounding contextual information. Our preliminary results indicate that different inputs have a strong preference for different convolutional kernel configurations. We show that by allowing just four layers of common off-the-shelf CNN models to have dynamic convolutional stride, dilation, and size, we can achieve remarkably high levels of accuracy on classification tasks.
Item Open Access Multiplex assay reliability and long-term intra-individual variation of serologic inflammatory biomarkers.(Cytokine, 2017-02) McKay, Heather S; Margolick, Joseph B; Martínez-Maza, Otoniel; Lopez, Joseph; Phair, John; Rappocciolo, Giovanna; Denny, Thomas N; Magpantay, Larry I; Jacobson, Lisa P; Bream, Jay HBACKGROUND: Circulating cytokines, chemokines, and soluble cytokine receptors can serve as biomarkers of inflammation and immune dysregulation. Good reliability of multiplex platforms, which allow for simultaneous, comprehensive biomarker assessment, is critical for their utility in epidemiologic studies. We examined the reliability of the Meso-Scale Discovery (MSD) platform to simultaneously quantitate 15 cytokines and chemokines and the Luminex platform (R&D Systems) to quantitate 5 soluble receptors and 2 chemokines and cytokines and evaluated long-term within-person correlation of these biomarkers. METHODS: The detectability and reliability of these assay systems were assessed using the same external controls across plates and archived sera from 250 HIV(-) men in the Multicenter AIDS Cohort Study. Using up to four visits per person from 1984 to 2009, age-adjusted intraclass correlation coefficients (ICC) of biomarkers with >80% detectability (CCL11, CXCL8, CXCL10, CCL2, CCL4, CCL13, CCL17, CXCL13, IL-10, IL-12p70, IL-6, TNF-α, BAFF, sCD14, sCD27, sgp130, sIL-2Rα, and sTNF-R2) were obtained using linear mixed models. RESULTS: Most biomarkers were detectable in 80% of control samples; IFN-γ, GM-CSF, and IL-2 were undetectable in >20% of samples. Among the HIV-uninfected men, most biomarkers showed fair to strong within-person correlation (ICC>0.40) up to 15years. The ICC for CXCL8 was good in the short term but decreased with increasing time between visits, becoming lower (ICC<0.40) after 8years. CONCLUSIONS: These multiplexed assays showed acceptable reliability for use in epidemiologic research, despite some technical variability and limitations in cytokine quantitation. Most biomarkers displayed moderate-to-excellent intra-individual variability over the long term, suggesting their utility in prospective studies investigating etiologic associations with diverse chronic conditions.Item Open Access Performance and Reliability Evaluation for DSRC Vehicular Safety Communication(2013) Yin, XiaoyanInter-Vehicle Communication (IVC) is a vital part of Intelligent Transportation System (ITS), which has been extensively researched in recent years. Dedicated Short Range Communication (DSRC) is being seriously considered by automotive industry and government agencies as a promising wireless technology for enhancing transportation safety and efficiency of road utilization. In the DSRC based vehicular ad hoc networks (VANETs), the transportation safety is one of the most crucial features that needs to be addressed. Safety applications usually demand direct vehicle-to-vehicle ad hoc communication due to a highly dynamic network topology and strict delay requirements. Such direct safety communication will involve a broadcast service because safety information can be beneficial to all vehicles around a sender. Broadcasting safety messages is one of the fundamental services in DSRC. In order to provide satisfactory quality of services (QoS) for various safety applications, safety messages need to be delivered both timely and reliably. To support the stringent delay and reliability requirements of broadcasting safety messages, researchers have been seeking to test proposed DSRC protocols and suggesting improvements. A major hurdle in the development of VANET for safety-critical services is the lack of methods that enable one to determine the effectiveness of VANET design mechanism for predictable QoS and allow one to evaluate the tradeoff between network parameters. Computer simulations are extensively used for this purpose. A few analytic models and experiments have been developed to study the performance and reliability of IEEE 802.11p for safety-related applications. In this thesis, we propose to develop detailed analytic models to capture various safety message dissemination features such as channel contention, backoff behavior, concurrent transmissions, hidden terminal problems, channel fading with path loss, multi-channel operations, multi-hop dissemination in 1-Dimentional or 2-Dimentional traffic scenarios. MAC-level and application-level performance metrics are derived to evaluate the performance and reliability of message broadcasting, which provide insights on network parameter settings. Extensive simulations in either Matlab or NS2 are conducted to validate the accuracy of our proposed models.
Item Open Access Project Resiliency: Overcoming Barriers for Repeatable Microgrids in the United States(2021-04-27) Leon-Hinton, Reed; Nadeem, Hassan; Amjad, ZukhrufThe recent blackouts in Texas and California caused by extreme weather events, such as snowstorms and wildfires, have revealed the growing burden on the national transmission system. Grid outages cost the US about $28 to $33 billion annually, with this expense growing as climate change leads to increasingly severe weather events across the globe. Microgrids, which are localized grids that can isolate from the main power grid during an outage, are key to strengthening grid resiliency, mitigating grid disturbances, and allowing faster recovery. They also accelerate the integration of distributed and renewable energy resources on the grid. This project analyzes the key barriers that hinder repeatable microgrid deployment which would enable economies of scale and thus provide cost-effective energy solutions to small scale manufacturing customers in the US. Additionally, the financial analysis and energy modeling undertaken in this study find that the participation of microgrids in energy markets is vital to deployment on a national scale. This can be achieved through uniform state-level regulation, streamlined interconnection processes, and “microgrid ready” facility infrastructure. Lastly, the risk assessment and mitigation provide a roadmap to public-private financing mechanisms for microgrid deployment.Item Open Access Smart Microgrids to Improve Reliability and Resiliency of Power Supply in the Southeast(2023-05-01) Pumarejo Villarreal, Jose Eduardo (Puma)Extreme weather events in the Southeast have frequently caused significant damage to the power grid, leaving millions without electricity for extended periods. Despite substantial investments, vulnerabilities stemming from the centralized nature of the system remain unresolved. However, the implementation of decentralized smart microgrid technology presents a potential solution to mitigate the impacts of extreme weather events and enhance power supply reliability and resiliency. Microgrids, which consist of interconnected loads and distributed energy resources, can operate in coordination with the main grid or independently. Each microgrid requires a customized approach to design, installation, and management. Although smart microgrids can improve power supply reliability and resiliency by up to 60%, their high costs often render projects financially unfeasible. To accelerate the adoption of microgrids in the Southeast, clear state-level regulations, standardized guidelines for electric utilities, and economic assessments of resilient infrastructure are needed. Additionally, exploring the establishment of a Southeast ISO could facilitate the replication of successful practices from regions like California, Texas, and New York.Item Open Access Solarizing the Island of Culebra, Puerto Rico: Rate-Design Model and Analysis(2022-04-22) Abcug, Jeremy; Bettencourt, Allison; Khandelwal, RajatPower on the island of Puerto Rico has historically been served through a centralized generation system that has largely failed to provide reliability— ability of the grid to provide the right quantity and quality of electricity needed and operate in times of stress — and resilience— the ability of the grid to come back online quickly and for all consumers after a major disruption. This master’s project team is working with the Fundación Comunitaria de Puerto Rico (FCPR; Puerto Rico Community Foundation) to support the Caribbean’s first community-owned solar utility in Culebra, Puerto Rico to improve grid reliability and to foster community energy independence. The idea behind Culebra’s solar utility is simple: 50 businesses, non-profits, and critical facilities will pay for the energy service provided by the utility through rooftop solar and battery systems that have been fitted to meet the individual facility energy needs. These entities that purchase this utility electricity become subscribers to the service, and these payments will allow for operation and maintenance (O&M), equipment replacement, system expansion and any other necessary services to be sustained. A SWOT analysis is provided to identity the different Strengths (S), Weaknesses (W), Opportunities (O) and Threats (T) for the project. The core objective of this project is the development of a rate-design model to evaluate the optimal rate to charge the subscribers of this solar utility. This rate-design model has three parts – Revenue, Costs and Financial Statements. For the revenue calculations, a load curve for an average subscriber was fitted based on historical consumption data. This information was used in a Monte Carlo simulation to model subscriber demand on a monthly basis. This simulated subscriber demand was compared with solar production forecasts to compute monthly revenue per subscriber. Four types of costs were considered in this analysis: Operations and Maintenance, Administrative, Insurance and Correction costs. All costs are increased annually with inflation. An analysis of the cost-breakdown results shows that correction cost is the largest cost component, however this declines over time. Operations and maintenance is the second largest component, followed by administrative and insurance costs. The results from the revenue and cost analysis were used to compute an Income Statement and Statement of Cash Flows for the solar utility. A set of sensitivity analyses were conducted to assess the effect of input parameters such as inflation, PREPA electricity rate, solar utility electricity rate, and taxes on output metrics such as net income, profit margin, subscriber savings, annual revenue and costs. A combination of the rate-design model and various sensitivity analyses suggest an ideal rate of $0.19/kWh for FCPR to charge to subscribers for the solar utility project. FCPR has already submitted an electricity rate of $0.21/kWh to the Puerto Rico Energy Bureau. This team’s analysis shows that the $0.21/kWh rate will help realize significant subscriber savings and ensure the viability of the solar utility project over its initial lifespan of 10 years and beyond. This project is expected to yield $2,600 of annual savings in electricity payments for subscribers and lead to the abatement of 1076 MT CO2e annually.