Browsing by Subject "Uncertainty"
- Results Per Page
- Sort Options
Item Open Access Assessing Climate Change under Uncertainty: A Monte Carlo Approach(2008-04-25T07:47:25Z) Park, JeeyoungClimate change has emerged as one of the most multifaceted manifestations of global change of our time. However, there is less confidence about exactly how the climate will change in the future, and lesser confidence still about the adjustments it will induce to natural and human system. Thus, policy formulation for climate change poses a great challenge because of a problem of decision-making under uncertainty. To facilitate climate policy decision, quantification of the uncertainty in climate outcomes under possible policies is needed. This paper presents an approach to assess climate change under uncertainty using Monte Carlo simulations. Here, I find that in the absence of climate policy, the 95% bound on temperature change in 2100 is 5.79°C. The stringent climate policies with aggressive emissions reductions over time lower significantly the temperature change, compared to no policy case.Item Open Access Changes in the Delivery of Veterans Affairs Cancer Care: Ensuring Delivery of Coordinated, Quality Cancer Care in a Time of Uncertainty.(Journal of oncology practice, 2017-11) Zullig, Leah L; Goldstein, Karen M; Bosworth, Hayden BItem Open Access Clinical research challenges posed by difficult-to-treat depression.(Psychological medicine, 2022-02) Rush, A John; Sackeim, Harold A; Conway, Charles R; Bunker, Mark T; Hollon, Steven D; Demyttenaere, Koen; Young, Allan H; Aaronson, Scott T; Dibué, Maxine; Thase, Michael E; McAllister-Williams, R HamishApproximately one-third of individuals in a major depressive episode will not achieve sustained remission despite multiple, well-delivered treatments. These patients experience prolonged suffering and disproportionately utilize mental and general health care resources. The recently proposed clinical heuristic of 'difficult-to-treat depression' (DTD) aims to broaden our understanding and focus attention on the identification, clinical management, treatment selection, and outcomes of such individuals. Clinical trial methodologies developed to detect short-term therapeutic effects in treatment-responsive populations may not be appropriate in DTD. This report reviews three essential challenges for clinical intervention research in DTD: (1) how to define and subtype this heterogeneous group of patients; (2) how, when, and by what methods to select, acquire, compile, and interpret clinically meaningful outcome metrics; and (3) how to choose among alternative clinical trial design options to promote causal inference and generalizability. The boundaries of DTD are uncertain, and an evidence-based taxonomy and reliable assessment tools are preconditions for clinical research and subtyping. Traditional outcome metrics in treatment-responsive depression may not apply to DTD, as they largely reflect the only short-term symptomatic change and do not incorporate durability of benefit, side effect burden, or sustained impact on quality of life or daily function. The trial methodology will also require modification as trials will likely be of longer duration to examine the sustained impact, raising complex issues regarding control group selection, blinding and its integrity, and concomitant treatments.Item Open Access Confidence and gradation in causal judgment.(Cognition, 2022-06) O'Neill, Kevin; Henne, Paul; Bello, Paul; Pearson, John; De Brigard, FelipeWhen comparing the roles of the lightning strike and the dry climate in causing the forest fire, one might think that the lightning strike is more of a cause than the dry climate, or one might think that the lightning strike completely caused the fire while the dry conditions did not cause it at all. Psychologists and philosophers have long debated whether such causal judgments are graded; that is, whether people treat some causes as stronger than others. To address this debate, we first reanalyzed data from four recent studies. We found that causal judgments were actually multimodal: although most causal judgments made on a continuous scale were categorical, there was also some gradation. We then tested two competing explanations for this gradation: the confidence explanation, which states that people make graded causal judgments because they have varying degrees of belief in causal relations, and the strength explanation, which states that people make graded causal judgments because they believe that causation itself is graded. Experiment 1 tested the confidence explanation and showed that gradation in causal judgments was indeed moderated by confidence: people tended to make graded causal judgments when they were unconfident, but they tended to make more categorical causal judgments when they were confident. Experiment 2 tested the causal strength explanation and showed that although confidence still explained variation in causal judgments, it did not explain away the effects of normality, causal structure, or the number of candidate causes. Overall, we found that causal judgments were multimodal and that people make graded judgments both when they think a cause is weak and when they are uncertain about its causal role.Item Open Access Development and Calibration of Reaction Models for Multilayered Nanocomposites(2015) Vohra, ManavThis dissertation focuses on the development and calibration of reaction models for multilayered nanocomposites. The nanocomposites comprise sputter deposited alternating layers of distinct metallic elements. Specifically, we focus on the equimolar Ni-Al and Zr-Al multilayered systems. Computational models are developed to capture the transient reaction phenomena as well as understand the dependence of reaction properties on the microstructure, composition and geometry of the multilayers. Together with the available experimental data, simulations are used to calibrate the models and enhance the accuracy of their predictions.
Recent modeling efforts for the Ni-Al system have investigated the nature of self-propagating reactions in the multilayers. Model fidelity was enhanced by incorporating melting effects due to aluminum [Besnoin et al. (2002)]. Salloum and Knio formulated a reduced model to mitigate computational costs associated with multi-dimensional reaction simulations [Salloum and Knio (2010a)]. However, exist- ing formulations relied on a single Arrhenius correlation for diffusivity, estimated for the self-propagating reactions, and cannot be used to quantify mixing rates at lower temperatures within reasonable accuracy [Fritz (2011)]. We thus develop a thermal model for a multilayer stack comprising a reactive Ni-Al bilayer (nanocalorimeter) and exploit temperature evolution measurements to calibrate the diffusion parameters associated with solid state mixing (720 K - 860 K) in the bilayer.
The equimolar Zr-Al multilayered system when reacted aerobically is shown to
exhibit slow aerobic oxidation of zirconium (in the intermetallic), sustained for about 2-10 seconds after completion of the formation reaction. In a collaborative effort, we aim to exploit the sustained heat release for bio-agent defeat application. A simplified computational model is developed to capture the extended reaction regime characterized by oxidation of Zr-Al multilayers. Simulations provide insight into the growth phenomenon for the zirconia layer during the oxidation process. It is observed that the growth of zirconia is predominantly governed by surface-reaction. However, once the layer thickens, the growth is controlled by the diffusion of oxygen in zirconia.
A computational model is developed for formation reactions in Zr-Al multilayers. We estimate Arrhenius diffusivity correlations for a low temperature mixing regime characterized by homogeneous ignition in the multilayers, and a high temperature mixing regime characterized by self-propagating reactions in the multilayers. Experimental measurements for temperature and reaction velocity are used for this purpose. Diffusivity estimates for the two regimes are first inferred using regression analysis and full posterior distributions are then estimated for the diffusion parameters using Bayesian statistical approaches. A tight bound on posteriors is observed in the ignition regime whereas estimates for the self-propagating regime exhibit large levels of uncertainty. We further discuss a framework for optimal design of experiments to assess and optimize the utility of a set of experimental measurements for application to reaction models.
Item Open Access Effective health communication - a key factor in fighting the COVID-19 pandemic.(Patient education and counseling, 2020-05) Finset, Arnstein; Bosworth, Hayden; Butow, Phyllis; Gulbrandsen, Pål; Hulsman, Robert L; Pieterse, Arwen H; Street, Richard; Tschoetschel, Robin; van Weert, JuliaItem Open Access Efficient Algorithms for Querying Large and Uncertain Data(2020) Sintos, StavrosQuery processing is an important problem in many research fields including database systems, data mining, and geometric computing. The goal is to preprocess input data into an index to efficiently answer queries over data. In many real applications we need to handle a large amount of data or/and data that are ambiguous because of human errors and data integration. With the data sets becoming increasingly large and complex, queries are also becoming complex, therefore, new challenging problems have emerged in the area of query processing.
Data summarization helps to accelerate expensive data queries by running the query procedure in a small data summary rather than the entire large dataset.
The first part of this thesis studies the problem of finding small high quality data summaries in the Euclidean space.
Data summarization can be applied either in the entire dataset or among points in a query range given by the user.
Efficient algorithms are proposed to get a small summarization of the entire dataset so that Top-$k$ or user preference queries are answered efficiently with high quality guarantees by looking only in the data summary.
Such a summary can also be used in multi-criteria decision problems.
Furthermore, near-linear space indexes are designed so that given a query rectangle, a diverse high-valued summary of $k$ points in the query range is returned efficiently.
The second part of the thesis focuses on data queries over uncertain data.
Uncertainty is typically captured using stochastic data models, and querying data requires either statistics about the probabilistic behavior of the underlying data, or data cleaning to reduce the uncertainty of the query answer.
Small size indexes are built so that, given a query rectangle, statistics of the MAX (or top-$k$) operator among the points in the query range are computed in sublinear time.
In addition, approximation algorithms are proposed for selecting data to clean in order to minimize the variance of the result of a query function over a probabilistic database.
While most of the methods hold for general functions, the main focus is on minimizing the variance in fact checking operations.
Item Open Access Evaluation of dosimetric uncertainty caused by MR geometric distortion in MRI-based liver SBRT treatment planning.(Journal of applied clinical medical physics, 2019-02) Han, Silu; Yin, Fang-Fang; Cai, JingPURPOSE:MRI-based treatment planning is a promising technique for liver stereotactic-body radiation therapy (SBRT) treatment planning to improve target volume delineation and reduce radiation dose to normal tissues. MR geometric distortion, however, is a source of potential error in MRI-based treatment planning. The aim of this study is to investigate dosimetric uncertainties caused by MRI geometric distortion in MRI-based treatment planning for liver SBRT. MATERIALS AND METHODS:The study was conducted using computer simulations. 3D MR geometric distortion was simulated using measured data in the literature. Planning MR images with distortions were generated by integrating the simulated 3D MR geometric distortion onto planning CT images. MRI-based treatment plans were then generated on the planning MR images with two dose calculation methods: (1) using original CT numbers; and (2) using organ-specific assigned CT numbers. Dosimetric uncertainties of various dose-volume-histogram parameters were determined as their differences between the simulated MRI-based plans and the original clinical CT-based plans for five liver SBRT cases. RESULTS:The average simulated distortion for the five liver SBRT cases was 2.77 mm. In the case of using original CT numbers for dose calculation, the average dose uncertainties for target volumes and critical structures were <0.5 Gy, and the average target volume percentage at prescription dose uncertainties was 0.97%. In the case of using assigned CT numbers, the average dose uncertainties for target volumes and critical structures were <1.0 Gy, and the average target volume percentage at prescription dose uncertainties was 2.02%. CONCLUSIONS:Dosimetric uncertainties caused by MR geometric distortion in MRI-based liver SBRT treatment planning was generally small (<1 Gy) when the distortion is 3 mm.Item Open Access Incorporating Photogrammetric Uncertainty in UAS-based Morphometric Measurements of Baleen Whales(2021) Bierlich, Kevin CharlesIncreasingly, drone-based photogrammetry has been used to measure size and body condition changes in marine megafauna. A broad range of platforms, sensors, and altimeters are being applied for these purposes, but there is no unified way to predict photogrammetric uncertainty across this methodological spectrum. As such, it is difficult to make robust comparisons across studies, disrupting collaborations amongst researchers using platforms with varying levels of measurement accuracy.
In this dissertation, I evaluate the major drivers of photogrammetric error and develop a framework to easily quantify and incorporate uncertainty associated with different UAS platforms. To do this, I take an experimental approach to train a Bayesian statistical model using a known-sized object floating at the water’s surface to quantify how measurement error scales with altitude for several different drones equipped with different cameras, focal length lenses, and altimeters. I then use the fitted model to predict the length distributions of unknown-sized humpback whales and assess how predicted uncertainty can affect quantities derived from photogrammetric measurements such as the age class of an animal (Chapter 1). I also use the fitted model to predict body condition of blue whales, humpback whales, and Antarctic minke whales, providing the first comparison of how uncertainty scales across commonly used 1-, 2-, and 3-dimensional (1D, 2D, and 3D, respectively) body condition measurements (Chapter 2). This statistical framework jointly estimates errors from altitude and length measurements and accounts for altitudes measured with both barometers and laser altimeters while incorporating errors specific to each. This Bayesian statistical model outputs a posterior predictive distribution of measurement uncertainty around length and body condition measurements and allows for the construction of highest posterior density intervals to define measurement uncertainty, which allows one to make probabilistic statements and stronger inferences pertaining to morphometric features critical for understanding life history patterns and potential impacts from anthropogenically altered habitats. From these studies, I find that altimeters can greatly influence measurement predictions, with measurements using a barometer producing larger and greater uncertainty compared to using a laser altimeter, which can influence age classifications. I also find that while the different body condition measurements are highly correlated with one another, uncertainty does not scale linearly across 1D, 2D, and 3D body condition measurements, with 2D and 3D uncertainty increasing by a factor of 1.44 and 2.14 compared to 1D measurements, respectively. I find that body area index (BAI) accounts for potential variation along the body for each species and was the most precise body condition measurement.
I then use the model to incorporate uncertainty associated with different drone platforms to measure how body condition (as BAI) changes over the course of the foraging season for humpback whales along the Western Antarctic Peninsula (Chapter 3). I find that BAI increases curvilinearly for each reproductive class, with rapid increases in body condition early in the season compared to later in the season. Lactating females had the lowest BAI, reflecting the high energetic costs of reproduction, whereas mature whales had the largest BAI, reflecting their high energy stores for financing the costs of reproduction on the breeding grounds. Calves also increased BAI opposed to strictly increasing length, while immature whales may increase their BAI and commence an early migration by mid-season. These results set a baseline for monitoring this healthy population in the future as they face potential impacts from climate change and anthropogenic stresses. This dissertation concludes with a best practices guide for minimizing, quantifying, and incorporating uncertainty associated with photogrammetry data. This work provides novel insights into how to obtain more accurate morphological measurements to help increase our understanding of how animals perform and function in their environment, as well as better track the health of populations over time and space.
Item Open Access Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.(Medical decision making : an international journal of the Society for Medical Decision Making, 2016-11) Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David BBackground
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making.Objective
To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models.Methods
Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis).Results
Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years.Conclusions
For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection.Item Open Access Maladaptive Rule-Governed Behavior in Anorexia Nervosa: The Need for Certainty and Control(2014) Moskovich, Ashley A.Anorexia nervosa (AN) is a dangerous disorder characterized by unrelenting rigidity that continues even in the presence of deadly outcomes. Despite this, our understanding of factors that promote and maintain rigidity is lacking. The current paper proposes a model suggesting that rigid behaviors in AN can be formulated as maladaptive rule-governed behavior that emerges in contexts of uncertainty and loss of control, such as in the presence of affective arousal. An empirical study examining the differences between individuals weight recovered from AN (AN-WR) and healthy controls (CN) on parameters of rule-governed behavior in neutral and stressful contexts is described. Seventy-four adults (AN-WR: 36; CN: 38) were randomized to undergo either a stressful or neutral mood manipulation and then completed a laboratory assessment of rule-governed behavior, along with questionnaires measuring difficulties with uncertainty. While the AN-WR group demonstrated greater flexibility in rule implementation compared to the CN group, they evidenced greater impairment in behavioral extinction. Furthermore, although affective arousal did not significantly impact rule-governed behavior as expected, difficulties tolerating uncertainty were significantly related to rule-governed outcomes exclusively in the AN-WR group. Taken together, findings provide preliminary support for maladaptive rule-governed behavior in AN and suggest that this is related to an intolerance of uncertainty. Findings and treatment implications are discussed in light of study limitations.
Item Open Access Multiple models for outbreak decision support in the face of uncertainty.(Proceedings of the National Academy of Sciences of the United States of America, 2023-05) Shea, Katriona; Borchering, Rebecca K; Probert, William JM; Howerton, Emily; Bogich, Tiffany L; Li, Shou-Li; van Panhuis, Willem G; Viboud, Cecile; Aguás, Ricardo; Belov, Artur A; Bhargava, Sanjana H; Cavany, Sean M; Chang, Joshua C; Chen, Cynthia; Chen, Jinghui; Chen, Shi; Chen, YangQuan; Childs, Lauren M; Chow, Carson C; Crooker, Isabel; Del Valle, Sara Y; España, Guido; Fairchild, Geoffrey; Gerkin, Richard C; Germann, Timothy C; Gu, Quanquan; Guan, Xiangyang; Guo, Lihong; Hart, Gregory R; Hladish, Thomas J; Hupert, Nathaniel; Janies, Daniel; Kerr, Cliff C; Klein, Daniel J; Klein, Eili Y; Lin, Gary; Manore, Carrie; Meyers, Lauren Ancel; Mittler, John E; Mu, Kunpeng; Núñez, Rafael C; Oidtman, Rachel J; Pasco, Remy; Pastore Y Piontti, Ana; Paul, Rajib; Pearson, Carl AB; Perdomo, Dianela R; Perkins, T Alex; Pierce, Kelly; Pillai, Alexander N; Rael, Rosalyn Cherie; Rosenfeld, Katherine; Ross, Chrysm Watson; Spencer, Julie A; Stoltzfus, Arlin B; Toh, Kok Ben; Vattikuti, Shashaank; Vespignani, Alessandro; Wang, Lingxiao; White, Lisa J; Xu, Pan; Yang, Yupeng; Yogurtcu, Osman N; Zhang, Weitong; Zhao, Yanting; Zou, Difan; Ferrari, Matthew J; Pannell, David; Tildesley, Michael J; Seifarth, Jack; Johnson, Elyse; Biggerstaff, Matthew; Johansson, Michael A; Slayton, Rachel B; Levander, John D; Stazer, Jeff; Kerr, Jessica; Runge, Michael CPolicymakers must make management decisions despite incomplete knowledge and conflicting model projections. Little guidance exists for the rapid, representative, and unbiased collection of policy-relevant scientific input from independent modeling teams. Integrating approaches from decision analysis, expert judgment, and model aggregation, we convened multiple modeling teams to evaluate COVID-19 reopening strategies for a mid-sized United States county early in the pandemic. Projections from seventeen distinct models were inconsistent in magnitude but highly consistent in ranking interventions. The 6-mo-ahead aggregate projections were well in line with observed outbreaks in mid-sized US counties. The aggregate results showed that up to half the population could be infected with full workplace reopening, while workplace restrictions reduced median cumulative infections by 82%. Rankings of interventions were consistent across public health objectives, but there was a strong trade-off between public health outcomes and duration of workplace closures, and no win-win intermediate reopening strategies were identified. Between-model variation was high; the aggregate results thus provide valuable risk quantification for decision making. This approach can be applied to the evaluation of management interventions in any setting where models are used to inform decision making. This case study demonstrated the utility of our approach and was one of several multimodel efforts that laid the groundwork for the COVID-19 Scenario Modeling Hub, which has provided multiple rounds of real-time scenario projections for situational awareness and decision making to the Centers for Disease Control and Prevention since December 2020.Item Open Access Organizational Capital Budgeting Model (Ocbm)(2009) Kang, Hyoung GooOrganizational Capital Budgeting Model (OCBM) is a general theory of capital budgeting that incorporates traditional capital budgeting theories and the consideration about firm's information/ organization structure. The traditional financial capital budgeting model is a special case of OCBM. Therefore, OCBM not only broadens the traditional model, but also explains the heterogeneous behaviors of firms using quasi/non-financial version of capital budgeting. I demonstrate the validity of OCBM with multiple research methods. The field studies about Asian conglomerates are carefully constructed. The conglomerates are important dataset to study organizational decision making because of their size, scope, controversial behaviors and global presence.
Item Open Access Statistical Issues in Quantifying Text Mining Performance(2017) Chai, Christine PeijinnText mining is an emerging field in data science because text information is ubiquitous, but analyzing text data is much more complicated than analyzing numerical data. Topic modeling is a commonly-used approach to classify text documents into topics and identify key words, so the text information of interest is distilled from the large corpus sea. In this dissertation, I investigate various statistical issues in quantifying text mining performance, and Chapter 1 is a brief introduction.
Chapter 2 is about the adequate pre-processing for text data. For example, words of the same stem (e.g. "study" and "studied") should be assigned the same token because they share the exact same meaning. In addition, specific phrases such as "New York" and "White House" should be retained because many topic classification models focus exclusively on words. Statistical methods, such as conditional probability and p-values, are used as an objective approach to discover these phrases.
Chapter 3 starts the quantification of text mining performance; this measures the improvement of topic modeling results from text pre-processing. Retaining specific phrases increases their distinctivity because the "signal" of the most probable topic becomes stronger (i.e., the maximum probability is higher) than the "signal" generated by any of the two words separately. Therefore, text pre-processing helps recover semantic information at word level.
Chapter 4 quantifies the uncertainty of a widely-used topic model { latent Dirichlet allocation (LDA). A synthetic text dataset was created with known topic proportions, and I tried several methods to determine the appropriate number of topics from the data. Currently, the pre-set number of topics is important to the topic model results because LDA tends to utilize all topics allotted, so that each topic has about equal representation.
Last but not least, Chapter 5 explores a few selected text models as extensions, such as supervised latent Dirichlet allocation (sLDA), survey data application, sentiment analysis, and the infinite Gaussian mixture model.
Item Open Access Strategic planning to reduce the burden of stroke among veterans: using simulation modeling to inform decision making.(Stroke, 2014-07) Lich, Kristen Hassmiller; Tian, Yuan; Beadles, Christopher A; Williams, Linda S; Bravata, Dawn M; Cheng, Eric M; Bosworth, Hayden B; Homer, Jack B; Matchar, David BBACKGROUND AND PURPOSE: Reducing the burden of stroke is a priority for the Veterans Affairs Health System, reflected by the creation of the Veterans Affairs Stroke Quality Enhancement Research Initiative. To inform the initiative's strategic planning, we estimated the relative population-level impact and efficiency of distinct approaches to improving stroke care in the US Veteran population to inform policy and practice. METHODS: A System Dynamics stroke model of the Veteran population was constructed to evaluate the relative impact of 15 intervention scenarios including both broad and targeted primary and secondary prevention and acute care/rehabilitation on cumulative (20 years) outcomes including quality-adjusted life years (QALYs) gained, strokes prevented, stroke fatalities prevented, and the number-needed-to-treat per QALY gained. RESULTS: At the population level, a broad hypertension control effort yielded the largest increase in QALYs (35,517), followed by targeted prevention addressing hypertension and anticoagulation among Veterans with prior cardiovascular disease (27,856) and hypertension control among diabetics (23,100). Adjusting QALYs gained by the number of Veterans needed to treat, thrombolytic therapy with tissue-type plasminogen activator was most efficient, needing 3.1 Veterans to be treated per QALY gained. This was followed by rehabilitation (3.9) and targeted prevention addressing hypertension and anticoagulation among those with prior cardiovascular disease (5.1). Probabilistic sensitivity analysis showed that the ranking of interventions was robust to uncertainty in input parameter values. CONCLUSIONS: Prevention strategies tend to have larger population impacts, though interventions targeting specific high-risk groups tend to be more efficient in terms of number-needed-to-treat per QALY gained.Item Open Access The Role of Energy Models: Characterizing the Uncertainty of the Future Electricity System to Design More Efficient and Effective Laws and Regulations(2017-05-01) Righetti, Tara Kathleen; Godby, Robert; Echeverri, Dalia Patino; Stoellinger, Temple; Coddington, Kipp AndrewItem Open Access The view for cord blood is "cup half full" not "cup half empty".(Stem cells translational medicine, 2020-10) Kurtzberg, JoanneItem Open Access Uncertainty Shocks, Asset Supply and Pricing over the Business Cycle(The Review of Economic Studies, 2018) Bianchi, Francesco; Ilut, Cosmin L; Schneider, MartinThis article estimates a business cycle model with endogenous financial asset supply and ambiguity averse investors. Firms’ shareholders choose not only production and investment, but also capital structure and payout policy subject to financial frictions. An increase in uncertainty about profits lowers stock prices and leads firms to substitute away from debt as well as reduce shareholder payout. This mechanism parsimoniously accounts for the postwar comovement in investment, stock prices, leverage, and payout, at both business cycle and medium term cycle frequencies. Ambiguity aversion permits a Markov-switching VAR representation of the model, while preserving the effect of uncertainty shocks on the time variation in the equity premium.Item Open Access Uncertainty, Policy, and the Risk of New Nuclear Build—a Real Options Approach(2010-04-30T18:32:41Z) O'Connor, PatrickThe United States has recently seen renewed interest in nuclear power, what is called the Nuclear Renaissance. However, the new licensing processes are untested and the new reactor designs have never been constructed on US soil. Analyzing the history of US nuclear development demonstrates that plants face considerable risk from construction uncertainties, public intervention in the licensing process, and project mismanagement. When these unknowns are coupled with the industry’s poor cost track record, the resulting set of uncertainties and risks may cause investors to be wary of pursuing new nuclear projects. Real Options valuation was used to assess how the risks associated with the uncertainties in the environment for nuclear power could impact the economics of new plants. To value a new nuclear power plant a decision model was developed incorporating construction, regulatory, and operational uncertainties along with an option to abandon project development. Various policy and uncertainty scenarios were modeled and a conservative policy goal was developed as an achievable end point for the current levels of subsidy. The results suggest that without subsidy, the first new plants in the United States are economically unattractive in liberalized electricity markets. Subsidized plants have positive investment value, but this value is still marginal. However, cost reductions from standardization and learning could add between $200 and $600 per kilowatt in project value. Additionally, alternative incentive policies and market-based greenhouse gas regulations both considerably improve the economics of new nuclear plants.Item Open Access Volatility and Uncertainty in Environmental Policy(2013) Maniloff, PeterEnvironmental policy is increasingly implemented via market mechanisms. While this is in many ways a great success for the economics profession, a number of questions remain. In this dissertation, I empirically explore the question of what will happen as environmental outcomes are coupled to potentially volatile market phenomena, whether policies can insulate environmental outcomes and market shocks, and policymakers should act to mitigate such volatility. I use a variety of empirical methods including reduced form and structural econometrics as well as theoretical models to consider a variety of policy, market, and institutional contexts. The effectiveness of market interventions depends on the context and on the policy mechanism. In particular, energy markets are characterized by low demand elasticities and kinked supply curves which are very flat below a capacity constraint (elastic) and very steep above it (inelastic). This means that a quantity-based policy that acts on demand, such as releasing additional pollution emission allowances from a reserved fund would be an effective way to constrain price shocks in a cap-and-trade system. However, a quantity-based policy that lowers the need for inframarginal supply, such as using ethanol as an oil product substitute to mitigate oil shocks, would be ineffective. Similarly, the benefits of such interventions depends on the macroeconomic impacts of price shocks from the sector. Relatedly, I show that a liability rule designed to reduce risk from low-probability, high-consequence oil spills have very low compliance costs.