Browsing by Subject "Information technology"
Results Per Page
Sort Options
Item Open Access A Mobile Health Intervention to Sustain Recent Weight Loss(2012) Shaw, Ryan J.Background: Obesity is the number one health risk facing Americans. The obesity epidemic in America is attributed to physical inactivity, unhealthy food choices, and excessive food intake. Structured weight loss programs have been successful in initiating behavior change and weight loss; however, weight is almost always regained over time. The rate of weight gain is highest immediately after cessation of a structured weight loss program. Thus, effective interventions are needed that can successfully be used following a structured weight loss program to sustain weight loss and prevent weight relapse. Due to low cost, ubiquity, and ease of use, healthcare communicated through mobile technology, or "mHealth", may be able to serve as an effective medium to reach a large number of people to facilitate weight loss behaviors. Short message service (SMS), also known as text messaging, is easy to use, ubiquitous, affordable, and can target people directly where they are regardless of geographic location, socioeconomic status, or demographic factors. A review of the literature demonstrated limited information regarding message content, timing and frequency of message delivery and only 3 of 14 SMS-related interventions reviewed demonstrated a statistically significant effect on weight loss, diet or exercise. Additionally, information on how to integrate and leverage SMS as a health promotion tool for weight loss was also limited in the literature.
The Behavior Change Process model was used as a guide to understand how to develop an intervention to help people sustain recent weight loss. Furthermore, research suggests interventions that target and frame messages about how people reach goals in their life through either a prevention or promotion focus may be beneficial at motivating people to self-regulate and sustain recent behavioral changes. The goal of this study was to design an intervention that would help people stay in the continued response phase of the Behavior Change Process and help prevent weight relapse. Using the Behavior Change Process and regulatory focus theory, an intervention was developed that leveraged short message service (SMS) to deliver messages to people who have recently lost weight in attempt to help them sustain weight loss and prevent relapse.
Methods: First, a pilot study was conducted to inform the development of a SMS software application, the development of message content and the frequency and timing of message delivery. Second, an exploratory 3-arm mixed methods randomized controlled trial was conducted to test the feasibility, acceptability, perception of the usefulness, and efficacy of a weight loss sustaining mHealth SMS intervention among people with obesity. Participants (N=120) were randomized to a promotion message group, a prevention message group, or an attention-control general health message group. Participants completed baseline assessments, and reported their weight at 1 and 3 months post-baseline to assess efficacy of the intervention on sustaining weight loss. In addition, participants partook in a phone interview follow completion of the intervention to assess acceptability and usefulness.
Results: Participants found the message content and intervention acceptable and a majority perceived value in receiving messages via SMS that promote weight loss sustaining behaviors. Interview data implied that the intervention served as a reminder and daily cue to action. Participants were favorable towards receiving a daily reminder, which they noted helped them to stay focused, and in some cases to keep them motivated to continue losing weight. And a majority, 42 (91%) who participated in a telephone interview said that they preferred to get messages on their cell phone due to accessibility and convenience. A minimum of one message per day delivered at approximately 8:00 A.M. was deemed the optimal delivery time and frequency. This was particularly true for weight loss, which many participants reported as a daily struggle that begins every morning. With regards to sustaining weight loss, there was a statistical trend in sustained weight loss at months 1 and 3 in the promotion and prevention framed message groups compared to the control group in both the intent-to-treat and evaluable case analyses. Clinically, there was a significant decrease in mean weight of approximately 5 pounds at month 3 in the promotion and prevention groups compared to the control. Additionally, effect sizes indicated a large effect of the intervention on sustaining weight loss in the promotion and prevention groups relative to the control group.
Conclusion: Overall results showed that at the continued response phase of the behavioral change process, it was feasible to design an application to deliver promotion and prevention framed weight loss sustaining messages. In particular, prevention framed messages may have been more useful in helping participants sustain weight loss. Though there was less than 80% power to detect a statistically significant difference, the observed effect sizes in this study were significant and demonstrated a large effect of the promotion and prevention interventions on sustaining weight loss relative to control. Furthermore, there was a clinically significant increase in mean weight loss and in the number of people who sustained weight loss in the promotion and prevention intervention groups compared to control.
These findings may serve as a reference for future interventions designed to help people thwart relapse and transition from a state of sustaining recent weight loss behaviors to a state of maintenance. Technological tools such as this SMS intervention that are constructed and guided by evidence-based content and theoretical constructs show promise in helping people sustain healthy behaviors that can lead to improved health outcomes.
Item Open Access An Information Systems Strategy for the Environmental Conservation Community(2008-04-25T20:55:48Z) Barker, KristinAs the cause of environmental conservation emerges as a global priority, the need for a practical information systems strategy shared among conservation organizations becomes imperative. Historically, researchers and practitioners in conservation have met their own information management and analysis needs with inevitable variation in methodology, semantics, data formats and quality. Consequently, conservation organizations have been unable to systematically assess conditions and set informed priorities at various scales, measure performance of their projects and improve practices through adaptive management. Moreover, the demands on conservation are changing such that the bottom-up approach to information systems will become an increasing constraint to effective environmental problem solving. Where we have historically focused on the protection of “important” places and species and more recently “biodiversity,” conservation is moving to a systems view, specifically ecosystem-based management, where relationships and process are as important as the individual elements. In parallel, awareness of the human dependency on functioning natural systems is on the rise and with it the need to explicitly value ecosystem services and inform tradeoffs. Climate change requires conservation to develop dynamic adaptation scenarios at multiple spatial and temporal scales. Finally, the business of conservation is under increased pressure to account for its spending and objectively measure outcomes of its strategies. All of these changes translate to growing, not shrinking, demands on information and information systems. In response to these challenges, this research presents an information systems strategy for the environmental conservation community. It proposes the development of a distributed systems infrastructure with end-user tools and shared services that support standardized datasets. Key strategies include removing the barriers to information sharing, providing valuable tools to data producers and directly supporting heterogeneity in conservation datasets. The strategy concludes with a call for high-level management involvement in information systems strategy and collaborative investment in implementation by the conservation community, partners in government and donors. Without these steps, conservation as an industry may find itself ill-equipped to meet the changing needs of people and nature.Item Open Access Design, Optimization and Test Methods for Robust Digital Microfluidic Biochips(2020) Zhong, ZhanweiMicrofluidic biochips are now being used for biochemical applications such as high-throughput DNA sequencing, point-of-care clinical diagnostics, and immunoassays. In particular, digital microfluidic biochips (DMFBs) are especially promising. They manipulate liquid as discrete droplets of nanoliter or picoliter volumes based on the principle of electrowetting-on-dielectric under voltage-based electrode actuation. DMFBs have been commercially adopted for sample preparation and clinical diagnostics. Techniques have also been developed for high-level synthesis, module placement, and droplet routing.
However, reliability is a major concern in the use of DMFBs for laboratory protocols. In addition to manufacturing defects and imperfections, faults can also arise during a bioassay. For example, excessive or prolonged actuation voltage may lead to electrode breakdown and charge trapping, and DNA fouling may lead to the malfunction of electrodes. Faults may eventually result in errors in droplet operations. If an unexpected error appears during an experiment, the outcome of the experiment will be incorrect. The repetition of an experiment leads to wastage of valuable reagents and time.
Therefore, it is necessary to ensure the correctness of the hardware and bioassay execution on the biochip. In this thesis, we focus on three types of reliability: biochip testing, error/fault recovery, and fault-tolerant synthesis. First, when a biochip is fabricated, defects might occur in parts of the biochip. Therefore, our objective is to develop biochip testing methods to detect and locate faults. Second, to faults that appear during droplet operation or in the hardware, we develop error-recovery procedures and redundancy solutions. Finally, we develop fault-tolerant synthesis techniques so that even if faults occur during droplet operations (e.g., unbalance splitting), the bioassay can proceed unimpeded. The proposed solutions are applied to two new types of biochip platforms, namely micro-electrode-dot-array (MEDA) and digital acoustofluidics.
Item Embargo Ethics of Artificial Intelligence, Robotics and Supra-Intelligence(2020) Kasbe, Timothy DAll things were created by Him and for Him:
Ethics of Artificial Intelligence, Robotics and Supra-Intelligence
Fascination with automation has captured the human imagination for thousands of years. As far back as 800 CE, when Baghdad was at its height as one of the world’s most cultured cities, its House of Wisdom produced a remarkable text, “The Book of Ingenious Devices.” In it were beautiful schematic drawings of machines years ahead of anything in Europe—clocks, hydraulic instruments, even a water-powered organ with swappable pin-cylinders that was effectively a programmable device.
The fascination with automation has come a long way since then. Technological advancements in the last seventy years have provided unprecedented opportunities for humans to explore not only automation, but now also the creation of intelligent and superintelligent machines. These machines promise to mimic human qualities and even supersede humanity in every manner of task and intelligence. The explosion of, and ready access to, information through the internet has proved to be challenging in some regards but has also eased other aspects of life. An example of this would be the way long-lost friends can be reunited through the click of a mouse. Similarly, news accompanied by pictures and videos is now readily available in real-time. These conveniences have also brought unintended consequences. Despite this newfound connectivity, social challenges such as loneliness and suicide are on the rise. Technology has also opened the door to problems such as cyberbullying, election manipulation, and fake news. Information, whether it be accurate or not, spreads across the world at unprecedented speeds, carrying with it change, sometimes for the better, but not always. This is all happening before the anticipated age of superintelligence.
This thesis examines the distinct nature of humanity and God in view of the emergence of superintelligence. Can we see this “new creation” as an addition to God’s creation of humans, angels, and Satan? If that be the case, then questions of ethics and theology need to be addressed. For instance, who gets to program these new superintelligent “beings?” As things stand today, the individuals and corporations with the deepest pockets are racing to be the first to produce superintelligent beings. The so-called “technology horse” has already bolted, with government policy struggling to keep up. Unseen in this race is the prophetic and ethical voice of the church, regarding the meaning of life, and what living in this new reality will look like.
More questions are raised than can be answered in this paper. How does the Church stay true to its message of hope in a world where robots will likely take over everyday jobs? Where will humanity find meaning and contentment? What are we to think about the idea of a basic universal wage? How will such a shift impact migrant and the poor? In this paper I establish a framework for the church to consider different aspects of these challenges, even as people are welcomed weekly into the community of faith.
This thesis represents extensive research into the philosophy and practice of safety engineering, paired with personal experiences as a professional in the technology industry who is also deeply committed to being a disciple of Christ. Primary works I have drawn from extensively include Hauerwas and Wells’ Blackwell Companion to Christian Ethics, and Jungian archetypes in comparing and contrasting biological beings to technological creations. The paper starts with creation accounts from Genesis and the Enuma Elish as a way of exploring the “being” category as it appears on this planet. Personal insights gained working in both enterprise and startup businesses, as well as in my own professional development, have contributed to this work and may be found throughout. This thesis represents a labor of love through which I have learned a great deal about my own profession and faith. However, it is my sincere hope that it will be much more. Through this dissertation I hope to see companies both big and small taking note of the ethical issues discussed here, even as they find themselves unleashing artificial intelligence in the marketplace. At the same time, I expect churches and religious organizations will benefit from this discussion and will, I hope, move to engage more deeply with culture and the marketplace as new opportunities and risks emerge from the implementation of artificial intelligence. If the observations that I have made and the recommendations that I have set forth can inspire even one person to carefully examine his or her identity in Christ, then this work will be successful beyond its original purpose as an academic work.
Item Open Access Making Sense of Health Information Technology(2012) Kitzmiller, Rebecca RutherfordBackground: Hospital adoption of health information technology (HIT) systems is promoted as essential to decreasing medical error and their associated 44,000 annual deaths and $17 billion in healthcare costs (Institute of Medicine, 2001; Kohn, Corrigan, & Donaldson, 1999). Leading national healthcare groups, such as the Institute of Medicine, Agency for Healthcare Research and Quality, Institute for Healthcare Improvement, and the Leap Frog Group continue to advocate for increased use of HIT (AHRQ, 2010; Beidler, 2010; Institute of Medicine, 2001; Page, 2003; The Leapfrog Group, 2009), such as provider order entry and electronic health record systems, as a way to improve healthcare quality in hospitals. Even under intense pressure to adopt HIT, however, a mere 2% of US hospitals report having a comprehensive electronic health record system. Further, more than 50% of US hospitals have only rudimentary HIT systems (Jha et al., 2009). With the ARRA HITECH Act of 2009, the pressure on hospitals to quickly adopt HIT and achieve meaningful use is mounting.
While a large body of literature exists about HIT implementation, the majority is anecdotal case reports. The remaining studies investigated attitudes about HIT or the impact of HIT on patient care processes and outcomes. Thus, best strategies for implementing HIT in hospitals remain unknown. Study design choices, such as the use of self report data, retrospective data collection methods, subjects from single care units or single healthcare professions further limit our understanding HIT implementation in complex hospital care settings.
Methods: This prospective, longitutdinal case study used a novel approach, sensemaking, to understanding how project teams may work to implement HIT in an academic medical center. Sensemaking, defined as the social process of establishing the meaning of events and experiences (Weick, 1995), is associated with learning and problemsolving in research studies of healthcare and nonhealthcare settings. Through direct observation and document review I observed project team social interaction and activities over the course of the 18 month preimplementation phase of an HIT implementation project in a single tertiary care hopsital.
Conclusions: In this study, I described team actions and activities that enhanced clinician team member sensemaking including: frequent, collective interaction with HIT and focusing team members' attention on specific aspects of HIT function. Further, study findings demonstrated that team members' perceptions of HIT and care processes varied across healthcare professions, management levels, and departments. Supportive social interaction from team leaders and members encouraged team member participation and resulted in members' voicing observations, perceptions and attitudes about the HIT and hospital care processes. Sensemaking of HIT teams not only resulted in identification of needed HIT design changes, but also revealed assumptions and information which may prove critical to successful HIT implementation in hospital care environments. Based on study findings, I suggested strategies for selecting and preparing HIT team members as well as for HIT team activities. This study advanced our understanding of how project teams function and bring about change in complex hospital care environments by not only identifying HIT implementation issues within but also describing the link between team member social interaction and implementation actions.
Item Open Access Materializing Depths: The Potential of Contemporary Art and Media(2016) Choi, Jung EunThis dissertation argues that critical practices in the expanded field of art, technology, and space illustrate the potential of twenty-first century media by materializing depths of our experiential dimensions. Scholarship on digital embodiment and materialism in art, media studies, and aesthetics has paid much attention to the central role played by the human body in contemporary media environments. Grounded in these studies, however, this study moves forward to understand the more fundamental quality that grounds and conditions the experience of the human body—namely depth.
Drawing on diverse disciplines, such as art history, visual studies, media studies, critical theory, phenomenology, and aesthetics, this study provides a reconstruction of the notion of depth to unpack the complex dimensionality of human experiences that are solicited by different critical spatial practices. As a spatial medium that produces the body subject and the world through the process of intertwining, depth points to an environmental affordance that prepares or conditions the ways in which the body processes the information in the world. The dimension of depth is not available to natural human perception. However, incorporating twenty-first century media that are seamlessly embedded in physical environments, critical spatial practices sensibly materialize the virtual dimensions of depth by animating space in a way that is different from the past.
This dissertation provides comprehensive analyses of these critical spatial practices by artists who create constructed situations that bring the experiential dimensions of depth to the fore. The acknowledgement of depth allows us to understand the spatialities of bodies and their implication in the vaster worldly spatiality. In doing so, this study attends to major contemporary philosophical and aesthetic challenges by reframing the body as the locus of subjectivity that is always interdependent upon broader sociocultural and technological environments.
Item Open Access Single Image Super Resolution:Perceptual quality & Test-time Optimization(2019) Chen, LeiImage super resolution is defined as recovering a high-resolution image given a low-resolution image input. It has a wide area of applications in modern digital image processing, producing better results in areas including satellite image processing, medical image processing, microscopy image processing, astrological studies and surveillance area. However, image super resolution is an ill-posed question since there exists non-deterministic answer in the high resolution image space, making it difficult to find the optimal solution.
In this work, various research directions in the area of single image super resolution are thoroughly studied. Each of the proposed methods' achievements as well as limitations including computational efficiency, perceptual performance limits are compared. The main contribution in this work including implementing a perceptual score predictor and integrating as part of the objective function in the upsampler algorithm. Apart from that, a test-time optimization algorithm is proposed, aiming at further enhance the image quality for the obtained super-resolution image from any upsampler. The proposed methods are implemented and tested using Pytorch. Results are compared on baseline applied datasets including Set5, Set14, Urban100 and DIV2K.
Results from perceptual score predictor was evaluated on both PSNR precision index and perceptual index, which is a combination of perceptual evaluation Ma score and NIQE score. With new objective function, the upsampler achieved to move along the trade-off curve of precision and perception. The test-time optimization algorithm achieved slightly improvements in both precision and perception index. Note that the proposed test time optimization does not require training of new neural network, thus, is computationally efficient.
Item Open Access The Latency Budget: How to Save and What to Buy(2021) Aqeel, WaqarNovel applications have driven innovation in the Internet over decades. Electronic mail and file sharing drove research for communication and congestion control protocols. Hypertext documents then created the web and put the web browser at the center. Online advertisement commercialized the web and accelerated development in web technologies such as JavaScript along with content delivery and caching. Video streaming then demanded higher bandwidth both in the data center and the home network. The web is now headed towards increased interactivity and immersion. With high bandwidth available to many subscribers, end-to-end network latency is likely to be the bottleneck for interactive applications in the future. While some applications have very stringent latency requirements, many have a "good enough" latency floor, beyond which further speed-up is imperceptible to humans. In the latter case, time saved from reduced network latency can be used to improve other aspects of user experience. For example, most private information retrieval protocols require more computation or multiple roundtrips, and reduced network latency can allow clients to use such protocols to protect user privacy while also delivering good quality of experience. The latency budget is then set by the "good enough" latency floor (which may vary over applications). We can save by reducing network latency, and then spend to improve various aspects of the web ecosystem. This thesis (a) addresses a widespread pitfall in measuring latency on the web, and highlights that (b) there is ample potential to reduce infrastructural, long-distance latency, and (c) the saved latency enables improvements in the web ranging from increased publisher revenues for online ads to improved user privacy for DNS queries.
Item Open Access The Use and Abuse of Technology: Reconsidering the Ethics of Civil Disobedience, Leaking, and Intellectual Property for the Information Age(2020) Kennedy, ChristopherThe suspicion that the advent of the internet marks some sort of qualitative change in the development of the human affairs motivates much diagnosis but less instruction about the contemporary political moment. Are there normative implications to recent advances in information technology? This dissertation examines three political conflicts over the use of the internet in a liberal democratic society. Each controversy reflects a basic disagreement about the appropriate domain of the public sphere: whether to accommodate electronic forms of civil disobedience, to treat digital information as intellectual property, or to sanction the act of leaking. Each chapter of the dissertation uses the work and writings of a political activist for insight into competing claims over what should count as a use or abuse or new technology. Electronic methods of political protest clarify an important feature of the justification of civil disobedience that scholars should take into consideration even in the more traditional circumstances in which it is practiced. Current and historical controversies surrounding the ethics of leaking call into question who should have the authority to decide what the public has the right to know. And the free software movement challenges long-standing assumptions about the justification of intellectual property and the public interest bargain at the heart of it. Together, these cases illustrates the normative implications to recent advances in information technology.