# Browsing by Subject "algorithms"

###### Results Per Page

###### Sort Options

Item Open Access Algorithms for the Reeb Graph and Related Concepts(2014) Parsa, SalmanThis thesis is concerned with a structure called the Reeb graph. There are three main problems considered. The first is devising an efficient algorithm for comnstructing the Reeb graph of a simplicial complex with respect to a generic simplex-wise linear real-valued function. We present an algorithm that builds the Reeb graph in almost optimal worst-case deterministic time. This was the first deterministic algorithm with the time bound which is linear up to a logarithmic factor. Without loss of generality, the complex is assumed to be 2-dimensional. The algorithm works by sweeping the function values and maintaining a spanning forest for the preimage, or the level-set, of the value. Using the observation that the operations that change the level-sets are off-line, we was able to achieve the above bound.

The second topic is the dynamic Reeb graphs. As the function changes its values, the Reeb graph also changes. We are interested in updating the Reeb graph. We reduce this problem to a graph problem that we call retroactive graph connectivity. We obtain algorithms for dynamic Reeb graphs, by using data structures that solve the graph problem.

The third topic is an argument regarding the complexity of computing Betti numbers. This problem is also related to the Reeb graph by means of the vertical Homology classes. The problem considered here is whether the realization of a simplicial complex in the Euclidean 4-space can result in an algorithm for computing its Homology groups faster than the usual matrix reduction methods. Using the observation that the vertical Betti numbers can always be computed more efficiently using the Reeb graph, we show that the answer to this question is in general negative. For instance, given a square matrix over the field with two elements, we construct a simplicial complex in linear time, realized in euclidean 4-space and a function on it, such that its Horizontal homology group, over the same field, is isomorphic with the null-space of the matrix. It follows that the Betti number computation for complexes realized in the 4-space is as hard as computing the rank for a sparse matrix.

Item Open Access Ambush on Black Veterans: Foreign Disinformation Swayed the 2016 U.S. Presidential Election by Targeting Black Voters(2023-04-19) Jackson, Chandlee A. IVA Russian-orchestrated influence campaign spread disinformation using social media during the 2016 United States (U.S.) presidential election. Digital evidence shows that Russian operatives developed presumptions about differing identity groups and tailored their interactions to sow strife between groups. The inferred intent was to influence and negatively impact African Americans’ voting practices during the 2016 election campaign. Russian influence agents targeted the Black community more heavily than any other identity group. Influence operations also targeted veterans and veteran-adjacent communities; therefore, African American veterans (Black Vets) received twice the indoctrination because of their dual identity. The online impersonations of Black people and veterans on social media platforms was problematic for a myriad of reasons, but the Russian leadership’s facilitation of disinformation represents adversarial exploitation of protection gaps uncovered in the Digital Age. Currently the First Amendment inhibits the U.S. government and social media platforms from performing the desired protective measures to maintain a healthy online environment that nurtures an informed citizenry. For Black Vets in particular, foreign entities suppressed the voting power of their ethnic group and sought to instigate members of their profession to join domestic violent extremist groups. The following project will propose that the U.S. government ought to change its approach in teaching digital media literacy competencies so that vulnerable populations receive the care and skills necessary to reduce their potential of becoming radicalized. Russian disinformation on social media and how the U.S. will embrace an identity-centric approach to educating digital media literacy is a matter of U.S. national security.Item Open Access Demonstration and Performance Evaluation of Two Novel Algorithms for Removing Artifacts From Automated Intraoperative Temperature Data Sets: Multicenter, Observational, Retrospective Study.(JMIR perioperative medicine, 2022-10) Bardia, Amit; Deshpande, Ranjit; Michel, George; Yanez, David; Dai, Feng; Pace, Nathan L; Schuster, Kevin; Mathis, Michael R; Kheterpal, Sachin; Schonberger, Robert B#### Background

The automated acquisition of intraoperative patient temperature data via temperature probes leads to the possibility of producing a number of artifacts related to probe positioning that may impact these probes' utility for observational research.#### Objective

We sought to compare the performance of two de novo algorithms for filtering such artifacts.#### Methods

In this observational retrospective study, the intraoperative temperature data of adults who received general anesthesia for noncardiac surgery were extracted from the Multicenter Perioperative Outcomes Group registry. Two algorithms were developed and then compared to the reference standard-anesthesiologists' manual artifact detection process. Algorithm 1 (a slope-based algorithm) was based on the linear curve fit of 3 adjacent temperature data points. Algorithm 2 (an interval-based algorithm) assessed for time gaps between contiguous temperature recordings. Sensitivity and specificity values for artifact detection were calculated for each algorithm, as were mean temperatures and areas under the curve for hypothermia (temperatures below 36 C) for each patient, after artifact removal via each methodology.#### Results

A total of 27,683 temperature readings from 200 anesthetic records were analyzed. The overall agreement among the anesthesiologists was 92.1%. Both algorithms had high specificity but moderate sensitivity (specificity: 99.02% for algorithm 1 vs 99.54% for algorithm 2; sensitivity: 49.13% for algorithm 1 vs 37.72% for algorithm 2; F-score: 0.65 for algorithm 1 vs 0.55 for algorithm 2). The areas under the curve for time × hypothermic temperature and the mean temperatures recorded for each case after artifact removal were similar between the algorithms and the anesthesiologists.#### Conclusions

The tested algorithms provide an automated way to filter intraoperative temperature artifacts that closely approximates manual sorting by anesthesiologists. Our study provides evidence demonstrating the efficacy of highly generalizable artifact reduction algorithms that can be readily used by observational studies that rely on automated intraoperative data acquisition.Item Open Access Developing Scalable Abilities for Self-Reconfigurable Robots(2010) Slee, SamThe power of modern computer systems is due in no small part to their fantastic ability to adapt to whatever tasks they are charged with. Self-reconfigurable robots seek to provide that flexibility in hardware by building a system out of many individual modules, each with limited functionality, but with the ability to rearrange themselves to modify the shape and structure of the overall robotic system and meet whatever challenges are faced. Various hardware systems have been constructed for reconfigurable robots, and algorithms for them produce a wide variety of modes of locomotion. However, the task of efficiently controlling these complex systems -- possibly with thousands or millions of modules comprising a single robot -- is still not fully solved even after years of prior work on the topic.

In this thesis, we investigate the topic of theoretical control algorithms for lattice-style self-reconfigurable robots. These robots are composed of modules attached to each other in discrete lattice locations and only move by transitioning from one lattice location to another adjacent location. In our work, given the physical limitations of modules in a robot, we show a lower bound for the time to reconfiguration that robot. That is, transition the robot from one connected arrangement of modules to a different connected arrangement. Furthermore, we develop an algorithm with a running time that matches this lower bound both for a specific example reconfiguration problem and for general reconfiguration between any pair of 2D arrangements of modules. Since these algorithms match the demonstrated lower bound, they are optimal given the assumed abilities of the modules in the robot.

In addition to our theoretically optimal reconfiguration algorithms, we also make contributions to the more practical side of of this robotics field with a novel, physically stable control algorithm. The majority of prior theoretical work on control algorithms for self-reconfigurable robots did not consider the effects of gravity upon the robot. The result is that these algorithms often transform a robot into configurations -- arrangements of modules -- which are unstable and would likely break hardware on a real robot with thousands or millions of modules. In this thesis we present an algorithm for locomotion of a self-reconfigurable robot which is always physically stable in the presence of gravity even though we assume limited abilities for the robot's modules to withstand tension or sheer forces. This algorithm is highly scalable, able to be efficiently run on a robot with millions of modules, demonstrates significant speed advantages over prior scalable locomotion algorithms, and is resilient to errors in module actions or message passing. Overall, the contributions of this thesis extend both the theoretical and practical limits of what is possible with control algorithms for self-reconfigurable robots.

Item Open Access Discriminación y colonialidad en la visión algorítmica. Abordajes artísticos para una revisión crítica de la clasificación de personas(Revista 180, 2021-12-31) Idarraga franco, HugoA la luz de propuestas artísticas con medios digitales y, específicamente, con inteligencia artificial, en este artículo se analiza críticamente la discriminación de personas en modelos de aprendizaje de máquinas; en particular, en aquellos modelos de clasificación diseñados para cumplir tareas de vigilancia y control social. Para ello, se propondrá en primer lugar que, según una versión filosófica del objetivismo, distintos modelos algorítmicos pretenden clasificar “objetivamente” a las personas en función de sus rasgos corporales, vinculándolas con perfiles psicológicos y de comportamiento. La dudosa relación que aquí se cuestionará es aquella que se basa en rasgos visibles y características invisibles de las personas, forjada por una mirada colonial que se reproduce actualmente en el funcionamiento de la visión algorítmica. En segundo lugar, se afirmará que esta discriminación se materializa en el mismo diseño de los modelos de clasificación. Por un lado, se abordará la importancia de la estadística para el funcionamiento del aprendizaje de máquinas en la perspectiva de sus relaciones históricas con prácticas policiales de vigilancia y control social; y, en segundo lugar, se analizará cómo aquella mirada colonial se reproduce en los conjuntos de datos y en los nombres de las categorías bajo las cuales las imágenes son etiquetadas y clasificadas, determinando así la realidad percibida algorítmicamente por estos modelos de clasificación.Item Open Access Essays on Criminal Justice and Inequality(2022) Jabri, RanaeThis dissertation encompasses three essays on policing and criminal justice, algorithms and inequality. The first two essays examine the efficacy and equity implications of data-driven algorithms that are increasingly used in important life-altering decision-making contexts. The third essay investigates when crime responds to punishment.

The first essay studies the impacts of neighborhood targeting of police presence brought about by predictive policing algorithms on crime and arrests. While predictive policing is widely used, the impacts of neighborhood targeting brought about by predictive policing on crime, and whether there are disproportionate racial impacts are open questions. Using a novel dataset, I isolate quasi-experimental variation in police presence induced by predictive-policing algorithms to estimate the causal impacts of algorithm-induced police presence. I find that algorithm-induced police presence decreases serious violent and property crime, and evidence that algorithm-induced neighborhood targeting of police presence has disproportionate racial impacts on traffic incident arrests and serious violent crime incident arrests.

The second essay investigates how data-driven algorithms can maximize overall predictive power at the cost of racial and economic justice. Examining a tool that is already widely used in pretrial decision-making, I build a framework to evaluate how input variables trades off overall predictive power, and racial and economic disparities in the scores that defendants receive. I find that using information on neighborhoods where defendants live only marginally contributes to overall predictive power. However, the use of defendant neighborhood data substantially increases racial and economic disparities, suggesting that machine learning objectives tuned to maximize overall predictive power risk being in conflict with racial and economic justice.

Finally, in the third essay, joint with Sarah Komisarow and Robert Gonzalez, we examine when crime responds to punishment severity increases. While economic theory suggests that crime should respond to punishment severity, empirical evidence on this link is ambiguous. We propose an explanation for this empirical evidence -- the effect of punishment severity increases depends on the probability of detection; punishments deter crime when the probability of detection is moderate. We test and validate this explanation using increases in punishment severity in drug-free school zones along with changes in the probability of detection resulting from a community crime-monitoring program.

Item Open Access Homological Illusions of Persistence and Stability(2008-08-04) Morozov, DmitriyIn this thesis we explore and extend the theory of persistent homology, which captures topological features of a function by pairing its critical values. The result is represented by a collection of points in the extended plane called persistence diagram.

We start with the question of ridding the function of topological noise as suggested by its persistence diagram. We give an algorithm for hierarchically finding such epsilon-simplifications on 2-manifolds as well as answer the question of when it is impossible to simplify a function in higher dimensions.

We continue by examining time-varying functions. The original algorithm computes the persistence pairing from an ordering of the simplices in a triangulation and takes worst-case time cubic in the number of simplices. We describe how to maintain the pairing in linear time per transposition of consecutive simplices. A side effect of the update algorithm is an elementary proof of the stability of persistence diagrams. We introduce a parametrized family of persistence diagrams called persistence vineyards and illustrate the concept with a vineyard describing a folding of a small peptide. We also base a simple algorithm to compute the rank invariant of a collection of functions on the update procedure.

Guided by the desire to reconstruct stratified spaces from noisy samples, we use the vineyard of the distance function restricted to a 1-parameter family of neighborhoods of a point to assess the local homology of a sampled stratified space at that point. We prove the correctness of this assessment under the assumption of a sufficiently dense sample. We also give an algorithm that constructs the vineyard and makes the local assessment in time at most cubic in the size of the Delaunay triangulation of the point sample.

Finally, to refine the measurement of local homology the thesis extends the notion of persistent homology to sequences of kernels, images, and cokernels of maps induced by inclusions in a filtration of pairs of spaces. Specifically, we note that persistence in this context is well defined, we prove that the persistence diagrams are stable, and we explain how to compute them. Additionally, we use image persistence to cope with functions on noisy domains.

Item Open Access Methods for Systematic Exploratory Analysis of Gene Expression Data with Applications to Cancer Genomics(2017) Wagner, FlorianAdvances in technologies for gene expression profiling have resulted in an unprecedented abundance of gene expression data. However, computational methods available for the exploratory analysis of such data are limited in their ability to generate an interpretable overview of biologically relevant similarities and differences among samples. This work first introduces the XL-mHG test, a sensitive and specific hypothesis test for detecting gene set enrichment, and discusses its algorithmic and statistical properties. It further introduces GO-PCA, a method for exploratory analysis of gene expression data using prior knowledge. The XL-mHG test serves as a building block for GO-PCA. The output of GO-PCA consists of functional expression signatures, designed to provide an interpretable representation of biologically meaningful variation in the data. The power and versatility of the method is demonstrated on heterogeneous human and mouse expression data. Finally, applications of the proposed methods to carcinoma and lymphoma expression data aim to demonstrate their clinical relevance. The effective utilization of prior knowledge in the exploratory analysis of gene expression data through carefully designed computational methods is essential for successfully harnessing the power of current and future platforms for gene expression profiling, with the aim of generating clinically relevant insights into complex diseases such as cancer.

Item Open Access Two Algorithmic Schemes for Geometric Bipartite Matching and Transportation(2020) Xiao, AllenGiven n red points and n blue points in d-dimensional space and a distance function, the geometric bipartite matching problem is to find a matching of red points to blue points, such that the sum of distances between the matched pairs is minimized. The general graph problem, minimum-cost bipartite matching, can be solved in polynomial time using the classical Hungarian algorithm, which leads to a cubic algorithm for the geometric matching problem. We present several algorithms for computing optimal and near-optimal geometric bipartite matching and its fractional generalization, geometric transportation. We also present fast algorithms for partial matching, i.e., match only k pairs for a given k <= n.

Finally, we consider the case when red points are allowed to translate (rigidly) and the goal is to find a partial matching minimum-cost matching under all translations of red points. We assume the cost of a matching to be the sum of squares of edge costs (RMS). For a given translation t, let f(t) denote the cost of optimal partial matching (for a fixed k). A long-standing open problem is whether the number of local minima in f(t) is polynomial; the best upper bounds are exponential. We show that there is an approximation g(t) of the function f(t) that has only quadratic number of local minima.