Browsing by Author "Kovalsky, Shahar Z"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Open Access A Deep-Learning Algorithm for Thyroid Malignancy Prediction From Whole Slide Cytopathology ImagesDov, David; Kovalsky, Shahar Z; Assaad, Serge; Cohen, Jonathan; Range, Danielle Elliott; Pendse, Avani A; Henao, Ricardo; Carin, LawrenceWe consider thyroid-malignancy prediction from ultra-high-resolution whole-slide cytopathology images. We propose a deep-learning-based algorithm that is inspired by the way a cytopathologist diagnoses the slides. The algorithm identifies diagnostically relevant image regions and assigns them local malignancy scores, that in turn are incorporated into a global malignancy prediction. We discuss the relation of our deep-learning-based approach to multiple-instance learning (MIL) and describe how it deviates from classical MIL methods by the use of a supervised procedure to extract relevant regions from the whole-slide. The analysis of our algorithm further reveals a close relation to hypothesis testing, which, along with unique characteristics of thyroid cytopathology, allows us to devise an improved training strategy. We further propose an ordinal regression framework for the simultaneous prediction of thyroid malignancy and an ordered diagnostic score acting as a regularizer, which further improves the predictions of the network. Experimental results demonstrate that the proposed algorithm outperforms several competing methods, achieving performance comparable to human experts.Item Open Access Neural Network Approximation of Refinable Functions.(CoRR, 2021) Daubechies, Ingrid; DeVore, Ronald; Dym, Nadav; Faigenbaum-Golovin, Shira; Kovalsky, Shahar Z; Lin, Kung-Ching; Park, Josiah; Petrova, Guergana; Sober, BarakIn the desire to quantify the success of neural networks in deep learning and other applications, there is a great interest in understanding which functions are efficiently approximated by the outputs of neural networks. By now, there exists a variety of results which show that a wide range of functions can be approximated with sometimes surprising accuracy by these outputs. For example, it is known that the set of functions that can be approximated with exponential accuracy (in terms of the number of parameters used) includes, on one hand, very smooth functions such as polynomials and analytic functions (see e.g. \cite{E,S,Y}) and, on the other hand, very rough functions such as the Weierstrass function (see e.g. \cite{EPGB,DDFHP}), which is nowhere differentiable. In this paper, we add to the latter class of rough functions by showing that it also includes refinable functions. Namely, we show that refinable functions are approximated by the outputs of deep ReLU networks with a fixed width and increasing depth with accuracy exponential in terms of their number of parameters. Our results apply to functions used in the standard construction of wavelets as well as to functions constructed via subdivision algorithms in Computer Aided Geometric Design.Item Open Access Non-Convex Planar Harmonic MapsKovalsky, Shahar Z; Aigerman, Noam; Daubechies, Ingrid; Kazhdan, Michael; Lu, Jianfeng; Steinerberger, StefanWe formulate a novel characterization of a family of invertible maps between two-dimensional domains. Our work follows two classic results: The Rad\'o-Kneser-Choquet (RKC) theorem, which establishes the invertibility of harmonic maps into a convex planer domain; and Tutte's embedding theorem for planar graphs - RKC's discrete counterpart - which proves the invertibility of piecewise linear maps of triangulated domains satisfying a discrete-harmonic principle, into a convex planar polygon. In both theorems, the convexity of the target domain is essential for ensuring invertibility. We extend these characterizations, in both the continuous and discrete cases, by replacing convexity with a less restrictive condition. In the continuous case, Alessandrini and Nesi provide a characterization of invertible harmonic maps into non-convex domains with a smooth boundary by adding additional conditions on orientation preservation along the boundary. We extend their results by defining a condition on the normal derivatives along the boundary, which we call the cone condition; this condition is tractable and geometrically intuitive, encoding a weak notion of local invertibility. The cone condition enables us to extend Alessandrini and Nesi to the case of harmonic maps into non-convex domains with a piecewise-smooth boundary. In the discrete case, we use an analog of the cone condition to characterize invertible discrete-harmonic piecewise-linear maps of triangulations. This gives an analog of our continuous results and characterizes invertible discrete-harmonic maps in terms of the orientation of triangles incident on the boundary.