On the Universality of Rotation Equivariant Point Cloud Networks.

dc.contributor.author

Dym, Nadav

dc.contributor.author

Maron, Haggai

dc.date.accessioned

2020-12-14T19:39:31Z

dc.date.available

2020-12-14T19:39:31Z

dc.date.issued

2020

dc.date.updated

2020-12-14T19:39:30Z

dc.description.abstract

Learning functions on point clouds has applications in many fields, including computer vision, computer graphics, physics, and chemistry. Recently, there has been a growing interest in neural architectures that are invariant or equivariant to all three shape-preserving transformations of point clouds: translation, rotation, and permutation. In this paper, we present a first study of the approximation power of these architectures. We first derive two sufficient conditions for an equivariant architecture to have the universal approximation property, based on a novel characterization of the space of equivariant polynomials. We then use these conditions to show that two recently suggested models are universal, and for devising two other novel universal architectures.

dc.identifier.uri

https://hdl.handle.net/10161/21901

dc.relation.ispartof

CoRR

dc.subject

cs.LG

dc.subject

cs.LG

dc.subject

cs.CG

dc.title

On the Universality of Rotation Equivariant Point Cloud Networks.

dc.type

Journal article

pubs.organisational-group

Trinity College of Arts & Sciences

pubs.organisational-group

Mathematics

pubs.organisational-group

Duke

pubs.volume

abs/2010.02449

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2010.02449v1.pdf
Size:
1.44 MB
Format:
Adobe Portable Document Format