Rylan Schaeffer

Logo
Resume
Research
Learning
Blog
Teaching
Jokes
Kernel Papers


Properties for Distances / Divergences

Bellemare et al 2017 introduce several properties of distances that might be desirable.

\[d(cX, cY) \leq \lvert c \lvert^{\beta} d(X, Y)\]

Intuitively, this just means that scaling the arguments by \(c\) scales the distance by \(c\), possibly to some power.

\[d(X+A, Y+A) \leq d(X, Y)\]

Intuitively, this means a constant shift of both \(X, Y\) doesn’t change the distance between them.