**entropy**

[1908.11184] The maximum entropy of a metric space

yesterday by cshalizi

"We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These entropies generalise the Shannon and Rényi entropies of information theory.

"We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X.

"Primarily we work not with entropy itself but its exponential, called diversity and (in its finite form) used as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes."

to:NB
entropy
information_theory
geometry
"We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X.

"Primarily we work not with entropy itself but its exponential, called diversity and (in its finite form) used as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes."

yesterday by cshalizi

KNOB Attack

8 weeks ago by asteroza

The spec allowed 1 byte of entropy?!? What the hell were they smoking?

bluetooth
vulnerability
low
entropy
session
key
security
hacking
pentesting
8 weeks ago by asteroza

[1907.12879] Visual Entropy and the Visualization of Uncertainty

10 weeks ago by cshalizi

"Background: It is possible to find many different visual representations of data values in visualizations, it is less common to see visual representations that include uncertainty, especially in visualizations intended for non-technical audiences. Objective: our aim is to rigorously define and evaluate the novel use of visual entropy as a measure of shape that allows us to construct an ordered scale of glyphs for use in representing both uncertainty and value in 2D and 3D environments. Method: We use sample entropy as a numerical measure of visual entropy to construct a set of glyphs using R and Blender which vary in their complexity. Results: A Bradley-Terry analysis of a pairwise comparison of the glyphs shows participants (n=19) ordered the glyphs as predicted by the visual entropy score (linear regression R2 >0.97, p<0.001). We also evaluate whether the glyphs can effectively represent uncertainty using a signal detection method, participants (n=15) were able to search for glyphs representing uncertainty with high sensitivity and low error rates. Conclusion: visual entropy is a novel cue for representing ordered data and provides a channel that allows the uncertainty of a measure to be presented alongside its mean value."

to:NB
visual_display_of_quantitative_information
entropy
information_theory
statistics
10 weeks ago by cshalizi

Information Theory for Intelligent People

10 weeks ago by pw201

(a play on the "for Dummies" series).

entropy
communication
shannon
information
mathematics
10 weeks ago by pw201