Research Images As Art 2014/15 Gallery


Information entropy landscape

Entry No 36.jpg 
Mr Neil Bramley, Cognitive, Perceptual & Brain Sciences

Figure 1 shows how we can construct a space of information entropies. Normal (Shannon) information entropy is at the centre. It is a measure your uncertainty given you hold a particular belief distribution. Here we illustrate with a probability that some variable X takes state x1, x2 or x3 (the corners) with complete uncertainty in the middle. Different assumptions about how your uncertainty tails off over this space yield different information landscapes with important implications for driving information seeking behaviour during learning. The Renyi and Tsallis families evolve in different ways over this space. We are exploring the implications of holding different uncertainty functions in the context choosing what question to ask, or test to perform, during learning. In Figure 2 we explore a range of 'silly' entropy functions to see the extent to which it matters how you codify uncertainty during learning.

 


All images and text copyright their artist/author and MAY NOT be used for any purposes without the express permission of the original artist/author. All Rights Reserved, 2014.
Please contact docschoolweb@ucl.ac.uk with any problems or queries.