# Sapto Condro loves Science and Technology

## Information transfer rate (ITR) and Poco-poco dance

Information transfer rate ( $ITR$) $ITR = B \cdot v$
with $B$ as bits per transfer and $v$ as speed

Speed/rate ( $v$) $v = \frac{n + e}{t}$
with $n$ as good information transfer, $e$ as error and $t$ is transfer time

Bits per transfer ( $B$) $B = \log_2{N }+ P \cdot \log_2{P} + ( 1 - P ) \cdot \log_2{(\frac{1 - P}{N - 1})}$
with $N$ as the number which shows how many kinds of information you can transfer, $P$ as the accuracy, which means the number of good information ( $n$) divided by the total information

Accuracy ( $P$) $P = \frac{n}{n + e}$

***

First Example:

I am learning how to dance. The basic steps are just go to left and right. The dance instructor will tell “left” and “right”.
One example is “left, left, right, left, right, right, left, right” That are first eight step of line dance.

In this case, the information type is just LEFT and RIGHT. $N = 2$

In 15 minutes, the instructor has just taught me a dance routine. There was 100 steps. Fiuh, I am sweating. Because I am a beginner, I made 60 mistakes and 40 good dance. $n = 40$ $e = 60$ $t = 15$ minutes

The Accuracy ( $P$) $P = \frac{40}{40+60} = 0.4 = 40$ percent

The learning speed ( $v$) $v = \frac{100}{15} = 6.67$

The bits per learning ( $B$) $B = log_2{(2)} + 0.4 \cdot log_2{(0.4)} + (1 - 0.4) \cdot log_2{(\frac{1 - 0.4}{2 -1})}$ $B = 1 + 0.4 \cdot (-1.32) + 0.6 \cdot (-0.74)$ $B = 0.03$ bits per learning

The information transfer rate $ITR = 0.03 \cdot 6.67 = 0.2$
So I am learning how to dance with a rate of 0.2 bits per minute.

***

Second Example:

Now, I learn how to dance Poco-poco. This line dance comes from North Sulawesi in Indonesia.
The basic steps are

• move left
• move right
• move forward
• move backward
• lean forward
• lean backward
• twist left
• twist right

So the number of information types are $N = 8$

More advanced steps are

• cross forward right
• cross forward left
• cross backward right
• cross backward left
• and so on

So $N$ can be increasing depending on how well you learn the dance.

You can learn the basic step of Poco-poco from here.

***

Third Example:

Well, in a brain-computer interface experiment, a human subject has to do a task containing commands: LEFT, RIGHT, UP, DOWN. So $N = 4$. The time $t$, how many successful tasks $n$ and how many mistakes $e$ are measured. In the end, you can calculate the information transfer rate (ITR). More of this example can be read in my master thesis (here: wordpress, blogspot, scribd).

***

Well, I am still looking for books about this bits per transfer equation. It has something to do with information entropy.

Bits per transfer ( $B$) $B = \log_2{N }+ P \cdot \log_2{P} + ( 1 - P ) \cdot \log_2{(\frac{1 - P}{N - 1})}$

From wikipedia, the binary entropy function ( $H_b$): $H_b(p) = - p \cdot \log_2{p} - (1 - p) \cdot \log_2{(1 - p)}$
with $p$ as a probability in a Bernoulli process, containing only 2 states, for example success-failure, yes-no, true-false, on-off, two-side of tossing coin.

From wikipedia, the Shannon entropy function ( $H$): $H(X) = - \sum_{i = 1}^{n}{p(x_i) \cdot \log_b{p(x_i)}}$
with $p$ as a probability mass function of a discrete random variable $X$ with a possible values $\{ x_1, x_2, \dotsc, x_n \}$,
and $b$ as the base or the possible states of $x_i$ (in a binary or Bernoulli process b = 2)
Entropy is a measure of uncertainty associated to random variable. Shannon entropy quantifies expected value of information contained in a message. In binary process, it is quantified in bits.

Other reference:

1. J.R. Wolpaw, N. Birbaumer, D.J. McFarland, G. Pfurtscheller, and T.M Vaughan, “Brain-computer interfaces for communication and control,” Clinical Neurophysiology, vol. 113, pp. 767-791, 2002.
2. M. Cheng, X. Gao, S. Gao, and D. Xu, “Design and Implementation of a Brain-Computer Interface with High Transfer Rate,” IEEE Transactions on Biomedical Engineering, vol. 49, pp. 1181-1186, October 2002.
3. Atmawan-Bisawarna,I.S.C., “Improvement of Response Times in SSVEP-based Brain-Computer Interface,” Master thesis, Information and Automation Engineering, University of Bremen, 2010.

Nürnberg, 17 Maret 2012