Information transfer rate ()
with as bits per transfer and as speed
with as good information transfer, as error and is transfer time
Bits per transfer ()
with as the number which shows how many kinds of information you can transfer,
as the accuracy, which means the number of good information () divided by the total information
I am learning how to dance. The basic steps are just go to left and right. The dance instructor will tell “left” and “right”.
One example is “left, left, right, left, right, right, left, right” That are first eight step of line dance.
In this case, the information type is just LEFT and RIGHT.
In 15 minutes, the instructor has just taught me a dance routine. There was 100 steps. Fiuh, I am sweating. Because I am a beginner, I made 60 mistakes and 40 good dance.
The Accuracy ()
The learning speed ()
The bits per learning ()
bits per learning
The information transfer rate
So I am learning how to dance with a rate of 0.2 bits per minute.
- move left
- move right
- move forward
- move backward
- lean forward
- lean backward
- twist left
- twist right
So the number of information types are
More advanced steps are
- cross forward right
- cross forward left
- cross backward right
- cross backward left
- and so on
So can be increasing depending on how well you learn the dance.
You can learn the basic step of Poco-poco from here.
Well, in a brain-computer interface experiment, a human subject has to do a task containing commands: LEFT, RIGHT, UP, DOWN. So . The time , how many successful tasks and how many mistakes are measured. In the end, you can calculate the information transfer rate (ITR). More of this example can be read in my master thesis (here: wordpress, blogspot, scribd).
Well, I am still looking for books about this bits per transfer equation. It has something to do with information entropy.
Bits per transfer ()
From wikipedia, the binary entropy function ():
with as a probability in a Bernoulli process, containing only 2 states, for example success-failure, yes-no, true-false, on-off, two-side of tossing coin.
From wikipedia, the Shannon entropy function ():
with as a probability mass function of a discrete random variable with a possible values ,
and as the base or the possible states of (in a binary or Bernoulli process b = 2)
Entropy is a measure of uncertainty associated to random variable. Shannon entropy quantifies expected value of information contained in a message. In binary process, it is quantified in bits.
- J.R. Wolpaw, N. Birbaumer, D.J. McFarland, G. Pfurtscheller, and T.M Vaughan, “Brain-computer interfaces for communication and control,” Clinical Neurophysiology, vol. 113, pp. 767-791, 2002.
- M. Cheng, X. Gao, S. Gao, and D. Xu, “Design and Implementation of a Brain-Computer Interface with High Transfer Rate,” IEEE Transactions on Biomedical Engineering, vol. 49, pp. 1181-1186, October 2002.
- Atmawan-Bisawarna,I.S.C., “Improvement of Response Times in SSVEP-based Brain-Computer Interface,” Master thesis, Information and Automation Engineering, University of Bremen, 2010.
Nürnberg, 17 Maret 2012