**1. Actual**

"There is nothing in the definition that tells you how to provide yourself with S and P, given some

*actual*experiment."

**2. Alignment**

"The code string is shifted by the same amount in order to maintain

*alignment*."

**3. Arithmetic**

"Binary coder (such as the Q-coder) are an important special case in

*arithmetic*coding."

**4. Binomial Distribution**

""

**5. Channel**

"A

*channel*is a communication device with two ends, an input end, or transmitter, and an output end, or receiver."

**6. Condition**

"This theorme may seem, at first glance, to be saying that all you have to do to find the capacity of a channel and the optimal input frequencies is to solve the capacity equations of the channel, the equations arising from the Lagrange Multiplier Theorem, and the

*condition*( Σi )^n =1 pi = 1, for p1 ,..., pn > 0".

**7. Content**

"Since the average information

*content*per character is H(S), it follows that the source is emitting information at the rate r H (S) information units per unit time, on average."

**8. Denote**

"Let C

*denote*the channel capacity."

**9. Differ**

"By the Law of Large Numbers, if N is large, the source words of lenght N in which the proportions of the source letters within the word

*differ*markedly from the fj have very small probability, collectively."

**10. Entire**

"Furthermore, there is a waste of time involved: you have to wait until the

*entire*source string is scanned and processed before you have to code for it."

**11. Exact**

"There are two inconveniences to be dealt with when there is a long run of no rescaling. The first is that we have to carry on doing

*exact*computations of the endpoints α and α + ℓ with the smaller and smaller values of ℓ."

**12. Fingerprint**

""

**13. Frequency**

"We now return to the general case, with A ={a1,...,an} and ai haveing input

*frequency*pi."

**14. Implicit**

"The usual theory of binary block codes starts from the

*implicit*assumption that the relative source frequencies are equal."

**15. Independent**

"The hidden cost suffered here is fixed,

*independent*of the lenght of the source text, and is, therefore, usually essentially negligible."

**16. Inexact**

""

**17. Input**

"

*Input*frequency: Is the relative proportion of ocurrence of an input symbol ai. Let's say we have a memoryless channe with input alphabet A = {a1,..., an}, output alphabet B={b1,...,bn}, and transition probabilities qij, for i ∈ {1,...,n}. The input frequency of a character ai is denoted by pi.

**18. Joint**

"Similarly, the

*joint*and conditional entropies are average values of certain kind of

**19. Jump**

"The sequence x, as seen by the Fourier transform, makes a considerable

*jump*each time k goes from -9 to -9, -1 to 0, 7 to 8, 15 to 16, etc."

**20. Matching**

"This can provide very fast

*matching,*but it significantly reduces the size of the dictionary."

**21. Prime**

"The constant 40543 which appears in the definition is

*prime*, but is there any other reason for its choice?"

**22. Recursive**

"This feature of the utility is known as

*revursive*or iterative compression, and will enable you to compress your data files to a tiny fraction of the original size."

**23. Relative**

"When p=0, the capacity is zero, and any

*relative*input frequencies are 'optimal'. "

**24. Success**

"When speaking of some unspecified Bernoulli trial, we will call one possible outcome

*Success*, or S, and the ohter Failure, or F."

**25. Symmetric**

"A binary

*symmetric*channel (BSC, for short) is a memoryless channel with A = B = {0,1}; whichever digit, 0 or 1, is being transmitted, the probability p that it will get through correctly is called the reliability of the channel."

**26. Typical**

"Higher-order models may do better, where the necessary statistics are gathered from a sample of '

*typical*' text."

**27. Sequence**

The word v is called the leave of the parsing of W into a sequence of the s

*i*.

**28. Binary**

The most common sort of choice for source alphabet is: S = {0, 1} L , the set

of binary words of some fixed length L.

NP en tarea de codificación adaptativa. 7 pts extra por las palabras en tarea 3.

ResponderEliminar