Services
Discover
Homeschooling
Ask a Question
Log in
Sign up
Filters
Done
Question type:
Essay
Multiple Choice
Short Answer
True False
Matching
Topic
Computing
Study Set
Fundamentals of Multimedia
Quiz 4: Lossless Compression Algorithms
Path 4
Access For Free
Share
All types
Filters
Study Flashcards
Question 1
Essay
Calculate the entropy of a "checkerboard" image in which half of the pixels are BLACK and half of them are WHITE.
Question 2
Essay
Consider a text string containing a set of characters and their frequency counts as follows: A: (15), B: (7), C: (6), D: (6) and E: (5). Show that the Shannon-Fano algorithm produces a tree that encodes this string in a total of 89 bits, whereas the Huffman algorithm needs only 87 bits.
Question 3
Essay
Suppose we have a source consisting of two symbols,
X
X
X
and
Y
Y
Y
, with probabilities
P
X
=
2
/
3
P_{X}=2 / 3
P
X
ā
=
2/3
and
P
Y
=
1
/
3
P_{Y}=1 / 3
P
Y
ā
=
1/3
. (a) Using Arithmetic Coding, what are the resulting bitstrings, for input
X
X
X
XXX
XXX
and
Y
X
X
YXX
Y
XX
? (b) A major advantage of adaptive algorithms is that no a priori knowledge of symbol probabilities is required. (1) Propose an Adaptive Arithmetic Coding algorithm and briefly describe how it would work. For simplicity, let's assume both encoder and decoder know that exactly
k
=
3
k=3
k
=
3
symbols will be sent each time. (2) Assume the sequence of messages is initially
3
X
X
X
Ā
s
3 XXX \mathrm{~s}
3
XXX
Ā
s
, followed by
11
Y
Y
Y
Ā
s
ā
11 YYY \mathrm{~s}-
11
YYY
Ā
s
ā
the sequence is
X
X
X
X
X
X
X
X
X
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
X X X X X X X X X Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
XXXXXXXXX
YYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
Y
Y
Y
Y Y Y
YYY
. Show how your Adaptive Arithmetic Coding algorithm works, for this sequence, for encoding: (i) the first
X
X
X
XXX
XXX
, (ii) the second
X
X
X
XXX
XXX
, and (iii) the last
Y
Y
Y
YYY
YYY
.
Question 4
Essay
For the LZW algorithm, assume an alphabet
{
ā
\{*
{
ā
a b o d
}
\}
}
of size 5 from which text characters are picked. Suppose the following string of characters is to be transmitted (or stored):
d
a
b
b
a
ā
d
a
b
b
a
ā
d
a
b
b
a
ā
d
o
o
ā
d
o
o
ā¦
\mathrm{dabba} * \mathrm{dabba} * \mathrm{dabba} * \mathrm{doo} * \mathrm{doo} \ldots
dabba
ā
dabba
ā
dabba
ā
doo
ā
doo
ā¦
The coder starts with an initial dictionary consisting of the above 5 characters and an index assignment, as in the following table:
Make a table showing the string being assembled, the current character, the output, the new symbol, and the index corresponding to the symbol, for the above input - only go as far as "d a b b a d a b b a * d".
Question 5
Essay
Consider an alphabet with three symbols
A
,
B
,
C
A, B, C
A
,
B
,
C
, with probability
P
(
A
)
=
x
,
P
(
C
)
=
y
P(A)=x, P(C)=y
P
(
A
)
=
x
,
P
(
C
)
=
y
. and
P
(
C
)
=
1
ā
x
ā
y
P(C)=1-x-y
P
(
C
)
=
1
ā
x
ā
y
. State how you would go about plotting the entropy as a function of
x
x
x
and
y
y
y
. Pseudocode is perhaps the best way to state your answer.
Question 6
Essay
Is the following code uniquely decodable?
1
ā¦
a
2
ā¦
a
b
a
3
ā¦
b
a
b
\begin{aligned}1 & \mapsto a \\2 & \mapsto a b a \\3 & \mapsto b a b\end{aligned}
1
2
3
ā
ā¦
a
ā¦
aba
ā¦
bab
ā
Question 7
Essay
Consider the question of whether it is true that the size of the Huffman code of a symbol
a
i
a_{i}
a
i
ā
with probability
p
i
p_{i}
p
i
ā
is always less than or equal to
ā
ā
log
ā”
p
i
ā
\left\lceil-\log p_{i}\right\rceil
ā
ā
lo
g
p
i
ā
ā
. As a counter example, let the source alphabet be
{
=
a
,
b
}
\{=a, b\}
{
=
a
,
b
}
. Given
p
(
a
)
p(a)
p
(
a
)
and
p
(
b
)
p(b)
p
(
b
)
, construct a Huffman tree. Will the tree be different for different values for
p
(
a
)
p(a)
p
(
a
)
and
p
(
b
)
p(b)
p
(
b
)
? What conclusion can you draw?
Question 8
Essay
Suppose we wish to transmit the 10-character string "MULTIMEDIA". The characters in the string are chosen from a finite alphabet of 8 characters. (a) What are the probabilities of each character in the source string? (b) Compute the entropy of the source string. (c) If the source string is encoded using Huffman coding, draw the encoding tree and compute the average number of bits needed.
Question 9
Essay
If the source string MULTIMEDIA is now encoded using Arithmetic coding, what is the codeword in fractional decimal representation. How many bits are needed for coding in binary format? How does this compare to the entropy?