A) Alice Jones B) David A. Huffman C) Robert Johnson D) John Smith
A) Variable-length encoding B) ASCII encoding C) Binary encoding D) Fixed-length encoding
A) Symbols at odd indices B) Symbols starting with A C) Rare symbols D) Frequent symbols
A) A code where no codeword is a prefix of another B) A code that uses only 0s and 1s C) A code that starts with the same symbol D) A code with equal-length codewords
A) Perfect tree B) Complete tree C) Optimal binary tree D) Balanced tree
A) Memory consumption B) Compression ratio C) Encoding speed D) Number of symbols
A) O(n2) B) O(log n) C) O(n) D) O(n log n)
A) Compressing the data B) Calculating symbol frequencies C) Assigning binary codes to symbols D) Building a linked list
A) Least frequent symbol B) Symbol with a prime number C) Most frequent symbol D) Symbol with the longest name
A) Stack B) Queue C) Linked list D) Binary heap
A) Prefix codes B) Infix codes C) Suffix codes D) Postfix codes
A) 1960 B) 1952 C) 1955 D) 1949
A) Lempel-Ziv-Welch (LZW) B) Run-length encoding C) Arithmetic coding D) Shannon-Fano coding
A) h(a_i) = w_i * log2(w_i) B) h(a_i) = log2(1 / w_i) C) h(a_i) = -log2(w_i) D) h(a_i) = 2w_i
A) H(A) = ∑(w_i > 0) log2(w_i) B) H(A) = -∑(w_i > 0) w_i * log2(w_i) C) H(A) = ∑(w_i > 0) h(a_i) / w_i D) H(A) = ∑(w_i > 0) w_i / log2(w_i)
A) Zero, since lim_(w→0+) w * log2(w) = 0 B) It contributes negatively to the entropy C) It equals the inverse of its weight D) It is equal to the symbol's information content
A) Following the right child B) A leaf node C) Following the left child D) An internal node
A) Array B) Queue C) Stack D) Priority queue
A) Two B) Four C) One D) Three
A) The second queue B) The first queue C) Neither queue D) Both queues simultaneously
A) By keeping initial weights in the first queue and combined weights in the second queue B) By sorting both queues by weight after each insertion C) By randomly selecting nodes from either queue D) By only enqueuing nodes with unique weights
A) Randomly select an item from either queue B) Choose the item in the second queue C) Choose the item in the first queue D) Remove both items and start over
A) They become root nodes B) They remain as leaf nodes C) They are removed from the tree D) They are combined into a new internal node
A) Audio file compression. B) Fax machines. C) Image encoding for web pages. D) Text compression in word processors.
A) Only compression-related problems. B) Problems related to sorting data. C) Minimizing the maximum weighted path length, among others. D) Problems that do not involve weights.
A) Binary Huffman algorithm. B) Template Huffman algorithm. C) The package-merge algorithm. D) Adaptive Huffman algorithm.
A) Richard M. Karp. B) Alan Turing. C) Adriano Garsia. D) T. C. Hu.
A) The transmission cost. B) The frequency of occurrence. C) The binary representation. D) The alphabetic order.
A) Stanford University B) MIT C) Harvard University D) Princeton University
A) A frequency table must be stored with the compressed text. B) The original text must be stored alongside the compressed version. C) An encryption key must accompany the compressed data. D) No additional information needs to be stored. |