A) Alice Jones B) John Smith C) David A. Huffman D) Robert Johnson
A) ASCII encoding B) Fixed-length encoding C) Binary encoding D) Variable-length encoding
A) Rare symbols B) Symbols at odd indices C) Symbols starting with A D) Frequent symbols
A) A code with equal-length codewords B) A code that uses only 0s and 1s C) A code that starts with the same symbol D) A code where no codeword is a prefix of another
A) Complete tree B) Optimal binary tree C) Balanced tree D) Perfect tree
A) Number of symbols B) Encoding speed C) Memory consumption D) Compression ratio
A) O(n) B) O(log n) C) O(n log n) D) O(n2)
A) Compressing the data B) Calculating symbol frequencies C) Building a linked list D) Assigning binary codes to symbols
A) Most frequent symbol B) Symbol with the longest name C) Symbol with a prime number D) Least frequent symbol
A) Queue B) Stack C) Binary heap D) Linked list
A) Infix codes B) Postfix codes C) Prefix codes D) Suffix codes
A) 1952 B) 1949 C) 1955 D) 1960
A) Run-length encoding B) Shannon-Fano coding C) Arithmetic coding D) Lempel-Ziv-Welch (LZW)
A) h(a_i) = log2(1 / w_i) B) h(a_i) = 2w_i C) h(a_i) = -log2(w_i) D) h(a_i) = w_i * log2(w_i)
A) H(A) = ∑(w_i > 0) log2(w_i) B) H(A) = ∑(w_i > 0) h(a_i) / w_i C) H(A) = ∑(w_i > 0) w_i / log2(w_i) D) H(A) = -∑(w_i > 0) w_i * log2(w_i)
A) It contributes negatively to the entropy B) It is equal to the symbol's information content C) It equals the inverse of its weight D) Zero, since lim_(w→0+) w * log2(w) = 0
A) An internal node B) Following the right child C) Following the left child D) A leaf node
A) Array B) Queue C) Priority queue D) Stack
A) Four B) Three C) One D) Two
A) Both queues simultaneously B) The second queue C) Neither queue D) The first queue
A) By keeping initial weights in the first queue and combined weights in the second queue B) By only enqueuing nodes with unique weights C) By randomly selecting nodes from either queue D) By sorting both queues by weight after each insertion
A) Choose the item in the second queue B) Choose the item in the first queue C) Remove both items and start over D) Randomly select an item from either queue
A) They remain as leaf nodes B) They are combined into a new internal node C) They are removed from the tree D) They become root nodes
A) Fax machines. B) Image encoding for web pages. C) Audio file compression. D) Text compression in word processors.
A) Problems that do not involve weights. B) Problems related to sorting data. C) Minimizing the maximum weighted path length, among others. D) Only compression-related problems.
A) Adaptive Huffman algorithm. B) Template Huffman algorithm. C) The package-merge algorithm. D) Binary Huffman algorithm.
A) T. C. Hu. B) Richard M. Karp. C) Alan Turing. D) Adriano Garsia.
A) The transmission cost. B) The binary representation. C) The alphabetic order. D) The frequency of occurrence.
A) Stanford University B) Harvard University C) MIT D) Princeton University
A) The original text must be stored alongside the compressed version. B) A frequency table must be stored with the compressed text. C) No additional information needs to be stored. D) An encryption key must accompany the compressed data. |