ThatQuiz Test Library Take this test now
Huffman Coding
Contributed by: Singh
  • 1. Who introduced Huffman Coding?
A) Alice Jones
B) John Smith
C) David A. Huffman
D) Robert Johnson
  • 2. Which type of encoding does Huffman Coding use?
A) ASCII encoding
B) Fixed-length encoding
C) Binary encoding
D) Variable-length encoding
  • 3. In Huffman Coding, what type of symbols have shorter codes?
A) Rare symbols
B) Symbols at odd indices
C) Symbols starting with A
D) Frequent symbols
  • 4. What is a prefix code in Huffman Coding?
A) A code with equal-length codewords
B) A code that uses only 0s and 1s
C) A code that starts with the same symbol
D) A code where no codeword is a prefix of another
  • 5. What is a Huffman tree also known as?
A) Complete tree
B) Optimal binary tree
C) Balanced tree
D) Perfect tree
  • 6. How is the efficiency of Huffman Coding usually measured?
A) Number of symbols
B) Encoding speed
C) Memory consumption
D) Compression ratio
  • 7. What's the worst-case time complexity of building a Huffman tree?
A) O(n)
B) O(log n)
C) O(n log n)
D) O(n2)
  • 8. Which step comes after building the Huffman tree in the encoding process?
A) Compressing the data
B) Calculating symbol frequencies
C) Building a linked list
D) Assigning binary codes to symbols
  • 9. In Huffman Coding, what symbol is typically assigned the shortest code?
A) Most frequent symbol
B) Symbol with the longest name
C) Symbol with a prime number
D) Least frequent symbol
  • 10. Which data structure is commonly used to implement a priority queue in Huffman Coding?
A) Queue
B) Stack
C) Binary heap
D) Linked list
  • 11. What kind of codes does Huffman Coding produce?
A) Infix codes
B) Postfix codes
C) Prefix codes
D) Suffix codes
  • 12. In which year was the paper 'A Method for the Construction of Minimum-Redundancy Codes' published?
A) 1952
B) 1949
C) 1955
D) 1960
  • 13. Which method can replace Huffman coding if a better compression ratio is required?
A) Run-length encoding
B) Shannon-Fano coding
C) Arithmetic coding
D) Lempel-Ziv-Welch (LZW)
  • 14. How is the information content h(a_i) of a symbol ai defined?
A) h(a_i) = log2(1 / w_i)
B) h(a_i) = 2w_i
C) h(a_i) = -log2(w_i)
D) h(a_i) = w_i * log2(w_i)
  • 15. What is the formula for entropy H(A)?
A) H(A) = ∑(w_i > 0) log2(w_i)
B) H(A) = ∑(w_i > 0) h(a_i) / w_i
C) H(A) = ∑(w_i > 0) w_i / log2(w_i)
D) H(A) = -∑(w_i > 0) w_i * log2(w_i)
  • 16. What is the contribution of a symbol with zero probability to entropy?
A) It contributes negatively to the entropy
B) It is equal to the symbol's information content
C) It equals the inverse of its weight
D) Zero, since lim_(w→0+) w * log2(w) = 0
  • 17. What does bit '0' represent in a Huffman tree?
A) An internal node
B) Following the right child
C) Following the left child
D) A leaf node
  • 18. Which data structure is used for efficient insertion and retrieval of nodes by probability in a simple Huffman tree construction algorithm?
A) Array
B) Queue
C) Priority queue
D) Stack
  • 19. How many queues are used in the linear-time method to create a Huffman tree?
A) Four
B) Three
C) One
D) Two
  • 20. In the linear-time Huffman tree construction, where are initial weights enqueued?
A) Both queues simultaneously
B) The second queue
C) Neither queue
D) The first queue
  • 21. When constructing a Huffman tree using two queues, how do you ensure the lowest weight is always at the front?
A) By keeping initial weights in the first queue and combined weights in the second queue
B) By only enqueuing nodes with unique weights
C) By randomly selecting nodes from either queue
D) By sorting both queues by weight after each insertion
  • 22. How do you break ties between queues to minimize variance in Huffman coding?
A) Choose the item in the second queue
B) Choose the item in the first queue
C) Remove both items and start over
D) Randomly select an item from either queue
  • 23. What happens to the two nodes with the smallest probability during Huffman tree construction?
A) They remain as leaf nodes
B) They are combined into a new internal node
C) They are removed from the tree
D) They become root nodes
  • 24. What is a common use of modified Huffman coding?
A) Fax machines.
B) Image encoding for web pages.
C) Audio file compression.
D) Text compression in word processors.
  • 25. What kind of problems can Huffman template algorithms solve?
A) Problems that do not involve weights.
B) Problems related to sorting data.
C) Minimizing the maximum weighted path length, among others.
D) Only compression-related problems.
  • 26. What algorithm solves the problem of length-limited Huffman coding?
A) Adaptive Huffman algorithm.
B) Template Huffman algorithm.
C) The package-merge algorithm.
D) Binary Huffman algorithm.
  • 27. Who solved the Huffman coding problem with unequal letter costs?
A) T. C. Hu.
B) Richard M. Karp.
C) Alan Turing.
D) Adriano Garsia.
  • 28. In alphabetic Huffman coding, what must be identical between inputs and outputs?
A) The transmission cost.
B) The binary representation.
C) The alphabetic order.
D) The frequency of occurrence.
  • 29. Which university was David A. Huffman attending when he developed the algorithm?
A) Stanford University
B) Harvard University
C) MIT
D) Princeton University
  • 30. What is required when using Huffman coding with unknown input probabilities?
A) The original text must be stored alongside the compressed version.
B) A frequency table must be stored with the compressed text.
C) No additional information needs to be stored.
D) An encryption key must accompany the compressed data.
Created with That Quiz — the math test generation site with resources for other subject areas.