TASK
Implementation
Build a basic in-memory message queue:
- Producers enqueue messages
- Consumers dequeue messages
- Messages delivered in FIFO order
- Thread-safe for concurrent access
- Support blocking and non-blocking receive
This decouples producers and consumers in time.
Sample Test Cases
FIFO orderTimeout: 5000ms
Input
{
"src": "c0",
"dest": "n1",
"body": {
"type": "init",
"msg_id": 1,
"node_id": "n1",
"node_ids": [
"n1"
]
}
}Expected Output
{"src":"n1","dest":"c0","body":{"type":"init_ok","in_reply_to":1,"msg_id":0}}Concurrent accessTimeout: 5000ms
Input
{
"src": "c0",
"dest": "n1",
"body": {
"type": "init",
"msg_id": 1,
"node_id": "n1",
"node_ids": [
"n1"
]
}
}Expected Output
{"src":"n1","dest":"c0","body":{"type":"init_ok","in_reply_to":1,"msg_id":0}}Hints
Hint 1▾
Use thread-safe data structure
Hint 2▾
Block on empty queue or return None
Hint 3▾
Handle multiple producers/consumers
OVERVIEW
Theoretical Hub
Message Queues
Queues decouple components: producers emit messages without waiting, consumers process at their own pace. This enables asynchronous processing, load leveling, and resilient architectures.
FIFO Ordering
First-in-first-out ensures messages are processed in send order. This matters for ordered event streams. Strict FIFO limits parallelism - a trade-off to consider.
Key Concepts
queueproducer-consumerFIFO
main.py
python
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
#!/usr/bin/env python3
import sys
import json
import threading
from collections import deque
class MessageQueue:
def __init__(self, max_size=10000):
self.queue = deque()
self.max_size = max_size
self.lock = threading.Lock()
self.not_empty = threading.Condition(self.lock)
self.not_full = threading.Condition(self.lock)
def enqueue(self, message, timeout=None):
# TODO: Add message, block if full
pass
def dequeue(self, timeout=None):
# TODO: Remove and return message, block if empty
pass
def size(self):
with self.lock:
return len(self.queue)
if __name__ == "__main__":
q = MessageQueue()