TASK
Implementation
Implement a shared cache accessible by all nodes. Instead of each node maintaining its own cache, a dedicated cache server handles all cache operations.
Benefits:
- No duplicate cached data
- Single point for invalidation
- Better memory utilization
Trade-offs:
- Network hop for every cache access
- Cache server becomes a bottleneck
- Single point of failure
Sample Test Cases
Global cache get/setTimeout: 5000ms
Input
{"src":"c0","dest":"cache","body":{"type":"init","msg_id":1,"node_id":"cache","node_ids":["cache","n1","n2"]}}
{"src":"n1","dest":"cache","body":{"type":"get","msg_id":2,"key":"x"}}
{"src":"n2","dest":"cache","body":{"type":"set","msg_id":3,"key":"x","value":100}}
{"src":"n1","dest":"cache","body":{"type":"get","msg_id":4,"key":"x"}}
Expected Output
{"src":"cache","dest":"c0","body":{"type":"init_ok","in_reply_to":1,"msg_id":0}}
{"src":"cache","dest":"n1","body":{"type":"get_ok","in_reply_to":2,"msg_id":1,"value":null}}
{"src":"cache","dest":"n2","body":{"type":"set_ok","in_reply_to":3,"msg_id":2}}
{"src":"cache","dest":"n1","body":{"type":"get_ok","in_reply_to":4,"msg_id":3,"value":100}}
Hints
Hint 1▾
All nodes access a single cache instance
Hint 2▾
Use network protocol for cache access
Hint 3▾
Handle concurrent access safely
Hint 4▾
Go/Python tip: avoid holding the lock while calling reply/send - this causes deadlocks with non-reentrant locks
OVERVIEW
Theoretical Hub
Global Cache Architecture
A global cache centralizes cached data in one or more dedicated servers. All application nodes contact the cache server instead of maintaining local caches. This is the model used by Redis and Memcached.
Look-Aside Cache Pattern
In look-aside (cache-aside), the application checks the cache, and on miss, queries the database and populates the cache. The cache is passive - it does not know about the database.
Look-Through Cache Pattern
In look-through, the cache handles database interaction. On miss, the cache fetches from the database automatically. This simplifies application code but couples cache to database.
Key Concepts
shared cachecache coherencesingle point of truth
main.py
python
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
#!/usr/bin/env python3
import sys
import json
import time
import threading
class CacheServer:
def __init__(self):
self.cache = {}
self.lock = threading.Lock()
def get(self, key):
# TODO: Thread-safe get
pass
def set(self, key, value, ttl=60):
# TODO: Thread-safe set with TTL
pass
def delete(self, key):
# TODO: Thread-safe delete
pass
class CacheClient:
def __init__(self, server):
self.server = server
def get(self, key):
# TODO: Send get request to cache server
pass
def set(self, key, value, ttl=60):
# TODO: Send set request to cache server
pass
class Application:
def __init__(self, cache_client, database):
self.cache = cache_client
self.db = database