README
¶
redis-cache - HTTP JSON caching proxy for Redis
Architecture Overview (what the code does)
- The proxy webserver listens on
/redis
for GET requests with the paramkey
. - Up to 100 client requests at once will run using parallel concurrent processing while the rest will be queued. The client connection remains open until their request is processed.
- The client request key/value for the GET request is cached using a fixed capacity, global expiry, LRU eviction cache.
- Less than 100 LOC.
Make and Run
- note: flags are optional
make test
./redis-cache --ttl 360 --keycap 128 --port 8000 --address localhost:6379
Docker
docker-compose up
Complexity of Caching - Big-O
Example Client
import(
"github.com/apibillme/restly"
)
func main() {
req := restly.New()
res, statusCode, err := restly.GetJSON(req, "http://localhost:8000/redis", `?key=123`)
if err != nil {
if statusCode == 200 {
value := res.Get("value").String()
log.Println(value)
} else {
value := res.Get("error").String()
log.Println(value)
}
}
}
Load Testing
The first key set expires after one second and the test ran for 5 seconds - run against docker-compose in this repo on local Mac.
Total Requests: 5000
Requests Per Second: 1000
Success Ratio: 100%
Max: 183.256182ms
Mean: 15.357077ms
50th percentile: 730.348µs
95th percentile: 99.38549ms
99th percentile: 149.668733ms
Not Implemented
- Redis RPC - took too long to write the parser and dealing with
net
package and parallel concurrent processing was something I didn't solve.
Documentation
¶
There is no documentation for this package.
Click to show internal directories.
Click to hide internal directories.