How to implement a Golang cache

How to implement a Golang cache

Caching is a common technique used to store and retrieve frequently accessed data more efficiently. There are various ways to implement caching in Go, depending on your specific requirements. Here, I'll cover a simple in-memory cache using a map.

Role-Based Access Control (RBAC) is a common approach for managing user roles and permissions in a system. In our case study, users have roles, and for each module, they can have read or write permissions. The goal is to implement middleware that evaluates and enforces these authorization rights for each incoming request.

There are two ways to handle this.

  1. One is to attach those roles in a token and when the request is sent their rights are in the token (BAD idea for big systems, it can work for small systems.)
  2. Or every request, query the database for the user roles this will lead to too much database traffic, but we can improve the performance with cache. - and as such our article today.

Simple In-Memory Cache using a Map

type Cache struct {
	locker sync.Mutex
	Store  map[string]map[string]map[string]bool
}

func NewCache() CacheInterface {
	once.Do(func() {
		cache = Cache{
			Store: make(map[string]map[string]map[string]bool),
		}
	})
	return &cache
}        

It implements an interface with three methods

type CacheInterface interface {
	Put(username, module, key string, right bool)
	Get(username, module, key string) (bool, httperors.HttpErr)
	Invalidate(username string)
}        

In real-world systems, having hundreds of users and storing all their information can quickly lead to memory bloat, potentially degrading the system's performance.

Managing the memory consumption of a cache is crucial to prevent it from bloating and affecting the overall performance of your application. Here are some strategies to handle caching efficiently:

  1. Use expiration - I would probably spin a go routine that keeps track of when a cache record was added, and if it exceeds a set time limit delete it. and use "time.Ticker" to check the cache periodically.
  2. Use LRU (Least Recently Used) or LFU (Least Frequently Used) Policies- with this you can keep track of LRU/LFU and delete those periodically.
  3. Periodic Cleanup: - periodically clean the cache based on time or size, if it exceeds a set Duration or size - clean it up.

Testing the package

package cacher

import (
	"testing"

	"github.com/stretchr/testify/assert"
)

func TestCache(t *testing.T) {
	tests := []struct {
		name           string
		actions        func(CacheInterface)
		expectedResult bool
		expectedError  bool
	}{
		{
			name: "Put and Get cache entry",
			actions: func(c CacheInterface) {
				c.Put("myrachanto", "users", "READ", true)
			},
			expectedResult: true,
			expectedError:  false,
		},
		{
			name: "Invalidate cache entry and Get",
			actions: func(c CacheInterface) {
				c.Put("myrachanto", "users", "READ", true)
				c.Invalidate("myrachanto")
			},
			expectedResult: false,
			expectedError:  true,
		},
		{
			name: "Get non-existing user",
			actions: func(c CacheInterface) {
			},
			expectedResult: false,
			expectedError:  true,
		},
	}

	for _, tt := range tests {
		t.Run(tt.name, func(t *testing.T) {
			cache := NewCache()
			tt.actions(cache)
			result, err := cache.Get("myrachanto", "users", "READ")

			assert.Equal(t, tt.expectedResult, result)
			if tt.expectedError {
				assert.NotNil(t, err)
			} else {
				assert.Nil(t, err)
			}
		})
	}
}
        

the package is at The cache package

Better options include using Redis and other in-memory databases for better performance, especially for distributed systems.

要查看或添加评论,请登录

Anthony Miracho的更多文章

社区洞察

其他会员也浏览了