<p>I need to implement a cache. Let's assume that it's an LRU cache. I have chosen a map and a doubly linked list. </p>
<pre><code>type LRUCache struct {
size int
mu sync.RWMutex
cache map[string]*list.Element
queue *list.List
}
</code></pre>
<p>The cache entry contains a string key and counter. </p>
<pre><code>type lruItem struct {
key string
counter int
}
</code></pre>
<p>If we hit the cache - counter will bump up. Okay. Get and Put operations are quite simple. But I need to make the Top function that will show the top 20 cache entries and the complexity of this operation should be O(log n).
So iteration over the map and then put all the key/value to slice and then sort it is not a good idea. I need help. Any advice?</p>
<hr/>**评论:**<br/><br/>msaggar96: <pre><p>Consider using data structures that have log(n) look ups, specifically heaps, which can be implemented with arrays, sort of similar to what you have. Using a max heap could get you the top 20 items in log(n). </p></pre>allhatenocattle: <pre><p>I've always thought of heaps as working well with static objects. How would you use a heap in this case where the counter field of an lruItem is increasing over time?</p></pre>babbarshaer: <pre><p>I think, on each operation, you would restructure the heap so that the order of the heap gets maintained. The restructuring parameter would in this case be the counter. </p></pre>msaggar96: <pre><p>Ya look up heapify algorithms and restructure based on your needs. </p></pre>DoomFrog666: <pre><p>You could use a queue for the first 20 elements which include their value.</p>
<p>When you get a miss pop one element of the queue and append the requested one.</p>
<p>This of course only works well if you have up to 20 elements that are highly requested.
This has horrible performance on random reads.</p></pre>
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889
- 请尽量让自己的回复能够对别人有帮助
- 支持 Markdown 格式, **粗体**、~~删除线~~、
`单行代码`
- 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
- 图片支持拖拽、截图粘贴等方式上传