File: control

package info (click to toggle)
golang-github-bep-lazycache 0.4.0-1
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 108 kB
  • sloc: makefile: 2
file content (41 lines) | stat: -rw-r--r-- 1,826 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
Source: golang-github-bep-lazycache
Section: golang
Priority: optional
Maintainer: Debian Go Packaging Team <team+pkg-go@tracker.debian.org>
Uploaders: Anthony Fok <foka@debian.org>
Rules-Requires-Root: no
Build-Depends: debhelper-compat (= 13),
               dh-sequence-golang,
               golang-any,
               golang-github-frankban-quicktest-dev (>= 1.14.2),
               golang-github-hashicorp-golang-lru-v2-dev (>= 2.0.7)
Testsuite: autopkgtest-pkg-go
Standards-Version: 4.6.2
Vcs-Browser: https://salsa.debian.org/go-team/packages/golang-github-bep-lazycache
Vcs-Git: https://salsa.debian.org/go-team/packages/golang-github-bep-lazycache.git
Homepage: https://github.com/bep/lazycache
XS-Go-Import-Path: github.com/bep/lazycache

Package: golang-github-bep-lazycache-dev
Architecture: all
Multi-Arch: foreign
Depends: golang-github-frankban-quicktest-dev (>= 1.14.2),
         golang-github-hashicorp-golang-lru-v2-dev (>= 2.0.7),
         ${misc:Depends}
Description: Thread-safe in-memory LRU cache with non-blocking cache priming on cache misses
 Lazycache is a simple thread-safe in-memory LRU cache.  Under the hood
 it leverages the great simpleru package in golang-lru, with its exellent
 performance.  One big difference between golang-lru and this library is
 the GetOrCreate method, which provides:
 .
  * Non-blocking cache priming on cache misses.
  * A guarantee that the prime function is only called once for a given key.
  * The cache's RWMutex is not locked during the execution of the prime
    function, which should make it easier to reason about potential deadlocks.
 .
 Other notable features:
 .
  * The API is generic
  * The cache can be resized while running.
  * When the number of entries overflows the defined cache size, the
    least recently used item gets discarded (LRU).