@types/lru-cache
- Version 7.6.1
- Published
- 17.9 kB
- No dependencies
- MIT license
Install
npm i @types/lru-cache
yarn add @types/lru-cache
pnpm add @types/lru-cache
Overview
TypeScript definitions for lru-cache
Index
Classes
LRUCache
- [Symbol.iterator]()
- allowStale
- calculatedSize
- clear()
- del()
- delete()
- dispose
- disposeAfter
- dump()
- entries()
- fetch()
- fetchMethod
- find()
- forEach()
- get()
- getRemainingTTL()
- has()
- keys()
- length
- load()
- max
- maxSize
- noDisposeOnSet
- peek()
- pop()
- prune()
- purgeStale()
- rentries()
- reset()
- rforEach()
- rkeys()
- rvalues()
- set()
- size
- sizeCalculation
- ttl
- ttlAutopurge
- ttlResolution
- updateAgeOnGet
- values()
Interfaces
Type Aliases
Classes
class LRUCache
class LRUCache<K, V> implements Iterable<[K, V]> {}
constructor
constructor(options: LRUCache.Options<K, V>);
property allowStale
readonly allowStale: boolean;
property calculatedSize
readonly calculatedSize: number;
The total size of items in cache when using size tracking.
property dispose
readonly dispose: LRUCache.Disposer<K, V>;
property disposeAfter
readonly disposeAfter: LRUCache.Disposer<K, V>;
7.4.0
property fetchMethod
readonly fetchMethod: LRUCache.Fetcher<K, V>;
property length
readonly length: number;
Return total length of objects in cache taking into account
length
options function.Deprecated
since 7.0 use
cache.size
instead
property max
readonly max: number;
property maxSize
readonly maxSize: number;
property noDisposeOnSet
readonly noDisposeOnSet: boolean;
property size
readonly size: number;
The total number of items held in the cache at the current moment.
property sizeCalculation
readonly sizeCalculation: LRUCache.SizeCalculator<K, V>;
property ttl
readonly ttl: number;
property ttlAutopurge
readonly ttlAutopurge: boolean;
property ttlResolution
readonly ttlResolution: number;
property updateAgeOnGet
readonly updateAgeOnGet: boolean;
method [Symbol.iterator]
[Symbol.iterator]: () => Iterator<[K, V]>;
method clear
clear: () => void;
Clear the cache entirely, throwing away all values.
method del
del: (key: K) => boolean;
Deletes a key out of the cache.
Deprecated
since 7.0 use delete() instead
method delete
delete: (key: K) => boolean;
Deletes a key out of the cache. Returns true if the key was deleted, false otherwise.
method dump
dump: () => Array<[K, LRUCache.Entry<V>]>;
Return an array of [key, entry] objects which can be passed to cache.load()
method entries
entries: () => Generator<[K, V]>;
Return a generator yielding
[key, value]
pairs, in order from most recently used to least recently used.
method fetch
fetch: <ExpectedValue = V>( key: K, options?: LRUCache.FetchOptions) => Promise<ExpectedValue | undefined>;
since: 7.6.0
method find
find: <T = V>( callbackFn: (value: V, key: K, cache: this) => boolean | undefined | void, options?: LRUCache.GetOptions) => T;
Find a value for which the supplied fn method returns a truthy value, similar to Array.find(). fn is called as fn(value, key, cache).
method forEach
forEach: <T = this>( callbackFn: (this: T, value: V, key: K, cache: this) => void, thisArg?: T) => void;
Same as cache.forEach(fn, thisp), but in order from least recently used to most recently used.
method get
get: <T = V>(key: K, options?: LRUCache.GetOptions) => T | undefined;
Return a value from the cache. Will update the recency of the cache entry found. If the key is not found,
get()
will returnundefined
. This can be confusing when setting values specifically toundefined
, as incache.set(key, undefined)
. Usecache.has()
to determine whether a key is present in the cache at all.
method getRemainingTTL
getRemainingTTL: (key: K) => number;
since: 7.6.0
method has
has: (key: K) => boolean;
Check if a key is in the cache, without updating the recency or age. Will return false if the item is stale, even though it is technically in the cache.
method keys
keys: () => Generator<K>;
Return a generator yielding the keys in the cache, in order from most recently used to least recently used.
method load
load: (cacheEntries: ReadonlyArray<[K, LRUCache.Entry<V>]>) => void;
Reset the cache and load in the items in entries in the order listed. Note that the shape of the resulting cache may be different if the same options are not used in both caches.
method peek
peek: <T = V>(key: K, options?: LRUCache.PeekOptions) => T | undefined;
Like
get()
but doesn't update recency or delete stale items. Returnsundefined
if the item is stale, unlessallowStale
is set either on the cache or in the options object.
method pop
pop: () => V | undefined;
Evict the least recently used item, returning its value or
undefined
if cache is empty.
method prune
prune: () => boolean;
Manually iterates over the entire cache proactively pruning old entries.
Deprecated
since 7.0 use purgeStale() instead
method purgeStale
purgeStale: () => boolean;
Delete any stale entries. Returns true if anything was removed, false otherwise.
method rentries
rentries: () => Generator<[K, V]>;
Return a generator yielding
[key, value]
pairs, in order from least recently used to most recently used.
method reset
reset: () => void;
Clear the cache entirely, throwing away all values.
Deprecated
since 7.0 use clear() instead
method rforEach
rforEach: <T = this>( callbackFn: (this: T, value: V, key: K, cache: this) => void, thisArg?: T) => void;
The same as
cache.forEach(...)
but items are iterated over in reverse order. (ie, less recently used items are iterated over first.)
method rkeys
rkeys: () => Generator<K>;
Return a generator yielding the keys in the cache, in order from least recently used to most recently used.
method rvalues
rvalues: () => Generator<V>;
Return a generator yielding the values in the cache, in order from least recently used to most recently used.
method set
set: (key: K, value: V, options?: LRUCache.SetOptions<K, V>) => this;
Add a value to the cache.
method values
values: () => Generator<V>;
Return a generator yielding the values in the cache, in order from most recently used to least recently used.
Interfaces
interface DeprecatedOptions
interface DeprecatedOptions<K, V> {}
property maxAge
maxAge?: number;
Maximum age in ms. Items are not pro-actively pruned out as they age, but if you try to get an item that is too old, it'll drop it and return undefined instead of giving it to you.
Deprecated
since 7.0 use options.ttl instead
property stale
stale?: boolean;
By default, if you set a
maxAge
, it'll only actually pull stale items out of the cache when youget(key)
. (That is, it's not pre-emptively doing asetTimeout
or anything.) If you setstale:true
, it'll return the stale value before deleting it. If you don't set this, then it'll returnundefined
when you try to get a stale entry, as if it had already been deleted.Deprecated
since 7.0 use options.allowStale instead
method length
length: (value: V, key?: K) => number;
Function that is used to calculate the length of stored items. If you're storing strings or buffers, then you probably want to do something like
function(n, key){return n.length}
. The default isfunction(){return 1}
, which is fine if you want to storemax
like-sized things. The item is passed as the first argument, and the key is passed as the second argument.Deprecated
since 7.0 use options.sizeCalculation instead
interface Entry
interface Entry<V> {}
interface FetcherOptions
interface FetcherOptions<K, V> {}
interface FetchOptions
interface FetchOptions {}
property allowStale
allowStale?: boolean;
property updateAgeOnGet
updateAgeOnGet?: boolean;
interface GetOptions
interface GetOptions {}
property allowStale
allowStale?: boolean;
property updateAgeOnGet
updateAgeOnGet?: boolean;
interface LimitedByCount
interface LimitedByCount {}
property max
max: number;
The number of most recently used items to keep. Note that we may store fewer items than this if maxSize is hit.
interface LimitedBySize
interface LimitedBySize<K, V> {}
property maxSize
maxSize: number;
If you wish to track item size, you must provide a maxSize note that we still will only keep up to max *actual items*, so size tracking may cause fewer than max items to be stored. At the extreme, a single item of maxSize size will cause everything else in the cache to be dropped when it is added. Use with caution! Note also that size tracking can negatively impact performance, though for most cases, only minimally.
property sizeCalculation
sizeCalculation?: SizeCalculator<K, V>;
Function to calculate size of items. Useful if storing strings or buffers or other items where memory size depends on the object itself. Also note that oversized items do NOT immediately get dropped from the cache, though they will cause faster turnover in the storage.
interface LimitedByTTL
interface LimitedByTTL {}
property allowStale
allowStale?: boolean;
Return stale items from cache.get() before disposing of them
false
property noUpdateTTL
noUpdateTTL?: boolean;
Boolean flag to tell the cache to not update the TTL when setting a new value for an existing key (ie, when updating a value rather than inserting a new value). Note that the TTL value is _always_ set (if provided) when adding a new entry into the cache.
false 7.4.0
property ttl
ttl: number;
Max time to live for items before they are considered stale. Note that stale items are NOT preemptively removed by default, and MAY live in the cache, contributing to its LRU max, long after they have expired.
Also, as this cache is optimized for LRU/MRU operations, some of the staleness/TTL checks will reduce performance, as they will incur overhead by deleting items.
Must be a positive integer in ms, defaults to 0, which means "no TTL"
property ttlAutopurge
ttlAutopurge?: boolean;
Preemptively remove stale items from the cache. Note that this may significantly degrade performance, especially if the cache is storing a large number of items. It is almost always best to just leave the stale items in the cache, and let them fall out as new items are added.
Note that this means that allowStale is a bit pointless, as stale items will be deleted almost as soon as they expire.
Use with caution!
false 7.1.0
property ttlResolution
ttlResolution?: number;
Minimum amount of time in ms in which to check for staleness. Defaults to 1, which means that the current time is checked at most once per millisecond.
Set to 0 to check the current time every time staleness is tested.
Note that setting this to a higher value will improve performance somewhat while using ttl tracking, albeit at the expense of keeping stale items around a bit longer than intended.
1 7.1.0
property updateAgeOnGet
updateAgeOnGet?: boolean;
Update the age of items on cache.get(), renewing their TTL
false
interface PeekOptions
interface PeekOptions {}
property allowStale
allowStale?: boolean;
interface SetOptions
interface SetOptions<K, V> {}
property noDisposeOnSet
noDisposeOnSet?: boolean;
property noUpdateTTL
noUpdateTTL?: boolean;
property size
size?: number;
A value for the size of the entry, prevents calls to
sizeCalculation
function
property sizeCalculation
sizeCalculation?: SizeCalculator<K, V>;
property ttl
ttl?: number;
interface SharedOptions
interface SharedOptions<K, V> {}
property dispose
dispose?: Disposer<K, V>;
Function that is called on items when they are dropped from the cache. This can be handy if you want to close file descriptors or do other cleanup tasks when items are no longer accessible. Called with
key, value
. It's called before actually removing the item from the internal cache, so if you want to immediately put it back in, you'll have to do that in anextTick
orsetTimeout
callback or it won't do anything.
property disposeAfter
disposeAfter?: Disposer<K, V>;
The same as dispose, but called *after* the entry is completely removed and the cache is once again in a clean state It is safe to add an item right back into the cache at this point. However, note that it is *very* easy to inadvertently create infinite recursion this way. 7.3.0
property fetchMethod
fetchMethod?: Fetcher<K, V> | null;
Since 7.6.0
fetchMethod
Function that is used to make background asynchronous fetches. Called withfetchMethod(key, staleValue)
. May return a Promise.If
fetchMethod
is not provided, thencache.fetch(key)
is equivalent toPromise.resolve(cache.get(key))
.
property noDisposeOnSet
noDisposeOnSet?: boolean;
Set to true to suppress calling the dispose() function if the entry key is still accessible within the cache. This may be overridden by passing an options object to cache.set().
false
Type Aliases
type Disposer
type Disposer<K, V> = (value: V, key: K, reason: DisposeReason) => void;
type DisposeReason
type DisposeReason = 'evict' | 'set' | 'delete';
type Fetcher
type Fetcher<K, V> = ( key: K, staleKey?: K, options?: FetcherOptions<K, V>) => Promise<V>;
type Options
type Options<K, V> = SharedOptions<K, V> & DeprecatedOptions<K, V> & SafetyBounds<K, V>;
type SafetyBounds
type SafetyBounds<K, V> = LimitedByCount | LimitedBySize<K, V> | LimitedByTTL;
type SizeCalculator
type SizeCalculator<K, V> = (value: V, key: K) => number;
Package Files (1)
Dependencies (0)
No dependencies.
Dev Dependencies (0)
No dev dependencies.
Peer Dependencies (0)
No peer dependencies.
Badge
To add a badge like this oneto your package's README, use the codes available below.
You may also use Shields.io to create a custom badge linking to https://www.jsdocs.io/package/@types/lru-cache
.
- Markdown[](https://www.jsdocs.io/package/@types/lru-cache)
- HTML<a href="https://www.jsdocs.io/package/@types/lru-cache"><img src="https://img.shields.io/badge/jsDocs.io-reference-blue" alt="jsDocs.io"></a>
- Updated .
Package analyzed in 5729 ms. - Missing or incorrect documentation? Open an issue for this package.