lru-cache
- Version 11.0.2
- Published
- 808 kB
- No dependencies
- ISC license
Install
npm i lru-cache
yarn add lru-cache
pnpm add lru-cache
Overview
A cache object that deletes the least-recently-used items.
Index
Variables
Classes
LRUCache
- [Symbol.iterator]()
- [Symbol.toStringTag]
- allowStale
- allowStaleOnFetchAbort
- allowStaleOnFetchRejection
- calculatedSize
- clear()
- delete()
- dispose
- disposeAfter
- dump()
- entries()
- fetch()
- fetchMethod
- find()
- forceFetch()
- forEach()
- get()
- getRemainingTTL()
- has()
- ignoreFetchAbort
- info()
- keys()
- load()
- max
- maxEntrySize
- maxSize
- memo()
- memoMethod
- noDeleteOnFetchRejection
- noDeleteOnStaleGet
- noDisposeOnSet
- noUpdateTTL
- peek()
- pop()
- purgeStale()
- rentries()
- rforEach()
- rkeys()
- rvalues()
- set()
- size
- sizeCalculation
- ttl
- ttlAutopurge
- ttlResolution
- updateAgeOnGet
- updateAgeOnHas
- values()
Type Aliases
Namespaces
LRUCache
- Count
- Disposer
- DisposeReason
- Entry
- Fetcher
- FetcherFetchOptions
- FetcherOptions
- FetchOptions
- FetchOptionsNoContext
- FetchOptionsWithContext
- GetOptions
- HasOptions
- Memoizer
- MemoizerMemoOptions
- MemoizerOptions
- MemoOptions
- MemoOptionsNoContext
- MemoOptionsWithContext
- Milliseconds
- Options
- OptionsBase
- OptionsMaxLimit
- OptionsSizeLimit
- OptionsTTLLimit
- PeekOptions
- SetOptions
- Size
- SizeCalculator
- Status
Variables
variable TYPE
const TYPE: Symbol;
LRUCache
Classes
class LRUCache
class LRUCache<K extends {}, V extends {}, FC = unknown> {}
Default export, the thing you're using this module to get.
The
K
andV
types define the key and value types, respectively. The optionalFC
type defines the type of thecontext
object passed tocache.fetch()
andcache.memo()
.Keys and values **must not** be
null
orundefined
.All properties from the options object (with the exception of
max
,maxSize
,fetchMethod
,memoMethod
,dispose
anddisposeAfter
) are added as normal public members. (The listed options are read-only getters.)Changing any of these will alter the defaults for subsequent method calls.
constructor
constructor(options: LRUCache.Options<K, V, FC> | LRUCache<K, V, FC>);
property [Symbol.toStringTag]
[Symbol.toStringTag]: string;
A String value that is used in the creation of the default string description of an object. Called by the built-in method
Object.prototype.toString
.
property allowStale
allowStale: boolean;
property allowStaleOnFetchAbort
allowStaleOnFetchAbort: boolean;
property allowStaleOnFetchRejection
allowStaleOnFetchRejection: boolean;
property calculatedSize
readonly calculatedSize: number;
The total computed size of items in the cache (read-only)
property dispose
readonly dispose: LRUCache.Disposer<K, V>;
LRUCache.OptionsBase.dispose (read-only)
property disposeAfter
readonly disposeAfter: LRUCache.Disposer<K, V>;
LRUCache.OptionsBase.disposeAfter (read-only)
property fetchMethod
readonly fetchMethod: LRUCache.Fetcher<K, V, FC>;
LRUCache.OptionsBase.fetchMethod (read-only)
property ignoreFetchAbort
ignoreFetchAbort: boolean;
property max
readonly max: number;
LRUCache.OptionsBase.max (read-only)
property maxEntrySize
maxEntrySize: number;
property maxSize
readonly maxSize: number;
LRUCache.OptionsBase.maxSize (read-only)
property memoMethod
readonly memoMethod: LRUCache.Memoizer<K, V, FC>;
property noDeleteOnFetchRejection
noDeleteOnFetchRejection: boolean;
property noDeleteOnStaleGet
noDeleteOnStaleGet: boolean;
property noDisposeOnSet
noDisposeOnSet: boolean;
property noUpdateTTL
noUpdateTTL: boolean;
property size
readonly size: number;
The number of items stored in the cache (read-only)
property sizeCalculation
sizeCalculation?: LRUCache.SizeCalculator<K, V>;
property ttl
ttl: number;
property ttlAutopurge
ttlAutopurge: boolean;
property ttlResolution
ttlResolution: number;
property updateAgeOnGet
updateAgeOnGet: boolean;
property updateAgeOnHas
updateAgeOnHas: boolean;
method [Symbol.iterator]
[Symbol.iterator]: () => Generator<[K, V], void, unknown>;
Iterating over the cache itself yields the same results as LRUCache.entries
method clear
clear: () => void;
Clear the cache entirely, throwing away all values.
method delete
delete: (k: K) => boolean;
Deletes a key out of the cache.
Returns true if the key was deleted, false otherwise.
method dump
dump: () => [K, LRUCache.Entry<V>][];
Return an array of [key, LRUCache.Entry] tuples which can be passed to LRUCache#load.
The
start
fields are calculated relative to a portableDate.now()
timestamp, even ifperformance.now()
is available.Stale entries are always included in the
dump
, even if LRUCache.OptionsBase.allowStale is false.Note: this returns an actual array, not a generator, so it can be more easily passed around.
method entries
entries: () => Generator<[K, V], void, unknown>;
Return a generator yielding
[key, value]
pairs, in order from most recently used to least recently used.
method fetch
fetch: { ( k: K, fetchOptions: unknown extends FC ? LRUCache.FetchOptions<K, V, FC> : FC extends undefined | void ? LRUCache.FetchOptionsNoContext<K, V> : LRUCache.FetchOptionsWithContext<K, V, FC> ): Promise<undefined | V>; ( k: unknown extends FC ? K : FC extends void ? K : never, fetchOptions?: unknown extends FC ? LRUCache.FetchOptions<K, V, FC> : FC extends void ? LRUCache.FetchOptionsNoContext<K, V> : never ): Promise<V>;};
Make an asynchronous cached fetch using the LRUCache.OptionsBase.fetchMethod function.
If the value is in the cache and not stale, then the returned Promise resolves to the value.
If not in the cache, or beyond its TTL staleness, then
fetchMethod(key, staleValue, { options, signal, context })
is called, and the value returned will be added to the cache once resolved.If called with
allowStale
, and an asynchronous fetch is currently in progress to reload a stale value, then the former stale value will be returned.If called with
forceRefresh
, then the cached item will be re-fetched, even if it is not stale. However, ifallowStale
is also set, then the old value will still be returned. This is useful in cases where you want to force a reload of a cached value. If a background fetch is already in progress, thenforceRefresh
has no effect.If multiple fetches for the same key are issued, then they will all be coalesced into a single call to fetchMethod.
Note that this means that handling options such as LRUCache.OptionsBase.allowStaleOnFetchAbort, LRUCache.FetchOptions.signal, and LRUCache.OptionsBase.allowStaleOnFetchRejection will be determined by the FIRST fetch() call for a given key.
This is a known (fixable) shortcoming which will be addresed on when someone complains about it, as the fix would involve added complexity and may not be worth the costs for this edge case.
If LRUCache.OptionsBase.fetchMethod is not specified, then this is effectively an alias for
Promise.resolve(cache.get(key))
.When the fetch method resolves to a value, if the fetch has not been aborted due to deletion, eviction, or being overwritten, then it is added to the cache using the options provided.
If the key is evicted or deleted before the
fetchMethod
resolves, then the AbortSignal passed to thefetchMethod
will receive anabort
event, and the promise returned byfetch()
will reject with the reason for the abort.If a
signal
is passed to thefetch()
call, then aborting the signal will abort the fetch and cause thefetch()
promise to reject with the reason provided.**Setting
context
**If an
FC
type is set to a type other thanunknown
,void
, orundefined
in the LRUCache constructor, then all calls tocache.fetch()
_must_ provide acontext
option. If set toundefined
orvoid
, then calls to fetch _must not_ provide acontext
option.The
context
param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a singlefetch()
operation, and discarded afterwards.**Note:
fetch()
calls are inflight-unique**If you call
fetch()
multiple times with the same key value, then every call after the first will resolve on the same promise1, _even if they have different settings that would otherwise change the behavior of the fetch_, such asnoDeleteOnFetchRejection
orignoreFetchAbort
.In most cases, this is not a problem (in fact, only fetching something once is what you probably want, if you're caching in the first place). If you are changing the fetch() options dramatically between runs, there's a good chance that you might be trying to fit divergent semantics into a single object, and would be better off with multiple cache instances.
**1**: Ie, they're not the "same Promise", but they resolve at the same time, because they're both waiting on the same underlying fetchMethod response.
method find
find: ( fn: (v: V, k: K, self: LRUCache<K, V, FC>) => boolean, getOptions?: LRUCache.GetOptions<K, V, FC>) => V | undefined;
Find a value for which the supplied fn method returns a truthy value, similar to
Array.find()
. fn is called asfn(value, key, cache)
.
method forceFetch
forceFetch: { ( k: K, fetchOptions: unknown extends FC ? LRUCache.FetchOptions<K, V, FC> : FC extends undefined | void ? LRUCache.FetchOptionsNoContext<K, V> : LRUCache.FetchOptionsWithContext<K, V, FC> ): Promise<V>; ( k: unknown extends FC ? K : FC extends void ? K : never, fetchOptions?: unknown extends FC ? LRUCache.FetchOptions<K, V, FC> : FC extends void ? LRUCache.FetchOptionsNoContext<K, V> : never ): Promise<V>;};
In some cases,
cache.fetch()
may resolve toundefined
, either because a LRUCache.OptionsBase#fetchMethod was not provided (turningcache.fetch(k)
into just an async wrapper aroundcache.get(k)
) or becauseignoreFetchAbort
was specified (either to the constructor or in the LRUCache.FetchOptions). Also, the LRUCache.OptionsBase.fetchMethod may returnundefined
orvoid
, making the test even more complicated.Because inferring the cases where
undefined
might be returned are so cumbersome, but testing forundefined
can also be annoying, this method can be used, which will reject ifthis.fetch()
resolves to undefined.
method forEach
forEach: ( fn: (v: V, k: K, self: LRUCache<K, V, FC>) => any, thisp?: any) => void;
Call the supplied function on each item in the cache, in order from most recently used to least recently used.
fn
is called asfn(value, key, cache)
.If
thisp
is provided, function will be called in thethis
-context of the provided object, or the cache if nothisp
object is provided.Does not update age or recenty of use, or iterate over stale values.
method get
get: (k: K, getOptions?: LRUCache.GetOptions<K, V, FC>) => V | undefined;
Return a value from the cache. Will update the recency of the cache entry found.
If the key is not found, get() will return
undefined
.
method getRemainingTTL
getRemainingTTL: (key: K) => number;
Return the number of ms left in the item's TTL. If item is not in cache, returns
0
. ReturnsInfinity
if item is in cache without a defined TTL.
method has
has: (k: K, hasOptions?: LRUCache.HasOptions<K, V, FC>) => boolean;
Check if a key is in the cache, without updating the recency of use. Will return false if the item is stale, even though it is technically in the cache.
Check if a key is in the cache, without updating the recency of use. Age is updated if LRUCache.OptionsBase.updateAgeOnHas is set to
true
in either the options or the constructor.Will return
false
if the item is stale, even though it is technically in the cache. The difference can be determined (if it matters) by using astatus
argument, and inspecting thehas
field.Will not update item age unless LRUCache.OptionsBase.updateAgeOnHas is set.
method info
info: (key: K) => LRUCache.Entry<V> | undefined;
Get the extended info about a given entry, to get its value, size, and TTL info simultaneously. Returns
undefined
if the key is not present.Unlike LRUCache#dump, which is designed to be portable and survive serialization, the
start
value is always the current timestamp, and thettl
is a calculated remaining time to live (negative if expired).Always returns stale values, if their info is found in the cache, so be sure to check for expirations (ie, a negative LRUCache.Entry#ttl) if relevant.
method keys
keys: () => Generator<K, void, unknown>;
Return a generator yielding the keys in the cache, in order from most recently used to least recently used.
method load
load: (arr: [K, LRUCache.Entry<V>][]) => void;
Reset the cache and load in the items in entries in the order listed.
The shape of the resulting cache may be different if the same options are not used in both caches.
The
start
fields are assumed to be calculated relative to a portableDate.now()
timestamp, even ifperformance.now()
is available.
method memo
memo: { ( k: K, memoOptions: unknown extends FC ? LRUCache.MemoOptions<K, V, FC> : FC extends undefined | void ? LRUCache.MemoOptionsNoContext<K, V> : LRUCache.MemoOptionsWithContext<K, V, FC> ): V; ( k: unknown extends FC ? K : FC extends void ? K : never, memoOptions?: unknown extends FC ? LRUCache.MemoOptions<K, V, FC> : FC extends void ? LRUCache.MemoOptionsNoContext<K, V> : never ): V;};
If the key is found in the cache, then this is equivalent to LRUCache#get. If not, in the cache, then calculate the value using the LRUCache.OptionsBase.memoMethod, and add it to the cache.
If an
FC
type is set to a type other thanunknown
,void
, orundefined
in the LRUCache constructor, then all calls tocache.memo()
_must_ provide acontext
option. If set toundefined
orvoid
, then calls to memo _must not_ provide acontext
option.The
context
param allows you to provide arbitrary data that might be relevant in the course of fetching the data. It is only relevant for the course of a singlememo()
operation, and discarded afterwards.
method peek
peek: (k: K, peekOptions?: LRUCache.PeekOptions<K, V, FC>) => V | undefined;
Like LRUCache#get but doesn't update recency or delete stale items.
Returns
undefined
if the item is stale, unless LRUCache.OptionsBase.allowStale is set.
method pop
pop: () => V | undefined;
Evict the least recently used item, returning its value or
undefined
if cache is empty.
method purgeStale
purgeStale: () => boolean;
Delete any stale entries. Returns true if anything was removed, false otherwise.
method rentries
rentries: () => Generator<(K | V)[], void, unknown>;
Inverse order version of LRUCache.entries
Return a generator yielding
[key, value]
pairs, in order from least recently used to most recently used.
method rforEach
rforEach: ( fn: (v: V, k: K, self: LRUCache<K, V, FC>) => any, thisp?: any) => void;
The same as LRUCache.forEach but items are iterated over in reverse order. (ie, less recently used items are iterated over first.)
method rkeys
rkeys: () => Generator<K, void, unknown>;
Inverse order version of LRUCache.keys
Return a generator yielding the keys in the cache, in order from least recently used to most recently used.
method rvalues
rvalues: () => Generator<V | undefined, void, unknown>;
Inverse order version of LRUCache.values
Return a generator yielding the values in the cache, in order from least recently used to most recently used.
method set
set: ( k: K, v: V | BackgroundFetch<V> | undefined, setOptions?: LRUCache.SetOptions<K, V, FC>) => this;
Add a value to the cache.
Note: if
undefined
is specified as a value, this is an alias for LRUCache#deleteFields on the LRUCache.SetOptions options param will override their corresponding values in the constructor options for the scope of this single
set()
operation.If
start
is provided, then that will set the effective start time for the TTL calculation. Note that this must be a previous value ofperformance.now()
if supported, or a previous value ofDate.now()
if not.Options object may also include
size
, which will prevent calling thesizeCalculation
function and just use the specified number if it is a positive integer, andnoDisposeOnSet
which will prevent calling adispose
function in the case of overwrites.If the
size
(or return value ofsizeCalculation
) for a given entry is greater thanmaxEntrySize
, then the item will not be added to the cache.Will update the recency of the entry.
If the value is
undefined
, then this is an alias forcache.delete(key)
.undefined
is never stored in the cache.
method values
values: () => Generator<V, void, unknown>;
Return a generator yielding the values in the cache, in order from most recently used to least recently used.
class Stack
class Stack {}
class ZeroArray
class ZeroArray extends Array<number> {}
constructor
constructor(size: number);
Type Aliases
type BackgroundFetch
type BackgroundFetch<V> = Promise<V | undefined> & { __returned: BackgroundFetch<V> | undefined; __abortController: AbortController; __staleWhileFetching: V | undefined;};
Promise representing an in-progress LRUCache#fetch call
type DisposeTask
type DisposeTask<K, V> = [value: V, key: K, reason: LRUCache.DisposeReason];
type Index
type Index = number & { [TYPE]: 'LRUCache Index';};
type NumberArray
type NumberArray = UintArray | number[];
type PosInt
type PosInt = number & { [TYPE]: 'Positive Integer';};
type StackLike
type StackLike = Stack | Index[];
type UintArray
type UintArray = Uint8Array | Uint16Array | Uint32Array;
Namespaces
namespace LRUCache
namespace LRUCache {}
interface Entry
interface Entry<V> {}
Entry objects used by LRUCache#load and LRUCache#dump, and returned by LRUCache#info.
interface FetcherFetchOptions
interface FetcherFetchOptions<K, V, FC = unknown> extends Pick< OptionsBase<K, V, FC>, | 'allowStale' | 'updateAgeOnGet' | 'noDeleteOnStaleGet' | 'sizeCalculation' | 'ttl' | 'noDisposeOnSet' | 'noUpdateTTL' | 'noDeleteOnFetchRejection' | 'allowStaleOnFetchRejection' | 'ignoreFetchAbort' | 'allowStaleOnFetchAbort' > {}
options which override the options set in the LRUCache constructor when calling LRUCache#fetch.
This is the union of GetOptions and SetOptions, plus OptionsBase.noDeleteOnFetchRejection, OptionsBase.allowStaleOnFetchRejection, FetchOptions.forceRefresh, and FetcherOptions.context
Any of these may be modified in the OptionsBase.fetchMethod function, but the GetOptions fields will of course have no effect, as the LRUCache#get call already happened by the time the fetchMethod is called.
interface FetcherOptions
interface FetcherOptions<K, V, FC = unknown> {}
Options provided to the OptionsBase.fetchMethod function.
property context
context: FC;
Object provided in the FetchOptions.context option to LRUCache#fetch
property options
options: FetcherFetchOptions<K, V, FC>;
property signal
signal: AbortSignal;
interface FetchOptions
interface FetchOptions<K, V, FC> extends FetcherFetchOptions<K, V, FC> {}
Options that may be passed to the LRUCache#fetch method.
property context
context?: FC;
Context provided to the OptionsBase.fetchMethod as the FetcherOptions.context param.
If the FC type is specified as unknown (the default), undefined or void, then this is optional. Otherwise, it will be required.
property forceRefresh
forceRefresh?: boolean;
Set to true to force a re-load of the existing data, even if it is not yet stale.
property signal
signal?: AbortSignal;
property status
status?: Status<V>;
interface FetchOptionsNoContext
interface FetchOptionsNoContext<K, V> extends FetchOptions<K, V, undefined> {}
Options provided to LRUCache#fetch when the FC type is
undefined
orvoid
property context
context?: undefined;
interface FetchOptionsWithContext
interface FetchOptionsWithContext<K, V, FC> extends FetchOptions<K, V, FC> {}
Options provided to LRUCache#fetch when the FC type is something other than
unknown
,undefined
, orvoid
property context
context: FC;
interface GetOptions
interface GetOptions<K, V, FC> extends Pick< OptionsBase<K, V, FC>, 'allowStale' | 'updateAgeOnGet' | 'noDeleteOnStaleGet' > {}
Options that may be passed to the LRUCache#get method.
property status
status?: Status<V>;
interface HasOptions
interface HasOptions<K, V, FC> extends Pick<OptionsBase<K, V, FC>, 'updateAgeOnHas'> {}
Options that may be passed to the LRUCache#has method.
property status
status?: Status<V>;
interface MemoizerMemoOptions
interface MemoizerMemoOptions<K, V, FC = unknown> extends Pick< OptionsBase<K, V, FC>, | 'allowStale' | 'updateAgeOnGet' | 'noDeleteOnStaleGet' | 'sizeCalculation' | 'ttl' | 'noDisposeOnSet' | 'noUpdateTTL' > {}
options which override the options set in the LRUCache constructor when calling LRUCache#memo.
This is the union of GetOptions and SetOptions, plus MemoOptions.forceRefresh, and MemoOptions.context
Any of these may be modified in the OptionsBase.memoMethod function, but the GetOptions fields will of course have no effect, as the LRUCache#get call already happened by the time the memoMethod is called.
interface MemoizerOptions
interface MemoizerOptions<K, V, FC = unknown> {}
Options provided to the OptionsBase.memoMethod function.
property context
context: FC;
Object provided in the MemoOptions.context option to LRUCache#memo
property options
options: MemoizerMemoOptions<K, V, FC>;
interface MemoOptions
interface MemoOptions<K, V, FC = unknown> extends Pick< OptionsBase<K, V, FC>, | 'allowStale' | 'updateAgeOnGet' | 'noDeleteOnStaleGet' | 'sizeCalculation' | 'ttl' | 'noDisposeOnSet' | 'noUpdateTTL' | 'noDeleteOnFetchRejection' | 'allowStaleOnFetchRejection' | 'ignoreFetchAbort' | 'allowStaleOnFetchAbort' > {}
property context
context?: FC;
Context provided to the OptionsBase.memoMethod as the MemoizerOptions.context param.
If the FC type is specified as unknown (the default), undefined or void, then this is optional. Otherwise, it will be required.
property forceRefresh
forceRefresh?: boolean;
Set to true to force a re-load of the existing data, even if it is not yet stale.
property status
status?: Status<V>;
interface MemoOptionsNoContext
interface MemoOptionsNoContext<K, V> extends MemoOptions<K, V, undefined> {}
Options provided to LRUCache#memo when the FC type is
undefined
orvoid
property context
context?: undefined;
interface MemoOptionsWithContext
interface MemoOptionsWithContext<K, V, FC> extends MemoOptions<K, V, FC> {}
Options provided to LRUCache#memo when the FC type is something other than
unknown
,undefined
, orvoid
property context
context: FC;
interface OptionsBase
interface OptionsBase<K, V, FC> {}
Options which may be passed to the LRUCache constructor.
Most of these may be overridden in the various options that use them.
Despite all being technically optional, the constructor requires that a cache is at minimum limited by one or more of OptionsBase.max, OptionsBase.ttl, or OptionsBase.maxSize.
If OptionsBase.ttl is used alone, then it is strongly advised (and in fact required by the type definitions here) that the cache also set OptionsBase.ttlAutopurge, to prevent potentially unbounded storage.
All options are also available on the LRUCache instance, making it safe to pass an LRUCache instance as the options argumemnt to make another empty cache of the same type.
Some options are marked as read-only, because changing them after instantiation is not safe. Changing any of the other options will of course only have an effect on subsequent method calls.
property allowStale
allowStale?: boolean;
Allow LRUCache#get and LRUCache#fetch calls to return stale data, if available.
By default, if you set
ttl
, stale items will only be deleted from the cache when youget(key)
. That is, it's not preemptively pruning items, unless OptionsBase.ttlAutopurge is set.If you set
allowStale:true
, it'll return the stale value *as well as* deleting it. If you don't set this, then it'll returnundefined
when you try to get a stale entry.Note that when a stale entry is fetched, _even if it is returned due to
allowStale
being set_, it is removed from the cache immediately. You can suppress this behavior by setting OptionsBase.noDeleteOnStaleGet, either in the constructor, or in the options provided to LRUCache#get.This may be overridden by passing an options object to
cache.get()
. Thecache.has()
method will always returnfalse
for stale items.Only relevant if a ttl is set.
property allowStaleOnFetchAbort
allowStaleOnFetchAbort?: boolean;
Set to true to return a stale value from the cache when the
AbortSignal
passed to the OptionsBase.fetchMethod dispatches an'abort'
event, whether user-triggered, or due to internal cache behavior.Unless OptionsBase.ignoreFetchAbort is also set, the underlying OptionsBase.fetchMethod will still be considered canceled, and any value it returns will be ignored and not cached.
Caveat: since fetches are aborted when a new value is explicitly set in the cache, this can lead to fetch returning a stale value, since that was the fallback value _at the moment the
fetch()
was initiated_, even though the new updated value is now present in the cache.For example:
const cache = new LRUCache<string, any>({ttl: 100,fetchMethod: async (url, oldValue, { signal }) => {const res = await fetch(url, { signal })return await res.json()}})cache.set('https://example.com/', { some: 'data' })// 100ms go by...const result = cache.fetch('https://example.com/')cache.set('https://example.com/', { other: 'thing' })console.log(await result) // { some: 'data' }console.log(cache.get('https://example.com/')) // { other: 'thing' }
property allowStaleOnFetchRejection
allowStaleOnFetchRejection?: boolean;
Set to true to allow returning stale data when a OptionsBase.fetchMethod throws an error or returns a rejected promise.
This differs from using OptionsBase.allowStale in that stale data will ONLY be returned in the case that the LRUCache#fetch fails, not any other times.
If a
fetchMethod
fails, and there is no stale value available, thefetch()
will resolve toundefined
. Ie, allfetchMethod
errors are suppressed.Implies
noDeleteOnFetchRejection
.This may be set in calls to
fetch()
, or defaulted on the constructor, or overridden by modifying the options object in thefetchMethod
.
property dispose
dispose?: Disposer<K, V>;
Function that is called on items when they are dropped from the cache, as
dispose(value, key, reason)
.This can be handy if you want to close file descriptors or do other cleanup tasks when items are no longer stored in the cache.
**NOTE**: It is called _before_ the item has been fully removed from the cache, so if you want to put it right back in, you need to wait until the next tick. If you try to add it back in during the
dispose()
function call, it will break things in subtle and weird ways.Unlike several other options, this may _not_ be overridden by passing an option to
set()
, for performance reasons.The
reason
will be one of the following strings, corresponding to the reason for the item's deletion:-
evict
Item was evicted to make space for a new addition -set
Item was overwritten by a new value -expire
Item expired its TTL -fetch
Item was deleted due to a failed or aborted fetch, or a fetchMethod returning `undefined. -delete
Item was removed by explicitcache.delete(key)
,cache.clear()
, orcache.set(key, undefined)
.
property disposeAfter
disposeAfter?: Disposer<K, V>;
The same as OptionsBase.dispose, but called *after* the entry is completely removed and the cache is once again in a clean state.
It is safe to add an item right back into the cache at this point. However, note that it is *very* easy to inadvertently create infinite recursion this way.
property fetchMethod
fetchMethod?: Fetcher<K, V, FC>;
Method that provides the implementation for LRUCache#fetch
fetchMethod(key, staleValue, { signal, options, context })If
fetchMethod
is not provided, thencache.fetch(key)
is equivalent toPromise.resolve(cache.get(key))
.If at any time,
signal.aborted
is set totrue
, or if thesignal.onabort
method is called, or if it emits an'abort'
event which you can listen to withaddEventListener
, then that means that the fetch should be abandoned. This may be passed along to async functions aware of AbortController/AbortSignal behavior.The
fetchMethod
should **only** returnundefined
or a Promise resolving toundefined
if the AbortController signaled anabort
event. In all other cases, it should return or resolve to a value suitable for adding to the cache.The
options
object is a union of the options that may be provided toset()
andget()
. If they are modified, then that will result in modifying the settings tocache.set()
when the value is resolved, and in the case of OptionsBase.noDeleteOnFetchRejection and OptionsBase.allowStaleOnFetchRejection, the handling offetchMethod
failures.For example, a DNS cache may update the TTL based on the value returned from a remote DNS server by changing
options.ttl
in thefetchMethod
.
property ignoreFetchAbort
ignoreFetchAbort?: boolean;
Set to true to ignore the
abort
event emitted by theAbortSignal
object passed to OptionsBase.fetchMethod, and still cache the resulting resolution value, as long as it is notundefined
.When used on its own, this means aborted LRUCache#fetch calls are not immediately resolved or rejected when they are aborted, and instead take the full time to await.
When used with OptionsBase.allowStaleOnFetchAbort, aborted LRUCache#fetch calls will resolve immediately to their stale cached value or
undefined
, and will continue to process and eventually update the cache when they resolve, as long as the resulting value is notundefined
, thus supporting a "return stale on timeout while refreshing" mechanism by passingAbortSignal.timeout(n)
as the signal.For example:
const c = new LRUCache({ttl: 100,ignoreFetchAbort: true,allowStaleOnFetchAbort: true,fetchMethod: async (key, oldValue, { signal }) => {// note: do NOT pass the signal to fetch()!// let's say this fetch can take a long time.const res = await fetch(`https://slow-backend-server/${key}`)return await res.json()},})// this will return the stale value after 100ms, while still// updating in the background for next time.const val = await c.fetch('key', { signal: AbortSignal.timeout(100) })**Note**: regardless of this setting, an
abort
event _is still emitted on theAbortSignal
object_, so may result in invalid results when passed to other underlying APIs that use AbortSignals.This may be overridden in the OptionsBase.fetchMethod or the call to LRUCache#fetch.
property max
max?: Count;
The maximum number of items to store in the cache before evicting old entries. This is read-only on the LRUCache instance, and may not be overridden.
If set, then storage space will be pre-allocated at construction time, and the cache will perform significantly faster.
Note that significantly fewer items may be stored, if OptionsBase.maxSize and/or OptionsBase.ttl are also set.
**It is strongly recommended to set a
max
to prevent unbounded growth of the cache.**
property maxEntrySize
maxEntrySize?: Size;
The maximum allowed size for any single item in the cache.
If a larger item is passed to LRUCache#set or returned by a OptionsBase.fetchMethod or OptionsBase.memoMethod, then it will not be stored in the cache.
Attempting to add an item whose calculated size is greater than this amount will not cache the item or evict any old items, but WILL delete an existing value if one is already present.
Optional, must be a positive integer if provided. Defaults to the value of
maxSize
if provided.
property maxSize
maxSize?: Size;
Set to a positive integer to track the sizes of items added to the cache, and automatically evict items in order to stay below this size. Note that this may result in fewer than
max
items being stored.Attempting to add an item to the cache whose calculated size is greater that this amount will be a no-op. The item will not be cached, and no other items will be evicted.
Optional, must be a positive integer if provided.
Sets
maxEntrySize
to the same value, unless a different value is provided formaxEntrySize
.At least one of
max
,maxSize
, orTTL
is required. This must be a positive integer if set.Even if size tracking is enabled, **it is strongly recommended to set a
max
to prevent unbounded growth of the cache.**Note also that size tracking can negatively impact performance, though for most cases, only minimally.
property memoMethod
memoMethod?: Memoizer<K, V, FC>;
Method that provides the implementation for LRUCache#memo
property noDeleteOnFetchRejection
noDeleteOnFetchRejection?: boolean;
Set to true to suppress the deletion of stale data when a OptionsBase.fetchMethod returns a rejected promise.
property noDeleteOnStaleGet
noDeleteOnStaleGet?: boolean;
Do not delete stale items when they are retrieved with LRUCache#get.
Note that the
get
return value will still beundefined
unless OptionsBase.allowStale is true.When using time-expiring entries with
ttl
, by default stale items will be removed from the cache when the key is accessed withcache.get()
.Setting this option will cause stale items to remain in the cache, until they are explicitly deleted with
cache.delete(key)
, or retrieved withnoDeleteOnStaleGet
set tofalse
.This may be overridden by passing an options object to
cache.get()
.Only relevant if a ttl is used.
property noDisposeOnSet
noDisposeOnSet?: boolean;
Set to true to suppress calling the OptionsBase.dispose function if the entry key is still accessible within the cache.
This may be overridden by passing an options object to LRUCache#set.
Only relevant if
dispose
ordisposeAfter
are set.
property noUpdateTTL
noUpdateTTL?: boolean;
Boolean flag to tell the cache to not update the TTL when setting a new value for an existing key (ie, when updating a value rather than inserting a new value). Note that the TTL value is _always_ set (if provided) when adding a new entry into the cache.
Has no effect if a OptionsBase.ttl is not set.
May be passed as an option to LRUCache#set.
property sizeCalculation
sizeCalculation?: SizeCalculator<K, V>;
A function that returns a number indicating the item's size.
Requires OptionsBase.maxSize to be set.
If not provided, and OptionsBase.maxSize or OptionsBase.maxEntrySize are set, then all LRUCache#set calls **must** provide an explicit SetOptions.size or sizeCalculation param.
property ttl
ttl?: Milliseconds;
Max time in milliseconds for items to live in cache before they are considered stale. Note that stale items are NOT preemptively removed by default, and MAY live in the cache, contributing to its LRU max, long after they have expired, unless OptionsBase.ttlAutopurge is set.
If set to
0
(the default value), then that means "do not track TTL", not "expire immediately".Also, as this cache is optimized for LRU/MRU operations, some of the staleness/TTL checks will reduce performance, as they will incur overhead by deleting items.
This is not primarily a TTL cache, and does not make strong TTL guarantees. There is no pre-emptive pruning of expired items, but you _may_ set a TTL on the cache, and it will treat expired items as missing when they are fetched, and delete them.
Optional, but must be a non-negative integer in ms if specified.
This may be overridden by passing an options object to
cache.set()
.At least one of
max
,maxSize
, orTTL
is required. This must be a positive integer if set.Even if ttl tracking is enabled, **it is strongly recommended to set a
max
to prevent unbounded growth of the cache.**If ttl tracking is enabled, and
max
andmaxSize
are not set, andttlAutopurge
is not set, then a warning will be emitted cautioning about the potential for unbounded memory consumption. (The TypeScript definitions will also discourage this.)
property ttlAutopurge
ttlAutopurge?: boolean;
Preemptively remove stale items from the cache.
Note that this may *significantly* degrade performance, especially if the cache is storing a large number of items. It is almost always best to just leave the stale items in the cache, and let them fall out as new items are added.
Note that this means that OptionsBase.allowStale is a bit pointless, as stale items will be deleted almost as soon as they expire.
Use with caution!
property ttlResolution
ttlResolution?: Milliseconds;
Minimum amount of time in ms in which to check for staleness. Defaults to 1, which means that the current time is checked at most once per millisecond.
Set to 0 to check the current time every time staleness is tested. (This reduces performance, and is theoretically unnecessary.)
Setting this to a higher value will improve performance somewhat while using ttl tracking, albeit at the expense of keeping stale items around a bit longer than their TTLs would indicate.
1
property updateAgeOnGet
updateAgeOnGet?: boolean;
When using time-expiring entries with
ttl
, setting this totrue
will make each item's age reset to 0 whenever it is retrieved from cache with LRUCache#get, causing it to not expire. (It can still fall out of cache based on recency of use, of course.)Has no effect if OptionsBase.ttl is not set.
This may be overridden by passing an options object to
cache.get()
.
property updateAgeOnHas
updateAgeOnHas?: boolean;
When using time-expiring entries with
ttl
, setting this totrue
will make each item's age reset to 0 whenever its presence in the cache is checked with LRUCache#has, causing it to not expire. (It can still fall out of cache based on recency of use, of course.)Has no effect if OptionsBase.ttl is not set.
interface OptionsMaxLimit
interface OptionsMaxLimit<K, V, FC> extends OptionsBase<K, V, FC> {}
property max
max: Count;
interface OptionsSizeLimit
interface OptionsSizeLimit<K, V, FC> extends OptionsBase<K, V, FC> {}
property maxSize
maxSize: Size;
interface OptionsTTLLimit
interface OptionsTTLLimit<K, V, FC> extends OptionsBase<K, V, FC> {}
property ttl
ttl: Milliseconds;
property ttlAutopurge
ttlAutopurge: boolean;
interface PeekOptions
interface PeekOptions<K, V, FC> extends Pick<OptionsBase<K, V, FC>, 'allowStale'> {}
Options that may be passed to the LRUCache#peek method.
interface SetOptions
interface SetOptions<K, V, FC> extends Pick< OptionsBase<K, V, FC>, 'sizeCalculation' | 'ttl' | 'noDisposeOnSet' | 'noUpdateTTL' > {}
Options that may be passed to the LRUCache#set method.
property size
size?: Size;
If size tracking is enabled, then setting an explicit size in the LRUCache#set call will prevent calling the OptionsBase.sizeCalculation function.
property start
start?: Milliseconds;
If TTL tracking is enabled, then setting an explicit start time in the LRUCache#set call will override the default time from
performance.now()
orDate.now()
.Note that it must be a valid value for whichever time-tracking method is in use.
property status
status?: Status<V>;
interface Status
interface Status<V> {}
Occasionally, it may be useful to track the internal behavior of the cache, particularly for logging, debugging, or for behavior within the
fetchMethod
. To do this, you can pass astatus
object to the LRUCache#fetch, LRUCache#get, LRUCache#set, LRUCache#memo, and LRUCache#has methods.The
status
option should be a plain JavaScript object. The following fields will be set on it appropriately, depending on the situation.
property entrySize
entrySize?: Size;
The calculated size for the item, if sizes are used.
property fetch
fetch?: 'get' | 'inflight' | 'miss' | 'hit' | 'stale' | 'refresh';
The status of a LRUCache#fetch operation. Note that this can change as the underlying fetch() moves through various states.
- inflight: there is another fetch() for this key which is in process - get: there is no OptionsBase.fetchMethod, so LRUCache#get was called. - miss: the item is not in cache, and will be fetched. - hit: the item is in the cache, and was resolved immediately. - stale: the item is in the cache, but stale. - refresh: the item is in the cache, and not stale, but FetchOptions.forceRefresh was specified.
property fetchAborted
fetchAborted?: true;
The fetch received an abort signal
property fetchAbortIgnored
fetchAbortIgnored?: true;
The abort signal received was ignored, and the fetch was allowed to continue.
property fetchDispatched
fetchDispatched?: true;
The OptionsBase.fetchMethod was called
property fetchError
fetchError?: Error;
The reason for a fetch() rejection. Either the error raised by the OptionsBase.fetchMethod, or the reason for an AbortSignal.
property fetchRejected
fetchRejected?: true;
The fetchMethod promise was rejected
property fetchResolved
fetchResolved?: true;
The fetchMethod promise resolved successfully
property fetchUpdated
fetchUpdated?: true;
The cached value was updated after a successful call to OptionsBase.fetchMethod
property get
get?: 'stale' | 'hit' | 'miss';
The status of a LRUCache#get operation.
- fetching: The item is currently being fetched. If a previous value is present and allowed, that will be returned. - stale: The item is in the cache, and is stale. - hit: the item is in the cache - miss: the item is not in the cache
property has
has?: 'hit' | 'stale' | 'miss';
The results of a LRUCache#has operation
- hit: the item was found in the cache - stale: the item was found in the cache, but is stale - miss: the item was not found in the cache
property maxEntrySizeExceeded
maxEntrySizeExceeded?: true;
A flag indicating that the item was not stored, due to exceeding the OptionsBase.maxEntrySize
property now
now?: Milliseconds;
The timestamp used for TTL calculation
property oldValue
oldValue?: V;
The old value, specified in the case of
set:'update'
orset:'replace'
property remainingTTL
remainingTTL?: Milliseconds;
the remaining ttl for the item, or undefined if ttls are not used.
property returnedStale
returnedStale?: true;
A fetch or get operation returned a stale value.
property set
set?: 'add' | 'update' | 'replace' | 'miss';
The status of a set() operation.
- add: the item was not found in the cache, and was added - update: the item was in the cache, with the same value provided - replace: the item was in the cache, and replaced - miss: the item was not added to the cache for some reason
property start
start?: Milliseconds;
the start time for the item, or undefined if ttls are not used.
property totalCalculatedSize
totalCalculatedSize?: Size;
The total calculated size of the cache, if sizes are used.
property ttl
ttl?: Milliseconds;
the ttl stored for the item, or undefined if ttls are not used.
type Count
type Count = number;
An integer greater than 0, reflecting a number of items
type Disposer
type Disposer<K, V> = (value: V, key: K, reason: DisposeReason) => void;
A method called upon item removal, passed as the OptionsBase.dispose and/or OptionsBase.disposeAfter options.
type DisposeReason
type DisposeReason = 'evict' | 'set' | 'delete' | 'expire' | 'fetch';
The reason why an item was removed from the cache, passed to the Disposer methods.
-
evict
: The item was evicted because it is the least recently used, and the cache is full. -set
: A new value was set, overwriting the old value being disposed. -delete
: The item was explicitly deleted, either by calling LRUCache#delete, LRUCache#clear, or LRUCache#set with an undefined value. -expire
: The item was removed due to exceeding its TTL. -fetch
: A OptionsBase#fetchMethod operation returnedundefined
or was aborted, causing the item to be deleted.
type Fetcher
type Fetcher<K, V, FC = unknown> = ( key: K, staleValue: V | undefined, options: FetcherOptions<K, V, FC>) => Promise<V | undefined | void> | V | undefined | void;
The type signature for the OptionsBase.fetchMethod option.
type Memoizer
type Memoizer<K, V, FC = unknown> = ( key: K, staleValue: V | undefined, options: MemoizerOptions<K, V, FC>) => V;
the type signature for the OptionsBase.memoMethod option.
type Milliseconds
type Milliseconds = number;
Integer greater than 0, representing some number of milliseconds, or the time at which a TTL started counting from.
type Options
type Options<K, V, FC> = | OptionsMaxLimit<K, V, FC> | OptionsSizeLimit<K, V, FC> | OptionsTTLLimit<K, V, FC>;
The valid safe options for the LRUCache constructor
type Size
type Size = number;
An integer greater than 0, reflecting the calculated size of items
type SizeCalculator
type SizeCalculator<K, V> = (value: V, key: K) => Size;
A function that returns the effective calculated size of an entry in the cache.
Package Files (1)
Dependencies (0)
No dependencies.
Dev Dependencies (12)
Peer Dependencies (0)
No peer dependencies.
Badge
To add a badge like this oneto your package's README, use the codes available below.
You may also use Shields.io to create a custom badge linking to https://www.jsdocs.io/package/lru-cache
.
- Markdown[](https://www.jsdocs.io/package/lru-cache)
- HTML<a href="https://www.jsdocs.io/package/lru-cache"><img src="https://img.shields.io/badge/jsDocs.io-reference-blue" alt="jsDocs.io"></a>
- Updated .
Package analyzed in 4791 ms. - Missing or incorrect documentation? Open an issue for this package.