Skip to content

jitbit/FastCache

Repository files navigation

FastCache

7x-10x faster alternative to MemoryCache. A high-performance, lighweight (8KB dll) and thread-safe memory cache for .NET Core (.NET 6 and later)

NuGet version .NET

TL;DR

Basically it's just a ConcurrentDictionary with expiration.

Benchmarks

Windows:

Method Mean Error StdDev Gen0 Allocated
DictionaryLookup 65.38 ns 1.594 ns 0.087 ns - -
FastCacheLookup 67.15 ns 2.582 ns 0.142 ns - -
MemoryCacheLookup 426.60 ns 60.162 ns 3.298 ns 0.0200 128 B
FastCacheGetOrAdd 44.31 ns 1.170 ns 0.064 ns - -
MemoryCacheGetOrAdd 826.85 ns 36.609 ns 2.007 ns 0.1879 1184 B
FastCacheAddRemove 99.97 ns 12.040 ns 0.660 ns 0.0063 80 B
MemoryCacheAddRemove 710.70 ns 32.415 ns 1.777 ns 0.0515 328 B

Linux (Ubuntu, Docker):

Method Mean Error StdDev Gen0 Allocated
FastCacheLookup 94.97 ns 3.250 ns 0.178 ns - -
MemoryCacheLookup 1,051.69 ns 64.904 ns 3.558 ns 0.0191 128 B
FastCacheAddRemove 148.32 ns 25.766 ns 1.412 ns 0.0076 80 B
MemoryCacheAddRemove 1,120.75 ns 767.666 ns 42.078 ns 0.0515 328 B

How is FastCache better

Compared to System.Runtime.Caching.MemoryCache and Microsoft.Extensions.Caching.MemoryCache FastCache is

  • 7X faster reads (11X under Linux!)
  • 10x faster writes
  • Thread safe and atomic
  • Generic (strongly typed keys and values) to avoid boxing/unboxing primitive types
  • MemoryCache uses string keys only, so it allocates strings for keying
  • MemoryCache comes with performance counters that can't be turned off
  • MemoryCache uses heuristic and black magic to evict keys under memory pressure
  • MemoryCache uses more memory, can crash during a key scan

Usage

Install via nuget

Install-Package Jitbit.FastCache

Then use

var cache = new FastCache<string, int>();

cache.AddOrUpdate(
	key: "answer",
	value: 42,
	ttl: TimeSpan.FromMinutes(1));

cache.TryGet("answer", out int value); //value is "42"

//factory pattern! calls the expensive factory only if not cached yet
cache.GetOrAdd(
	key: "answer",
	valueFactory: k => 42,
	ttl: TimeSpan.FromMilliseconds(100));

//handy overload to prevent captures/closures allocation
cache.GetOrAdd(
	key: "answer",
	valueFactory: (k, arg) => 42 + arg.Length,
	ttl: TimeSpan.FromMilliseconds(100),
	factoryArgument: "some state data");

Tradeoffs

FastCache uses Environment.TickCount to monitor items' TTL. Environment.TickCount is 104x times faster than using DateTime.Now and 26x times faster than DateTime.UtcNow.

But Environment.TickCount is limited to Int32. Which means it resets to int.MinValue once overflowed. This is not a problem, we do have a workaround for that. However this means you cannot cache stuff for more than 25 days (2.4 billion milliseconds).

The above is no longer valid, we have switched to .NET 6 targeting and now use TickCount64 which is free of this problem.

Another tradeoff: MemoryCache watches memory usage, and evicts items once it senses memory pressure. FastCache does not do any of that it is up to you to keep your caches reasonably sized. After all, it's just a dictionary.

API Reference

FastCache<TKey, TValue>

Implements IEnumerable<KeyValuePair<TKey, TValue>>, IDisposable.

Constructor

FastCache(int cleanupJobInterval = 10000, EvictionCallback itemEvicted = null)

Creates a new empty cache instance.

Parameter Type Default Description
cleanupJobInterval int 10000 Background cleanup interval in milliseconds
itemEvicted EvictionCallback null Optional callback when an item is evicted (runs on thread pool)

Methods

AddOrUpdate(TKey key, TValue value, TimeSpan ttl)

Adds an item to cache or updates it if it already exists. Updating resets the TTL (sliding expiration).

AddOrUpdate(TKey key, Func<TKey, TValue> addValueFactory, Func<TKey, TValue, TValue> updateValueFactory, TimeSpan ttl)

Factory overload. Uses addValueFactory when the key is new, updateValueFactory when it exists.

TryGet(TKey key, out TValue value)bool

Attempts to get a value by key. Returns true if found and not expired.

TryAdd(TKey key, TValue value, TimeSpan ttl)bool

Attempts to add a key/value item. Returns false if the key already exists (and is not expired).

GetOrAdd(TKey key, Func<TKey, TValue> valueFactory, TimeSpan ttl)TValue

Returns existing value if cached, otherwise calls the factory to create, cache, and return it.

GetOrAdd(TKey key, TValue value, TimeSpan ttl)TValue

Returns existing value if cached, otherwise adds the provided value and returns it.

GetOrAdd<TArg>(TKey key, Func<TKey, TArg, TValue> valueFactory, TimeSpan ttl, TArg factoryArgument)TValue

Same as GetOrAdd but accepts a factoryArgument to avoid closure allocations.

Touch(TKey key, TimeSpan ttl)

Resets the TTL for an existing (non-expired) item — sliding expiration.

Remove(TKey key)

Removes the item with the specified key.

TryRemove(TKey key, out TValue value)bool

Removes the item and returns the removed value. Returns false if not found or expired.

EvictExpired()

Manually triggers cleanup of expired items. Rarely needed since TryGet checks TTL anyway.

Properties

Property Type Description
Count int Total item count, including expired items not yet cleaned up

Clear()

Removes all items from the cache.

Delegate

delegate void EvictionCallback(TKey key, TValue value)

Callback invoked (on thread pool) when an item is evicted from the cache.

About

7x-10x faster alternative to MemoryCache. A high-performance, lighweight (8KB dll) and thread-safe memory cache for .NET.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages