diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 0000000..a55fb43 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,43 @@ +# Contributing to CacheForge + +Thank you for your interest in contributing to CacheForge! We welcome all kinds of contributions, including bug reports, feature requests, documentation improvements, and code enhancements. + +## How to Contribute + +1. **Fork the repository** + - Click the "Fork" button at the top right of the GitHub page. +2. **Clone your fork** + - `git clone https://github.com//cacheforge.git` +3. **Create a new branch** + - `git checkout -b feature/your-feature-name` +4. **Make your changes** + - Ensure your code follows the project's style and conventions. + - Add or update tests as needed. +5. **Commit your changes** + - `git commit -m "Describe your changes"` +6. **Push to your fork** + - `git push origin feature/your-feature-name` +7. **Open a Pull Request** + - Go to the original repository and click "New Pull Request". + - Fill out the PR template and describe your changes clearly. + +## Code Style +- Use TypeScript for all source files. +- Run `npm run lint` before submitting. +- Write clear, concise documentation and comments. +- Add tests for new features and bug fixes. + +## Reporting Issues +- Use [GitHub Issues](https://github.com/oliverkuchies/cacheforge/issues) to report bugs or request features. +- Please provide as much detail as possible, including steps to reproduce, expected behavior, and screenshots if relevant. + +## Community +- Be respectful and constructive in all communications. +- Follow the [Code of Conduct](CODE_OF_CONDUCT.md) if available. + +## License +By contributing, you agree that your contributions will be licensed under the MIT License. + +--- + +Thank you for helping make CacheForge better! diff --git a/README.md b/README.md index d36054d..436fd02 100644 --- a/README.md +++ b/README.md @@ -1,20 +1,29 @@ -[![npm version](https://img.shields.io/npm/v/cacheforge.svg)](https://www.npmjs.com/package/cacheforge) -[![npm downloads](https://img.shields.io/npm/dm/cacheforge.svg)](https://www.npmjs.com/package/cacheforge) -[![Build Status](https://github.com/oliverkuchies/cacheforge/actions/workflows/main.yml/badge.svg)](https://github.com/oliverkuchies/cacheforge/actions) -[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) +
+ CacheForge Logo +
# cacheforge +
+ npm version + npm downloads + Build Status + License: MIT +
+ **cacheforge** is a flexible, multi-level cache library for Node.js and TypeScript that combines the speed of in-memory caching with the persistence of Redis. -Built with extensibility in mind, it features pluggable eviction policies, memory management strategies, and safe versioning for cache invalidation. +Built with extensibility in mind, it features pluggable eviction policies, memory management strategies, and optional versioning for cache invalidation. It utilizes a leveling framework, ensuring that Level 1 is always accessed before Level 2 in the cache hierarchy. - Level 1 might be an in-memory cache, offering faster reads and reducing latency. - Level 2 could be a remote cache such as Redis or Valkey, which serves as a secondary layer when data is not found in Level 1. +- And.. optionally, Level 3 could be a persistent storage solution like a database or filesystem. + +This approach reduces load on the lower-level cache and improves overall performance by leveraging the strengths of each caching layer. -This approach reduces load on the lower-level cache and improves overall performance. +By leveraging memory cache as the first level, you can expect latency to be up to 105x faster compared to fetching data directly from Redis. ## Features @@ -92,9 +101,11 @@ console.log(user); // { name: 'John Doe', email: 'john@example.com' } ### Cache Levels -Cache levels represent different storage backends in your caching hierarchy. cacheforge queries levels in order and returns the first hit, promoting cache locality. +Cache levels represent different storage backends in your caching hierarchy. Cacheforge queries levels in order, starting from the fastest (Level 1) to the slowest (Level N). + +When data is requested, the cache service checks each level sequentially until it finds the data or exhausts all levels. -At the top (CacheService), fallbacks are handled. However the added layers do not have fallback logic to reduce complexity. +If data is found in a lower level, it can be promoted to higher levels for faster future access. **Built-in Levels:** diff --git a/assets/img/cacheforge.png b/assets/img/cacheforge.png new file mode 100644 index 0000000..934fc9a Binary files /dev/null and b/assets/img/cacheforge.png differ diff --git a/benchmark/results/cache-read-performance.table.html b/benchmark/results/cache-read-performance.table.html deleted file mode 100644 index 1dde643..0000000 --- a/benchmark/results/cache-read-performance.table.html +++ /dev/null @@ -1,25 +0,0 @@ - - - - - - - Cache Read Performance - - - - - - - - - - - - - - - -
nameopsmarginpercentSlower
Multi-Level Cache1759304.710
Redis-Only Cache18258.8799.9
- - \ No newline at end of file diff --git a/benchmark/results/cache-write-performance.table.html b/benchmark/results/cache-write-performance.table.html deleted file mode 100644 index 3e2da53..0000000 --- a/benchmark/results/cache-write-performance.table.html +++ /dev/null @@ -1,25 +0,0 @@ - - - - - - - Cache Write Performance - - - - - - - - - - - - - - - -
nameopsmarginpercentSlower
Multi-Level Cache23016.320
Redis-Only Cache106641.8553.67
- - \ No newline at end of file diff --git a/src/features/version-manager.ts b/src/features/version-manager.ts index 97d2f56..29b5add 100644 --- a/src/features/version-manager.ts +++ b/src/features/version-manager.ts @@ -39,6 +39,12 @@ export class VersionManager { } } + /** + * @description Build the versioned key + * @param key + * @param version + * @returns {string} versioned key + */ private buildLookupKey(key: string, version: number) { return `${key}:${version}`; } @@ -120,6 +126,14 @@ export class VersionManager { return this.level.get(versionedKey); } + /** + * @description Set the value for a given key + * @param key - cache key + * @param value - value to set + * @param ttl - optional time to live + * @param namespace - optional namespace for versioning + * @returns {Promise} version + */ async set(key: string, value: T, ttl?: number, namespace?: string) { const versionedKey = await this.getOrSetVersionedKeyLookup(key, namespace); await this.level.set(versionedKey, value, ttl); @@ -128,6 +142,12 @@ export class VersionManager { return version; } + /** + * @description Delete the value for a given key + * @param key - cache key + * @param namespace - optional namespace for versioning + * @returns {Promise} version + */ async del(key: string, namespace?: string) { const versionedKey = await this.getOrSetVersionedKeyLookup(key, namespace); await this.level.del(versionedKey); @@ -136,6 +156,11 @@ export class VersionManager { return version; } + /** + * @description Retrieve version from versioned key + * @param versionedKey + * @returns {string | undefined} version + */ private retrieveVersionFromKey(versionedKey: string): string | undefined { const splitKey = versionedKey.split(":"); const version = splitKey[splitKey.length - 1]; diff --git a/src/levels/memory/eviction-manager.ts b/src/levels/memory/eviction-manager.ts index d80ddac..266f04d 100644 --- a/src/levels/memory/eviction-manager.ts +++ b/src/levels/memory/eviction-manager.ts @@ -6,6 +6,12 @@ import type { import { onMemoryChange } from "./memory-event.manager"; export class EvictionManager { + /** + * Constructs an EvictionManager for a given memory cache level and options. + * Registers a listener for memory changes to trigger eviction. + * @param memoryLevel - The memory cache level instance + * @param memoryLevelOptions - Options including strategies and eviction policy + */ constructor( protected memoryLevel: MemoryCacheLevel, protected memoryLevelOptions: MemoryLevelOptions, @@ -13,6 +19,11 @@ export class EvictionManager { this.registerMemoryChangeListener(memoryLevelOptions); } + /** + * Evicts items from memory based on custom strategies defined in options. + * If any strategy's condition is met, triggers the eviction policy. + * @param options - Memory level options containing strategies and eviction policy + */ private async evictByStrategy(options: MemoryLevelOptions) { if ( options.memoryStrategies.find((strategy) => @@ -23,6 +34,11 @@ export class EvictionManager { } } + /** + * Registers a listener to handle memory changes. + * On memory change, evicts expired items and applies eviction strategies. + * @param options - Memory level options + */ public registerMemoryChangeListener( options: MemoryLevelOptions, ) { @@ -32,6 +48,10 @@ export class EvictionManager { }); } + /** + * Evicts all expired items from the memory cache level. + * Removes items whose expiry timestamp is less than or equal to the current time. + */ public async evictExpiredItems() { const now = Date.now(); const evictedItems = []; diff --git a/src/levels/memory/memory.level.ts b/src/levels/memory/memory.level.ts index e13a818..6f0c0d1 100644 --- a/src/levels/memory/memory.level.ts +++ b/src/levels/memory/memory.level.ts @@ -37,6 +37,11 @@ export class MemoryCacheLevel this.evictionManager = new EvictionManager(this, options); } + /** + * Delete multiple values from the cache. + * @param keys The cache keys. + * @return void + */ public async mdel(keys: string[]): Promise { const deletePromises: Promise[] = []; for (const key of keys) { @@ -45,6 +50,11 @@ export class MemoryCacheLevel await Promise.all(deletePromises); } + /** + * Update the cache store with a new item. + * @param key The cache key. + * @param item The item to store. + */ private async updateStore(key: string, item: StoredItem) { this.store.set(key, item); this.heap.insert({ ...item, key }); @@ -52,10 +62,21 @@ export class MemoryCacheLevel triggerMemoryChange(); } + /** + * Get the current size of the cache store in bytes. + * @returns The size of the cache store. + */ public getStoreSize(): number { return this.size; } + /** + * Store multiple values in the cache. + * @param keys The cache keys. + * @param values The values to cache. + * @param ttl Time to live in seconds. + * @returns The cached values. + */ async mset( keys: string[], values: T[], @@ -94,6 +115,11 @@ export class MemoryCacheLevel return results; } + /** + * Retrieve a value from the cache. + * @param key The cache key. + * @returns The cached value or null if not found. + */ async get(key: string): Promise { await this.evictionManager.evictExpiredItems(); @@ -101,6 +127,14 @@ export class MemoryCacheLevel return cachedValue?.value as T; } + + /** + * Store a value in the cache. + * @param key The cache key. + * @param value The value to cache. + * @param ttl Time to live in seconds. + * @returns The cached value. + */ async set(key: string, value: T, ttl: number = DEFAULT_TTL): Promise { const expiryDate = Date.now() + ttl * 1000; const storedItem = { value, expiry: expiryDate }; @@ -108,21 +142,47 @@ export class MemoryCacheLevel return value as T; } + + /** + * Delete a value from the cache. + * @param key The cache key. + * @return void + */ async del(key: string): Promise { this.store.delete(key); } + + /** + * Purge all items from the cache. + * @return void + */ purge(): void { this.heap.clear(); this.store.clear(); } + + /** + * Get the current memory usage percentage. + * @return Memory usage percentage. + */ getMemoryUsage(): number { const memoryAvailable = os.totalmem() - os.freemem(); return (memoryAvailable / os.totalmem()) * 100; } + + /** + * Get the cache heap. + * @return The cache heap. + */ getHeap() { return this.heap; } + /** + * Flush all items from the cache. + * @return Promise that resolves when the operation is complete. + */ + flushAll(): Promise { this.purge(); return Promise.resolve(); diff --git a/src/levels/redis/redis.level.ts b/src/levels/redis/redis.level.ts index 55ecf1e..7d3f521 100644 --- a/src/levels/redis/redis.level.ts +++ b/src/levels/redis/redis.level.ts @@ -25,59 +25,91 @@ export class RedisCacheLevel implements CacheLevel, Lockable { * @param namespace - used to group related cache entries for easier invalidation * @returns */ + /** + * Retrieves a value from Redis by key and parses it if it's JSON. + * @param key - Cache key + * @returns Parsed value of type T or undefined if not found + */ async get(key: string): Promise { const cachedValue = (await this.client.get(key)) as T; - return parseIfJSON(cachedValue); } + /** + * Sets multiple key-value pairs in Redis using a pipeline for efficiency. + * @param keys - Array of cache keys + * @param values - Array of values to cache + * @param ttl - Time to live in seconds + * @returns Array of values that were set + */ async mset(keys: string[], values: T[], ttl = DEFAULT_TTL): Promise { const pipeline = this.client.pipeline(); - for (let i = 0; i < keys.length; i++) { const key = keys[i]; const value = values[i]; pipeline.set(key, serialize(value), "EX", ttl); } - await pipeline.exec(); - return values; } + /** + * Retrieves multiple values from Redis by their keys and deserializes them. + * @param keys - Array of cache keys + * @returns Array of values of type T (undefined if not found) + */ async mget(keys: string[]): Promise { const results = await this.client.mget(...keys); const finalResults: T[] = []; - for (let i = 0; i < keys.length; i++) { const cachedValue = results[i] as never; - if (cachedValue === null || cachedValue === undefined) { finalResults.push(undefined as T); } else { finalResults.push(deserialize(cachedValue)); } } - return finalResults; } + /** + * Sets a single key-value pair in Redis with a TTL. + * @param key - Cache key + * @param value - Value to cache + * @param ttl - Time to live in seconds + * @returns Parsed value of type T + */ async set(key: string, value: T, ttl = DEFAULT_TTL) { await this.client.set(key, serialize(value), "EX", ttl); - return parseIfJSON(value) as T; } + /** + * Deletes a key and its version lookup key from Redis. + * @param key - Cache key to delete + */ async del(key: string) { await this.client.del(key); const versionKey = generateVersionLookupKey(key); await this.client.del(versionKey); } + /** + * Deletes multiple keys from Redis. + * @param keys - Array of cache keys to delete + */ async mdel(keys: string[]) { await this.client.del(...keys); } + /** + * Acquires a distributed lock for a key, executes a callback, and releases the lock. + * Uses Redlock for distributed locking. + * @param key - Cache key to lock + * @param callback - Function to execute while holding the lock + * @param ttl - Time to live for the lock in seconds + * @returns Result of the callback function + */ async lock(key: string, callback: () => Promise, ttl = 30): Promise { const redlockClient = new Redlock([this.client], { driftFactor: 0.01, @@ -86,7 +118,6 @@ export class RedisCacheLevel implements CacheLevel, Lockable { retryJitter: 200, automaticExtensionThreshold: 500, }); - const lock = await redlockClient.acquire([`lock:${key}`], ttl * 1000); try { const result = await callback(); @@ -97,8 +128,7 @@ export class RedisCacheLevel implements CacheLevel, Lockable { } /** - * Not recommended for production use in large datasets - * as it can be slow and blocking. + * Flushes all keys from Redis. Not recommended for production use in large datasets as it can be slow and blocking. */ async flushAll(): Promise { await this.client.flushall();