Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 43 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Contributing to CacheForge

Thank you for your interest in contributing to CacheForge! We welcome all kinds of contributions, including bug reports, feature requests, documentation improvements, and code enhancements.

## How to Contribute

1. **Fork the repository**
- Click the "Fork" button at the top right of the GitHub page.
2. **Clone your fork**
- `git clone https://github.com/<your-username>/cacheforge.git`
3. **Create a new branch**
- `git checkout -b feature/your-feature-name`
4. **Make your changes**
- Ensure your code follows the project's style and conventions.
- Add or update tests as needed.
5. **Commit your changes**
- `git commit -m "Describe your changes"`
6. **Push to your fork**
- `git push origin feature/your-feature-name`
7. **Open a Pull Request**
- Go to the original repository and click "New Pull Request".
- Fill out the PR template and describe your changes clearly.

## Code Style
- Use TypeScript for all source files.
- Run `npm run lint` before submitting.
- Write clear, concise documentation and comments.
- Add tests for new features and bug fixes.

## Reporting Issues
- Use [GitHub Issues](https://github.com/oliverkuchies/cacheforge/issues) to report bugs or request features.
- Please provide as much detail as possible, including steps to reproduce, expected behavior, and screenshots if relevant.

## Community
- Be respectful and constructive in all communications.
- Follow the [Code of Conduct](CODE_OF_CONDUCT.md) if available.

## License
By contributing, you agree that your contributions will be licensed under the MIT License.

---

Thank you for helping make CacheForge better!
27 changes: 19 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,29 @@
[![npm version](https://img.shields.io/npm/v/cacheforge.svg)](https://www.npmjs.com/package/cacheforge)
[![npm downloads](https://img.shields.io/npm/dm/cacheforge.svg)](https://www.npmjs.com/package/cacheforge)
[![Build Status](https://github.com/oliverkuchies/cacheforge/actions/workflows/main.yml/badge.svg)](https://github.com/oliverkuchies/cacheforge/actions)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
<div>
<img src="assets/img/cacheforge.png" alt="CacheForge Logo" width="300" />
</div>

# cacheforge

<div>
<img src="https://img.shields.io/npm/v/cacheforge.svg" alt="npm version" />
<img src="https://img.shields.io/npm/dm/cacheforge.svg" alt="npm downloads" />
<img src="https://github.com/oliverkuchies/cacheforge/actions/workflows/main.yml/badge.svg" alt="Build Status" />
<img src="https://img.shields.io/badge/License-MIT-yellow.svg" alt="License: MIT" />
</div>

**cacheforge** is a flexible, multi-level cache library for Node.js and TypeScript that combines the speed of in-memory caching with the persistence of Redis.

Built with extensibility in mind, it features pluggable eviction policies, memory management strategies, and safe versioning for cache invalidation.
Built with extensibility in mind, it features pluggable eviction policies, memory management strategies, and optional versioning for cache invalidation.

It utilizes a leveling framework, ensuring that Level 1 is always accessed before Level 2 in the cache hierarchy.

- Level 1 might be an in-memory cache, offering faster reads and reducing latency.
- Level 2 could be a remote cache such as Redis or Valkey, which serves as a secondary layer when data is not found in Level 1.
- And.. optionally, Level 3 could be a persistent storage solution like a database or filesystem.

This approach reduces load on the lower-level cache and improves overall performance by leveraging the strengths of each caching layer.

This approach reduces load on the lower-level cache and improves overall performance.
By leveraging memory cache as the first level, you can expect latency to be up to 105x faster compared to fetching data directly from Redis.

## Features

Expand Down Expand Up @@ -92,9 +101,11 @@ console.log(user); // { name: 'John Doe', email: 'john@example.com' }

### Cache Levels

Cache levels represent different storage backends in your caching hierarchy. cacheforge queries levels in order and returns the first hit, promoting cache locality.
Cache levels represent different storage backends in your caching hierarchy. Cacheforge queries levels in order, starting from the fastest (Level 1) to the slowest (Level N).

When data is requested, the cache service checks each level sequentially until it finds the data or exhausts all levels.

At the top (CacheService), fallbacks are handled. However the added layers do not have fallback logic to reduce complexity.
If data is found in a lower level, it can be promoted to higher levels for faster future access.

**Built-in Levels:**

Expand Down
Binary file added assets/img/cacheforge.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
25 changes: 0 additions & 25 deletions benchmark/results/cache-read-performance.table.html

This file was deleted.

25 changes: 0 additions & 25 deletions benchmark/results/cache-write-performance.table.html

This file was deleted.

25 changes: 25 additions & 0 deletions src/features/version-manager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,12 @@ export class VersionManager {
}
}

/**
* @description Build the versioned key
* @param key
* @param version
* @returns {string} versioned key
*/
private buildLookupKey(key: string, version: number) {
return `${key}:${version}`;
}
Expand Down Expand Up @@ -120,6 +126,14 @@ export class VersionManager {
return this.level.get<T>(versionedKey);
}

/**
* @description Set the value for a given key
* @param key - cache key
* @param value - value to set
* @param ttl - optional time to live
* @param namespace - optional namespace for versioning
* @returns {Promise<number>} version
*/
async set<T>(key: string, value: T, ttl?: number, namespace?: string) {
const versionedKey = await this.getOrSetVersionedKeyLookup(key, namespace);
await this.level.set<T>(versionedKey, value, ttl);
Expand All @@ -128,6 +142,12 @@ export class VersionManager {
return version;
}

/**
* @description Delete the value for a given key
* @param key - cache key
* @param namespace - optional namespace for versioning
* @returns {Promise<number>} version
*/
async del(key: string, namespace?: string) {
const versionedKey = await this.getOrSetVersionedKeyLookup(key, namespace);
await this.level.del(versionedKey);
Expand All @@ -136,6 +156,11 @@ export class VersionManager {
return version;
}

/**
* @description Retrieve version from versioned key
* @param versionedKey
* @returns {string | undefined} version
*/
private retrieveVersionFromKey(versionedKey: string): string | undefined {
const splitKey = versionedKey.split(":");
const version = splitKey[splitKey.length - 1];
Expand Down
20 changes: 20 additions & 0 deletions src/levels/memory/eviction-manager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,24 @@ import type {
import { onMemoryChange } from "./memory-event.manager";

export class EvictionManager {
/**
* Constructs an EvictionManager for a given memory cache level and options.
* Registers a listener for memory changes to trigger eviction.
* @param memoryLevel - The memory cache level instance
* @param memoryLevelOptions - Options including strategies and eviction policy
*/
constructor(
protected memoryLevel: MemoryCacheLevel,
protected memoryLevelOptions: MemoryLevelOptions<StoredHeapItem>,
) {
this.registerMemoryChangeListener(memoryLevelOptions);
}

/**
* Evicts items from memory based on custom strategies defined in options.
* If any strategy's condition is met, triggers the eviction policy.
* @param options - Memory level options containing strategies and eviction policy
*/
private async evictByStrategy(options: MemoryLevelOptions<StoredHeapItem>) {
if (
options.memoryStrategies.find((strategy) =>
Expand All @@ -23,6 +34,11 @@ export class EvictionManager {
}
}

/**
* Registers a listener to handle memory changes.
* On memory change, evicts expired items and applies eviction strategies.
* @param options - Memory level options
*/
public registerMemoryChangeListener(
options: MemoryLevelOptions<StoredHeapItem>,
) {
Expand All @@ -32,6 +48,10 @@ export class EvictionManager {
});
}

/**
* Evicts all expired items from the memory cache level.
* Removes items whose expiry timestamp is less than or equal to the current time.
*/
public async evictExpiredItems() {
const now = Date.now();
const evictedItems = [];
Expand Down
60 changes: 60 additions & 0 deletions src/levels/memory/memory.level.ts
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,11 @@ export class MemoryCacheLevel
this.evictionManager = new EvictionManager(this, options);
}

/**
* Delete multiple values from the cache.
* @param keys The cache keys.
* @return void
*/
public async mdel(keys: string[]): Promise<void> {
const deletePromises: Promise<void>[] = [];
for (const key of keys) {
Expand All @@ -45,17 +50,33 @@ export class MemoryCacheLevel
await Promise.all(deletePromises);
}

/**
* Update the cache store with a new item.
* @param key The cache key.
* @param item The item to store.
*/
private async updateStore(key: string, item: StoredItem) {
this.store.set(key, item);
this.heap.insert({ ...item, key });
this.size += serialize(item).length;
triggerMemoryChange();
}

/**
* Get the current size of the cache store in bytes.
* @returns The size of the cache store.
*/
public getStoreSize(): number {
return this.size;
}

/**
* Store multiple values in the cache.
* @param keys The cache keys.
* @param values The values to cache.
* @param ttl Time to live in seconds.
* @returns The cached values.
*/
async mset<T>(
keys: string[],
values: T[],
Expand Down Expand Up @@ -94,35 +115,74 @@ export class MemoryCacheLevel
return results;
}

/**
* Retrieve a value from the cache.
* @param key The cache key.
* @returns The cached value or null if not found.
*/
async get<T>(key: string): Promise<T> {
await this.evictionManager.evictExpiredItems();

const cachedValue = this.store.get(key) as StoredItem | undefined;

return cachedValue?.value as T;
}

/**
* Store a value in the cache.
* @param key The cache key.
* @param value The value to cache.
* @param ttl Time to live in seconds.
* @returns The cached value.
*/
async set<T>(key: string, value: T, ttl: number = DEFAULT_TTL): Promise<T> {
const expiryDate = Date.now() + ttl * 1000;
const storedItem = { value, expiry: expiryDate };
await this.updateStore(key, storedItem);

return value as T;
}

/**
* Delete a value from the cache.
* @param key The cache key.
* @return void
*/
async del(key: string): Promise<void> {
this.store.delete(key);
}

/**
* Purge all items from the cache.
* @return void
*/
purge(): void {
this.heap.clear();
this.store.clear();
}

/**
* Get the current memory usage percentage.
* @return Memory usage percentage.
*/
getMemoryUsage(): number {
const memoryAvailable = os.totalmem() - os.freemem();
return (memoryAvailable / os.totalmem()) * 100;
}

/**
* Get the cache heap.
* @return The cache heap.
*/
getHeap() {
return this.heap;
}

/**
* Flush all items from the cache.
* @return Promise that resolves when the operation is complete.
*/

flushAll(): Promise<void> {
this.purge();
return Promise.resolve();
Expand Down
Loading