π A lightweight collection of TypeScript utilities for common async operation patterns. Each utility is optimized for performance and provides a clean, type-safe API.
Features:
- β‘οΈ async-retry: Smart retry logic with exponential backoff for API calls and network operations
- ποΈ async-cache: Fast LRU caching with TTL support for expensive operations
- π― async-dedupe: Prevent duplicate API calls and redundant operations
- π async-queue: Control concurrency and resource usage with priority queues
- π async-poll: Reliable polling with configurable intervals and backoff
- π Fully typed with TypeScript
- π Comprehensive test coverage
- π¦ Tree-shakeable and lightweight
- π« Zero dependencies (except tiny-lru)
npm install async-plugins # npm
pnpm add async-plugins # pnpm
bun add async-plugins # bun
yarn add async-plugins # yarn
Example online: https://stackblitz.com/edit/js-fjsqnhfc?file=index.js
Perfect for handling flaky API calls or network operations:
import { createAsyncRetry } from 'async-plugins';
const fetchWithRetry = createAsyncRetry({
retries: 3, // Try up to 3 times
minTimeout: 1000, // Start with 1s delay
maxTimeout: 10000, // Cap at 10s delay
factor: 2, // Double the delay each time
jitter: true, // Add randomness to prevent thundering herd
shouldRetry: (error) => {
// Only retry on network/5xx errors
return error.name === 'NetworkError' || (error.status && error.status >= 500);
},
onRetry: (error, attempt) => {
console.warn(`Retry attempt ${attempt} after error:`, error);
},
});
// Example: Fetch user data with retries
const getUserData = async (userId: string) => {
try {
const response = await fetchWithRetry(() =>
fetch(`/api/users/${userId}`).then((r) => r.json())
);
return response;
} catch (error) {
// All retries failed
console.error('Failed to fetch user data:', error);
throw error;
}
};
Optimize expensive operations and API calls with smart caching:
import { createAsyncCache } from 'async-plugins';
const cache = createAsyncCache({
ttl: 300000, // Cache for 5 minutes
maxSize: 1000, // Store up to 1000 items
staleWhileRevalidate: true, // Return stale data while refreshing
});
// Example: Cache expensive API calls
const getUserProfile = cache(
async (userId: string) => {
const response = await fetch(`/api/users/${userId}`);
return response.json();
},
// Optional: Custom cache key generator
(userId) => `user_profile:${userId}`
);
// First call fetches and caches
const profile1 = await getUserProfile('123');
// Subsequent calls within TTL return cached data
const profile2 = await getUserProfile('123'); // instant return
// After TTL expires, returns stale data and refreshes in background
const profile3 = await getUserProfile('123'); // instant return with stale data
Prevent duplicate API calls and redundant operations:
import { createAsyncDedupe } from 'async-plugins';
const dedupe = createAsyncDedupe({
timeout: 5000, // Auto-expire after 5s
errorSharing: true, // Share errors between duplicate calls
});
// Example: Prevent duplicate API calls
const fetchUserData = dedupe(async (userId: string) => {
const response = await fetch(`/api/users/${userId}`);
return response.json();
});
// Multiple simultaneous calls with same ID
const [user1, user2] = await Promise.all([
fetchUserData('123'), // Makes API call
fetchUserData('123'), // Uses result from first call
]);
// Check if operation is in progress
if (dedupe.isInProgress('123')) {
console.log('Fetch in progress...');
}
Control concurrency and manage resource usage:
import { createAsyncQueue } from 'async-plugins';
const queue = createAsyncQueue({
concurrency: 2, // Process 2 tasks at once
autoStart: true, // Start processing immediately
});
// Example: Rate-limit API calls
const processUsers = async (userIds: string[]) => {
const results = await queue.addAll(
userIds.map((id) => async () => {
const response = await fetch(`/api/users/${id}`);
return response.json();
})
);
return results;
};
// Monitor queue status
queue.onEmpty().then(() => {
console.log('Queue is empty');
});
queue.onDrain().then(() => {
console.log('All tasks completed');
});
// Queue stats
console.log(queue.stats());
// { pending: 0, active: 2, completed: 10, errors: 0, total: 12 }
Reliable polling with configurable intervals:
import { createAsyncPoller } from 'async-plugins';
// Example: Poll for job completion
const pollJobStatus = createAsyncPoller(
// Function to poll
async () => {
const response = await fetch('/api/job/123');
return response.json();
},
{
interval: 1000, // Poll every second
maxAttempts: 30, // Try up to 30 times
backoff: {
type: 'exponential',
factor: 2,
maxInterval: 30000,
jitter: true,
},
shouldContinue: (result) => result.status === 'running',
onProgress: (result) => {
console.log('Job progress:', result.progress);
},
}
);
try {
const finalResult = await pollJobStatus.start();
console.log('Job completed:', finalResult);
} catch (error) {
console.error('Polling failed:', error);
}
// Can stop polling manually if needed
pollJobStatus.stop();
- π― Focused Purpose: Each utility solves a specific async pattern problem
- π¦ Lightweight: Minimal bundle size impact with tree-shaking support
- πͺ Type-Safe: Written in TypeScript with comprehensive type definitions
- π§ Customizable: Flexible configuration options for each utility
- π Production-Ready: Well-tested and actively maintained
If you have questions or run into issues:
- Check the API Reference
- Create an issue on GitHub
- For security issues, please email me directly
Contributions are welcome!
MIT License. See LICENSE for details.
This project wouldn't be possible without:
- tiny-lru for LRU cache implementation
- Claude 3.7 Sonnet
- Gemini 2.5 Pro