Dark Mode

Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

soenneker/soenneker.utils.backgroundqueue

Repository files navigation

Soenneker.Utils.BackgroundQueue

A high-performance background Task / ValueTask queue


Overview

BackgroundQueue provides a fast, controlled way to execute background work in .NET applications. It prevents overload by queueing and processing work asynchronously with configurable limits and built-in tracking.


Features

  • Supports both Task and ValueTask
  • Configurable queue size
  • Tracks running and pending work
  • Simple DI registration
  • Hosted service for automatic background processing

Installation

dotnet add package Soenneker.Utils.BackgroundQueue

Register the queue:

void ConfigureServices(IServiceCollection services)
{
services.AddBackgroundQueueAsSingleton();
}

Starting & Stopping

Start

await serviceProvider.WarmupAndStartBackgroundQueue(cancellationToken);

Synchronous start:

serviceProvider.WarmupAndStartBackgroundQueueSync(cancellationToken);

Stop

await serviceProvider.StopBackgroundQueue(cancellationToken);

Synchronous stop:

serviceProvider.StopBackgroundQueueSync(cancellationToken);

Configuration

{
"Background": {
"QueueLength": 5000,
"LockCounts": false,
"Log": false
}
}
  • QueueLength - Maximum number of queued items
  • LockCounts - Enables thread-safe tracking of running work
  • Log - Enables debug logging

Using the Queue

Inject IBackgroundQueue:

IBackgroundQueue _queue;

void MyClass(IBackgroundQueue queue)
{
_queue = queue;
}

Queueing a ValueTask

await _queue.QueueValueTask(_ => someValueTask(), cancellationToken);

Queueing a Task

await _queue.QueueTask(_ => someTask(), cancellationToken);

?? Performance Tip: Prefer Stateful Queueing

Avoid capturing variables in lambdas when queueing work. Captured lambdas allocate and can impact performance under load.

? Avoid (captures state)

await _queue.QueueTask(ct => DoWorkAsync(id, ct));

If id is a local variable, this creates a closure.


? Recommended: Pass State Explicitly

Use the stateful overloads with static lambdas.

ValueTask

await _queue.QueueValueTask(
myService,
static (svc, ct) => svc.ProcessAsync(ct),
ct);

Task

await _queue.QueueTask(
(logger, id),
static (s, ct) => s.logger.RunAsync(s.id, ct),
ct);

Why this is better:

  • No closure allocations
  • Lower GC pressure
  • Best performance for high-throughput queues

The non-stateful overloads remain available for convenience, but stateful queueing is recommended for hot paths.


Waiting for the Queue to Empty

await queue.WaitUntilEmpty(cancellationToken);

Task Tracking

Check if work is still processing:

bool isProcessing = await queueInformationUtil.IsProcessing(cancellationToken);

Get current counts:

var (taskCount, valueTaskCount) =
await queueInformationUtil.GetCountsOfProcessing(cancellationToken);

About

A high-performance background Task/ValueTask queue

Topics

Resources

Readme

License

MIT license

Code of conduct

Code of conduct

Contributing

Contributing

Security policy

Security policy

Stars

Watchers

Forks

Sponsor this project

Packages

Contributors

Languages