Skip to content

Bulk Insert

Pawel Gerr edited this page Feb 16, 2026 · 8 revisions

Required Nuget Package:
Thinktecture.EntityFrameworkCore.SqlServer
Thinktecture.EntityFrameworkCore.PostgreSQL [EXPERIMENTAL]
Thinktecture.EntityFrameworkCore.Sqlite

Allows bulk-insert of entities.

Enable bulk-insert support

Enable bulk-insert support by using the extension method AddBulkOperationSupport.

If you are using Lazy Loading then disable the registration of temp tables for primites types sqlOptions.AddBulkOperationSupport(configureTempTablesForPrimitiveTypes: false).

var services = new ServiceCollection()
                       .AddDbContext<DemoDbContext>(builder => builder
                               // SQL Server
                               .UseSqlServer("conn-string", sqlOptions =>
                                                            {
                                                                 sqlOptions.AddBulkOperationSupport();
                                                            })
                               // PostgreSQL
                               //.UseNpgsql("conn-string", npgsqlOptions =>
                               //                          {
                               //                                npgsqlOptions.AddBulkOperationSupport();
                               //                          })
                               // SQLite
                               //.UseSqlite("conn-string", sqliteOptions =>
                               //                          {
                               //                                sqliteOptions.AddBulkOperationSupport();
                               //                          })

Usage

Use one of the method overloads of BulkInsertAsync to bulk-insert entities. The method returns the number of inserted rows.

List<Customer> customersToInsert = ...;

// insert entities as a whole
int numberOfInsertedRows = await ctx.BulkInsertAsync(customersToInsert);

// only "Id" has to be sent to the DB
// alternative ways to specify the column(s):
// * c => new { c.Id }
// * c => c.Id
// * new SqlServerBulkInsertOptions { PropertiesToInsert = IEntityPropertiesProvider.Exclude<Customer>(c => new { c.Id })};
numberOfInsertedRows = await ctx.BulkInsertAsync(customersToInsert, c => new { c.Id });

Bulk Insert Options

Use the corresponding implementation of IBulkInsertOptions to configure the insert of the entities.

  • SQL Server: SqlServerBulkInsertOptions
  • PostgreSQL: NpgsqlBulkInsertOptions
  • SQLite: SqliteBulkInsertOptions

Insert subset of properties only

By default, all properties of an entity are going to be inserted. You can use the options to specify the columns to insert.

var options = new SqlServerBulkInsertOptions
{
   PropertiesToInsert = IEntityPropertiesProvider.Include<Customer>(c => new { c.Id, c.FirstName, c.LastName }),
   // use "IEntityPropertiesProvider.Exclude" to exclude properties
};

await ctx.BulkInsertAsync(customersToInsert, options);

Table and schema override

By default, the target table name and schema are resolved from the EF Core model metadata. You can override them per-operation via options. When null (the default), the model-based resolution is used.

// SQL Server
var options = new SqlServerBulkInsertOptions
{
   TableName = "ArchiveCustomers",
   Schema = "archive"
};

// PostgreSQL
var options = new NpgsqlBulkInsertOptions
{
   TableName = "ArchiveCustomers",
   Schema = "archive"
};

// SQLite
var options = new SqliteBulkInsertOptions
{
   TableName = "ArchiveCustomers"
};

await ctx.BulkInsertAsync(customersToInsert, options);

Note: Column resolution still uses the EF Core model. The target table must have compatible columns.

Typical SqlBulkCopy options (SQL Server)

var options = new SqlServerBulkInsertOptions
{
  BatchSize = 5_000,
  EnableStreaming = true,
  BulkCopyTimeout = TimeSpan.FromSeconds(5),
  SqlBulkCopyOptions = SqlBulkCopyOptions.Default
};

PostgreSQL-specific options

PostgreSQL uses the COPY command internally for maximum performance.

var options = new NpgsqlBulkInsertOptions
{
   // FREEZE option: rows are frozen immediately, bypassing MVCC visibility checks.
   // Significantly improves performance for bulk loads into newly created/truncated tables.
   Freeze = true,

   CommandTimeout = TimeSpan.FromSeconds(60),

   PropertiesToInsert = IEntityPropertiesProvider.Include<Customer>(c => new { c.Id, c.FirstName, c.LastName })
};

await ctx.BulkInsertAsync(customersToInsert, options);

Limitations

Default values

Depending on the database you may hit some limitations when using default values. The limitations are applied to both the default values defined using HasDefaultValueSql and HasDefaultValue.

modelBuilder.Entity<Customer>(builder =>
{
    builder.Property(e => e.StringProperyWithSqlDefaultValue).HasDefaultValueSql("'foo'");
    builder.Property(e => e.StringPropertyWithDefaultValue).HasDefaultValue("bar");
});

The Entity Framework Core is able to handle default values properly because every entity is handled individually. Generating individual SQL statements during bulk insert would contradict the whole idea of this feature.

Possible solutions:

  • Provide all values explicitly (i.e. don't depend on the database or EF to set default values)
  • Don't insert entities as a whole, skip the properties with default values

MS SQL Server

It is not possible to trigger the default value constraint of the SQL Server if the column is NOT NULL

  • If the corresponding .NET-Property is a reference type (like a string) then SqlBulkCopy will throw a InvalidOperationException with a message Column 'MyStringColumn' does not allow DBNull.Value.
  • If the corresponding .NET-Property is a not-nullable struct (like an int) then the value is written to the database as-is, i.e. if the .NET-value is 0 then 0 is written into database. The same is true for a Guid, i.e. an empty Guid stays 00000000-0000-0000-0000-000000000000.

If the column allows NULL then .NET-Value null will trigger the default value constraint, i.e. we get the expected result.

SQLite

With SQLite the default value constraint doesn't trigger when trying to send null / NULL / DBNull.Value to both NULL and NOT NULL columns. The only way to trigger the constraint is not to insert the corresponding property alltogether.

Bulk Insert from Query

Use BulkInsertAsync with an IQueryable<TSource> to insert rows from a server-side query without materializing data client-side. This generates INSERT INTO ... SELECT ... FROM (subquery).

Usage

// Insert from a temp table
await using var tempTable = await ctx.BulkInsertIntoTempTableAsync(sourceEntities, new SqlServerTempTableBulkInsertOptions());

var affectedRows = await ctx.Set<Customer>().BulkInsertAsync(
    tempTable.Query,
    builder => builder
        .Map(e => e.Id, f => f.Id)
        .Map(e => e.Name, f => f.Name)
        .Map(e => e.Email, f => f.Email));

// Insert from a DbSet query
var sourceQuery = ctx.Set<Customer>()
                     .Where(c => c.IsActive)
                     .Select(c => new { c.Id, c.Name, c.Email });

var affectedRows = await ctx.Set<ArchivedCustomer>().BulkInsertAsync(
    sourceQuery,
    builder => builder
        .Map(e => e.Id, f => f.Id)
        .Map(e => e.Name, f => f.Name)
        .Map(e => e.Email, f => f.Email));

Constants, variables, and expressions

Value selectors support constants, captured variables, and arithmetic expressions in addition to source property access. Constant values are emitted as SQL parameters (@__bv_0, @__bv_1, etc.); expressions are translated to SQL with parameters (@__ev_0, @__ev_1, etc.).

// Map a column to a constant value
var affectedRows = await ctx.Set<Customer>().BulkInsertAsync(
    sourceQuery,
    builder => builder
        .Map(e => e.Id, f => f.Id)
        .Map(e => e.Name, f => f.Name)
        .Map(e => e.Score, f => 42));             // constant

// Map a column to a captured variable
var defaultRegion = "EU";

var affectedRows = await ctx.Set<Customer>().BulkInsertAsync(
    sourceQuery,
    builder => builder
        .Map(e => e.Id, f => f.Id)
        .Map(e => e.Name, f => f.Name)
        .Map(e => e.Region, f => defaultRegion));  // captured variable

// Arithmetic on source properties
var affectedRows = await ctx.Set<Customer>().BulkInsertAsync(
    sourceQuery,
    builder => builder
        .Map(e => e.Id, f => f.Id)
        .Map(e => e.Name, f => f.Name)
        .Map(e => e.Score, f => f.Score * 2 + 1)); // arithmetic expression

Note: Value selectors support property access, EF.Property calls, constants, captured variables, and arithmetic operators (+, -, *, /, %). Method calls on properties (e.g., f.Name.Length, f.Name.ToUpper()) are not supported.

Table and schema override

// SQL Server
var options = new SqlServerBulkInsertFromQueryOptions
{
    TableName = "ArchiveCustomers",
    Schema = "archive"
};

// PostgreSQL
var options = new NpgsqlBulkInsertFromQueryOptions
{
    TableName = "ArchiveCustomers",
    Schema = "archive"
};

// SQLite
var options = new SqliteBulkInsertFromQueryOptions
{
    TableName = "ArchiveCustomers"
};

await ctx.Set<Customer>().BulkInsertAsync(sourceQuery, mapBuilder, options);

Generated SQL

All three providers generate the same structure (only identifier quoting differs):

-- SQL Server
INSERT INTO [Schema].[Table] ([Col1], [Col2])
SELECT s.[SrcCol1], s.[SrcCol2]
FROM (
  <source_query>
) AS s;

-- PostgreSQL / SQLite
INSERT INTO "Schema"."Table" ("Col1", "Col2")
SELECT s."SrcCol1", s."SrcCol2"
FROM (
  <source_query>
) AS s;

Shadow Properties

If an entity has shadow properties then the entity must be attached to the corresponding DbContext to be able to access the properties.

Clone this wiki locally