1024programmer Asp.Net .NET database big data operation solution (insert, update, delete, query, insert or update)

.NET database big data operation solution (insert, update, delete, query, insert or update)

.NET database big data operation solution (insert, update, delete, query, insert or update)

1. Function introduction

The performance bottleneck of ORM for massive data operations is entity conversion, and it cannot be achieved using regular Sql

When there are more columns, the conversion will be slower. SqlSugar will achieve the ultimate conversion performance and use the best API of the database

Operate the database to achieve extreme performance. Of course, if you don’t need sqlsugar to understand the principle, you can also use other ORM implementations

BulkCopy

BulkCopy is a technology for efficiently batch inserting large amounts of data into a database. The principle is to use the high-performance data import interface provided by the database system to write data into the database in batches at one time instead of inserting them one by one. This method is much faster than inserting items one by one because it reduces the number of database operations and thus reduces system overhead

db.Fastest().BulkCopy(lstData);
 db.Fastest().PageSize(100000).BulkCopy(insertObjs);
 //Task.Run is used in Winfom. The bottom layer is asynchronous implementation.
 

 

BulkUpdate

Use BulkCopy to quickly import large amounts of data into temporary tables. Then, data is updated through efficient operations between tables (such as UPDATE statements) instead of processing them one by one. This method reduces the frequency of database operations and reduces system overhead, thereby improving update efficiency. Finally, clean up temporary tables in a timely manner to ensure data consistency and database performance stability. This strategy can significantly improve performance and efficiency during large-scale data updates

db.Fastest().BulkUpdate(GetList())//Update
 db.Fastest().PageSize(100000).BulkUpdate(GetList())//Update
 //Task.Run is used in Winfom. The bottom layer is asynchronous implementation.
 

 

BulkMerge (5.1.4.109)

Big data: insert or update

Principle Oracle and SqlServer use Merge Into+BulkCopy, and other libraries use paging + BulkCopy + query verification

db.Fastest().BulkMerge(List);
 db.Fastest().PageSize(100000).BulkMerge(List);
 //Task.Run is used in Winfom. The bottom layer is asynchronous implementation.
 

Use BulkCopy to batch import data that needs to be updated into a temporary table (or use a temporary memory table).
Use the MERGE INTO statement to compare and merge data in the temporary table with the target table. Determine whether to perform an update operation based on the conditions that need to be updated. At the same time, you can specify the columns that need to be updated in the MERGE INTO statement.
Through one-time batch operations, efficient updates of large-scale data are achieved, reducing the number of database operations and improving performance.
This method combines batch import of database and flexible conditional update, is suitable for scenarios that need to handle large-scale data updates, and provides an efficient and fast data update solution.

 

BulkQuery

Purely manually specify mapping + Emit binding to prevent assembly, disassembly and unboxing caused by type conflicts and reach native level after warm-up

db.Queryable().ToList(); //Paging reduces memory and is suitable for complex DTO conversion and export
 List order = new List();
 db.Queryable().ForEach(it=> { order.Add(it); /*It is forbidden to operate the database here because it will cycle  */} ,2000);

BulkDelete

The reasons for slow deletion operations include transaction processing, index updates, concurrent locking, large data volume, triggers, and hardware performance. To improve performance, delete in batches.

db.Deleteable(list).PageSize(1000).ExecuteCommand();
 

 

Select INTO

Import between tables, if they are in the same library and the table already exists, the performance is faster than bulkcopy

Select INTO
 Import between tables
 //Example 1: Insert different entities into Select Into
 db.Queryable()
  //.IgnoreColumns(it=>it.Id) If it is auto-increment, it can be ignored, but the ID will be different.
  .Select(it=>new { name=it.name,......})
  .IntoTable();
   
 //Example 2: Inserting the same entity into different tables
 db.Queryable()
  //.IgnoreColumns(it=>it.Id) If it is auto-increment, it can be ignored, but the ID will be different.
 .IntoTable<Order>("New table name");
 

SqlBulkCopy Class

(A class provided in .NET for high-performance batch operation of data) combined with the ORM framework for big data insertion and update. Combining SqlBulkCopy for big data processing can provide higher performance. Here are some optimization principles and best practices:

1. Use transactions:

Place the SqlBulkCopy operation in a database transaction. This way, if an error occurs during an insert or update, you can roll back the entire operation, ensuring data consistency.

2. Adjust BatchSize

BatchSize represents the number of rows in each batch. Depending on database performance and network bandwidth, adjust batch size to improve performance. Typically, using a reasonable batch size can minimize the number of round trips to the database and improve performance.

BulkCopyTimeout represents the timeout (in seconds). If the operation is not completed within the specified time, the operation will be interrupted. Adjust the timeout based on the size of the data volume and network conditions.

3. Use a temporary table:
Sometimes, the data is inserted into a temporary table first, and then a SQL statement is executed to copy the data from the temporary table to the target table. This method can also be used Improve performance. This is because temporary tables are not affected by constraints and indexes, and insertions are faster.
4. Pay attention to memory usage:
When performing big data operations, you need to pay attention to memory usage to prevent memory overflow. Consider processing data in batches to avoid loading large amounts of data into memory at once.
In summary, when using SqlBulkCopy in combination with the ORM framework, the performance of big data insertion and update can be maximized by adjusting parameters, rationally using transactions, turning off indexes and constraints, etc.

More usage:

Nuget installation – SqlSugar 5x – .NET fructose network

This article is from the internet and does not represent1024programmerPosition, please indicate the source when reprinting:https://www.1024programmer.com/net-database-big-data-operation-solution-insert-update-delete-query-insert-or-update/

author: admin

Previous article
Next article

Leave a Reply

Your email address will not be published. Required fields are marked *

Contact Us

Contact us

181-3619-1160

Online consultation: QQ交谈

E-mail: [email protected]

Working hours: Monday to Friday, 9:00-17:30, holidays off

Follow wechat
Scan wechat and follow us

Scan wechat and follow us

Follow Weibo
Back to top
首页
微信
电话
搜索