I am trying to do a bulk insert of a List of object (List invoices). Sometimes it fails and throws an exception. However, I want to know which rows failed. This way I can redo the bulk insert by omitting those rows. Can I do this?
Find Failed row/rows on SqlBulkCopy.WriteToServer exception and retrySqlBulkCopy by omitting those rows that failed
2.4k views Asked by user1424876 At
1
There are 1 answers
Related Questions in C#
- How to call a C language function from x86 assembly code?
- What does: "char *argv[]" mean?
- User input sanitization program, which takes a specific amount of arguments and passes the execution to a bash script
- How to crop a BMP image in half using C
- How can I get the difference in minutes between two dates and hours?
- Why will this code compile although it defines two variables with the same name?
- Compiling eBPF program in Docker fails due to missing '__u64' type
- Why can't I use the file pointer after the first read attempt fails?
- #include Header files in C with definition too
- OpenCV2 on CLion
- What is causing the store latency in this program?
- How to refer to the filepath of test data in test sourcecode?
- 9 Digit Addresses in Hexadecimal System in MacOS
- My server TCP doesn't receive messages from the client in C
- Printing the characters obtained from the array s using printf?
Related Questions in SQL-SERVER
- Dynamic query creation with Array like implementation
- 'pyodbc.Cursor' object has no attribute 'callproc', mssql with django
- Driver com.microsoft.sqlserver.jdbc.SQLServerDriver claims to not accept jdbcUrl, ${SPRING_DATASOURCE_URL}: GitHub Actions
- PHP Laravel SQLServer could not find driver
- Upsert huge amount of data by EFCore.BulkExtensions
- How to locate relevant tables or columns in a SQL Server database
- Cannot delete SQL datafile (.mdf) as its currently in use
- Writing query in CTE returning the wrong output
- Group By Sum and without Group by sum Amount is different
- plan_handle is always different for each query in SQL Server Cache
- Adding a different string to a table fails
- The specified data type in the EF modelBuilder doesn't correspond to the one that is created
- SQL71561: SqlComputedColumn: When column selected
- How to Solve Error Associated with Trusted Authority
- SQL Server Data Model and Insert Performance
Related Questions in BULKINSERT
- Connecting to SQL Server and performing BULK INSERT from Linux Container
- OpenSearch - Bulk inserting Million rows from Pandas dataframe
- DIRECTUS 1 BULK INSERT when creating X Items instead of (1 INSERT / Item to create) * X
- Is there any way to optimize get_or_create() to make the program faster?
- Update scenario is not working properly when using CASEs or IF statement ON DUPLICATE KEY UPDATE section in MySQL
- Improve insert when normalizing database in PostgreSQL
- Why do regex style line endings no longer work with bulk insert but hex style does?
- T-SQL Bulk insert not sorting first row at the top
- How to observe bulk changes in a Laravel pivot model and bulk write in mongoDB
- How does bulkload in databases such as hbase/cassandra/KV store work?
- Error when inserting special characters such as Ñ and accents á, é, í, ó, ú. using BULK OPENROWSET from python in a linux server
- how postgres bulk insert affect Debezium?
- Should I import 1 million rows at 1 time or import 1 million time 1 row to MySQL?
- Dataload Error - Insert failed, Can Upsert work?
- Handle duplication error of mongoDb in celery
Related Questions in SQLBULKCOPY
- Copy multiple CSV files to SQL Server tables using PowerShell, each CSV is named same as table name
- Error when using C# class SqlBulkCopy to write to table with ColumnStore index
- SQL Bulkcopy Error "The given value of type String from the data source cannot be converted to type nvarchar of the specified target column"
- Building a data migration tool to migrate data from MySQL to SQL Server
- How can I do SQL column mappings ignoring missing columns?
- DataTable and big SqlDecimal
- Error when importing data from Excel to SQL Server with CLR
- implement bulk copy in different environment
- 504 Gateway Timeout: STRANGE!!! while server accepting the file into DB the response is not making it to the Client
- WriteToServerAsync Not working as expected when Upgrade to .net 6
- SqlBulkCopy WriteToServer using a DataTable just does nothing
- Get primary keys after inserting data into SQL Server database using SqlBulkCopy
- Why would SqlBulkCopy be extremely slow sometimes?
- Need to improve speed of processing
- BULK INSERT into Managed Instance from a file in Azure Files?
Related Questions in SQLCLIENT
- Why is calling T-SQL INSERT with SELECT @@IDENTITY inserting duplicates?
- Code working to input data into access database but when retrieving the data "Column "XYZ" does not belong to table ."
- .NET 7 Microsoft.Data.SqlClient PlatformNotSupportedException error on DbContext use
- Visual Basic, SQL cannot be closed
- Enable event tracing in EF Core
- When Should I Use connection.OpenAsync in Dapper?
- Microsoft.Data.SqlClient 5.1.2 throws NullReference exception, stack trace points to reference assembly
- Cannot use Windows authenticator in SQL statement in a form in C# Visual Studio
- C# Windows Form SQL Incorrect Syntax near Error for no reason
- Question on how to correctly use SQL Connections
- Unable to Connect to MariaDB from C# SqlClient on Ubuntu 22.04
- How do I properly connect to SQL Server 2014 database through SqlClient / SqlConnection?
- At runtime add columns in raw SQL from C#
- Is SqlConnection thread-safe when methods are accessed in parallel by different threads?
- I add sql parameters but I get the error "not added"
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
This is one of the draw backs to using Bulk operations, the feedback is an all or none kind of response.
SqlBulkCopyis deliberately designed for use with sanitised dataSo you should first consider how to sanitise your data before you try to copy it, this can take many forms so we can't cover everything in this post.
The most common Constraints that can fail is with null values (in fields that do not support nulls) and foreign keys (either null or not matching). Usually we can pre-validate the bulk data for nulls and keys that do not yet exist, just query your bulk set to find the rows that have null values in columns that do not support nulls. You can also query for any rows where the values in foreign key columns do not yet exist in the target data base.
If you are approaching this from a generic point of view, so you do not know the table schema in advance then the conceptual process that we usually follow in this scenario is to break the bulk set down into smaller chunks and execute those chunks.
In your interface, allow the user to specify the start row and the number of rows to copy, if it works, remove the rows from the source set. If it fails, ask the user to try again.
Your last option is to not do this in bulk at all! You can still use
SqlBulkCopyhowever only send one row at a time, this allows you to handle wach row when it fails.If you were using
SqlBulkCopyfor performance reasons, (there are of course other non-performance reasons to use SqlBulkCopy) then all that performance is lost if you use this method, however if failure has a low frequency then first trying the full bulk operation, then on failure doing it row by row is an option.This article on Code Project Retrieving failed records after an SqlBulkCopy exception explains a solution to assist this but it should be pretty easy for you to come up with your own implementation.
You could combine the two approaches, trying the whole lot first, then on failure splitting the table into a number of sub tables on failure, then continue to recursively try and then split the tables until you reach tables of 1 row. This would be similar to how the user could go through the same process of elimiation manually and would still retain some performance benefits over going row by row from the start, but this is only advisable for large sets that have relatively low failure rates.