You can work around this by instantiating a new instance of the DataContext for each insert operation.
var db = new DataClasses1DataContext(@"Data Source=");
var testTableRecord1 = new testTable();
db.GetTable<testTable>().InsertOnSubmit(testTableRecord1);
db.SubmitChanges();
db = new DataClasses1DataContext(@"Data Source=");
var testTableRecord2 = new testTable();
db.GetTable<testTable>().InsertOnSubmit(testTableRecord2);
db.SubmitChanges();
If you want to understand what is happening, try putting a breakpoint inside the parital InsertTestTable method. You'll see that it isn't getting called after the second call to SubmitChanges(). Each instance of the data context maintains a cache containing every entity that is inserted, updated, or retrieved. The cache behaves like a dictionary using the primary key of the entity as the dictionary key. For some reason, LINQ is only running the custom insert logic once per cached entity. IMO this is a bug in LINQ, and I cannot find any documentation that justifies this behavior.
To understand what I mean, try setting the id of each entity to a separate value, and you'll see that the custom insert behavior actually works correctly:
var db = new DataClasses1DataContext(@"Data Source=");
var testTableRecord1 = new testTable(){ id = -1 };
var testTableRecord2 = new testTable(){ id = -2 };
db.GetTable<testTable>().InsertOnSubmit(testTableRecord1);
db.SubmitChanges();
db.GetTable<testTable>().InsertOnSubmit(testTableRecord2);
db.SubmitChanges();
My advice is to create a new instance of the data context before each call to SubmitChanges(), OR batch your inserts to a single SubmitChanges (this is best, if possible). When using LINQ, the data context should generally be treated as a short-lived, disposable object.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…