Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
484 views
in Technique[技术] by (71.8m points)

c# - Thread safety for DataTable

I had read this answer ADO.NET DataTable/DataRow Thread Safety, and can't understand some things. Particularly I can't understand [2] article. What kind of wrapper I need to use? Can anyone give an example?

Also I can't understand what author means talking about cascading lock and full lock. Please example too.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

DataTable is simply not designed or intended for concurrent usage (in particular where there is any form of mutation involved). The advisable "wrapper" here would, in my view, be either:

  • remove the need to work on the DataTable concurrently (when involving mutation), or:
  • remove the DataTable, instead using a data-structure that either directly supports what you need (for example a concurrent collection), or which is much simpler and can be trivially synchronized (either exclusive or reader/writer)

Basically: change the problem.


From comments:

The code looks like:

Parallel.ForEach(strings, str=>
{
    DataRow row;
    lock(table){
        row= table.NewRow();
    }
    MyParser.Parse(str, out row);
    lock(table){
        table.Rows.Add(row)
    }
});

I can only hope that out row is a typo here, as that won't actually lead to it populating the row created via NewRow(), but: if you absolutely have to use that approach, you can't use NewRow, as the pending row is kinda shared. Your best bet would be:

Parallel.ForEach(strings, str=> {
    object[] values = MyParser.Parse(str);
    lock(table) {
        table.Rows.Add(values);
    }
});

The important change in the above is that the lock covers the entire new row process. Note that you will have no guarantee of order when using Parallel.ForEach like this, so it is important that the final order does not need to match exactly (which shouldn't be a problem if the data includes a time component).

However! I still think you are approaching this the wrong way: for parallelism to be relevant, it must be non-trivial data. If you have non-trivial data, you really don't want to have to buffer it all in memory. I strongly suggest doing something like the following, which will work fine on a single thread:

using(var bcp = new SqlBulkCopy())
using(var reader = ObjectReader.Create(ParseFile(path)))
{
    bcp.DestinationTable = "MyLog";
    bcp.WriteToServer(reader);    
}
...
static IEnumerable<LogRow> ParseFile(string path)
{
    using(var reader = File.OpenText(path))
    {
        string line;
        while((line = reader.ReadLine()) != null)
        {
            yield return new LogRow {
                // TODO: populate the row from line here
            };
        }
    }
}
...
public sealed class LogRow {
    /* define your schema here */
}

Advantages:

  • no buffering - this is a fully streaming operation (yield return does not put things into a list or similar)
  • for that reason, the rows can start streaming immediately without needing to wait for the entire file to be pre-processed first
  • no memory saturation issues
  • no threading complications / overheads
  • you get to preserve the original order (not usually critical, but nice)
  • you are only constrained by how fast you can read the original file, which is typically faster on a single thread than it is from multiple threads (contention on a single IO device is just overhead)
  • avoids all the overheads of DataTable, which is overkill here - because it is so flexible it has significant overheads
  • read (from the log file) and write (to the database) are now concurrent rather than sequential

I do a lot of things like ^^^ in my own work, and from experience it is usually at least twice as fast than populating a DataTable in memory first.


And finally - here's an example of an IEnumerable<T> implementation that accepts concurrent readers and writers without requiring everything to be buffered in memory - which would allow multiple threads to parse the data (calling Add and finally Close) with a single thread for SqlBulkCopy via the IEnumerable<T> API:

using System;
using System.Collections;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;

/// <summary>
/// Acts as a container for concurrent read/write flushing (for example, parsing a
/// file while concurrently uploading the contents); supports any number of concurrent
/// writers and readers, but note that each item will only be returned once (and once
/// fetched, is discarded). It is necessary to Close() the bucket after adding the last
/// of the data, otherwise any iterators will never finish
/// </summary>
class ThreadSafeBucket<T> : IEnumerable<T>
{
    private readonly Queue<T> queue = new Queue<T>();

    public void Add(T value)
    {
        lock (queue)
        {
            if (closed) // no more data once closed
                throw new InvalidOperationException("The bucket has been marked as closed");

            queue.Enqueue(value);
            if (queue.Count == 1)
            { // someone may be waiting for data
                Monitor.PulseAll(queue);
            }
        }
    }

    public void Close()
    {
        lock (queue)
        {
            closed = true;
            Monitor.PulseAll(queue);
        }
    }
    private bool closed;

    public IEnumerator<T> GetEnumerator()
    {
        while (true)
        {
            T value;
            lock (queue)
            {
                if (queue.Count == 0)
                {
                    // no data; should we expect any?
                    if (closed) yield break; // nothing more ever coming

                    // else wait to be woken, and redo from start
                    Monitor.Wait(queue);
                    continue;
                }
                value = queue.Dequeue();
            }
            // yield it **outside** of the lock
            yield return value;
        }
    }

    IEnumerator IEnumerable.GetEnumerator()
    {
        return GetEnumerator();
    }
}

static class Program
{
    static void Main()
    {
        var bucket = new ThreadSafeBucket<int>();
        int expectedTotal = 0;
        ThreadPool.QueueUserWorkItem(delegate
        {
            int count = 0, sum = 0;
            foreach(var item in bucket)
            {
                count++;
                sum += item;
                if ((count % 100) == 0)
                    Console.WriteLine("After {0}: {1}", count, sum);
            }
            Console.WriteLine("Total over {0}: {1}", count, sum);
        });
        Parallel.For(0, 5000,
            new ParallelOptions { MaxDegreeOfParallelism = 3 },
            i => {
                bucket.Add(i);
                Interlocked.Add(ref expectedTotal, i);
            }
        );
        Console.WriteLine("all data added; closing bucket");
        bucket.Close();
        Thread.Sleep(100);
        Console.WriteLine("expecting total: {0}",
            Interlocked.CompareExchange(ref expectedTotal, 0, 0));
        Console.ReadLine();


    }

}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...