I have a client .NET application and a server .NET application, connected through sockets.
The client sends a string of 20 or so characters every 500 milliseconds.
On my local development machine, this works perfectly, but once the client and the server are on two different servers, the server is not receiving the string immediately when it's sent. The client still sends perfectly, I've confirmed this with Wireshark. I have also confirmed that the the server does receive the strings every 500 milliseconds.
The problem is that my server application that is waiting for the message only actually receives the message every 20 seconds or so - and then it receives all the content from those 20 seconds.
I use asynchronous sockets and for some reason the callback is just not invoked more than once every 20 seconds.
In AcceptCallback
it establishes the connection and call BeginReceive
handler.BeginReceive(state.buffer, 0, StateObject.BufferSize, 0, new AsyncCallback(ReadCallback), state);
This works on my local machine, but on my production server the ReadCallback doesn't happen immediately.
The BufferSize is set to 1024. I also tried setting it to 10. It makes a difference in how much data it will read from the socket at one time once the ReadCallback is invoked, but that's not really the problem here. Once it invokes ReadCallback, the rest works fine.
I'm using Microsofts Asynchronous Server Socket Example so you can see there what my ReadCallback method looks like.
How can I get the BeginReceive callback immediately when data arrives at the server?
--
UPDATE
This has been solved. It was because the server had a a single processor and single core. After adding another core, the problem was instantly solved. ReadCallback is now called immediately when the call goes through to the server.
Thankyou all for your suggestions!!
See Question&Answers more detail:
os