I've got a Java program that opens a TCP stream and connects to a listening port on a remote server. I send a request to the server and I receive a response. I then let the stream sit idle for 60 minutes. At that point if I write a new request it will not arrive at the server. In short order TCP/IP will let me know that the connection has gone away.
My client code is running on a Windows laptop which is connected to a corporate environment via a VPN router. The server is whirring away up in Canada, far away from me here in central Massachusetts USA. I'm likely being routed through multiple pieces of networking equipment. I have no idea which one is causing the stream to break. (I keep thinking of Ghostbusters and "Don't cross the streams!")
What is the best term to use when a piece of equipment specifically "forgets" about a TCP connection which has been idle, causing it to break? Is that half-open, half-closed, or just plain gone?
I want to be able to simulate this timeout scenario entirely within my home lab so that I can perform easier testing -- for example where I don't have to wait for 60 minutes! What's a good technique, and what is the appropriate equipment I should use to simulate this "disconnect"? I've got extra switches here at home, as well as one old (and fiesty!) WRT router that could use some lovin'.
I do not want to enable keepalive to mask the problem. Keepalive won't prevent all possible stream disconnection scenarios, AFAIK. I want to do the best that I can at letting the problem occur and handling it quickly and cleanly when it does.
Thank you very much,
Bill S
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…