Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> DatagramSocket.receive() not working in Oracle JServer
I've put my working (in JDK 1.2.2) Java Code into Oracle (loadjava etc.).
It is a small client application which opens a DatagramSocket
connection to the authorisation server, sends a password and receives
an answer - 0-pasword OK, 1-password wrong.
For a small value of timeout (datagramsocket.setSoTimeout(timeout)) I received InterruptedIOException = it means timeout, not enough time to receive data from server, so I increased timeout to 500 milliseconds, and here starts my problem. My Java application stops responding (it hangs) exactly at line where I am reading the answer from the autorisation server (datagramsocket.receive(datagrampacket)). On the console of the autorisation server I see the request and I see, that the answer (password correct or not) was sent back to the client (Java application stored in Oracle), but my java client never returns this answer, because it loops (or wait) forever. If I set some dummy IP address (not the authorisation server), the application waits exactly time I specified and then I receive InterruptedIOException = timeout. So only in the case the server sends back the response to the client, the client stops responding.
So for me it seems as I cant read response data from DatagramSocket. Has anyone succeded?
I dont know, if it could be a problem of the authorisation server?
Is it problem of some privilliges?(the user who called this application has JAVASYSPRIV and JAVAUSERPRIV)
Is it problem of some buffer space?
I dont see any Exception in UDUMP file. Is there some other log file, where I should have a look?
Thanx
Branislav
Received on Mon Sep 17 2001 - 02:13:35 CDT