Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.misc -> Oracle performance in high-volume data storage.

Oracle performance in high-volume data storage.

From: Petri J. Riipinen <petri.riipinen_at_tele.fi>
Date: 1997/04/08
Message-ID: <5id95f$8mg@leivo.inet.fi>#1/1

Hi there,

We have a data collection running 24h (several processes in different workstations) and we get about 200M of data each day. At the moment we store this into proprietary file for size and performance reasons. Unfortunately, this makes the later analysis of the data a bit difficult with any standard tools.

However, now we would like to make the accessing and analyzing of the data easier. That would mean that we would store the data into Oracle (that's what we use elsewhere). The database would be probably running on Sun or HP-UX.

Our requirements:
- Constant database writing, rate being about 2,5k / second
  this writing must not cause any blocking to the writer processes.
- Reading at the same time in bursts (when someone runs the analyzing tool).

I have the feeling that Oracle should handle this without any problems.

What is your "gut" feeling about this? Should we look into parallel processing version? Any references to any high-volume storage applications?

Received on Tue Apr 08 1997 - 00:00:00 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US