Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Measuring test vs. production db performance

Measuring test vs. production db performance

From: Nick Marden <nick_at_quote.com>
Date: 1996/11/12
Message-ID: <3288FA1E.2248@quote.com>#1/1

I have the (unfun) task of measuring some system metrics on our test and production database servers and attempting to correlate the numbers. The high level question we are trying to get a handle on is, "If our test database has parameters x, y, and z (for example, CPU load, disk I/O, and swapping) with values a, b, and c, when running a certain application, what values will x, y, and z take on in our production environment, as a function of a, b, and c?"

Since our production database is a big, beefy SparcServer 1000 with 8 CPUs and 750MB of memory, we are unlikely to duplicate it in our test environment. Thus the need for a calibration of sorts for major system performance measurements. My guess is that there are well-defined scripts you can run to test each performance measurement separately (or together) in each environment, and then determine the calibration from those results.

Anyone know where to start (books, web pages, published research)?

All help appreciated. Received on Tue Nov 12 1996 - 00:00:00 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US