Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: 90GB table on Windows 2000

Re: 90GB table on Windows 2000

From: Billy Verreynne <vslabs_at_onwe.co.za>
Date: Tue, 15 Oct 2002 14:45:11 +0200
Message-ID: <aoh2ng$chb$1@ctb-nnrp2.saix.net>


Dusan Bolek wrote:

> I will just add one real world scenario. I will put no comments on
> this, nor I'm pretending that this behaviour should has some impact in
> enterprise solutions.

<snipped>

Interesting. And the real nitty gritty which provides some foundation for making o/s and hardware comparisons.

Personally I am not surprised by your results.

There are still some issues around NT's memory management. For example, with Daytona (what was that, NT 3.5?) they removed some of the memory management from ring 0 in order to increase performance.

Another culprit is having to deal with the bloated and interfering Explorer Desktop. IMO it has _no_ place on a server platform and is responsible for many lockups - e.g. the kernel and other processes may still run fine, but the desktop crashed the GDI and USER modules to such an extent that you can not access the machine locally.

> On the other side all WIN2000 PCs (which basicaly were just sending
> requests), exchausted all of their physical memory. Any attempt for
> running more sessions ends with program crashed with exception (OS not
> DB one).

A fine example of how Microsoft gears their caching for desktop user performance. I think that part of the reason that you saw this behaviour is how Windows handles buffer caches. To optimise desktop performance, these caches are huge. For example, loading a 10MB WinWord document or Excel Spreadsheet will cause that entire document to be cached. Worse, unless you manually perform a hard flush, old memory pages are not flushed to disk causing contention between old and new pages and intense virtual memory cache swapping. This behaviour is very noticeable on Win95/98/ME.

I recall that doing something as basic as a XCOPY on NT via the network using UNC's, caused similar behaviour as what you described.

And then I also wonder how much of the bastardize memory management of Win95/98/ME found their way into NT.. (none I have been told by NT kernel developers, but at times I do wonder). For example, every single 32bit app on Win95/98/ME uses a conventional 640KB memory stub. In theory, there should be no limitation on the number of NT processes one can spawn. At least, it should not go belly up after you open more than a 100 copies of something as simple as MineSweeper. :-)

> clearing of allocated resources was not fully successful and PCs were
> restarted by their respective owners, because of not returning to
> former shape.

One thing to remember though Dusan, is that NT out-of-the-box is very poorly configured. We ran rock stable NT systems using SQL-Server, servicing a corporate WAN with far over a 1,000 users many years ago. It did however required a lot of careful tuning and butting heads with many a brick wall.

So yeah, NT can do the job. How well though in the high-end server market is of course another question all together.

--
Billy
Received on Tue Oct 15 2002 - 07:45:11 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US