Data management algorithm needed

From: Chris <spamfree_at_no_spam_today_thanks.com>
Date: Mon, 31 Dec 2001 13:01:37 -0600
Message-ID: <3C30B611.34D7B51B_at_no_spam_today_thanks.com>



I need to manage random-sized blocks of data: fetch them quickly, read them, write them, etc. When reading and writing such blocks into memory or into a file, there is the problem of fragmentation -- when you delete a block, how do you re-use the space, particularly when all blocks are of odd sizes?

I know that databases typically handle the problem by organizing everything into data pages, where each pages has some free space, and when the space is filled the page is split. There are different ways of organizing and indexing the pages.

Does anyone know of a good survey of the techniques for solving this class of problems? Perhaps a book or a chapter of a book? Received on Mon Dec 31 2001 - 20:01:37 CET

Original text of this message