The Memory Management Glossary
C

 Contents | News | Glossary | FAQ | Articles | Bibliography | Links | Feedback

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z - Help

Our aim is for these entries to be accurate, comprehensible, and useful, and also to have an entry for all common memory management terms. If you can't find the term you're looking for, if our definition doesn't help you, or if you'd like to suggest corrections or additions, please let us know via our feedback page.

For an explanation of the structure of the entries, and information on how to link to definitions, please see the glossary help page.


cache(1) (also known as memory cache, cache memory)

A processor's memory cache is a small piece of fast, but more expensive memory, usually static memory(1), used for copies of parts of main memory. The cache is automatically used by the processor for fast access to any data currently resident there. Access to the cache typically takes only a few processor clock cycles, whereas access to main memory may take tens or even hundreds of cycles.

What part of main memory is resident in a cache, and the mechanisms by which it is kept consistent, are quite varied. See cache policy.

Some systems have more than one level of cache. "level 1 cache" is the fastest, smallest storage level, "level 2" the next fastest, and so on.

See also: storage hierarchy; cache(2).

cache(2)

A cache is any small, fast piece of storage, used for copies of data that normally reside in a larger, slower piece of storage. The cache is used to speed up access to data resident in the slower storage.

In a typical cache, recently used data is resident in the cache (although the details of this depend on the cache policy). A cache(1) is the most common example of a cache(2).

See also: storage hierarchy.

cache memory (for full details, see cache(1))

A processor's memory cache is a small piece of fast, but more expensive memory, usually static memory(1), used for copies of parts of main memory. The cache is automatically used by the processor for fast access to any data currently resident there. Access to the cache typically takes only a few processor clock cycles, whereas access to main memory may take tens or even hundreds of cycles.

cache policy

Any cache(3) uses a cache policy to decide which data to store. A cache policy is an attempt to predict the future, so that the cache will provide swift responses to future requests.

Cache policy may be implemented in hardware, software, or a combination of both. Some systems allow programs to influence cache policy, by giving hints or directions about future use of data.

There are three main aspects of cache behavior which the cache policy can affect:

Fetch policy

This determines which data is fetched into the cache, usually as a result of receiving a request for data that isn't cached.

Eviction policy

This determines which data is discarded from the cache to provide space for newly fetched data.

Write policy

This determines how and when modifications to cached data are synchronized with the underlying storage.

See also: cache(1); cache(2); cache(3).

Related publications:


caching(3) (also known as memoization, tabling)

Caching is a heuristic that stores answers to questions asked in the past in a cache or a table, in order that they may be more quickly answered in the future. This process is also called memoization and tabling (by the Prolog community).

A "look-ahead cache" attempts to store answers to questions that will be asked soon. A cache(2) is a common example of a cache(3).

cactus stack (also known as spaghetti stack)

A cactus stack is a stack with branches. When diagrammed, its shape resembles that of a saguaro cactus.

In languages that support continuations, activation records can have indefinite extent. One technique for implementing continuations is not to copy the activation records that are captured, rather to create a fork in the stack below the captured stack frames, so that new frames appear as a parallel branch. Often the process of forking is done lazily, captured frames are only duplicated if they are modified.

card

A card is a division of memory, all cards being of equal size (in a particular area of discourse). A card is usually bigger than a word and smaller than a page. Cards are used in a technique called card-marking whereby dirty bits (which record which portions of old generations have been written into) are maintained for each card. Often the use of cards will also entail the use of a crossing map.

card marking, card-marking

A technique for managing pointer stores into old generations (which in turn is used to track inter-generational pointers). Each generation is divided into a number of equal-sized cards, and when a generation is written into, the particular card written to is recorded (often by using a bit-table). Subsequently, when scanning an older generation in order to collect a younger generation, only the recorded cards (in the old generation) need to be scanned.

See also: generational garbage collection.

Related publications:


cell (for full details, see object)

In memory management, we use the term object or cell to mean a contiguous block of memory(2) forming a single logical structure.

Cheney collector (also known as Cheney scan)

A Cheney collector uses the new semi-space of a two space collector as a queue of objects remaining to be scanned, thus eliminating the need for recursion when tracing the graph of objects.

See also: two space collector.

Related publications:


Cheney scan (for full details, see Cheney collector)

A Cheney collector uses the new semi-space of a two space collector as a queue of objects remaining to be scanned, thus eliminating the need for recursion when tracing the graph of objects.

closure

A closure is a function or procedure that is saved along with the current bindings from enclosing blocks for later invocation.

Some programming languages, such as Algol, permit nested blocks to access the local variables of enclosing blocks. Lisp-like languages further permit such an inner block (in particular a function or procedure) to be saved for later invocation. The act of saving such an inner block along with the current bindings of variables in the enclosing blocks that are referenced by the inner block, is called closing over or capturing those variables. The object created is termed a closure. A closure is invoked just like the function from which it was built, passing whatever parameters the function accepts, but when the function executes, the variables that belong to enclosing blocks will have the bindings that were in effect when the closure was created.

Relevance to memory management: A closure is typically implemented by saving both the function and any activation records that contain variables referenced by the function. The closure creates additional implicit references to the bindings closed over and hence must be accounted for in any memory management scheme. The closure itself is an object that must be managed and may have either dynamic extent or indefinite extent depending on whether it is only used by inner blocks of the creating block or passed out of the creating block.

See also: continuation.

coalesce

Coalescing is the act of merging two adjacent free blocks.

Coalescing reduces external fragmentation, but is not totally effective.

Coalescing can be done as soon as blocks are freed, or it can be deferred until some time later (known as deferred coalescing), or it might not be done at all.

Dynamic Storage Allocation: A Survey and Critical Review has details about fragmentation, and which coalescing strategies are effective under what circumstances.

Related publications:


collect

An object is collected when it is reclaimed by a garbage collector.

Similar terms: reclaim.

collection (for full details, see collection cycle)

A collection cycle is a single complete execution of a tracing garbage collection algorithm.

collection cycle (also known as collection)

A collection cycle is a single complete execution of a tracing garbage collection algorithm.

Each collection cycle includes (not necessarily in strict order) choosing a condemned set; scanning roots and objects that have not been condemned; tracing the object graph to find all condemned objects that are reachable; and reclaiming those that were not reachable.

In non-incremental garbage collection, the mutator pauses at the start of a collection cycle and cannot continue until it is complete. In incremental and parallel garbage collection, a collection cycle can be interleaved with, or simultaneous to, mutator activity.

collector(1) (for full details, see garbage collector)

A (garbage) collector is (an implementation of) a garbage collection algorithm.

collector(2)

In a garbage-collected system, the part that executes the garbage collection code, which discovers unused storage and reclaims it.

For purposes of describing incremental garbage collection, the system is divided into the mutator and the collector. These can be separate threads of computation, or interleaved within the same thread.

Historical note: This term is due to Dijkstra et al.

Opposites: mutator.

Related publications:


color, colour

In a tri-color marking scheme, each node has a one of three colors: black, white, or gray. In a treadmill, nodes may also be colored off-white.

committed (for full details, see mapped)

A range of virtual addresses is said to be mapped (committed on Windows®) if there is physical memory(2) associated with the range.

compactifying (for full details, see compaction)

Compaction is the process of moving live objects to eliminate dead space between them. Some people call this compactifying, to distinguish it from techniques for compressing data structures.

compaction (also known as compactifying)

Compaction is the process of moving live objects to eliminate dead space between them. Some people call this compactifying, to distinguish it from techniques for compressing data structures.

Compaction is used to avoid external fragmentation and to increase locality of reference.

composite object

In the PostScript® language, composite objects are the boxed objects.

Unlike a simple object, the main data (what PostScript calls the value) in a composite object are stored separately, in VM(2). Several composite objects can share the same value.

Similar terms: boxed.
Opposites: simple object.

comprehensive

A collector(1) is comprehensive if all garbage (or, all unreachable objects) is reclaimed in one collection cycle.

See also: garbage collection.

concurrent garbage collection (for full details, see parallel garbage collection)

A parallel or concurrent collector(2) executes simultaneously with the mutator, usually on a multi-processor machine.

condemned set (for full details, see threatened set)

Condemned objects are those which are candidates for recycling within a collection cycle.

connected

Objects are connected if and only if one contains a reference to the other.

See also: graph.

cons(1)

In Lisp, cons is a primitive operation creating a list element (from English "CONStruct"). By extension, a cons is the element created.

Other links: CONS in Common Lisp HyperSpec.

cons(2) (for full details, see allocate)

Allocation is the process of assigning resources. When requested to by the program, an application memory manager or allocator allocates a block of memory(2) for the program to store its data in. Allocation is also known as consing, from cons(1).

conservative garbage collection

In conservative garbage collection, the layout of objects and roots is not known, instead the collector(1) assumes that any field that looks like a pointer might be a reference.

Conservative collectors can work with programs where information about the memory(2) layout is not available, because, for example, the language doesn't support GC.

A conservative collector doesn't need to know the format of the objects, it just needs some idea of where the object boundaries are. It regards any field value that looks like a pointer to an object (or, sometimes, into the middle of one), as preventing the recycling of that object. It can't move objects, because then the references to the moved objects would need to be updated, and such ambiguous references must not be modified, in case they weren't pointers after all. Therefore, conservative collectors are usually mark-sweep collectors.

Because references are ambiguous, some objects may be retained despite being actually unreachable. In practice, this happens rarely, and refinements such as black-listing can further reduce the odds.

Opposites: exact garbage collection.
See also: ambiguous root; semi-conservative garbage collection; interior pointer.

Related publications:


constructor(1)

A constructor is a function or method that allocates and initializes an object.

Opposites: destructor(1).

constructor(2)

In C++, a constructor is a member function that is used to initialize a newly-allocated object.

The actual allocation of memory(2) is performed by operator new or the compiler (for static and stack allocation), and the new block is then passed to the appropriate constructor.

See also: destructor(2).

continuation

A continuation is the data required to restore an execution context after invocation of another context, typically as a subroutine.

Relevance to memory management: If continuations can be represented as first-class objects, as in Scheme, the execution contexts can no longer be stored on a stack, instead, (at least some) activation records have to be heap-allocated.

See also: closure.

control stack (also known as activation stack, execution stack)

A stack that stores activation records, particularly subroutine return information, is known as a control stack.

Typically the control stack is supported and used by the hardware architecture and the operating system, limiting the types and sizes of objects that can be stored on it. Often, only one type of object, a stack frame, is permitted, and the layout of that is defined by the hardware architecture.

Relevance to memory management: Theoretically, a control stack is simply an array of activation records, and hence just another object managed by the memory manager. In practice, the control stack is central to the performance of the hardware architecture and may require special treatment. In particular, it may not be accessible as ordinary memory(2), or it may have its own cache(2) with specific updating requirements.

Similar terms: stack.
See also: data stack.

copying garbage collection (also known as scavenging garbage collection)

Copying garbage collection is a kind of tracing garbage collection that operates by relocating reachable objects (this is sometimes called scavenging) and then reclaiming objects that are left behind, which must be unreachable and therefore dead.

A copying garbage collection relies on being able to find and correct all references to copied objects.

Copying garbage collection
Diagram: Copying garbage collection

Similar terms: moving.
See also: broken heart; forwarding pointer; two-space collector.

core

A historical synonym for main memory, deriving from the cores or ferrite rings which were once the main technology used to implement main memory.

Similar terms: main memory.

creation space

In generational garbage collection, when generations are divided into buckets, the creation space is where new objects are created in each generation.

This term is sometimes used as a synonym for nursery space.

Opposites: aging space.
See also: generational garbage collection.

crossing map

Where memory(2) has already been divided into some fixed-sized unit (for example, pages or cards), a crossing map records where objects lie across the boundaries of the fixed-sized units. In other words, which fixed-sized units do not start with the beginning of an object.

A system which implements remembered sets by page-marking or card-marking needs to scan all the pointers in the page or card. If the system can not scan partial objects (or requires information in the object header in order to scan a partial object), a crossing map is necessary to find the beginning of the first object in the unit.

Relevance to memory management: In a sense, a crossing map is an optimization of tagged architecture. It represents the minimum information necessary to determine how to interpret any word of memory.

cyclic data structure

A data structure is cyclic if some of its references form a loop; that is, there's an object that can be reached by following references from itself.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z - Help