Christoph Lameter
|
f64dc58c54
Memoryless nodes: SLUB support
|
18 years ago |
Christoph Lameter
|
ef8b4520bd
Slab allocators: fail if ksize is called with a NULL parameter
|
18 years ago |
Satyam Sharma
|
2408c55037
{slub, slob}: use unlikely() for kfree(ZERO_OR_NULL_PTR) check
|
18 years ago |
Christoph Lameter
|
aadb4bc4a1
SLUB: direct pass through of page size or higher kmalloc requests
|
18 years ago |
Adrian Bunk
|
1cd7daa51b
slub.c:early_kmem_cache_node_alloc() shouldn't be __init
|
18 years ago |
Christoph Lameter
|
ba0268a8b0
SLUB: accurately compare debug flags during slab cache merge
|
18 years ago |
Christoph Lameter
|
5d540fb715
slub: do not fail if we cannot register a slab with sysfs
|
18 years ago |
Christoph Lameter
|
a2f92ee7e7
SLUB: do not fail on broken memory configurations
|
18 years ago |
Christoph Lameter
|
9e86943b6c
SLUB: use atomic_long_read for atomic_long variables
|
18 years ago |
Christoph Lameter
|
1ceef40249
SLUB: Fix dynamic dma kmalloc cache creation
|
18 years ago |
Christoph Lameter
|
fcda3d89bf
SLUB: Remove checks for MAX_PARTIAL from kmem_cache_shrink
|
18 years ago |
Peter Zijlstra
|
2208b764c1
slub: fix bug in slub debug support
|
18 years ago |
Peter Zijlstra
|
02febdf7f6
slub: add lock debugging check
|
18 years ago |
Paul Mundt
|
20c2df83d2
mm: Remove slab destructors from kmem_cache_create().
|
18 years ago |
Linus Torvalds
|
9550b105b8
slub: fix ksize() for zero-sized pointers
|
18 years ago |
Christoph Lameter
|
8ab1372fac
SLUB: Fix CONFIG_SLUB_DEBUG use for CONFIG_NUMA
|
18 years ago |
Christoph Lameter
|
a0e1d1be20
SLUB: Move sysfs operations outside of slub_lock
|
18 years ago |
Christoph Lameter
|
434e245ddd
SLUB: Do not allocate object bit array on stack
|
18 years ago |
Christoph Lameter
|
81cda66261
Slab allocators: Cleanup zeroing allocations
|
18 years ago |
Christoph Lameter
|
ce15fea827
SLUB: Do not use length parameter in slab_alloc()
|
18 years ago |
Christoph Lameter
|
12ad6843dd
SLUB: Style fix up the loop to disable small slabs
|
18 years ago |
Adrian Bunk
|
5af328a510
mm/slub.c: make code static
|
18 years ago |
Christoph Lameter
|
7b55f620e6
SLUB: Simplify dma index -> size calculation
|
18 years ago |
Christoph Lameter
|
f1b2633936
SLUB: faster more efficient slab determination for __kmalloc
|
18 years ago |
Christoph Lameter
|
dfce8648d6
SLUB: do proper locking during dma slab creation
|
18 years ago |
Christoph Lameter
|
2e443fd003
SLUB: extract dma_kmalloc_cache from get_cache.
|
18 years ago |
Christoph Lameter
|
0c71001320
SLUB: add some more inlines and #ifdef CONFIG_SLUB_DEBUG
|
18 years ago |
Christoph Lameter
|
d07dbea464
Slab allocators: support __GFP_ZERO in all allocators
|
18 years ago |
Christoph Lameter
|
6cb8f91320
Slab allocators: consistent ZERO_SIZE_PTR support and NULL result semantics
|
18 years ago |
Christoph Lameter
|
ef2ad80c7d
Slab allocators: consolidate code for krealloc in mm/util.c
|
18 years ago |