pub struct Bump { /* private fields */ }
Expand description
An arena to bump allocate into.
§No Drop
s
Objects that are bump-allocated will never have their Drop
implementation
called — unless you do it manually yourself. This makes it relatively
easy to leak memory or other resources.
If you have a type which internally manages
- an allocation from the global heap (e.g.
Vec<T>
), - open file descriptors (e.g.
std::fs::File
), or - any other resource that must be cleaned up (e.g. an
mmap
)
and relies on its Drop
implementation to clean up the internal resource,
then if you allocate that type with a Bump
, you need to find a new way to
clean up after it yourself.
Potential solutions are
- calling
drop_in_place
or usingstd::mem::ManuallyDrop
to manually drop these types, - using
bumpalo::collections::Vec
instead ofstd::vec::Vec
, or - simply avoiding allocating these problematic types within a
Bump
.
Note that not calling Drop
is memory safe! Destructors are never
guaranteed to run in Rust, you can’t rely on them for enforcing memory
safety.
§Example
use bumpalo::Bump;
// Create a new bump arena.
let bump = Bump::new();
// Allocate values into the arena.
let forty_two = bump.alloc(42);
assert_eq!(*forty_two, 42);
// Mutable references are returned from allocation.
let mut s = bump.alloc("bumpalo");
*s = "the bump allocator; and also is a buffalo";
Implementations§
Source§impl Bump
impl Bump
Sourcepub fn with_capacity(capacity: usize) -> Bump
pub fn with_capacity(capacity: usize) -> Bump
Construct a new arena with the specified capacity to bump allocate into.
§Example
let bump = bumpalo::Bump::with_capacity(100);
Sourcepub fn reset(&mut self)
pub fn reset(&mut self)
Reset this bump allocator.
Performs mass deallocation on everything allocated in this arena by
resetting the pointer into the underlying chunk of memory to the start
of the chunk. Does not run any Drop
implementations on deallocated
objects; see the Bump
type’s top-level
documentation for details.
If this arena has allocated multiple chunks to bump allocate into, then the excess chunks are returned to the global allocator.
§Example
let mut bump = bumpalo::Bump::new();
// Allocate a bunch of things.
{
for i in 0..100 {
bump.alloc(i);
}
}
// Reset the arena.
bump.reset();
// Allocate some new things in the space previously occupied by the
// original things.
for j in 200..400 {
bump.alloc(j);
}
Sourcepub fn alloc_with<F, T>(&self, f: F) -> &mut Twhere
F: FnOnce() -> T,
pub fn alloc_with<F, T>(&self, f: F) -> &mut Twhere
F: FnOnce() -> T,
Pre-allocate space for an object in this Bump
, initializes it using
the closure, then returns an exclusive reference to it.
Calling bump.alloc(x)
is essentially equivalent to calling
bump.alloc_with(|| x)
. However if you use alloc_with
, then the
closure will not be invoked until after allocating space for storing
x
on the heap.
This can be useful in certain edge-cases related to compiler
optimizations. When evaluating bump.alloc(x)
, semantically x
is
first put on the stack and then moved onto the heap. In some cases,
the compiler is able to optimize this into constructing x
directly
on the heap, however in many cases it does not.
The function alloc_with
tries to help the compiler be smarter. In
most cases doing bump.alloc_with(|| x)
on release mode will be
enough to help the compiler to realize this optimization is valid
and construct x
directly onto the heap.
§Warning
This function critically depends on compiler optimizations to achieve its desired effect. This means that it is not an effective tool when compiling without optimizations on.
Even when optimizations are on, this function does not guarantee that the value is constructed on the heap. To the best of our knowledge no such guarantee can be made in stable Rust as of 1.33.
§Panics
Panics if reserving space for T
would cause an overflow.
§Example
let bump = bumpalo::Bump::new();
let x = bump.alloc_with(|| "hello");
assert_eq!(*x, "hello");
Sourcepub fn alloc_slice_copy<T>(&self, src: &[T]) -> &mut [T]where
T: Copy,
pub fn alloc_slice_copy<T>(&self, src: &[T]) -> &mut [T]where
T: Copy,
Sourcepub fn alloc_slice_clone<T>(&self, src: &[T]) -> &mut [T]where
T: Clone,
pub fn alloc_slice_clone<T>(&self, src: &[T]) -> &mut [T]where
T: Clone,
Clone
a slice into this Bump
and return an exclusive reference to
the clone. Prefer alloc_slice_copy
if T
is Copy
.
§Panics
Panics if reserving space for the slice would cause an overflow.
§Example
#[derive(Clone, Debug, Eq, PartialEq)]
struct Sheep {
name: String,
}
let originals = vec![
Sheep { name: "Alice".into() },
Sheep { name: "Bob".into() },
Sheep { name: "Cathy".into() },
];
let bump = bumpalo::Bump::new();
let clones = bump.alloc_slice_clone(&originals);
assert_eq!(originals, clones);
Sourcepub fn alloc_slice_fill_with<T, F>(&self, len: usize, f: F) -> &mut [T]
pub fn alloc_slice_fill_with<T, F>(&self, len: usize, f: F) -> &mut [T]
Allocates a new slice of size len
into this Bump
and returns an
exclusive reference to the copy.
The elements of the slice are initialized using the supplied closure. The closure argument is the position in the slice.
§Panics
Panics if reserving space for the slice would cause an overflow.
§Example
let bump = bumpalo::Bump::new();
let x = bump.alloc_slice_fill_with(5, |i| 5*(i+1));
assert_eq!(x, &[5, 10, 15, 20, 25]);
Sourcepub fn alloc_slice_fill_copy<T: Copy>(&self, len: usize, value: T) -> &mut [T]
pub fn alloc_slice_fill_copy<T: Copy>(&self, len: usize, value: T) -> &mut [T]
Allocates a new slice of size len
into this Bump
and returns an
exclusive reference to the copy.
All elements of the slice are initialized to value
.
§Panics
Panics if reserving space for the slice would cause an overflow.
§Example
let bump = bumpalo::Bump::new();
let x = bump.alloc_slice_fill_copy(5, 42);
assert_eq!(x, &[42, 42, 42, 42, 42]);
Sourcepub fn alloc_slice_fill_clone<T: Clone>(
&self,
len: usize,
value: &T,
) -> &mut [T]
pub fn alloc_slice_fill_clone<T: Clone>( &self, len: usize, value: &T, ) -> &mut [T]
Allocates a new slice of size len
slice into this Bump
and return an
exclusive reference to the copy.
All elements of the slice are initialized to value.clone()
.
§Panics
Panics if reserving space for the slice would cause an overflow.
§Example
let bump = bumpalo::Bump::new();
let s: String = "Hello Bump!".to_string();
let x: &[String] = bump.alloc_slice_fill_clone(2, &s);
assert_eq!(x.len(), 2);
assert_eq!(&x[0], &s);
assert_eq!(&x[1], &s);
Sourcepub fn alloc_slice_fill_iter<T, I>(&self, iter: I) -> &mut [T]
pub fn alloc_slice_fill_iter<T, I>(&self, iter: I) -> &mut [T]
Allocates a new slice of size len
slice into this Bump
and return an
exclusive reference to the copy.
The elements are initialized using the supplied iterator.
§Panics
Panics if reserving space for the slice would cause an overflow, or if the supplied iterator returns fewer elements than it promised.
§Example
let bump = bumpalo::Bump::new();
let x: &[i32] = bump.alloc_slice_fill_iter([2, 3, 5].iter().cloned().map(|i| i * i));
assert_eq!(x, [4, 9, 25]);
Sourcepub fn alloc_slice_fill_default<T: Default>(&self, len: usize) -> &mut [T]
pub fn alloc_slice_fill_default<T: Default>(&self, len: usize) -> &mut [T]
Allocates a new slice of size len
slice into this Bump
and return an
exclusive reference to the copy.
All elements of the slice are initialized to T::default()
.
§Panics
Panics if reserving space for the slice would cause an overflow.
§Example
let bump = bumpalo::Bump::new();
let x = bump.alloc_slice_fill_default::<u32>(5);
assert_eq!(x, &[0, 0, 0, 0, 0]);
Sourcepub fn alloc_layout(&self, layout: Layout) -> NonNull<u8>
pub fn alloc_layout(&self, layout: Layout) -> NonNull<u8>
Allocate space for an object with the given Layout
.
The returned pointer points at uninitialized memory, and should be
initialized with
std::ptr::write
.
Sourcepub fn iter_allocated_chunks(&mut self) -> ChunkIter<'_> ⓘ
pub fn iter_allocated_chunks(&mut self) -> ChunkIter<'_> ⓘ
Returns an iterator over each chunk of allocated memory that this arena has bump allocated into.
The chunks are returned ordered by allocation time, with the most recently allocated chunk being returned first, and the least recently allocated chunk being returned last.
The values inside each chunk are also ordered by allocation time, with the most recent allocation being earlier in the slice, and the least recent allocation being towards the end of the slice.
§Safety
Because this method takes &mut self
, we know that the bump arena
reference is unique and therefore there aren’t any active references to
any of the objects we’ve allocated in it either. This potential aliasing
of exclusive references is one common footgun for unsafe code that we
don’t need to worry about here.
However, there could be regions of uninitialized memory used as padding
between allocations, which is why this iterator has items of type
[MaybeUninit<u8>]
, instead of simply [u8]
.
The only way to guarantee that there is no padding between allocations or within allocated objects is if all of these properties hold:
- Every object allocated in this arena has the same alignment, and that alignment is at most 16.
- Every object’s size is a multiple of its alignment.
- None of the objects allocated in this arena contain any internal padding.
If you want to use this iter_allocated_chunks
method, it is your
responsibility to ensure that these properties hold before calling
MaybeUninit::assume_init
or otherwise reading the returned values.
§Example
let mut bump = bumpalo::Bump::new();
// Allocate a bunch of `i32`s in this bump arena, potentially causing
// additional memory chunks to be reserved.
for i in 0..10000 {
bump.alloc(i);
}
// Iterate over each chunk we've bump allocated into. This is safe
// because we have only allocated `i32`s in this arena, which fulfills
// the above requirements.
for ch in bump.iter_allocated_chunks() {
println!("Used a chunk that is {} bytes long", ch.len());
println!("The first byte is {:?}", unsafe {
ch.get(0).unwrap().assume_init()
});
}
// Within a chunk, allocations are ordered from most recent to least
// recent. If we allocated 'a', then 'b', then 'c', when we iterate
// through the chunk's data, we get them in the order 'c', then 'b',
// then 'a'.
bump.reset();
bump.alloc(b'a');
bump.alloc(b'b');
bump.alloc(b'c');
assert_eq!(bump.iter_allocated_chunks().count(), 1);
let chunk = bump.iter_allocated_chunks().nth(0).unwrap();
assert_eq!(chunk.len(), 3);
// Safe because we've only allocated `u8`s in this arena, which
// fulfills the above requirements.
unsafe {
assert_eq!(chunk[0].assume_init(), b'c');
assert_eq!(chunk[1].assume_init(), b'b');
assert_eq!(chunk[2].assume_init(), b'a');
}
Sourcepub fn allocated_bytes(&self) -> usize
pub fn allocated_bytes(&self) -> usize
Calculates the number of bytes currently allocated across all chunks.
If you allocate types of different alignments or types with larger-than-typical alignment in the same arena, some padding bytes might get allocated in the bump arena. Note that those padding bytes will add to this method’s resulting sum, so you cannot rely on it only counting the sum of the sizes of the things you’ve allocated in the arena.
§Example
let bump = bumpalo::Bump::new();
let _x = bump.alloc_slice_fill_default::<u32>(5);
let bytes = bump.allocated_bytes();
assert!(bytes >= core::mem::size_of::<u32>() * 5);