1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143
|
/*
* Copyright (C) 2017 Apple Inc. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY
* EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR
* CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
* EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
* PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
* PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
* OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
#pragma once
#include "MarkedBlock.h"
#include "MarkedSpace.h"
#include <wtf/text/CString.h>
namespace JSC {
class AlignedMemoryAllocator;
// The idea of subspaces is that you can provide some custom behavior for your objects if you
// allocate them from a custom Subspace in which you override some of the virtual methods. This
// class is the baseclass of Subspaces and it provides a reasonable default implementation, where
// sweeping assumes immortal structure. The common ways of overriding this are:
//
// - Provide customized destructor behavior. You can change how the destructor is called. You can
// also specialize the destructor call in the loop.
//
// - Use the Subspace as a quick way to iterate all of the objects in that subspace.
class Subspace {
WTF_MAKE_NONCOPYABLE(Subspace);
WTF_MAKE_FAST_ALLOCATED;
public:
JS_EXPORT_PRIVATE Subspace(CString name, Heap&, AllocatorAttributes, AlignedMemoryAllocator*);
JS_EXPORT_PRIVATE virtual ~Subspace();
const char* name() const { return m_name.data(); }
MarkedSpace& space() const { return m_space; }
const AllocatorAttributes& attributes() const { return m_attributes; }
AlignedMemoryAllocator* alignedMemoryAllocator() const { return m_alignedMemoryAllocator; }
// The purpose of overriding this is to specialize the sweep for your destructors. This won't
// be called for no-destructor blocks. This must call MarkedBlock::finishSweepKnowingSubspace.
virtual void finishSweep(MarkedBlock::Handle&, FreeList*);
// These get called for large objects.
virtual void destroy(VM&, JSCell*);
MarkedAllocator* tryAllocatorFor(size_t);
MarkedAllocator* allocatorFor(size_t);
JS_EXPORT_PRIVATE void* allocate(size_t);
JS_EXPORT_PRIVATE void* allocate(GCDeferralContext*, size_t);
JS_EXPORT_PRIVATE void* tryAllocate(size_t);
JS_EXPORT_PRIVATE void* tryAllocate(GCDeferralContext*, size_t);
void prepareForAllocation();
void didCreateFirstAllocator(MarkedAllocator* allocator) { m_allocatorForEmptyAllocation = allocator; }
// Finds an empty block from any Subspace that agrees to trade blocks with us.
MarkedBlock::Handle* findEmptyBlockToSteal();
template<typename Func>
void forEachAllocator(const Func&);
template<typename Func>
void forEachMarkedBlock(const Func&);
template<typename Func>
void forEachNotEmptyMarkedBlock(const Func&);
template<typename Func>
void forEachLargeAllocation(const Func&);
template<typename Func>
void forEachMarkedCell(const Func&);
template<typename Func>
void forEachLiveCell(const Func&);
static ptrdiff_t offsetOfAllocatorForSizeStep() { return OBJECT_OFFSETOF(Subspace, m_allocatorForSizeStep); }
MarkedAllocator** allocatorForSizeStep() { return &m_allocatorForSizeStep[0]; }
private:
MarkedAllocator* allocatorForSlow(size_t);
// These slow paths are concerned with large allocations and allocator creation.
void* allocateSlow(GCDeferralContext*, size_t);
void* tryAllocateSlow(GCDeferralContext*, size_t);
void didAllocate(void*);
MarkedSpace& m_space;
CString m_name;
AllocatorAttributes m_attributes;
AlignedMemoryAllocator* m_alignedMemoryAllocator;
std::array<MarkedAllocator*, MarkedSpace::numSizeClasses> m_allocatorForSizeStep;
MarkedAllocator* m_firstAllocator { nullptr };
MarkedAllocator* m_allocatorForEmptyAllocation { nullptr }; // Uses the MarkedSpace linked list of blocks.
SentinelLinkedList<LargeAllocation, BasicRawSentinelNode<LargeAllocation>> m_largeAllocations;
};
ALWAYS_INLINE MarkedAllocator* Subspace::tryAllocatorFor(size_t size)
{
if (size <= MarkedSpace::largeCutoff)
return m_allocatorForSizeStep[MarkedSpace::sizeClassToIndex(size)];
return nullptr;
}
ALWAYS_INLINE MarkedAllocator* Subspace::allocatorFor(size_t size)
{
if (size <= MarkedSpace::largeCutoff) {
if (MarkedAllocator* result = m_allocatorForSizeStep[MarkedSpace::sizeClassToIndex(size)])
return result;
return allocatorForSlow(size);
}
return nullptr;
}
} // namespace JSC
|