1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932
|
/*
* Copyright (C) 2013 Google Inc. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are
* met:
*
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above
* copyright notice, this list of conditions and the following disclaimer
* in the documentation and/or other materials provided with the
* distribution.
* * Neither the name of Google Inc. nor the names of its
* contributors may be used to endorse or promote products derived from
* this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
* A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
* OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
* LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
#ifndef Visitor_h
#define Visitor_h
#include "platform/PlatformExport.h"
#include "platform/heap/StackFrameDepth.h"
#include "platform/heap/ThreadState.h"
#include "wtf/Assertions.h"
#include "wtf/Atomics.h"
#include "wtf/Deque.h"
#include "wtf/Forward.h"
#include "wtf/HashMap.h"
#include "wtf/HashTraits.h"
#include "wtf/InstanceCounter.h"
#include "wtf/OwnPtr.h"
#include "wtf/RefPtr.h"
#include "wtf/TypeTraits.h"
#include "wtf/WeakPtr.h"
#if ENABLE(GC_PROFILING)
#include "wtf/text/WTFString.h"
#endif
#if ENABLE(ASSERT)
#define DEBUG_ONLY(x) x
#else
#define DEBUG_ONLY(x)
#endif
namespace blink {
template<typename T> class GarbageCollected;
template<typename T> class GarbageCollectedFinalized;
class GarbageCollectedMixin;
class HeapObjectHeader;
class InlinedGlobalMarkingVisitor;
template<typename T> class Member;
template<typename T> class WeakMember;
class Visitor;
template <typename T> struct IsGarbageCollectedType;
#define STATIC_ASSERT_IS_GARBAGE_COLLECTED(T, ErrorMessage) \
static_assert(IsGarbageCollectedType<T>::value, ErrorMessage)
#define STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, ErrorMessage) \
static_assert(!IsGarbageCollectedType<T>::value, ErrorMessage)
template<bool needsTracing, WTF::WeakHandlingFlag weakHandlingFlag, WTF::ShouldWeakPointersBeMarkedStrongly strongify, typename T, typename Traits> struct CollectionBackingTraceTrait;
// The TraceMethodDelegate is used to convert a trace method for type T to a TraceCallback.
// This allows us to pass a type's trace method as a parameter to the PersistentNode
// constructor. The PersistentNode constructor needs the specific trace method due an issue
// with the Windows compiler which instantiates even unused variables. This causes problems
// in header files where we have only forward declarations of classes.
template<typename T, void (T::*method)(Visitor*)>
struct TraceMethodDelegate {
static void trampoline(Visitor* visitor, void* self) { (reinterpret_cast<T*>(self)->*method)(visitor); }
};
// GCInfo contains meta-data associated with objects allocated in the
// Blink heap. This meta-data consists of a function pointer used to
// trace the pointers in the object during garbage collection, an
// indication of whether or not the object needs a finalization
// callback, and a function pointer used to finalize the object when
// the garbage collector determines that the object is no longer
// reachable. There is a GCInfo struct for each class that directly
// inherits from GarbageCollected or GarbageCollectedFinalized.
struct GCInfo {
bool hasFinalizer() const { return m_nonTrivialFinalizer; }
bool hasVTable() const { return m_hasVTable; }
TraceCallback m_trace;
FinalizationCallback m_finalize;
bool m_nonTrivialFinalizer;
bool m_hasVTable;
#if ENABLE(GC_PROFILING)
// |m_className| is held as a reference to prevent dtor being called at exit.
const String& m_className;
#endif
};
#if ENABLE(ASSERT)
PLATFORM_EXPORT void assertObjectHasGCInfo(const void*, size_t gcInfoIndex);
#endif
// The FinalizerTraitImpl specifies how to finalize objects. Object
// that inherit from GarbageCollectedFinalized are finalized by
// calling their 'finalize' method which by default will call the
// destructor on the object.
template<typename T, bool isGarbageCollectedFinalized>
struct FinalizerTraitImpl;
template<typename T>
struct FinalizerTraitImpl<T, true> {
static void finalize(void* obj) { static_cast<T*>(obj)->finalizeGarbageCollectedObject(); };
};
template<typename T>
struct FinalizerTraitImpl<T, false> {
static void finalize(void* obj) { };
};
// The FinalizerTrait is used to determine if a type requires
// finalization and what finalization means.
//
// By default classes that inherit from GarbageCollectedFinalized need
// finalization and finalization means calling the 'finalize' method
// of the object. The FinalizerTrait can be specialized if the default
// behavior is not desired.
template<typename T>
struct FinalizerTrait {
static const bool nonTrivialFinalizer = WTF::IsSubclassOfTemplate<typename WTF::RemoveConst<T>::Type, GarbageCollectedFinalized>::value;
static void finalize(void* obj) { FinalizerTraitImpl<T, nonTrivialFinalizer>::finalize(obj); }
};
// Trait to get the GCInfo structure for types that have their
// instances allocated in the Blink garbage-collected heap.
template<typename T> struct GCInfoTrait;
template<typename T, bool = WTF::IsSubclassOfTemplate<typename WTF::RemoveConst<T>::Type, GarbageCollected>::value> class NeedsAdjustAndMark;
template<typename T>
class NeedsAdjustAndMark<T, true> {
public:
static const bool value = false;
};
template <typename T> const bool NeedsAdjustAndMark<T, true>::value;
template<typename T>
class NeedsAdjustAndMark<T, false> {
public:
static const bool value = WTF::IsSubclass<typename WTF::RemoveConst<T>::Type, GarbageCollectedMixin>::value;
};
template <typename T> const bool NeedsAdjustAndMark<T, false>::value;
template<typename T, bool = NeedsAdjustAndMark<T>::value> class DefaultTraceTrait;
// HasInlinedTraceMethod<T>::value is true for T supporting
// T::trace(InlinedGlobalMarkingVisitor).
// The template works by checking if T::HasInlinedTraceMethodMarker type is
// available using SFINAE. The HasInlinedTraceMethodMarker type is defined
// by DECLARE_TRACE and DEFINE_INLINE_TRACE helper macros, which are used to
// define trace methods supporting both inlined/uninlined tracing.
template <typename T>
struct HasInlinedTraceMethod {
#if ENABLE(INLINED_TRACE)
private:
typedef char YesType;
struct NoType {
char padding[8];
};
template <typename U> static YesType checkMarker(typename U::HasInlinedTraceMethodMarker*);
template <typename U> static NoType checkMarker(...);
public:
static const bool value = sizeof(checkMarker<T>(nullptr)) == sizeof(YesType);
#else
static const bool value = false;
#endif
};
template <typename T, bool = HasInlinedTraceMethod<T>::value>
struct TraceCompatibilityAdaptor;
// The TraceTrait is used to specify how to mark an object pointer and
// how to trace all of the pointers in the object.
//
// By default, the 'trace' method implemented on an object itself is
// used to trace the pointers to other heap objects inside the object.
//
// However, the TraceTrait can be specialized to use a different
// implementation. A common case where a TraceTrait specialization is
// needed is when multiple inheritance leads to pointers that are not
// to the start of the object in the Blink garbage-collected heap. In
// that case the pointer has to be adjusted before marking.
template<typename T>
class TraceTrait {
public:
// Default implementation of TraceTrait<T>::trace just statically
// dispatches to the trace method of the class T.
template<typename VisitorDispatcher>
static void trace(VisitorDispatcher visitor, void* self)
{
TraceCompatibilityAdaptor<T>::trace(visitor, static_cast<T*>(self));
}
template<typename VisitorDispatcher>
static void mark(VisitorDispatcher visitor, const T* t)
{
DefaultTraceTrait<T>::mark(visitor, t);
}
#if ENABLE(ASSERT)
static void checkGCInfo(const T* t)
{
DefaultTraceTrait<T>::checkGCInfo(t);
}
#endif
};
template<typename T> class TraceTrait<const T> : public TraceTrait<T> { };
#if ENABLE(INLINED_TRACE)
#define DECLARE_TRACE(maybevirtual, maybeoverride) \
public: \
typedef int HasInlinedTraceMethodMarker; \
maybevirtual void trace(Visitor*) maybeoverride; \
maybevirtual void trace(InlinedGlobalMarkingVisitor) maybeoverride; \
private: \
template <typename VisitorDispatcher> void traceImpl(VisitorDispatcher); \
public:
#define DEFINE_TRACE(T) \
void T::trace(Visitor* visitor) { traceImpl(visitor); } \
void T::trace(InlinedGlobalMarkingVisitor visitor) { traceImpl(visitor); } \
template <typename VisitorDispatcher> \
ALWAYS_INLINE void T::traceImpl(VisitorDispatcher visitor)
#define DEFINE_INLINE_TRACE(maybevirtual, maybeoverride) \
typedef int HasInlinedTraceMethodMarker; \
maybevirtual void trace(Visitor* visitor) maybeoverride { traceImpl(visitor); } \
maybevirtual void trace(InlinedGlobalMarkingVisitor visitor) maybeoverride { traceImpl(visitor); } \
template <typename VisitorDispatcher> \
inline void traceImpl(VisitorDispatcher visitor)
#else // !ENABLE(INLINED_TRACE)
#define DECLARE_TRACE(maybevirtual, maybeoverride) \
public: \
maybevirtual void trace(Visitor*) maybeoverride;
#define DEFINE_TRACE(T) void T::trace(Visitor* visitor)
#define DEFINE_INLINE_TRACE(maybevirtual, maybeoverride) \
maybevirtual void trace(Visitor* visitor) maybeoverride
#endif
// If MARKER_EAGER_TRACING is set to 1, a marker thread is allowed
// to directly invoke the trace() method of not-as-yet marked objects upon
// marking. If it is set to 0, the |trace()| callback for an object will
// be pushed onto an explicit mark stack, which the marker proceeds to
// iteratively pop and invoke. The eager scheme enables inlining of a trace()
// method inside another, the latter keeps system call stack usage bounded
// and under explicit control.
//
// If eager tracing leads to excessively deep |trace()| call chains (and
// the system stack usage that this brings), the marker implementation will
// switch to using an explicit mark stack. Recursive and deep object graphs
// are uncommon for Blink objects.
//
// A class type can opt out of eager tracing by declaring a TraceEagerlyTrait<>
// specialization, mapping the trait's |value| to |false| (see the
// WILL_NOT_BE_EAGERLY_TRACED() macros below.) For Blink, this is done for
// the small set of GCed classes that are directly recursive.
#define MARKER_EAGER_TRACING 1
// The TraceEagerlyTrait<T> trait controls whether or not a class
// (and its subclasses) should be eagerly traced or not.
//
// If |TraceEagerlyTrait<T>::value| is |true|, then the marker thread
// should invoke |trace()| on not-yet-marked objects deriving from class T
// right away, and not queue their trace callbacks on its marker stack,
// which it will do if |value| is |false|.
//
// The trait can be declared to enable/disable eager tracing for a class T
// and any of its subclasses, or just to the class T, but none of its
// subclasses.
//
template<typename T, typename Enabled = void>
class TraceEagerlyTrait {
public:
static const bool value = MARKER_EAGER_TRACING;
};
#define WILL_NOT_BE_EAGERLY_TRACED(TYPE) \
template<typename T> \
class TraceEagerlyTrait<T, typename WTF::EnableIf<WTF::IsSubclass<T, TYPE>::value>::Type> { \
public: \
static const bool value = false; \
}
// Disable eager tracing for TYPE, but not any of its subclasses.
#define WILL_NOT_BE_EAGERLY_TRACED_CLASS(TYPE) \
template<> \
class TraceEagerlyTrait<TYPE> { \
public: \
static const bool value = false; \
}
template<typename Collection>
struct OffHeapCollectionTraceTrait;
template<typename T>
struct ObjectAliveTrait {
static bool isHeapObjectAlive(Visitor*, T*);
};
// VisitorHelper contains common implementation of Visitor helper methods.
//
// VisitorHelper avoids virtual methods by using CRTP.
// c.f. http://en.wikipedia.org/wiki/Curiously_Recurring_Template_Pattern
template<typename Derived>
class VisitorHelper {
public:
// One-argument templated mark method. This uses the static type of
// the argument to get the TraceTrait. By default, the mark method
// of the TraceTrait just calls the virtual two-argument mark method on this
// visitor, where the second argument is the static trace method of the trait.
template<typename T>
void mark(T* t)
{
if (!t)
return;
#if ENABLE(ASSERT)
TraceTrait<T>::checkGCInfo(t);
#endif
TraceTrait<T>::mark(Derived::fromHelper(this), t);
STATIC_ASSERT_IS_GARBAGE_COLLECTED(T, "attempted to mark non garbage collected object");
}
// Member version of the one-argument templated trace method.
template<typename T>
void trace(const Member<T>& t)
{
Derived::fromHelper(this)->mark(t.get());
}
// Fallback method used only when we need to trace raw pointers of T.
// This is the case when a member is a union where we do not support members.
template<typename T>
void trace(const T* t)
{
Derived::fromHelper(this)->mark(const_cast<T*>(t));
}
template<typename T>
void trace(T* t)
{
Derived::fromHelper(this)->mark(t);
}
// WeakMember version of the templated trace method. It doesn't keep
// the traced thing alive, but will write null to the WeakMember later
// if the pointed-to object is dead. It's lying for this to be const,
// but the overloading resolver prioritizes constness too high when
// picking the correct overload, so all these trace methods have to have
// the same constness on their argument to allow the type to decide.
template<typename T>
void trace(const WeakMember<T>& t)
{
// Check that we actually know the definition of T when tracing.
static_assert(sizeof(T), "we need to know the definition of the type we are tracing");
registerWeakCell(const_cast<WeakMember<T>&>(t).cell());
STATIC_ASSERT_IS_GARBAGE_COLLECTED(T, "cannot weak trace non garbage collected object");
}
template<typename T>
void traceInCollection(T& t, WTF::ShouldWeakPointersBeMarkedStrongly strongify)
{
HashTraits<T>::traceInCollection(Derived::fromHelper(this), t, strongify);
}
// Fallback trace method for part objects to allow individual trace methods
// to trace through a part object with visitor->trace(m_partObject). This
// takes a const argument, because otherwise it will match too eagerly: a
// non-const argument would match a non-const Vector<T>& argument better
// than the specialization that takes const Vector<T>&. For a similar reason,
// the other specializations take a const argument even though they are
// usually used with non-const arguments, otherwise this function would match
// too well.
template<typename T>
void trace(const T& t)
{
if (WTF::IsPolymorphic<T>::value) {
intptr_t vtable = *reinterpret_cast<const intptr_t*>(&t);
if (!vtable)
return;
}
const_cast<T&>(t).trace(Derived::fromHelper(this));
}
// For simple cases where you just want to zero out a cell when the thing
// it is pointing at is garbage, you can use this. This will register a
// callback for each cell that needs to be zeroed, so if you have a lot of
// weak cells in your object you should still consider using
// registerWeakMembers above.
//
// In contrast to registerWeakMembers, the weak cell callbacks are
// run on the thread performing garbage collection. Therefore, all
// threads are stopped during weak cell callbacks.
template<typename T>
void registerWeakCell(T** cell)
{
Derived::fromHelper(this)->registerWeakCellWithCallback(reinterpret_cast<void**>(cell), &handleWeakCell<T>);
}
// The following trace methods are for off-heap collections.
template<typename T, size_t inlineCapacity>
void trace(const Vector<T, inlineCapacity>& vector)
{
OffHeapCollectionTraceTrait<Vector<T, inlineCapacity, WTF::DefaultAllocator> >::trace(Derived::fromHelper(this), vector);
}
template<typename T, size_t N>
void trace(const Deque<T, N>& deque)
{
OffHeapCollectionTraceTrait<Deque<T, N> >::trace(Derived::fromHelper(this), deque);
}
#if !ENABLE(OILPAN)
// These trace methods are needed to allow compiling and calling trace on
// transition types. We need to support calls in the non-oilpan build
// because a fully transitioned type (which will have its trace method
// called) might trace a field that is in transition. Once transition types
// are removed these can be removed.
template<typename T> void trace(const OwnPtr<T>&) { }
template<typename T> void trace(const RefPtr<T>&) { }
template<typename T> void trace(const RawPtr<T>&) { }
template<typename T> void trace(const WeakPtr<T>&) { }
// On non-oilpan builds, it is convenient to allow calling trace on
// WillBeHeap{Vector,Deque}<FooPtrWillBeMember<T>>.
// Forbid tracing on-heap objects in off-heap collections.
// This is forbidden because convservative marking cannot identify
// those off-heap collection backing stores.
template<typename T, size_t inlineCapacity> void trace(const Vector<OwnPtr<T>, inlineCapacity>& vector)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Vector");
}
template<typename T, size_t inlineCapacity> void trace(const Vector<RefPtr<T>, inlineCapacity>& vector)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Vector");
}
template<typename T, size_t inlineCapacity> void trace(const Vector<RawPtr<T>, inlineCapacity>& vector)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Vector");
}
template<typename T, size_t inlineCapacity> void trace(const Vector<WeakPtr<T>, inlineCapacity>& vector)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Vector");
}
template<typename T, size_t N> void trace(const Deque<OwnPtr<T>, N>& deque)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Deque");
}
template<typename T, size_t N> void trace(const Deque<RefPtr<T>, N>& deque)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Deque");
}
template<typename T, size_t N> void trace(const Deque<RawPtr<T>, N>& deque)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Deque");
}
template<typename T, size_t N> void trace(const Deque<WeakPtr<T>, N>& deque)
{
STATIC_ASSERT_IS_NOT_GARBAGE_COLLECTED(T, "cannot trace garbage collected object inside Deque");
}
#endif
void markNoTracing(const void* pointer) { Derived::fromHelper(this)->mark(pointer, reinterpret_cast<TraceCallback>(0)); }
void markHeaderNoTracing(HeapObjectHeader* header) { Derived::fromHelper(this)->markHeader(header, reinterpret_cast<TraceCallback>(0)); }
template<typename T> void markNoTracing(const T* pointer) { Derived::fromHelper(this)->mark(pointer, reinterpret_cast<TraceCallback>(0)); }
template<typename T, void (T::*method)(Visitor*)>
void registerWeakMembers(const T* obj)
{
Derived::fromHelper(this)->registerWeakMembers(obj, &TraceMethodDelegate<T, method>::trampoline);
}
void registerWeakMembers(const void* object, WeakPointerCallback callback)
{
Derived::fromHelper(this)->registerWeakMembers(object, object, callback);
}
template<typename T> inline bool isAlive(T* obj)
{
// Check that we actually know the definition of T when tracing.
static_assert(sizeof(T), "T must be fully defined");
// The strongification of collections relies on the fact that once a
// collection has been strongified, there is no way that it can contain
// non-live entries, so no entries will be removed. Since you can't set
// the mark bit on a null pointer, that means that null pointers are
// always 'alive'.
if (!obj)
return true;
return ObjectAliveTrait<T>::isHeapObjectAlive(Derived::fromHelper(this), obj);
}
template<typename T> inline bool isAlive(const Member<T>& member)
{
return isAlive(member.get());
}
template<typename T> inline bool isAlive(RawPtr<T> ptr)
{
return isAlive(ptr.get());
}
private:
template<typename T>
static void handleWeakCell(Derived* self, void* obj)
{
T** cell = reinterpret_cast<T**>(obj);
if (*cell && !self->isAlive(*cell))
*cell = nullptr;
}
};
// Visitor is used to traverse the Blink object graph. Used for the
// marking phase of the mark-sweep garbage collector.
//
// Pointers are marked and pushed on the marking stack by calling the
// |mark| method with the pointer as an argument.
//
// Pointers within objects are traced by calling the |trace| methods
// with the object as an argument. Tracing objects will mark all of the
// contained pointers and push them on the marking stack.
class PLATFORM_EXPORT Visitor : public VisitorHelper<Visitor> {
public:
friend class VisitorHelper<Visitor>;
friend class InlinedGlobalMarkingVisitor;
enum VisitorType {
GlobalMarkingVisitorType,
GenericVisitorType,
};
virtual ~Visitor() { }
// FIXME: This is a temporary hack to cheat old Blink GC plugin checks.
// Old GC Plugin doesn't accept calling VisitorHelper<Visitor>::trace
// as a valid mark. This manual redirect worksaround the issue by
// making the method declaration on Visitor class.
template<typename T>
void trace(const T& t)
{
VisitorHelper<Visitor>::trace(t);
}
using VisitorHelper<Visitor>::mark;
// This method marks an object and adds it to the set of objects
// that should have their trace method called. Since not all
// objects have vtables we have to have the callback as an
// explicit argument, but we can use the templated one-argument
// mark method above to automatically provide the callback
// function.
virtual void mark(const void*, TraceCallback) = 0;
// Used to mark objects during conservative scanning.
virtual void markHeader(HeapObjectHeader*, TraceCallback) = 0;
// Used to delay the marking of objects until the usual marking
// including emphemeron iteration is done. This is used to delay
// the marking of collection backing stores until we know if they
// are reachable from locations other than the collection front
// object. If collection backings are reachable from other
// locations we strongify them to avoid issues with iterators and
// weak processing.
virtual void registerDelayedMarkNoTracing(const void*) = 0;
// If the object calls this during the regular trace callback, then the
// WeakPointerCallback argument may be called later, when the strong roots
// have all been found. The WeakPointerCallback will normally use isAlive
// to find out whether some pointers are pointing to dying objects. When
// the WeakPointerCallback is done the object must have purged all pointers
// to objects where isAlive returned false. In the weak callback it is not
// allowed to touch other objects (except using isAlive) or to allocate on
// the GC heap. Note that even removing things from HeapHashSet or
// HeapHashMap can cause an allocation if the backing store resizes, but
// these collections know to remove WeakMember elements safely.
//
// The weak pointer callbacks are run on the thread that owns the
// object and other threads are not stopped during the
// callbacks. Since isAlive is used in the callback to determine
// if objects pointed to are alive it is crucial that the object
// pointed to belong to the same thread as the object receiving
// the weak callback. Since other threads have been resumed the
// mark bits are not valid for objects from other threads.
virtual void registerWeakMembers(const void*, const void*, WeakPointerCallback) = 0;
using VisitorHelper<Visitor>::registerWeakMembers;
virtual void registerWeakTable(const void*, EphemeronCallback, EphemeronCallback) = 0;
#if ENABLE(ASSERT)
virtual bool weakTableRegistered(const void*) = 0;
#endif
virtual bool isMarked(const void*) = 0;
virtual bool ensureMarked(const void*) = 0;
#if ENABLE(GC_PROFILE_MARKING)
void setHostInfo(void* object, const String& name)
{
m_hostObject = object;
m_hostName = name;
}
#endif
inline bool canTraceEagerly() const
{
ASSERT(m_stackFrameDepth);
return m_stackFrameDepth->isSafeToRecurse();
}
inline void configureEagerTraceLimit()
{
if (!m_stackFrameDepth)
m_stackFrameDepth = new StackFrameDepth;
m_stackFrameDepth->configureLimit();
}
inline bool isGlobalMarkingVisitor() const { return m_isGlobalMarkingVisitor; }
protected:
explicit Visitor(VisitorType type)
: m_isGlobalMarkingVisitor(type == GlobalMarkingVisitorType)
{ }
virtual void registerWeakCellWithCallback(void**, WeakPointerCallback) = 0;
#if ENABLE(GC_PROFILE_MARKING)
virtual void recordObjectGraphEdge(const void*)
{
ASSERT_NOT_REACHED();
}
void* m_hostObject;
String m_hostName;
#endif
private:
static Visitor* fromHelper(VisitorHelper<Visitor>* helper) { return static_cast<Visitor*>(helper); }
static StackFrameDepth* m_stackFrameDepth;
bool m_isGlobalMarkingVisitor;
};
// We trace vectors by using the trace trait on each element, which means you
// can have vectors of general objects (not just pointers to objects) that can
// be traced.
template<typename T, size_t N>
struct OffHeapCollectionTraceTrait<WTF::Vector<T, N, WTF::DefaultAllocator> > {
typedef WTF::Vector<T, N, WTF::DefaultAllocator> Vector;
template<typename VisitorDispatcher>
static void trace(VisitorDispatcher visitor, const Vector& vector)
{
if (vector.isEmpty())
return;
for (typename Vector::const_iterator it = vector.begin(), end = vector.end(); it != end; ++it)
TraceTrait<T>::trace(visitor, const_cast<T*>(it));
}
};
template<typename T, size_t N>
struct OffHeapCollectionTraceTrait<WTF::Deque<T, N> > {
typedef WTF::Deque<T, N> Deque;
template<typename VisitorDispatcher>
static void trace(VisitorDispatcher visitor, const Deque& deque)
{
if (deque.isEmpty())
return;
for (typename Deque::const_iterator it = deque.begin(), end = deque.end(); it != end; ++it)
TraceTrait<T>::trace(visitor, const_cast<T*>(&(*it)));
}
};
template<typename T, typename Traits = WTF::VectorTraits<T> >
class HeapVectorBacking;
template<typename Table>
class HeapHashTableBacking {
public:
static void finalize(void* pointer);
};
template<typename T>
class DefaultTraceTrait<T, false> {
public:
template<typename VisitorDispatcher>
static void mark(VisitorDispatcher visitor, const T* t)
{
// Default mark method of the trait just calls the two-argument mark
// method on the visitor. The second argument is the static trace method
// of the trait, which by default calls the instance method
// trace(Visitor*) on the object.
//
// If the trait allows it, invoke the trace callback right here on the
// not-yet-marked object.
if (TraceEagerlyTrait<T>::value) {
// Protect against too deep trace call chains, and the
// unbounded system stack usage they can bring about.
//
// Assert against deep stacks so as to flush them out,
// but test and appropriately handle them should they occur
// in release builds.
ASSERT(visitor->canTraceEagerly());
if (LIKELY(visitor->canTraceEagerly())) {
if (visitor->ensureMarked(t)) {
TraceTrait<T>::trace(visitor, const_cast<T*>(t));
}
return;
}
}
visitor->mark(const_cast<T*>(t), &TraceTrait<T>::trace);
}
#if ENABLE(ASSERT)
static void checkGCInfo(const T* t)
{
assertObjectHasGCInfo(const_cast<T*>(t), GCInfoTrait<T>::index());
}
#endif
};
template<typename T>
class DefaultTraceTrait<T, true> {
public:
template<typename VisitorDispatcher>
static void mark(VisitorDispatcher visitor, const T* self)
{
if (!self)
return;
// If you hit this ASSERT, it means that there is a dangling pointer
// from a live thread heap to a dead thread heap. We must eliminate
// the dangling pointer.
// Release builds don't have the ASSERT, but it is OK because
// release builds will crash at the following self->adjustAndMark
// because all the entries of the orphaned heaps are zeroed out and
// thus the item does not have a valid vtable.
ASSERT(!pageFromObject(self)->orphaned());
self->adjustAndMark(visitor);
}
#if ENABLE(ASSERT)
static void checkGCInfo(const T*) { }
#endif
};
template<typename T, bool = NeedsAdjustAndMark<T>::value> class DefaultObjectAliveTrait;
template<typename T>
class DefaultObjectAliveTrait<T, false> {
public:
static bool isHeapObjectAlive(Visitor* visitor, T* obj)
{
return visitor->isMarked(obj);
}
};
template<typename T>
class DefaultObjectAliveTrait<T, true> {
public:
static bool isHeapObjectAlive(Visitor* visitor, T* obj)
{
return obj->isHeapObjectAlive(visitor);
}
};
template<typename T> bool ObjectAliveTrait<T>::isHeapObjectAlive(Visitor* visitor, T* obj)
{
return DefaultObjectAliveTrait<T>::isHeapObjectAlive(visitor, obj);
}
// The GarbageCollectedMixin interface and helper macro
// USING_GARBAGE_COLLECTED_MIXIN can be used to automatically define
// TraceTrait/ObjectAliveTrait on non-leftmost deriving classes
// which need to be garbage collected.
//
// Consider the following case:
// class B {};
// class A : public GarbageCollected, public B {};
//
// We can't correctly handle "Member<B> p = &a" as we can't compute addr of
// object header statically. This can be solved by using GarbageCollectedMixin:
// class B : public GarbageCollectedMixin {};
// class A : public GarbageCollected, public B {
// USING_GARBAGE_COLLECTED_MIXIN(A)
// };
//
// With the helper, as long as we are using Member<B>, TypeTrait<B> will
// dispatch adjustAndMark dynamically to find collect addr of the object header.
// Note that this is only enabled for Member<B>. For Member<A> which we can
// compute the object header addr statically, this dynamic dispatch is not used.
class PLATFORM_EXPORT GarbageCollectedMixin {
public:
virtual void adjustAndMark(Visitor*) const = 0;
virtual bool isHeapObjectAlive(Visitor*) const = 0;
virtual void trace(Visitor*) { }
};
#define USING_GARBAGE_COLLECTED_MIXIN(TYPE) \
public: \
virtual void adjustAndMark(blink::Visitor* visitor) const override \
{ \
typedef WTF::IsSubclassOfTemplate<typename WTF::RemoveConst<TYPE>::Type, blink::GarbageCollected> IsSubclassOfGarbageCollected; \
static_assert(IsSubclassOfGarbageCollected::value, "only garbage collected objects can have garbage collected mixins"); \
if (TraceEagerlyTrait<TYPE>::value) { \
if (visitor->ensureMarked(static_cast<const TYPE*>(this))) \
TraceTrait<TYPE>::trace(visitor, const_cast<TYPE*>(this)); \
return; \
} \
visitor->mark(static_cast<const TYPE*>(this), &blink::TraceTrait<TYPE>::trace); \
} \
virtual bool isHeapObjectAlive(blink::Visitor* visitor) const override \
{ \
return visitor->isAlive(this); \
} \
private:
#if ENABLE(OILPAN)
#define WILL_BE_USING_GARBAGE_COLLECTED_MIXIN(TYPE) USING_GARBAGE_COLLECTED_MIXIN(TYPE)
#else
#define WILL_BE_USING_GARBAGE_COLLECTED_MIXIN(TYPE)
#endif
#if ENABLE(GC_PROFILING)
template<typename T>
struct TypenameStringTrait {
static const String& get()
{
DEFINE_STATIC_LOCAL(String, typenameString, (WTF::extractTypeNameFromFunctionName(WTF::extractNameFunction<T>())));
return typenameString;
}
};
#endif
// s_gcInfoTable holds the per-class GCInfo descriptors; each heap
// object header keeps its index into this table.
extern PLATFORM_EXPORT GCInfo const** s_gcInfoTable;
class GCInfoTable {
public:
PLATFORM_EXPORT static void ensureGCInfoIndex(const GCInfo*, size_t*);
static void init();
static void shutdown();
// The (max + 1) GCInfo index supported.
static const size_t maxIndex = 1 << 15;
private:
static void resize();
static int s_gcInfoIndex;
static size_t s_gcInfoTableSize;
};
// This macro should be used when returning a unique 15 bit integer
// for a given gcInfo.
#define RETURN_GCINFO_INDEX() \
static size_t gcInfoIndex = 0; \
ASSERT(s_gcInfoTable); \
if (!acquireLoad(&gcInfoIndex)) \
GCInfoTable::ensureGCInfoIndex(&gcInfo, &gcInfoIndex); \
ASSERT(gcInfoIndex >= 1); \
ASSERT(gcInfoIndex < GCInfoTable::maxIndex); \
return gcInfoIndex;
template<typename T>
struct GCInfoAtBase {
static size_t index()
{
static const GCInfo gcInfo = {
TraceTrait<T>::trace,
FinalizerTrait<T>::finalize,
FinalizerTrait<T>::nonTrivialFinalizer,
WTF::IsPolymorphic<T>::value,
#if ENABLE(GC_PROFILING)
TypenameStringTrait<T>::get()
#endif
};
RETURN_GCINFO_INDEX();
}
};
template<typename T, bool = WTF::IsSubclassOfTemplate<typename WTF::RemoveConst<T>::Type, GarbageCollected>::value> struct GetGarbageCollectedBase;
template<typename T>
struct GetGarbageCollectedBase<T, true> {
typedef typename T::GarbageCollectedBase type;
};
template<typename T>
struct GetGarbageCollectedBase<T, false> {
typedef T type;
};
template<typename T>
struct GCInfoTrait {
static size_t index()
{
return GCInfoAtBase<typename GetGarbageCollectedBase<T>::type>::index();
}
};
}
#endif
|