Changeset 161615 in webkit


Ignore:
Timestamp:
Jan 9, 2014 6:28:27 PM (10 years ago)
Author:
mhahnenberg@apple.com
Message:

Marking should be generational
https://bugs.webkit.org/show_bug.cgi?id=126552

Reviewed by Geoffrey Garen.

Source/JavaScriptCore:

Re-marking the same objects over and over is a waste of effort. This patch implements
the sticky mark bit algorithm (along with our already-present write barriers) to reduce
overhead during garbage collection caused by rescanning objects.

There are now two collection modes, EdenCollection and FullCollection. EdenCollections
only visit new objects or objects that were added to the remembered set by a write barrier.
FullCollections are normal collections that visit all objects regardless of their
generation.

In this patch EdenCollections do not do anything in CopiedSpace. This will be fixed in
https://bugs.webkit.org/show_bug.cgi?id=126555.

  • bytecode/CodeBlock.cpp:

(JSC::CodeBlock::visitAggregate):

  • bytecode/CodeBlock.h:

(JSC::CodeBlockSet::mark):

  • dfg/DFGOperations.cpp:
  • heap/CodeBlockSet.cpp:

(JSC::CodeBlockSet::add):
(JSC::CodeBlockSet::traceMarked):
(JSC::CodeBlockSet::rememberCurrentlyExecutingCodeBlocks):

  • heap/CodeBlockSet.h:
  • heap/CopiedBlockInlines.h:

(JSC::CopiedBlock::reportLiveBytes):

  • heap/CopiedSpace.cpp:

(JSC::CopiedSpace::didStartFullCollection):

  • heap/CopiedSpace.h:

(JSC::CopiedSpace::heap):

  • heap/Heap.cpp:

(JSC::Heap::Heap):
(JSC::Heap::didAbandon):
(JSC::Heap::markRoots):
(JSC::Heap::copyBackingStores):
(JSC::Heap::addToRememberedSet):
(JSC::Heap::collectAllGarbage):
(JSC::Heap::collect):
(JSC::Heap::didAllocate):
(JSC::Heap::writeBarrier):

  • heap/Heap.h:

(JSC::Heap::isInRememberedSet):
(JSC::Heap::operationInProgress):
(JSC::Heap::shouldCollect):
(JSC::Heap::isCollecting):
(JSC::Heap::isWriteBarrierEnabled):
(JSC::Heap::writeBarrier):

  • heap/HeapOperation.h:
  • heap/MarkStack.cpp:

(JSC::MarkStackArray::~MarkStackArray):
(JSC::MarkStackArray::clear):
(JSC::MarkStackArray::fillVector):

  • heap/MarkStack.h:
  • heap/MarkedAllocator.cpp:

(JSC::isListPagedOut):
(JSC::MarkedAllocator::isPagedOut):
(JSC::MarkedAllocator::tryAllocateHelper):
(JSC::MarkedAllocator::addBlock):
(JSC::MarkedAllocator::removeBlock):
(JSC::MarkedAllocator::reset):

  • heap/MarkedAllocator.h:

(JSC::MarkedAllocator::MarkedAllocator):

  • heap/MarkedBlock.cpp:

(JSC::MarkedBlock::clearMarks):
(JSC::MarkedBlock::clearRememberedSet):
(JSC::MarkedBlock::clearMarksWithCollectionType):
(JSC::MarkedBlock::lastChanceToFinalize):

  • heap/MarkedBlock.h: Changed atomSize to 16 bytes because we have no objects smaller

than 16 bytes. This is also to pay for the additional Bitmap for the remembered set.
(JSC::MarkedBlock::didConsumeEmptyFreeList):
(JSC::MarkedBlock::setRemembered):
(JSC::MarkedBlock::clearRemembered):
(JSC::MarkedBlock::atomicClearRemembered):
(JSC::MarkedBlock::isRemembered):

  • heap/MarkedSpace.cpp:

(JSC::MarkedSpace::~MarkedSpace):
(JSC::MarkedSpace::resetAllocators):
(JSC::MarkedSpace::visitWeakSets):
(JSC::MarkedSpace::reapWeakSets):
(JSC::VerifyMarked::operator()):
(JSC::MarkedSpace::clearMarks):

  • heap/MarkedSpace.h:

(JSC::ClearMarks::operator()):
(JSC::ClearRememberedSet::operator()):
(JSC::MarkedSpace::didAllocateInBlock):
(JSC::MarkedSpace::clearRememberedSet):

  • heap/SlotVisitor.cpp:

(JSC::SlotVisitor::~SlotVisitor):
(JSC::SlotVisitor::clearMarkStack):

  • heap/SlotVisitor.h:

(JSC::SlotVisitor::markStack):
(JSC::SlotVisitor::sharedData):

  • heap/SlotVisitorInlines.h:

(JSC::SlotVisitor::internalAppend):
(JSC::SlotVisitor::unconditionallyAppend):
(JSC::SlotVisitor::copyLater):
(JSC::SlotVisitor::reportExtraMemoryUsage):
(JSC::SlotVisitor::heap):

  • jit/Repatch.cpp:
  • runtime/JSGenericTypedArrayViewInlines.h:

(JSC::JSGenericTypedArrayView<Adaptor>::visitChildren):

  • runtime/JSPropertyNameIterator.h:

(JSC::StructureRareData::setEnumerationCache):

  • runtime/JSString.cpp:

(JSC::JSString::visitChildren):

  • runtime/StructureRareDataInlines.h:

(JSC::StructureRareData::setPreviousID):
(JSC::StructureRareData::setObjectToStringValue):

  • runtime/WeakMapData.cpp:

(JSC::WeakMapData::visitChildren):

Source/WTF:

  • wtf/Bitmap.h:

(WTF::WordType>::count): Added a cast that became necessary when Bitmap
is used with smaller types than int32_t.

Location:
trunk/Source
Files:
31 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/JavaScriptCore/ChangeLog

    r161585 r161615  
     12014-01-07  Mark Hahnenberg  <mhahnenberg@apple.com>
     2
     3        Marking should be generational
     4        https://bugs.webkit.org/show_bug.cgi?id=126552
     5
     6        Reviewed by Geoffrey Garen.
     7
     8        Re-marking the same objects over and over is a waste of effort. This patch implements
     9        the sticky mark bit algorithm (along with our already-present write barriers) to reduce
     10        overhead during garbage collection caused by rescanning objects.
     11
     12        There are now two collection modes, EdenCollection and FullCollection. EdenCollections
     13        only visit new objects or objects that were added to the remembered set by a write barrier.
     14        FullCollections are normal collections that visit all objects regardless of their
     15        generation.
     16
     17        In this patch EdenCollections do not do anything in CopiedSpace. This will be fixed in
     18        https://bugs.webkit.org/show_bug.cgi?id=126555.
     19
     20        * bytecode/CodeBlock.cpp:
     21        (JSC::CodeBlock::visitAggregate):
     22        * bytecode/CodeBlock.h:
     23        (JSC::CodeBlockSet::mark):
     24        * dfg/DFGOperations.cpp:
     25        * heap/CodeBlockSet.cpp:
     26        (JSC::CodeBlockSet::add):
     27        (JSC::CodeBlockSet::traceMarked):
     28        (JSC::CodeBlockSet::rememberCurrentlyExecutingCodeBlocks):
     29        * heap/CodeBlockSet.h:
     30        * heap/CopiedBlockInlines.h:
     31        (JSC::CopiedBlock::reportLiveBytes):
     32        * heap/CopiedSpace.cpp:
     33        (JSC::CopiedSpace::didStartFullCollection):
     34        * heap/CopiedSpace.h:
     35        (JSC::CopiedSpace::heap):
     36        * heap/Heap.cpp:
     37        (JSC::Heap::Heap):
     38        (JSC::Heap::didAbandon):
     39        (JSC::Heap::markRoots):
     40        (JSC::Heap::copyBackingStores):
     41        (JSC::Heap::addToRememberedSet):
     42        (JSC::Heap::collectAllGarbage):
     43        (JSC::Heap::collect):
     44        (JSC::Heap::didAllocate):
     45        (JSC::Heap::writeBarrier):
     46        * heap/Heap.h:
     47        (JSC::Heap::isInRememberedSet):
     48        (JSC::Heap::operationInProgress):
     49        (JSC::Heap::shouldCollect):
     50        (JSC::Heap::isCollecting):
     51        (JSC::Heap::isWriteBarrierEnabled):
     52        (JSC::Heap::writeBarrier):
     53        * heap/HeapOperation.h:
     54        * heap/MarkStack.cpp:
     55        (JSC::MarkStackArray::~MarkStackArray):
     56        (JSC::MarkStackArray::clear):
     57        (JSC::MarkStackArray::fillVector):
     58        * heap/MarkStack.h:
     59        * heap/MarkedAllocator.cpp:
     60        (JSC::isListPagedOut):
     61        (JSC::MarkedAllocator::isPagedOut):
     62        (JSC::MarkedAllocator::tryAllocateHelper):
     63        (JSC::MarkedAllocator::addBlock):
     64        (JSC::MarkedAllocator::removeBlock):
     65        (JSC::MarkedAllocator::reset):
     66        * heap/MarkedAllocator.h:
     67        (JSC::MarkedAllocator::MarkedAllocator):
     68        * heap/MarkedBlock.cpp:
     69        (JSC::MarkedBlock::clearMarks):
     70        (JSC::MarkedBlock::clearRememberedSet):
     71        (JSC::MarkedBlock::clearMarksWithCollectionType):
     72        (JSC::MarkedBlock::lastChanceToFinalize):
     73        * heap/MarkedBlock.h: Changed atomSize to 16 bytes because we have no objects smaller
     74        than 16 bytes. This is also to pay for the additional Bitmap for the remembered set.
     75        (JSC::MarkedBlock::didConsumeEmptyFreeList):
     76        (JSC::MarkedBlock::setRemembered):
     77        (JSC::MarkedBlock::clearRemembered):
     78        (JSC::MarkedBlock::atomicClearRemembered):
     79        (JSC::MarkedBlock::isRemembered):
     80        * heap/MarkedSpace.cpp:
     81        (JSC::MarkedSpace::~MarkedSpace):
     82        (JSC::MarkedSpace::resetAllocators):
     83        (JSC::MarkedSpace::visitWeakSets):
     84        (JSC::MarkedSpace::reapWeakSets):
     85        (JSC::VerifyMarked::operator()):
     86        (JSC::MarkedSpace::clearMarks):
     87        * heap/MarkedSpace.h:
     88        (JSC::ClearMarks::operator()):
     89        (JSC::ClearRememberedSet::operator()):
     90        (JSC::MarkedSpace::didAllocateInBlock):
     91        (JSC::MarkedSpace::clearRememberedSet):
     92        * heap/SlotVisitor.cpp:
     93        (JSC::SlotVisitor::~SlotVisitor):
     94        (JSC::SlotVisitor::clearMarkStack):
     95        * heap/SlotVisitor.h:
     96        (JSC::SlotVisitor::markStack):
     97        (JSC::SlotVisitor::sharedData):
     98        * heap/SlotVisitorInlines.h:
     99        (JSC::SlotVisitor::internalAppend):
     100        (JSC::SlotVisitor::unconditionallyAppend):
     101        (JSC::SlotVisitor::copyLater):
     102        (JSC::SlotVisitor::reportExtraMemoryUsage):
     103        (JSC::SlotVisitor::heap):
     104        * jit/Repatch.cpp:
     105        * runtime/JSGenericTypedArrayViewInlines.h:
     106        (JSC::JSGenericTypedArrayView<Adaptor>::visitChildren):
     107        * runtime/JSPropertyNameIterator.h:
     108        (JSC::StructureRareData::setEnumerationCache):
     109        * runtime/JSString.cpp:
     110        (JSC::JSString::visitChildren):
     111        * runtime/StructureRareDataInlines.h:
     112        (JSC::StructureRareData::setPreviousID):
     113        (JSC::StructureRareData::setObjectToStringValue):
     114        * runtime/WeakMapData.cpp:
     115        (JSC::WeakMapData::visitChildren):
     116
    11172014-01-09  Joseph Pecoraro  <pecoraro@apple.com>
    2118
  • trunk/Source/JavaScriptCore/bytecode/CodeBlock.cpp

    r161557 r161615  
    19551955        otherBlock->visitAggregate(visitor);
    19561956
    1957     visitor.reportExtraMemoryUsage(sizeof(CodeBlock));
     1957    visitor.reportExtraMemoryUsage(ownerExecutable(), sizeof(CodeBlock));
    19581958    if (m_jitCode)
    1959         visitor.reportExtraMemoryUsage(m_jitCode->size());
     1959        visitor.reportExtraMemoryUsage(ownerExecutable(), m_jitCode->size());
    19601960    if (m_instructions.size()) {
    19611961        // Divide by refCount() because m_instructions points to something that is shared
     
    19631963        // Having each CodeBlock report only its proportional share of the size is one way
    19641964        // of accomplishing this.
    1965         visitor.reportExtraMemoryUsage(m_instructions.size() * sizeof(Instruction) / m_instructions.refCount());
     1965        visitor.reportExtraMemoryUsage(ownerExecutable(), m_instructions.size() * sizeof(Instruction) / m_instructions.refCount());
    19661966    }
    19671967
  • trunk/Source/JavaScriptCore/bytecode/CodeBlock.h

    r161557 r161615  
    12701270   
    12711271    (*iter)->m_mayBeExecuting = true;
     1272#if ENABLE(GGC)
     1273    m_currentlyExecuting.append(static_cast<CodeBlock*>(candidateCodeBlock));
     1274#endif
    12721275}
    12731276
  • trunk/Source/JavaScriptCore/dfg/DFGOperations.cpp

    r161557 r161615  
    851851
    852852    ASSERT(!object->structure()->outOfLineCapacity());
     853    DeferGC deferGC(vm.heap);
    853854    Butterfly* result = object->growOutOfLineStorage(vm, 0, initialOutOfLineCapacity);
    854855    object->setButterflyWithoutChangingStructure(vm, result);
     
    861862    NativeCallFrameTracer tracer(&vm, exec);
    862863
     864    DeferGC deferGC(vm.heap);
    863865    Butterfly* result = object->growOutOfLineStorage(vm, object->structure()->outOfLineCapacity(), newSize);
    864866    object->setButterflyWithoutChangingStructure(vm, result);
  • trunk/Source/JavaScriptCore/heap/CodeBlockSet.cpp

    r161557 r161615  
    4646void CodeBlockSet::add(PassRefPtr<CodeBlock> codeBlock)
    4747{
    48     bool isNewEntry = m_set.add(codeBlock.leakRef()).isNewEntry;
     48    CodeBlock* block = codeBlock.leakRef();
     49    bool isNewEntry = m_set.add(block).isNewEntry;
    4950    ASSERT_UNUSED(isNewEntry, isNewEntry);
    5051}
     
    102103        if (!codeBlock->m_mayBeExecuting)
    103104            continue;
    104         codeBlock->visitAggregate(visitor);
     105        codeBlock->ownerExecutable()->methodTable()->visitChildren(codeBlock->ownerExecutable(), visitor);
    105106    }
     107}
     108
     109void CodeBlockSet::rememberCurrentlyExecutingCodeBlocks(Heap* heap)
     110{
     111#if ENABLE(GGC)
     112    for (size_t i = 0; i < m_currentlyExecuting.size(); ++i)
     113        heap->addToRememberedSet(m_currentlyExecuting[i]->ownerExecutable());
     114    m_currentlyExecuting.clear();
     115#else
     116    UNUSED_PARAM(heap);
     117#endif // ENABLE(GGC)
    106118}
    107119
  • trunk/Source/JavaScriptCore/heap/CodeBlockSet.h

    r161557 r161615  
    3131#include <wtf/PassRefPtr.h>
    3232#include <wtf/RefPtr.h>
     33#include <wtf/Vector.h>
    3334
    3435namespace JSC {
    3536
    3637class CodeBlock;
     38class Heap;
    3739class SlotVisitor;
    3840
     
    6668    void traceMarked(SlotVisitor&);
    6769
     70    // Add all currently executing CodeBlocks to the remembered set to be
     71    // re-scanned during the next collection.
     72    void rememberCurrentlyExecutingCodeBlocks(Heap*);
     73
    6874private:
    6975    // This is not a set of RefPtr<CodeBlock> because we need to be able to find
     
    7177    // and all, but that seemed like overkill.
    7278    HashSet<CodeBlock* > m_set;
     79    Vector<CodeBlock*> m_currentlyExecuting;
    7380};
    7481
  • trunk/Source/JavaScriptCore/heap/CopiedBlockInlines.h

    r161557 r161615  
    4343    m_liveBytes += bytes;
    4444
     45    if (isPinned())
     46        return;
     47
    4548    if (!shouldEvacuate()) {
    4649        pin();
  • trunk/Source/JavaScriptCore/heap/CopiedSpace.cpp

    r161557 r161615  
    317317}
    318318
     319void CopiedSpace::didStartFullCollection()
     320{
     321    ASSERT(heap()->operationInProgress() == FullCollection);
     322
     323    ASSERT(m_fromSpace->isEmpty());
     324
     325    for (CopiedBlock* block = m_toSpace->head(); block; block = block->next())
     326        block->didSurviveGC();
     327
     328    for (CopiedBlock* block = m_oversizeBlocks.head(); block; block = block->next())
     329        block->didSurviveGC();
     330}
     331
    319332} // namespace JSC
  • trunk/Source/JavaScriptCore/heap/CopiedSpace.h

    r161557 r161615  
    6161    CopiedAllocator& allocator() { return m_allocator; }
    6262
     63    void didStartFullCollection();
     64
    6365    void startedCopying();
    6466    void doneCopying();
     
    8082
    8183    static CopiedBlock* blockFor(void*);
     84
     85    Heap* heap() const { return m_heap; }
    8286
    8387private:
  • trunk/Source/JavaScriptCore/heap/Heap.cpp

    r161557 r161615  
    254254    , m_minBytesPerCycle(minHeapSize(m_heapType, m_ramSize))
    255255    , m_sizeAfterLastCollect(0)
    256     , m_bytesAllocatedLimit(m_minBytesPerCycle)
    257     , m_bytesAllocated(0)
    258     , m_bytesAbandoned(0)
     256    , m_bytesAllocatedThisCycle(0)
     257    , m_bytesAbandonedThisCycle(0)
     258    , m_maxEdenSize(m_minBytesPerCycle)
     259    , m_maxHeapSize(m_minBytesPerCycle)
     260    , m_shouldDoFullCollection(false)
    259261    , m_totalBytesVisited(0)
    260262    , m_totalBytesCopied(0)
     
    270272    , m_handleSet(vm)
    271273    , m_isSafeToCollect(false)
    272     , m_writeBarrierBuffer(128)
     274    , m_writeBarrierBuffer(256)
    273275    , m_vm(vm)
    274276    , m_lastGCLength(0)
     
    333335{
    334336    if (m_activityCallback)
    335         m_activityCallback->didAllocate(m_bytesAllocated + m_bytesAbandoned);
    336     m_bytesAbandoned += bytes;
     337        m_activityCallback->didAllocate(m_bytesAllocatedThisCycle + m_bytesAbandonedThisCycle);
     338    m_bytesAbandonedThisCycle += bytes;
    337339}
    338340
     
    487489    visitor.setup();
    488490    HeapRootVisitor heapRootVisitor(visitor);
     491
     492    Vector<const JSCell*> rememberedSet(m_slotVisitor.markStack().size());
     493    m_slotVisitor.markStack().fillVector(rememberedSet);
    489494
    490495    {
     
    591596    }
    592597
     598    {
     599        GCPHASE(ClearRememberedSet);
     600        for (unsigned i = 0; i < rememberedSet.size(); ++i) {
     601            const JSCell* cell = rememberedSet[i];
     602            MarkedBlock::blockFor(cell)->clearRemembered(cell);
     603        }
     604    }
     605
    593606    GCCOUNTER(VisitedValueCount, visitor.visitCount());
    594607
     
    602615#endif
    603616
    604     m_totalBytesVisited = visitor.bytesVisited();
    605     m_totalBytesCopied = visitor.bytesCopied();
     617    if (m_operationInProgress == EdenCollection) {
     618        m_totalBytesVisited += visitor.bytesVisited();
     619        m_totalBytesCopied += visitor.bytesCopied();
     620    } else {
     621        ASSERT(m_operationInProgress == FullCollection);
     622        m_totalBytesVisited = visitor.bytesVisited();
     623        m_totalBytesCopied = visitor.bytesCopied();
     624    }
    606625#if ENABLE(PARALLEL_GC)
    607626    m_totalBytesVisited += m_sharedData.childBytesVisited();
     
    616635}
    617636
     637template <HeapOperation collectionType>
    618638void Heap::copyBackingStores()
    619639{
     640    if (collectionType == EdenCollection)
     641        return;
     642
    620643    m_storageSpace.startedCopying();
    621644    if (m_storageSpace.shouldDoCopyPhase()) {
     
    628651        m_storageSpace.doneCopying();
    629652        m_sharedData.didFinishCopying();
    630     } else 
     653    } else
    631654        m_storageSpace.doneCopying();
    632655}
     
    724747}
    725748
     749void Heap::addToRememberedSet(const JSCell* cell)
     750{
     751    ASSERT(cell);
     752    ASSERT(!Options::enableConcurrentJIT() || !isCompilationThread());
     753    if (isInRememberedSet(cell))
     754        return;
     755    MarkedBlock::blockFor(cell)->setRemembered(cell);
     756    m_slotVisitor.unconditionallyAppend(const_cast<JSCell*>(cell));
     757}
     758
    726759void Heap::collectAllGarbage()
    727760{
     
    729762        return;
    730763
     764    m_shouldDoFullCollection = true;
    731765    collect();
    732766
     
    765799        m_vm->prepareToDiscardCode();
    766800    }
    767    
    768     m_operationInProgress = Collection;
    769     m_extraMemoryUsage = 0;
     801
     802    bool isFullCollection = m_shouldDoFullCollection;
     803    if (isFullCollection) {
     804        m_operationInProgress = FullCollection;
     805        m_slotVisitor.clearMarkStack();
     806        m_shouldDoFullCollection = false;
     807        if (Options::logGC())
     808            dataLog("FullCollection, ");
     809    } else {
     810#if ENABLE(GGC)
     811        m_operationInProgress = EdenCollection;
     812        if (Options::logGC())
     813            dataLog("EdenCollection, ");
     814#else
     815        m_operationInProgress = FullCollection;
     816        m_slotVisitor.clearMarkStack();
     817        if (Options::logGC())
     818            dataLog("FullCollection, ");
     819#endif
     820    }
     821    if (m_operationInProgress == FullCollection)
     822        m_extraMemoryUsage = 0;
    770823
    771824    if (m_activityCallback)
     
    781834        GCPHASE(StopAllocation);
    782835        m_objectSpace.stopAllocating();
     836        if (m_operationInProgress == FullCollection)
     837            m_storageSpace.didStartFullCollection();
     838    }
     839
     840    {
     841        GCPHASE(FlushWriteBarrierBuffer);
     842        if (m_operationInProgress == EdenCollection)
     843            m_writeBarrierBuffer.flush(*this);
     844        else
     845            m_writeBarrierBuffer.reset();
    783846    }
    784847
     
    797860    }
    798861
    799     {
     862    if (m_operationInProgress == FullCollection) {
    800863        m_blockSnapshot.resize(m_objectSpace.blocks().set().size());
    801864        MarkedBlockSnapshotFunctor functor(m_blockSnapshot);
     
    803866    }
    804867
    805     copyBackingStores();
     868    if (m_operationInProgress == FullCollection)
     869        copyBackingStores<FullCollection>();
     870    else
     871        copyBackingStores<EdenCollection>();
    806872
    807873    {
     
    820886    }
    821887
    822     m_sweeper->startSweeping(m_blockSnapshot);
    823     m_bytesAbandoned = 0;
     888    if (m_operationInProgress == FullCollection)
     889        m_sweeper->startSweeping(m_blockSnapshot);
     890
     891    {
     892        GCPHASE(AddCurrentlyExecutingCodeBlocksToRememberedSet);
     893        m_codeBlocks.rememberCurrentlyExecutingCodeBlocks(this);
     894    }
     895
     896    m_bytesAbandonedThisCycle = 0;
    824897
    825898    {
     
    832905        HeapStatistics::exitWithFailure();
    833906
     907    if (m_operationInProgress == FullCollection) {
     908        // To avoid pathological GC churn in very small and very large heaps, we set
     909        // the new allocation limit based on the current size of the heap, with a
     910        // fixed minimum.
     911        m_maxHeapSize = max(minHeapSize(m_heapType, m_ramSize), proportionalHeapSize(currentHeapSize, m_ramSize));
     912        m_maxEdenSize = m_maxHeapSize - currentHeapSize;
     913    } else {
     914        ASSERT(currentHeapSize >= m_sizeAfterLastCollect);
     915        m_maxEdenSize = m_maxHeapSize - currentHeapSize;
     916        double edenToOldGenerationRatio = (double)m_maxEdenSize / (double)m_maxHeapSize;
     917        double minEdenToOldGenerationRatio = 1.0 / 3.0;
     918        if (edenToOldGenerationRatio < minEdenToOldGenerationRatio)
     919            m_shouldDoFullCollection = true;
     920        m_maxHeapSize += currentHeapSize - m_sizeAfterLastCollect;
     921        m_maxEdenSize = m_maxHeapSize - currentHeapSize;
     922    }
     923
    834924    m_sizeAfterLastCollect = currentHeapSize;
    835925
    836     // To avoid pathological GC churn in very small and very large heaps, we set
    837     // the new allocation limit based on the current size of the heap, with a
    838     // fixed minimum.
    839     size_t maxHeapSize = max(minHeapSize(m_heapType, m_ramSize), proportionalHeapSize(currentHeapSize, m_ramSize));
    840     m_bytesAllocatedLimit = maxHeapSize - currentHeapSize;
    841 
    842     m_bytesAllocated = 0;
     926    m_bytesAllocatedThisCycle = 0;
    843927    double lastGCEndTime = WTF::monotonicallyIncreasingTime();
    844928    m_lastGCLength = lastGCEndTime - lastGCStartTime;
     
    846930    if (Options::recordGCPauseTimes())
    847931        HeapStatistics::recordGCPauseTime(lastGCStartTime, lastGCEndTime);
    848     RELEASE_ASSERT(m_operationInProgress == Collection);
     932    RELEASE_ASSERT(m_operationInProgress == EdenCollection || m_operationInProgress == FullCollection);
    849933
    850934    m_operationInProgress = NoOperation;
     
    864948        dataLog(after - before, " ms, ", currentHeapSize / 1024, " kb]\n");
    865949    }
    866 
    867 #if ENABLE(ALLOCATION_LOGGING)
    868     dataLogF("JSC GC finishing collection.\n");
    869 #endif
    870950}
    871951
     
    917997{
    918998    if (m_activityCallback)
    919         m_activityCallback->didAllocate(m_bytesAllocated + m_bytesAbandoned);
    920     m_bytesAllocated += bytes;
     999        m_activityCallback->didAllocate(m_bytesAllocatedThisCycle + m_bytesAbandonedThisCycle);
     1000    m_bytesAllocatedThisCycle += bytes;
    9211001}
    9221002
     
    9931073    decrementDeferralDepth();
    9941074    collectIfNecessaryOrDefer();
     1075}
     1076
     1077void Heap::writeBarrier(const JSCell* from)
     1078{
     1079    ASSERT_GC_OBJECT_LOOKS_VALID(const_cast<JSCell*>(from));
     1080    if (!from || !isMarked(from))
     1081        return;
     1082    Heap* heap = Heap::heap(from);
     1083    heap->addToRememberedSet(from);
    9951084}
    9961085
  • trunk/Source/JavaScriptCore/heap/Heap.h

    r161557 r161615  
    9595        static void setMarked(const void*);
    9696
     97        JS_EXPORT_PRIVATE void addToRememberedSet(const JSCell*);
     98        bool isInRememberedSet(const JSCell* cell) const
     99        {
     100            ASSERT(cell);
     101            ASSERT(!Options::enableConcurrentJIT() || !isCompilationThread());
     102            return MarkedBlock::blockFor(cell)->isRemembered(cell);
     103        }
    97104        static bool isWriteBarrierEnabled();
    98         static void writeBarrier(const JSCell*);
     105        JS_EXPORT_PRIVATE static void writeBarrier(const JSCell*);
    99106        static void writeBarrier(const JSCell*, JSValue);
    100107        static void writeBarrier(const JSCell*, JSCell*);
    101         static uint8_t* addressOfCardFor(JSCell*);
    102108
    103109        WriteBarrierBuffer& writeBarrierBuffer() { return m_writeBarrierBuffer; }
     
    121127        // true if collection is in progress
    122128        inline bool isCollecting();
     129        inline HeapOperation operationInProgress() { return m_operationInProgress; }
    123130        // true if an allocation or collection is in progress
    124131        inline bool isBusy();
     
    237244        void markProtectedObjects(HeapRootVisitor&);
    238245        void markTempSortVectors(HeapRootVisitor&);
     246        template <HeapOperation collectionType>
    239247        void copyBackingStores();
    240248        void harvestWeakReferences();
     
    258266        size_t m_sizeAfterLastCollect;
    259267
    260         size_t m_bytesAllocatedLimit;
    261         size_t m_bytesAllocated;
    262         size_t m_bytesAbandoned;
    263 
     268        size_t m_bytesAllocatedThisCycle;
     269        size_t m_bytesAbandonedThisCycle;
     270        size_t m_maxEdenSize;
     271        size_t m_maxHeapSize;
     272        bool m_shouldDoFullCollection;
    264273        size_t m_totalBytesVisited;
    265274        size_t m_totalBytesCopied;
     
    271280        GCIncomingRefCountedSet<ArrayBuffer> m_arrayBuffers;
    272281        size_t m_extraMemoryUsage;
     282
     283        HashSet<const JSCell*> m_copyingRememberedSet;
    273284
    274285        ProtectCountSet m_protectedValues;
     
    323334            return false;
    324335        if (Options::gcMaxHeapSize())
    325             return m_bytesAllocated > Options::gcMaxHeapSize() && m_isSafeToCollect && m_operationInProgress == NoOperation;
    326         return m_bytesAllocated > m_bytesAllocatedLimit && m_isSafeToCollect && m_operationInProgress == NoOperation;
     336            return m_bytesAllocatedThisCycle > Options::gcMaxHeapSize() && m_isSafeToCollect && m_operationInProgress == NoOperation;
     337        return m_bytesAllocatedThisCycle > m_maxEdenSize && m_isSafeToCollect && m_operationInProgress == NoOperation;
    327338    }
    328339
     
    334345    bool Heap::isCollecting()
    335346    {
    336         return m_operationInProgress == Collection;
     347        return m_operationInProgress == FullCollection || m_operationInProgress == EdenCollection;
    337348    }
    338349
     
    371382    inline bool Heap::isWriteBarrierEnabled()
    372383    {
    373 #if ENABLE(WRITE_BARRIER_PROFILING)
     384#if ENABLE(WRITE_BARRIER_PROFILING) || ENABLE(GGC)
    374385        return true;
    375386#else
     
    378389    }
    379390
    380     inline void Heap::writeBarrier(const JSCell*)
    381     {
     391    inline void Heap::writeBarrier(const JSCell* from, JSCell* to)
     392    {
     393#if ENABLE(WRITE_BARRIER_PROFILING)
    382394        WriteBarrierCounters::countWriteBarrier();
    383     }
    384 
    385     inline void Heap::writeBarrier(const JSCell*, JSCell*)
    386     {
     395#endif
     396        if (!from || !isMarked(from))
     397            return;
     398        if (!to || isMarked(to))
     399            return;
     400        Heap::heap(from)->addToRememberedSet(from);
     401    }
     402
     403    inline void Heap::writeBarrier(const JSCell* from, JSValue to)
     404    {
     405#if ENABLE(WRITE_BARRIER_PROFILING)
    387406        WriteBarrierCounters::countWriteBarrier();
    388     }
    389 
    390     inline void Heap::writeBarrier(const JSCell*, JSValue)
    391     {
    392         WriteBarrierCounters::countWriteBarrier();
     407#endif
     408        if (!to.isCell())
     409            return;
     410        writeBarrier(from, to.asCell());
    393411    }
    394412
  • trunk/Source/JavaScriptCore/heap/HeapOperation.h

    r161557 r161615  
    2929namespace JSC {
    3030
    31 enum HeapOperation { NoOperation, Allocation, Collection };
     31enum HeapOperation { NoOperation, Allocation, FullCollection, EdenCollection };
    3232
    3333} // namespace JSC
  • trunk/Source/JavaScriptCore/heap/MarkStack.cpp

    r161557 r161615  
    5858MarkStackArray::~MarkStackArray()
    5959{
    60     ASSERT(m_numberOfSegments == 1 && m_segments.size() == 1);
     60    ASSERT(m_numberOfSegments == 1);
     61    ASSERT(m_segments.size() == 1);
    6162    m_blockAllocator.deallocate(MarkStackSegment::destroy(m_segments.removeHead()));
     63    m_numberOfSegments--;
     64    ASSERT(!m_numberOfSegments);
     65    ASSERT(!m_segments.size());
     66}
     67
     68void MarkStackArray::clear()
     69{
     70    if (!m_segments.head())
     71        return;
     72    MarkStackSegment* next;
     73    for (MarkStackSegment* current = m_segments.head(); current->next(); current = next) {
     74        next = current->next();
     75        m_segments.remove(current);
     76        m_blockAllocator.deallocate(MarkStackSegment::destroy(current));
     77    }
     78    m_top = 0;
     79    m_numberOfSegments = 1;
     80#if !ASSERT_DISABLED
     81    m_segments.head()->m_top = 0;
     82#endif
    6283}
    6384
     
    168189}
    169190
     191void MarkStackArray::fillVector(Vector<const JSCell*>& vector)
     192{
     193    ASSERT(vector.size() == size());
     194
     195    MarkStackSegment* currentSegment = m_segments.head();
     196    if (!currentSegment)
     197        return;
     198
     199    unsigned count = 0;
     200    for (unsigned i = 0; i < m_top; ++i) {
     201        ASSERT(currentSegment->data()[i]);
     202        vector[count++] = currentSegment->data()[i];
     203    }
     204
     205    currentSegment = currentSegment->next();
     206    while (currentSegment) {
     207        for (unsigned i = 0; i < s_segmentCapacity; ++i) {
     208            ASSERT(currentSegment->data()[i]);
     209            vector[count++] = currentSegment->data()[i];
     210        }
     211        currentSegment = currentSegment->next();
     212    }
     213}
     214
    170215} // namespace JSC
  • trunk/Source/JavaScriptCore/heap/MarkStack.h

    r161557 r161615  
    5353#include "HeapBlock.h"
    5454#include <wtf/StdLibExtras.h>
     55#include <wtf/Vector.h>
    5556
    5657namespace JSC {
     
    101102    bool isEmpty();
    102103
     104    void fillVector(Vector<const JSCell*>&);
     105    void clear();
     106
    103107private:
    104108    template <size_t size> struct CapacityFromSize {
  • trunk/Source/JavaScriptCore/heap/MarkedAllocator.cpp

    r161557 r161615  
    1111namespace JSC {
    1212
    13 bool MarkedAllocator::isPagedOut(double deadline)
     13static bool isListPagedOut(double deadline, DoublyLinkedList<MarkedBlock>& list)
    1414{
    1515    unsigned itersSinceLastTimeCheck = 0;
    16     MarkedBlock* block = m_blockList.head();
     16    MarkedBlock* block = list.head();
    1717    while (block) {
    1818        block = block->next();
     
    2525        }
    2626    }
     27    return false;
     28}
    2729
     30bool MarkedAllocator::isPagedOut(double deadline)
     31{
     32    if (isListPagedOut(deadline, m_blockList))
     33        return true;
    2834    return false;
    2935}
     
    3743        DelayedReleaseScope delayedReleaseScope(*m_markedSpace);
    3844        if (m_currentBlock) {
    39             ASSERT(m_currentBlock == m_blocksToSweep);
     45            ASSERT(m_currentBlock == m_nextBlockToSweep);
    4046            m_currentBlock->didConsumeFreeList();
    41             m_blocksToSweep = m_currentBlock->next();
     47            m_nextBlockToSweep = m_currentBlock->next();
    4248        }
    4349
    44         for (MarkedBlock*& block = m_blocksToSweep; block; block = block->next()) {
     50        MarkedBlock* next;
     51        for (MarkedBlock*& block = m_nextBlockToSweep; block; block = next) {
     52            next = block->next();
     53
    4554            MarkedBlock::FreeList freeList = block->sweep(MarkedBlock::SweepToFreeList);
     55           
    4656            if (!freeList.head) {
    4757                block->didConsumeEmptyFreeList();
     58                m_blockList.remove(block);
     59                m_blockList.push(block);
     60                if (!m_lastFullBlock)
     61                    m_lastFullBlock = block;
    4862                continue;
    4963            }
     
    6983    m_freeList.head = head->next;
    7084    ASSERT(head);
     85    m_markedSpace->didAllocateInBlock(m_currentBlock);
    7186    return head;
    7287}
     
    137152   
    138153    m_blockList.append(block);
    139     m_blocksToSweep = m_currentBlock = block;
     154    m_nextBlockToSweep = m_currentBlock = block;
    140155    m_freeList = block->sweep(MarkedBlock::SweepToFreeList);
    141156    m_markedSpace->didAddBlock(block);
     
    148163        m_freeList = MarkedBlock::FreeList();
    149164    }
    150     if (m_blocksToSweep == block)
    151         m_blocksToSweep = m_blocksToSweep->next();
     165    if (m_nextBlockToSweep == block)
     166        m_nextBlockToSweep = m_nextBlockToSweep->next();
     167
     168    if (block == m_lastFullBlock)
     169        m_lastFullBlock = m_lastFullBlock->prev();
     170   
    152171    m_blockList.remove(block);
    153172}
    154173
     174void MarkedAllocator::reset()
     175{
     176    m_lastActiveBlock = 0;
     177    m_currentBlock = 0;
     178    m_freeList = MarkedBlock::FreeList();
     179    if (m_heap->operationInProgress() == FullCollection)
     180        m_lastFullBlock = 0;
     181
     182    if (m_lastFullBlock)
     183        m_nextBlockToSweep = m_lastFullBlock->next() ? m_lastFullBlock->next() : m_lastFullBlock;
     184    else
     185        m_nextBlockToSweep = m_blockList.head();
     186}
     187
    155188} // namespace JSC
  • trunk/Source/JavaScriptCore/heap/MarkedAllocator.h

    r161557 r161615  
    5353    MarkedBlock* m_currentBlock;
    5454    MarkedBlock* m_lastActiveBlock;
    55     MarkedBlock* m_blocksToSweep;
     55    MarkedBlock* m_nextBlockToSweep;
     56    MarkedBlock* m_lastFullBlock;
    5657    DoublyLinkedList<MarkedBlock> m_blockList;
    5758    size_t m_cellSize;
     
    6970    : m_currentBlock(0)
    7071    , m_lastActiveBlock(0)
    71     , m_blocksToSweep(0)
     72    , m_nextBlockToSweep(0)
     73    , m_lastFullBlock(0)
    7274    , m_cellSize(0)
    7375    , m_destructorType(MarkedBlock::None)
     
    101103#endif
    102104    return head;
    103 }
    104 
    105 inline void MarkedAllocator::reset()
    106 {
    107     m_lastActiveBlock = 0;
    108     m_currentBlock = 0;
    109     m_freeList = MarkedBlock::FreeList();
    110     m_blocksToSweep = m_blockList.head();
    111105}
    112106
  • trunk/Source/JavaScriptCore/heap/MarkedBlock.cpp

    r161557 r161615  
    198198}
    199199
     200void MarkedBlock::clearMarks()
     201{
     202    if (heap()->operationInProgress() == JSC::EdenCollection)
     203        this->clearMarksWithCollectionType<EdenCollection>();
     204    else
     205        this->clearMarksWithCollectionType<FullCollection>();
     206}
     207
     208void MarkedBlock::clearRememberedSet()
     209{
     210    m_rememberedSet.clearAll();
     211}
     212
     213template <HeapOperation collectionType>
     214void MarkedBlock::clearMarksWithCollectionType()
     215{
     216    ASSERT(collectionType == FullCollection || collectionType == EdenCollection);
     217    HEAP_LOG_BLOCK_STATE_TRANSITION(this);
     218
     219    ASSERT(m_state != New && m_state != FreeListed);
     220    if (collectionType == FullCollection) {
     221        m_marks.clearAll();
     222        m_rememberedSet.clearAll();
     223    }
     224
     225    // This will become true at the end of the mark phase. We set it now to
     226    // avoid an extra pass to do so later.
     227    m_state = Marked;
     228}
     229
     230void MarkedBlock::lastChanceToFinalize()
     231{
     232    m_weakSet.lastChanceToFinalize();
     233
     234    clearNewlyAllocated();
     235    clearMarksWithCollectionType<FullCollection>();
     236    sweep();
     237}
     238
    200239MarkedBlock::FreeList MarkedBlock::resumeAllocating()
    201240{
  • trunk/Source/JavaScriptCore/heap/MarkedBlock.h

    r161557 r161615  
    2626#include "HeapBlock.h"
    2727
     28#include "HeapOperation.h"
    2829#include "WeakSet.h"
    2930#include <wtf/Bitmap.h>
     
    7374
    7475    public:
    75         static const size_t atomSize = 8; // bytes
     76        static const size_t atomSize = 16; // bytes
    7677        static const size_t atomShiftAmount = 4; // log_2(atomSize) FIXME: Change atomSize to 16.
    7778        static const size_t blockSize = 64 * KB;
     
    141142        FreeList resumeAllocating(); // Call this if you canonicalized a block for some non-collection related purpose.
    142143        void didConsumeEmptyFreeList(); // Call this if you sweep a block, but the returned FreeList is empty.
     144        void didSweepToNoAvail(); // Call this if you sweep a block and get an empty free list back.
    143145
    144146        // Returns true if the "newly allocated" bitmap was non-null
     
    146148        bool clearNewlyAllocated();
    147149        void clearMarks();
     150        void clearRememberedSet();
     151        template <HeapOperation collectionType>
     152        void clearMarksWithCollectionType();
     153
    148154        size_t markCount();
    149155        bool isEmpty();
     
    161167        void setMarked(const void*);
    162168        void clearMarked(const void*);
     169
     170        void setRemembered(const void*);
     171        void clearRemembered(const void*);
     172        void atomicClearRemembered(const void*);
     173        bool isRemembered(const void*);
    163174
    164175        bool isNewlyAllocated(const void*);
     
    191202        size_t m_endAtom; // This is a fuzzy end. Always test for < m_endAtom.
    192203#if ENABLE(PARALLEL_GC)
    193         WTF::Bitmap<atomsPerBlock, WTF::BitmapAtomic> m_marks;
     204        WTF::Bitmap<atomsPerBlock, WTF::BitmapAtomic, uint8_t> m_marks;
     205        WTF::Bitmap<atomsPerBlock, WTF::BitmapAtomic, uint8_t> m_rememberedSet;
    194206#else
    195         WTF::Bitmap<atomsPerBlock, WTF::BitmapNotAtomic> m_marks;
     207        WTF::Bitmap<atomsPerBlock, WTF::BitmapNotAtomic, uint8_t> m_marks;
     208        WTF::Bitmap<atomsPerBlock, WTF::BitmapNotAtomic, uint8_t> m_rememberedSet;
    196209#endif
    197210        OwnPtr<WTF::Bitmap<atomsPerBlock>> m_newlyAllocated;
     
    235248    }
    236249
    237     inline void MarkedBlock::lastChanceToFinalize()
    238     {
    239         m_weakSet.lastChanceToFinalize();
    240 
    241         clearNewlyAllocated();
    242         clearMarks();
    243         sweep();
    244     }
    245 
    246250    inline MarkedAllocator* MarkedBlock::allocator() const
    247251    {
     
    292296
    293297        ASSERT(!m_newlyAllocated);
    294 #ifndef NDEBUG
    295         for (size_t i = firstAtom(); i < m_endAtom; i += m_atomsPerCell)
    296             ASSERT(m_marks.get(i));
    297 #endif
    298298        ASSERT(m_state == FreeListed);
    299299        m_state = Marked;
    300300    }
    301301
    302     inline void MarkedBlock::clearMarks()
    303     {
    304         HEAP_LOG_BLOCK_STATE_TRANSITION(this);
    305 
    306         ASSERT(m_state != New && m_state != FreeListed);
    307         m_marks.clearAll();
    308 
    309         // This will become true at the end of the mark phase. We set it now to
    310         // avoid an extra pass to do so later.
    311         m_state = Marked;
    312     }
    313 
    314302    inline size_t MarkedBlock::markCount()
    315303    {
     
    345333    {
    346334        return (reinterpret_cast<Bits>(p) - reinterpret_cast<Bits>(this)) / atomSize;
     335    }
     336
     337    inline void MarkedBlock::setRemembered(const void* p)
     338    {
     339        m_rememberedSet.set(atomNumber(p));
     340    }
     341
     342    inline void MarkedBlock::clearRemembered(const void* p)
     343    {
     344        m_rememberedSet.clear(atomNumber(p));
     345    }
     346
     347    inline void MarkedBlock::atomicClearRemembered(const void* p)
     348    {
     349        m_rememberedSet.concurrentTestAndClear(atomNumber(p));
     350    }
     351
     352    inline bool MarkedBlock::isRemembered(const void* p)
     353    {
     354        return m_rememberedSet.get(atomNumber(p));
    347355    }
    348356
  • trunk/Source/JavaScriptCore/heap/MarkedSpace.cpp

    r161557 r161615  
    106106    Free free(Free::FreeAll, this);
    107107    forEachBlock(free);
     108    ASSERT(!m_blocks.set().size());
    108109}
    109110
     
    144145    m_normalDestructorSpace.largeAllocator.reset();
    145146    m_immortalStructureDestructorSpace.largeAllocator.reset();
     147
     148    m_blocksWithNewObjects.clear();
    146149}
    147150
     
    149152{
    150153    VisitWeakSet visitWeakSet(heapRootVisitor);
    151     forEachBlock(visitWeakSet);
     154    if (m_heap->operationInProgress() == EdenCollection) {
     155        for (unsigned i = 0; i < m_blocksWithNewObjects.size(); ++i)
     156            visitWeakSet(m_blocksWithNewObjects[i]);
     157    } else
     158        forEachBlock(visitWeakSet);
    152159}
    153160
    154161void MarkedSpace::reapWeakSets()
    155162{
    156     forEachBlock<ReapWeakSet>();
     163    if (m_heap->operationInProgress() == EdenCollection) {
     164        for (unsigned i = 0; i < m_blocksWithNewObjects.size(); ++i)
     165            m_blocksWithNewObjects[i]->reapWeakSet();
     166    } else
     167        forEachBlock<ReapWeakSet>();
    157168}
    158169
     
    306317}
    307318
     319#ifndef NDEBUG
     320struct VerifyMarked : MarkedBlock::VoidFunctor {
     321    void operator()(MarkedBlock* block) { ASSERT(block->needsSweeping()); }
     322};
     323#endif
     324
     325void MarkedSpace::clearMarks()
     326{
     327    if (m_heap->operationInProgress() == EdenCollection) {
     328        for (unsigned i = 0; i < m_blocksWithNewObjects.size(); ++i)
     329            m_blocksWithNewObjects[i]->clearMarks();
     330    } else
     331        forEachBlock<ClearMarks>();
     332#ifndef NDEBUG
     333    forEachBlock<VerifyMarked>();
     334#endif
     335}
     336
    308337void MarkedSpace::willStartIterating()
    309338{
  • trunk/Source/JavaScriptCore/heap/MarkedSpace.h

    r161557 r161615  
    4747
    4848struct ClearMarks : MarkedBlock::VoidFunctor {
    49     void operator()(MarkedBlock* block) { block->clearMarks(); }
     49    void operator()(MarkedBlock* block)
     50    {
     51        block->clearMarks();
     52    }
     53};
     54
     55struct ClearRememberedSet : MarkedBlock::VoidFunctor {
     56    void operator()(MarkedBlock* block)
     57    {
     58        block->clearRememberedSet();
     59    }
    5060};
    5161
     
    106116    void didAddBlock(MarkedBlock*);
    107117    void didConsumeFreeList(MarkedBlock*);
     118    void didAllocateInBlock(MarkedBlock*);
    108119
    109120    void clearMarks();
     121    void clearRememberedSet();
    110122    void clearNewlyAllocated();
    111123    void sweep();
     
    151163    bool m_isIterating;
    152164    MarkedBlockSet m_blocks;
     165    Vector<MarkedBlock*> m_blocksWithNewObjects;
    153166
    154167    DelayedReleaseScope* m_currentDelayedReleaseScope;
     
    263276}
    264277
    265 inline void MarkedSpace::clearMarks()
    266 {
    267     forEachBlock<ClearMarks>();
     278inline void MarkedSpace::didAllocateInBlock(MarkedBlock* block)
     279{
     280    m_blocksWithNewObjects.append(block);
     281}
     282
     283inline void MarkedSpace::clearRememberedSet()
     284{
     285    forEachBlock<ClearRememberedSet>();
    268286}
    269287
  • trunk/Source/JavaScriptCore/heap/SlotVisitor.cpp

    r161557 r161615  
    3434SlotVisitor::~SlotVisitor()
    3535{
    36     ASSERT(m_stack.isEmpty());
     36    clearMarkStack();
    3737}
    3838
     
    6262        m_shouldHashCons = false;
    6363    }
     64}
     65
     66void SlotVisitor::clearMarkStack()
     67{
     68    m_stack.clear();
    6469}
    6570
  • trunk/Source/JavaScriptCore/heap/SlotVisitor.h

    r161557 r161615  
    5050    ~SlotVisitor();
    5151
     52    MarkStackArray& markStack() { return m_stack; }
     53
     54    Heap* heap() const;
     55
    5256    void append(ConservativeRoots&);
    5357   
     
    6266    template<typename T>
    6367    void appendUnbarrieredWeak(Weak<T>*);
     68    void unconditionallyAppend(JSCell*);
    6469   
    6570    void addOpaqueRoot(void*);
     
    6873    int opaqueRootCount();
    6974
    70     GCThreadSharedData& sharedData() { return m_shared; }
     75    GCThreadSharedData& sharedData() const { return m_shared; }
    7176    bool isEmpty() { return m_stack.isEmpty(); }
    7277
    7378    void setup();
    7479    void reset();
     80    void clearMarkStack();
    7581
    7682    size_t bytesVisited() const { return m_bytesVisited; }
     
    9096    void copyLater(JSCell*, CopyToken, void*, size_t);
    9197   
    92     void reportExtraMemoryUsage(size_t size);
     98    void reportExtraMemoryUsage(JSCell* owner, size_t);
    9399   
    94100    void addWeakReferenceHarvester(WeakReferenceHarvester*);
  • trunk/Source/JavaScriptCore/heap/SlotVisitorInlines.h

    r161557 r161615  
    106106    MARK_LOG_CHILD(*this, cell);
    107107
     108    unconditionallyAppend(cell);
     109}
     110
     111ALWAYS_INLINE void SlotVisitor::unconditionallyAppend(JSCell* cell)
     112{
     113    ASSERT(Heap::isMarked(cell));
     114    m_visitCount++;
     115       
    108116    // Should never attempt to mark something that is zapped.
    109117    ASSERT(!cell->isZapped());
     
    219227{
    220228    ASSERT(bytes);
     229    // We don't do any copying during EdenCollections.
     230    ASSERT(heap()->operationInProgress() != EdenCollection);
     231
    221232    m_bytesCopied += bytes;
    222233
     
    227238    }
    228239
    229     if (block->isPinned())
    230         return;
    231 
    232240    block->reportLiveBytes(owner, token, bytes);
    233241}
    234242   
    235 inline void SlotVisitor::reportExtraMemoryUsage(size_t size)
    236 {
     243inline void SlotVisitor::reportExtraMemoryUsage(JSCell* owner, size_t size)
     244{
     245    // We don't want to double-count the extra memory that was reported in previous collections.
     246    if (heap()->operationInProgress() == EdenCollection && MarkedBlock::blockFor(owner)->isRemembered(owner))
     247        return;
     248
    237249    size_t* counter = &m_shared.m_vm->heap.m_extraMemoryUsage;
    238250   
     
    248260}
    249261
     262inline Heap* SlotVisitor::heap() const
     263{
     264    return &sharedData().m_vm->heap;
     265}
     266
    250267} // namespace JSC
    251268
  • trunk/Source/JavaScriptCore/jit/Repatch.cpp

    r161557 r161615  
    4040#include "RepatchBuffer.h"
    4141#include "ScratchRegisterAllocator.h"
     42#include "StackAlignment.h"
    4243#include "StructureRareDataInlines.h"
    4344#include "StructureStubClearingWatchpoint.h"
  • trunk/Source/JavaScriptCore/runtime/JSGenericTypedArrayViewInlines.h

    r161557 r161615  
    448448       
    449449    case OversizeTypedArray: {
    450         visitor.reportExtraMemoryUsage(thisObject->byteSize());
     450        visitor.reportExtraMemoryUsage(thisObject, thisObject->byteSize());
    451451        break;
    452452    }
  • trunk/Source/JavaScriptCore/runtime/JSPropertyNameIterator.h

    r161557 r161615  
    110110    }
    111111   
    112     inline void StructureRareData::setEnumerationCache(VM& vm, const Structure* owner, JSPropertyNameIterator* value)
     112    inline void StructureRareData::setEnumerationCache(VM& vm, const Structure*, JSPropertyNameIterator* value)
    113113    {
    114         m_enumerationCache.set(vm, owner, value);
     114        m_enumerationCache.set(vm, this, value);
    115115    }
    116116
  • trunk/Source/JavaScriptCore/runtime/JSString.cpp

    r161557 r161615  
    7373        StringImpl* impl = thisObject->m_value.impl();
    7474        ASSERT(impl);
    75         visitor.reportExtraMemoryUsage(impl->costDuringGC());
     75        visitor.reportExtraMemoryUsage(thisObject, impl->costDuringGC());
    7676    }
    7777}
  • trunk/Source/JavaScriptCore/runtime/StructureRareDataInlines.h

    r161557 r161615  
    3636}
    3737
    38 inline void StructureRareData::setPreviousID(VM& vm, Structure* transition, Structure* structure)
     38inline void StructureRareData::setPreviousID(VM& vm, Structure*, Structure* structure)
    3939{
    40     m_previous.set(vm, transition, structure);
     40    m_previous.set(vm, this, structure);
    4141}
    4242
     
    5151}
    5252
    53 inline void StructureRareData::setObjectToStringValue(VM& vm, const JSCell* owner, JSString* value)
     53inline void StructureRareData::setObjectToStringValue(VM& vm, const JSCell*, JSString* value)
    5454{
    55     m_objectToStringValue.set(vm, owner, value);
     55    m_objectToStringValue.set(vm, this, value);
    5656}
    5757
  • trunk/Source/JavaScriptCore/runtime/WeakMapData.cpp

    r161557 r161615  
    6565    // This isn't exact, but it is close enough, and proportional to the actual
    6666    // external mermory usage.
    67     visitor.reportExtraMemoryUsage(thisObj->m_map.capacity() * (sizeof(JSObject*) + sizeof(WriteBarrier<Unknown>)));
     67    visitor.reportExtraMemoryUsage(thisObj, thisObj->m_map.capacity() * (sizeof(JSObject*) + sizeof(WriteBarrier<Unknown>)));
    6868}
    6969
  • trunk/Source/WTF/ChangeLog

    r161610 r161615  
     12014-01-07  Mark Hahnenberg  <mhahnenberg@apple.com>
     2
     3        Marking should be generational
     4        https://bugs.webkit.org/show_bug.cgi?id=126552
     5
     6        Reviewed by Geoffrey Garen.
     7
     8        * wtf/Bitmap.h:
     9        (WTF::WordType>::count): Added a cast that became necessary when Bitmap
     10        is used with smaller types than int32_t.
     11
    1122014-01-09  Simon Fraser  <simon.fraser@apple.com>
    213
  • trunk/Source/WTF/wtf/Bitmap.h

    r161557 r161615  
    197197    }
    198198    for (size_t i = start / wordSize; i < words; ++i)
    199         result += WTF::bitCount(bits[i]);
     199        result += WTF::bitCount(static_cast<unsigned>(bits[i]));
    200200    return result;
    201201}
Note: See TracChangeset for help on using the changeset viewer.