Changeset 227617 in webkit


Ignore:
Timestamp:
Jan 25, 2018 11:32:00 AM (6 years ago)
Author:
fpizlo@apple.com
Message:

JSC GC should support TLCs (thread local caches)
https://bugs.webkit.org/show_bug.cgi?id=181559

Reviewed by Mark Lam and Saam Barati.
Source/JavaScriptCore:


This is a big step towards object distancing by site origin. This patch implements TLCs, or
thread-local caches, which allow each thread to allocate from its own free lists. It also
means that any given thread can context-switch TLCs. This will allow us to do separate
allocation for separate site origins. Eventually, once we reshape how MarkedBlock looks, this
will allow us to have a hard distancing constraint between objects from different origins.

In this new design, every "size class" is represented as a BlockDirectory (formerly known as
MarkedAllocator, prior to r226822). This contains a bag of blocks allocated using some
aligned memory allocator (which roughly represents which cage you came out of), and anyone
using the same allocator can share those blocks - but so long as they are in that
BlockDirectory, they will have the size and type of that directory. Previously, each
BlockDirectory had exactly one FreeList. Now, each BlockDirectory has a double-linked-list of
LocalAllocators, each of which has a FreeList.

To decide which LocalAllocator to allocate out of, we need a ThreadLocalCache and a
BlockDirectory. The directory gives us an offset-within-the-ThreadLocalCache, which we simply
call the Allocator (which is just a POD type that contains a 32-bit offset). Each allocation
starts by figuring out what Allocator it wants (often we have this information at JIT time).
Then the allocation loads its ThreadLocalCache::Data from a fast TLS slot. Then we add the
Allocator offset to the ThreadLocalCache::Data to get the LocalAllocator. Note that we use
offsets as opposed to indices to make it easy to do the math on each allocation (if
LocalAllocator had a weird size then every allocation would have to do an imul).

This is a definite slow-down on GC-heavy benchmarks, but by a small margin, and only on
unusually heavy tests. For example, boyer and splay are both 3% regressed, but the Octane
geomean is just fine. The JetStream score regressed by 0.5% with p = 0.08 (so maybe there is
something there, but it's not significant according to our threshold).

Relanding after fixing ARM64 bug in AssemblyHelpers::emitAllocateWithNonNullAllocator(). That
function needs to be careful to avoid using the scratch register because the FTL will call it
in disallow-scratch-register mode.

  • JavaScriptCore.xcodeproj/project.pbxproj:
  • Sources.txt:
  • b3/B3LowerToAir.cpp:
  • b3/B3PatchpointSpecial.cpp:

(JSC::B3::PatchpointSpecial::admitsStack):

  • b3/B3StackmapSpecial.cpp:

(JSC::B3::StackmapSpecial::forEachArgImpl):
(JSC::B3::StackmapSpecial::isArgValidForRep):

  • b3/B3StackmapValue.cpp:

(JSC::B3::StackmapValue::appendSomeRegisterWithClobber):

  • b3/B3StackmapValue.h:
  • b3/B3Validate.cpp:
  • b3/B3ValueRep.cpp:

(JSC::B3::ValueRep::addUsedRegistersTo const):
(JSC::B3::ValueRep::dump const):
(WTF::printInternal):

  • b3/B3ValueRep.h:

(JSC::B3::ValueRep::ValueRep):

  • bytecode/AccessCase.cpp:

(JSC::AccessCase::generateImpl):

  • bytecode/ObjectAllocationProfile.h:

(JSC::ObjectAllocationProfile::ObjectAllocationProfile):
(JSC::ObjectAllocationProfile::clear):

  • bytecode/ObjectAllocationProfileInlines.h:

(JSC::ObjectAllocationProfile::initializeProfile):

  • dfg/DFGSpeculativeJIT.cpp:

(JSC::DFG::SpeculativeJIT::emitAllocateRawObject):
(JSC::DFG::SpeculativeJIT::compileMakeRope):
(JSC::DFG::SpeculativeJIT::compileAllocatePropertyStorage):
(JSC::DFG::SpeculativeJIT::compileReallocatePropertyStorage):
(JSC::DFG::SpeculativeJIT::compileCreateThis):
(JSC::DFG::SpeculativeJIT::compileNewObject):

  • dfg/DFGSpeculativeJIT.h:

(JSC::DFG::SpeculativeJIT::emitAllocateJSCell):
(JSC::DFG::SpeculativeJIT::emitAllocateJSObject):

  • ftl/FTLAbstractHeapRepository.h:
  • ftl/FTLLowerDFGToB3.cpp:

(JSC::FTL::DFG::LowerDFGToB3::compileMakeRope):
(JSC::FTL::DFG::LowerDFGToB3::compileMaterializeNewObject):
(JSC::FTL::DFG::LowerDFGToB3::allocatePropertyStorageWithSizeImpl):
(JSC::FTL::DFG::LowerDFGToB3::allocateHeapCell):
(JSC::FTL::DFG::LowerDFGToB3::allocateObject):
(JSC::FTL::DFG::LowerDFGToB3::allocatorForSize):
(JSC::FTL::DFG::LowerDFGToB3::allocateVariableSizedObject):
(JSC::FTL::DFG::LowerDFGToB3::allocateVariableSizedCell):

  • heap/Allocator.cpp: Added.

(JSC::Allocator::cellSize const):

  • heap/Allocator.h: Added.

(JSC::Allocator::Allocator):
(JSC::Allocator::offset const):
(JSC::Allocator::operator== const):
(JSC::Allocator::operator!= const):
(JSC::Allocator::operator bool const):

  • heap/AllocatorInlines.h: Added.

(JSC::Allocator::allocate const):
(JSC::Allocator::tryAllocate const):

  • heap/BlockDirectory.cpp:

(JSC::BlockDirectory::BlockDirectory):
(JSC::BlockDirectory::findBlockForAllocation):
(JSC::BlockDirectory::stopAllocating):
(JSC::BlockDirectory::prepareForAllocation):
(JSC::BlockDirectory::stopAllocatingForGood):
(JSC::BlockDirectory::resumeAllocating):
(JSC::BlockDirectory::endMarking):
(JSC::BlockDirectory::isFreeListedCell):
(JSC::BlockDirectory::didConsumeFreeList): Deleted.
(JSC::BlockDirectory::tryAllocateWithoutCollecting): Deleted.
(JSC::BlockDirectory::allocateIn): Deleted.
(JSC::BlockDirectory::tryAllocateIn): Deleted.
(JSC::BlockDirectory::doTestCollectionsIfNeeded): Deleted.
(JSC::BlockDirectory::allocateSlowCase): Deleted.

  • heap/BlockDirectory.h:

(JSC::BlockDirectory::cellKind const):
(JSC::BlockDirectory::allocator const):
(JSC::BlockDirectory::freeList const): Deleted.
(JSC::BlockDirectory::offsetOfFreeList): Deleted.
(JSC::BlockDirectory::offsetOfCellSize): Deleted.

  • heap/BlockDirectoryInlines.h:

(JSC::BlockDirectory::isFreeListedCell const): Deleted.
(JSC::BlockDirectory::allocate): Deleted.

  • heap/CompleteSubspace.cpp:

(JSC::CompleteSubspace::CompleteSubspace):
(JSC::CompleteSubspace::allocatorFor):
(JSC::CompleteSubspace::allocate):
(JSC::CompleteSubspace::allocateNonVirtual):
(JSC::CompleteSubspace::allocatorForSlow):
(JSC::CompleteSubspace::allocateSlow):
(JSC::CompleteSubspace::tryAllocateSlow):

  • heap/CompleteSubspace.h:

(JSC::CompleteSubspace::allocatorForSizeStep):
(JSC::CompleteSubspace::allocatorForNonVirtual):

  • heap/FreeList.h:
  • heap/GCDeferralContext.h:
  • heap/Heap.cpp:

(JSC::Heap::Heap):
(JSC::Heap::lastChanceToFinalize):

  • heap/Heap.h:

(JSC::Heap::threadLocalCacheLayout):

  • heap/IsoCellSet.h:
  • heap/IsoSubspace.cpp:

(JSC::IsoSubspace::IsoSubspace):
(JSC::IsoSubspace::allocatorFor):
(JSC::IsoSubspace::allocate):
(JSC::IsoSubspace::allocateNonVirtual):

  • heap/IsoSubspace.h:

(JSC::IsoSubspace::allocatorForNonVirtual):

  • heap/LocalAllocator.cpp: Added.

(JSC::LocalAllocator::LocalAllocator):
(JSC::LocalAllocator::reset):
(JSC::LocalAllocator::~LocalAllocator):
(JSC::LocalAllocator::stopAllocating):
(JSC::LocalAllocator::resumeAllocating):
(JSC::LocalAllocator::prepareForAllocation):
(JSC::LocalAllocator::stopAllocatingForGood):
(JSC::LocalAllocator::allocateSlowCase):
(JSC::LocalAllocator::didConsumeFreeList):
(JSC::LocalAllocator::tryAllocateWithoutCollecting):
(JSC::LocalAllocator::allocateIn):
(JSC::LocalAllocator::tryAllocateIn):
(JSC::LocalAllocator::doTestCollectionsIfNeeded):
(JSC::LocalAllocator::isFreeListedCell const):

  • heap/LocalAllocator.h: Added.

(JSC::LocalAllocator::offsetOfFreeList):
(JSC::LocalAllocator::offsetOfCellSize):

  • heap/LocalAllocatorInlines.h: Added.

(JSC::LocalAllocator::allocate):

  • heap/MarkedSpace.cpp:

(JSC::MarkedSpace::stopAllocatingForGood):

  • heap/MarkedSpace.h:
  • heap/SlotVisitor.cpp:
  • heap/SlotVisitor.h:
  • heap/Subspace.h:
  • heap/ThreadLocalCache.cpp: Added.

(JSC::ThreadLocalCache::create):
(JSC::ThreadLocalCache::ThreadLocalCache):
(JSC::ThreadLocalCache::~ThreadLocalCache):
(JSC::ThreadLocalCache::allocateData):
(JSC::ThreadLocalCache::destroyData):
(JSC::ThreadLocalCache::installSlow):
(JSC::ThreadLocalCache::installData):
(JSC::ThreadLocalCache::allocatorSlow):
(JSC::ThreadLocalCache::destructor):

  • heap/ThreadLocalCache.h: Added.

(JSC::ThreadLocalCache::offsetOfSize):
(JSC::ThreadLocalCache::offsetOfFirstAllocator):

  • heap/ThreadLocalCacheInlines.h: Added.

(JSC::ThreadLocalCache::getImpl):
(JSC::ThreadLocalCache::get):
(JSC::ThreadLocalCache::install):
(JSC::ThreadLocalCache::allocator):
(JSC::ThreadLocalCache::tryGetAllocator):

  • heap/ThreadLocalCacheLayout.cpp: Added.

(JSC::ThreadLocalCacheLayout::ThreadLocalCacheLayout):
(JSC::ThreadLocalCacheLayout::~ThreadLocalCacheLayout):
(JSC::ThreadLocalCacheLayout::allocateOffset):
(JSC::ThreadLocalCacheLayout::snapshot):
(JSC::ThreadLocalCacheLayout::directory):

  • heap/ThreadLocalCacheLayout.h: Added.
  • jit/AssemblyHelpers.cpp:

(JSC::AssemblyHelpers::emitAllocateWithNonNullAllocator):
(JSC::AssemblyHelpers::emitAllocate):
(JSC::AssemblyHelpers::emitAllocateVariableSized):

  • jit/AssemblyHelpers.h:

(JSC::AssemblyHelpers::vm):
(JSC::AssemblyHelpers::emitAllocateJSCell):
(JSC::AssemblyHelpers::emitAllocateJSObject):
(JSC::AssemblyHelpers::emitAllocateJSObjectWithKnownSize):
(JSC::AssemblyHelpers::emitAllocateWithNonNullAllocator): Deleted.
(JSC::AssemblyHelpers::emitAllocate): Deleted.
(JSC::AssemblyHelpers::emitAllocateVariableSized): Deleted.

  • jit/JITOpcodes.cpp:

(JSC::JIT::emit_op_new_object):
(JSC::JIT::emit_op_create_this):

  • jit/JITOpcodes32_64.cpp:

(JSC::JIT::emit_op_new_object):
(JSC::JIT::emit_op_create_this):

  • runtime/ButterflyInlines.h:

(JSC::Butterfly::createUninitialized):
(JSC::Butterfly::tryCreate):
(JSC::Butterfly::growArrayRight):

  • runtime/DirectArguments.cpp:

(JSC::DirectArguments::overrideThings):

  • runtime/GenericArgumentsInlines.h:

(JSC::GenericArguments<Type>::initModifiedArgumentsDescriptor):

  • runtime/HashMapImpl.h:

(JSC::HashMapBuffer::create):

  • runtime/JSArray.cpp:

(JSC::JSArray::tryCreateUninitializedRestricted):
(JSC::JSArray::unshiftCountSlowCase):

  • runtime/JSArray.h:

(JSC::JSArray::tryCreate):

  • runtime/JSArrayBufferView.cpp:

(JSC::JSArrayBufferView::ConstructionContext::ConstructionContext):

  • runtime/JSCellInlines.h:

(JSC::tryAllocateCellHelper):

  • runtime/JSGlobalObject.cpp:

(JSC::JSGlobalObject::JSGlobalObject):

  • runtime/JSGlobalObject.h:

(JSC::JSGlobalObject::threadLocalCache const):

  • runtime/JSLock.cpp:

(JSC::JSLock::didAcquireLock):

  • runtime/Options.h:
  • runtime/RegExpMatchesArray.h:

(JSC::tryCreateUninitializedRegExpMatchesArray):

  • runtime/VM.cpp:

(JSC::VM::VM):

  • runtime/VM.h:
  • runtime/VMEntryScope.cpp:

(JSC::VMEntryScope::VMEntryScope):

Source/WTF:

  • wtf/Bitmap.h: Just fixing a compile error.
Location:
trunk/Source
Files:
12 added
58 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/JavaScriptCore/ChangeLog

    r227609 r227617  
     12018-01-25  Filip Pizlo  <fpizlo@apple.com>
     2
     3        JSC GC should support TLCs (thread local caches)
     4        https://bugs.webkit.org/show_bug.cgi?id=181559
     5
     6        Reviewed by Mark Lam and Saam Barati.
     7       
     8        This is a big step towards object distancing by site origin. This patch implements TLCs, or
     9        thread-local caches, which allow each thread to allocate from its own free lists. It also
     10        means that any given thread can context-switch TLCs. This will allow us to do separate
     11        allocation for separate site origins. Eventually, once we reshape how MarkedBlock looks, this
     12        will allow us to have a hard distancing constraint between objects from different origins.
     13       
     14        In this new design, every "size class" is represented as a BlockDirectory (formerly known as
     15        MarkedAllocator, prior to r226822). This contains a bag of blocks allocated using some
     16        aligned memory allocator (which roughly represents which cage you came out of), and anyone
     17        using the same allocator can share those blocks - but so long as they are in that
     18        BlockDirectory, they will have the size and type of that directory. Previously, each
     19        BlockDirectory had exactly one FreeList. Now, each BlockDirectory has a double-linked-list of
     20        LocalAllocators, each of which has a FreeList.
     21       
     22        To decide which LocalAllocator to allocate out of, we need a ThreadLocalCache and a
     23        BlockDirectory. The directory gives us an offset-within-the-ThreadLocalCache, which we simply
     24        call the Allocator (which is just a POD type that contains a 32-bit offset). Each allocation
     25        starts by figuring out what Allocator it wants (often we have this information at JIT time).
     26        Then the allocation loads its ThreadLocalCache::Data from a fast TLS slot. Then we add the
     27        Allocator offset to the ThreadLocalCache::Data to get the LocalAllocator. Note that we use
     28        offsets as opposed to indices to make it easy to do the math on each allocation (if
     29        LocalAllocator had a weird size then every allocation would have to do an imul).
     30       
     31        This is a definite slow-down on GC-heavy benchmarks, but by a small margin, and only on
     32        unusually heavy tests. For example, boyer and splay are both 3% regressed, but the Octane
     33        geomean is just fine. The JetStream score regressed by 0.5% with p = 0.08 (so maybe there is
     34        something there, but it's not significant according to our threshold).
     35       
     36        Relanding after fixing ARM64 bug in AssemblyHelpers::emitAllocateWithNonNullAllocator(). That
     37        function needs to be careful to avoid using the scratch register because the FTL will call it
     38        in disallow-scratch-register mode.
     39
     40        * JavaScriptCore.xcodeproj/project.pbxproj:
     41        * Sources.txt:
     42        * b3/B3LowerToAir.cpp:
     43        * b3/B3PatchpointSpecial.cpp:
     44        (JSC::B3::PatchpointSpecial::admitsStack):
     45        * b3/B3StackmapSpecial.cpp:
     46        (JSC::B3::StackmapSpecial::forEachArgImpl):
     47        (JSC::B3::StackmapSpecial::isArgValidForRep):
     48        * b3/B3StackmapValue.cpp:
     49        (JSC::B3::StackmapValue::appendSomeRegisterWithClobber):
     50        * b3/B3StackmapValue.h:
     51        * b3/B3Validate.cpp:
     52        * b3/B3ValueRep.cpp:
     53        (JSC::B3::ValueRep::addUsedRegistersTo const):
     54        (JSC::B3::ValueRep::dump const):
     55        (WTF::printInternal):
     56        * b3/B3ValueRep.h:
     57        (JSC::B3::ValueRep::ValueRep):
     58        * bytecode/AccessCase.cpp:
     59        (JSC::AccessCase::generateImpl):
     60        * bytecode/ObjectAllocationProfile.h:
     61        (JSC::ObjectAllocationProfile::ObjectAllocationProfile):
     62        (JSC::ObjectAllocationProfile::clear):
     63        * bytecode/ObjectAllocationProfileInlines.h:
     64        (JSC::ObjectAllocationProfile::initializeProfile):
     65        * dfg/DFGSpeculativeJIT.cpp:
     66        (JSC::DFG::SpeculativeJIT::emitAllocateRawObject):
     67        (JSC::DFG::SpeculativeJIT::compileMakeRope):
     68        (JSC::DFG::SpeculativeJIT::compileAllocatePropertyStorage):
     69        (JSC::DFG::SpeculativeJIT::compileReallocatePropertyStorage):
     70        (JSC::DFG::SpeculativeJIT::compileCreateThis):
     71        (JSC::DFG::SpeculativeJIT::compileNewObject):
     72        * dfg/DFGSpeculativeJIT.h:
     73        (JSC::DFG::SpeculativeJIT::emitAllocateJSCell):
     74        (JSC::DFG::SpeculativeJIT::emitAllocateJSObject):
     75        * ftl/FTLAbstractHeapRepository.h:
     76        * ftl/FTLLowerDFGToB3.cpp:
     77        (JSC::FTL::DFG::LowerDFGToB3::compileMakeRope):
     78        (JSC::FTL::DFG::LowerDFGToB3::compileMaterializeNewObject):
     79        (JSC::FTL::DFG::LowerDFGToB3::allocatePropertyStorageWithSizeImpl):
     80        (JSC::FTL::DFG::LowerDFGToB3::allocateHeapCell):
     81        (JSC::FTL::DFG::LowerDFGToB3::allocateObject):
     82        (JSC::FTL::DFG::LowerDFGToB3::allocatorForSize):
     83        (JSC::FTL::DFG::LowerDFGToB3::allocateVariableSizedObject):
     84        (JSC::FTL::DFG::LowerDFGToB3::allocateVariableSizedCell):
     85        * heap/Allocator.cpp: Added.
     86        (JSC::Allocator::cellSize const):
     87        * heap/Allocator.h: Added.
     88        (JSC::Allocator::Allocator):
     89        (JSC::Allocator::offset const):
     90        (JSC::Allocator::operator== const):
     91        (JSC::Allocator::operator!= const):
     92        (JSC::Allocator::operator bool const):
     93        * heap/AllocatorInlines.h: Added.
     94        (JSC::Allocator::allocate const):
     95        (JSC::Allocator::tryAllocate const):
     96        * heap/BlockDirectory.cpp:
     97        (JSC::BlockDirectory::BlockDirectory):
     98        (JSC::BlockDirectory::findBlockForAllocation):
     99        (JSC::BlockDirectory::stopAllocating):
     100        (JSC::BlockDirectory::prepareForAllocation):
     101        (JSC::BlockDirectory::stopAllocatingForGood):
     102        (JSC::BlockDirectory::resumeAllocating):
     103        (JSC::BlockDirectory::endMarking):
     104        (JSC::BlockDirectory::isFreeListedCell):
     105        (JSC::BlockDirectory::didConsumeFreeList): Deleted.
     106        (JSC::BlockDirectory::tryAllocateWithoutCollecting): Deleted.
     107        (JSC::BlockDirectory::allocateIn): Deleted.
     108        (JSC::BlockDirectory::tryAllocateIn): Deleted.
     109        (JSC::BlockDirectory::doTestCollectionsIfNeeded): Deleted.
     110        (JSC::BlockDirectory::allocateSlowCase): Deleted.
     111        * heap/BlockDirectory.h:
     112        (JSC::BlockDirectory::cellKind const):
     113        (JSC::BlockDirectory::allocator const):
     114        (JSC::BlockDirectory::freeList const): Deleted.
     115        (JSC::BlockDirectory::offsetOfFreeList): Deleted.
     116        (JSC::BlockDirectory::offsetOfCellSize): Deleted.
     117        * heap/BlockDirectoryInlines.h:
     118        (JSC::BlockDirectory::isFreeListedCell const): Deleted.
     119        (JSC::BlockDirectory::allocate): Deleted.
     120        * heap/CompleteSubspace.cpp:
     121        (JSC::CompleteSubspace::CompleteSubspace):
     122        (JSC::CompleteSubspace::allocatorFor):
     123        (JSC::CompleteSubspace::allocate):
     124        (JSC::CompleteSubspace::allocateNonVirtual):
     125        (JSC::CompleteSubspace::allocatorForSlow):
     126        (JSC::CompleteSubspace::allocateSlow):
     127        (JSC::CompleteSubspace::tryAllocateSlow):
     128        * heap/CompleteSubspace.h:
     129        (JSC::CompleteSubspace::allocatorForSizeStep):
     130        (JSC::CompleteSubspace::allocatorForNonVirtual):
     131        * heap/FreeList.h:
     132        * heap/GCDeferralContext.h:
     133        * heap/Heap.cpp:
     134        (JSC::Heap::Heap):
     135        (JSC::Heap::lastChanceToFinalize):
     136        * heap/Heap.h:
     137        (JSC::Heap::threadLocalCacheLayout):
     138        * heap/IsoCellSet.h:
     139        * heap/IsoSubspace.cpp:
     140        (JSC::IsoSubspace::IsoSubspace):
     141        (JSC::IsoSubspace::allocatorFor):
     142        (JSC::IsoSubspace::allocate):
     143        (JSC::IsoSubspace::allocateNonVirtual):
     144        * heap/IsoSubspace.h:
     145        (JSC::IsoSubspace::allocatorForNonVirtual):
     146        * heap/LocalAllocator.cpp: Added.
     147        (JSC::LocalAllocator::LocalAllocator):
     148        (JSC::LocalAllocator::reset):
     149        (JSC::LocalAllocator::~LocalAllocator):
     150        (JSC::LocalAllocator::stopAllocating):
     151        (JSC::LocalAllocator::resumeAllocating):
     152        (JSC::LocalAllocator::prepareForAllocation):
     153        (JSC::LocalAllocator::stopAllocatingForGood):
     154        (JSC::LocalAllocator::allocateSlowCase):
     155        (JSC::LocalAllocator::didConsumeFreeList):
     156        (JSC::LocalAllocator::tryAllocateWithoutCollecting):
     157        (JSC::LocalAllocator::allocateIn):
     158        (JSC::LocalAllocator::tryAllocateIn):
     159        (JSC::LocalAllocator::doTestCollectionsIfNeeded):
     160        (JSC::LocalAllocator::isFreeListedCell const):
     161        * heap/LocalAllocator.h: Added.
     162        (JSC::LocalAllocator::offsetOfFreeList):
     163        (JSC::LocalAllocator::offsetOfCellSize):
     164        * heap/LocalAllocatorInlines.h: Added.
     165        (JSC::LocalAllocator::allocate):
     166        * heap/MarkedSpace.cpp:
     167        (JSC::MarkedSpace::stopAllocatingForGood):
     168        * heap/MarkedSpace.h:
     169        * heap/SlotVisitor.cpp:
     170        * heap/SlotVisitor.h:
     171        * heap/Subspace.h:
     172        * heap/ThreadLocalCache.cpp: Added.
     173        (JSC::ThreadLocalCache::create):
     174        (JSC::ThreadLocalCache::ThreadLocalCache):
     175        (JSC::ThreadLocalCache::~ThreadLocalCache):
     176        (JSC::ThreadLocalCache::allocateData):
     177        (JSC::ThreadLocalCache::destroyData):
     178        (JSC::ThreadLocalCache::installSlow):
     179        (JSC::ThreadLocalCache::installData):
     180        (JSC::ThreadLocalCache::allocatorSlow):
     181        (JSC::ThreadLocalCache::destructor):
     182        * heap/ThreadLocalCache.h: Added.
     183        (JSC::ThreadLocalCache::offsetOfSize):
     184        (JSC::ThreadLocalCache::offsetOfFirstAllocator):
     185        * heap/ThreadLocalCacheInlines.h: Added.
     186        (JSC::ThreadLocalCache::getImpl):
     187        (JSC::ThreadLocalCache::get):
     188        (JSC::ThreadLocalCache::install):
     189        (JSC::ThreadLocalCache::allocator):
     190        (JSC::ThreadLocalCache::tryGetAllocator):
     191        * heap/ThreadLocalCacheLayout.cpp: Added.
     192        (JSC::ThreadLocalCacheLayout::ThreadLocalCacheLayout):
     193        (JSC::ThreadLocalCacheLayout::~ThreadLocalCacheLayout):
     194        (JSC::ThreadLocalCacheLayout::allocateOffset):
     195        (JSC::ThreadLocalCacheLayout::snapshot):
     196        (JSC::ThreadLocalCacheLayout::directory):
     197        * heap/ThreadLocalCacheLayout.h: Added.
     198        * jit/AssemblyHelpers.cpp:
     199        (JSC::AssemblyHelpers::emitAllocateWithNonNullAllocator):
     200        (JSC::AssemblyHelpers::emitAllocate):
     201        (JSC::AssemblyHelpers::emitAllocateVariableSized):
     202        * jit/AssemblyHelpers.h:
     203        (JSC::AssemblyHelpers::vm):
     204        (JSC::AssemblyHelpers::emitAllocateJSCell):
     205        (JSC::AssemblyHelpers::emitAllocateJSObject):
     206        (JSC::AssemblyHelpers::emitAllocateJSObjectWithKnownSize):
     207        (JSC::AssemblyHelpers::emitAllocateWithNonNullAllocator): Deleted.
     208        (JSC::AssemblyHelpers::emitAllocate): Deleted.
     209        (JSC::AssemblyHelpers::emitAllocateVariableSized): Deleted.
     210        * jit/JITOpcodes.cpp:
     211        (JSC::JIT::emit_op_new_object):
     212        (JSC::JIT::emit_op_create_this):
     213        * jit/JITOpcodes32_64.cpp:
     214        (JSC::JIT::emit_op_new_object):
     215        (JSC::JIT::emit_op_create_this):
     216        * runtime/ButterflyInlines.h:
     217        (JSC::Butterfly::createUninitialized):
     218        (JSC::Butterfly::tryCreate):
     219        (JSC::Butterfly::growArrayRight):
     220        * runtime/DirectArguments.cpp:
     221        (JSC::DirectArguments::overrideThings):
     222        * runtime/GenericArgumentsInlines.h:
     223        (JSC::GenericArguments<Type>::initModifiedArgumentsDescriptor):
     224        * runtime/HashMapImpl.h:
     225        (JSC::HashMapBuffer::create):
     226        * runtime/JSArray.cpp:
     227        (JSC::JSArray::tryCreateUninitializedRestricted):
     228        (JSC::JSArray::unshiftCountSlowCase):
     229        * runtime/JSArray.h:
     230        (JSC::JSArray::tryCreate):
     231        * runtime/JSArrayBufferView.cpp:
     232        (JSC::JSArrayBufferView::ConstructionContext::ConstructionContext):
     233        * runtime/JSCellInlines.h:
     234        (JSC::tryAllocateCellHelper):
     235        * runtime/JSGlobalObject.cpp:
     236        (JSC::JSGlobalObject::JSGlobalObject):
     237        * runtime/JSGlobalObject.h:
     238        (JSC::JSGlobalObject::threadLocalCache const):
     239        * runtime/JSLock.cpp:
     240        (JSC::JSLock::didAcquireLock):
     241        * runtime/Options.h:
     242        * runtime/RegExpMatchesArray.h:
     243        (JSC::tryCreateUninitializedRegExpMatchesArray):
     244        * runtime/VM.cpp:
     245        (JSC::VM::VM):
     246        * runtime/VM.h:
     247        * runtime/VMEntryScope.cpp:
     248        (JSC::VMEntryScope::VMEntryScope):
     249
    12502018-01-25  Commit Queue  <commit-queue@webkit.org>
    2251
  • trunk/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj

    r227609 r227617  
    390390                0F725CB01C506D3B00AD943A /* B3FoldPathConstants.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F725CAE1C506D3B00AD943A /* B3FoldPathConstants.h */; };
    391391                0F74B93B1F89614800B935D3 /* PrototypeKey.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F74B93A1F89614500B935D3 /* PrototypeKey.h */; settings = {ATTRIBUTES = (Private, ); }; };
     392                0F75A05E200D25F60038E2CF /* ThreadLocalCache.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F75A055200D25EF0038E2CF /* ThreadLocalCache.h */; settings = {ATTRIBUTES = (Private, ); }; };
     393                0F75A060200D260B0038E2CF /* LocalAllocatorInlines.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F75A05A200D25F00038E2CF /* LocalAllocatorInlines.h */; };
     394                0F75A061200D26180038E2CF /* LocalAllocator.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F75A057200D25F00038E2CF /* LocalAllocator.h */; settings = {ATTRIBUTES = (Private, ); }; };
     395                0F75A062200D261D0038E2CF /* AllocatorInlines.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F75A05D200D25F10038E2CF /* AllocatorInlines.h */; };
     396                0F75A063200D261F0038E2CF /* Allocator.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F75A054200D25EF0038E2CF /* Allocator.h */; settings = {ATTRIBUTES = (Private, ); }; };
     397                0F75A064200D26280038E2CF /* ThreadLocalCacheLayout.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F75A05C200D25F10038E2CF /* ThreadLocalCacheLayout.h */; settings = {ATTRIBUTES = (Private, ); }; };
     398                0F75A0662013E4F10038E2CF /* JITAllocator.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F75A0652013E4EF0038E2CF /* JITAllocator.h */; settings = {ATTRIBUTES = (Private, ); }; };
    392399                0F766D2C15A8CC3A008F363E /* JITStubRoutineSet.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F766D2A15A8CC34008F363E /* JITStubRoutineSet.h */; settings = {ATTRIBUTES = (Private, ); }; };
    393400                0F766D3015A8DCE2008F363E /* GCAwareJITStubRoutine.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F766D2E15A8DCDD008F363E /* GCAwareJITStubRoutine.h */; settings = {ATTRIBUTES = (Private, ); }; };
     
    24202427                0F725CAE1C506D3B00AD943A /* B3FoldPathConstants.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = B3FoldPathConstants.h; path = b3/B3FoldPathConstants.h; sourceTree = "<group>"; };
    24212428                0F74B93A1F89614500B935D3 /* PrototypeKey.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = PrototypeKey.h; sourceTree = "<group>"; };
     2429                0F75A054200D25EF0038E2CF /* Allocator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = Allocator.h; sourceTree = "<group>"; };
     2430                0F75A055200D25EF0038E2CF /* ThreadLocalCache.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ThreadLocalCache.h; sourceTree = "<group>"; };
     2431                0F75A056200D25EF0038E2CF /* ThreadLocalCacheInlines.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ThreadLocalCacheInlines.h; sourceTree = "<group>"; };
     2432                0F75A057200D25F00038E2CF /* LocalAllocator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = LocalAllocator.h; sourceTree = "<group>"; };
     2433                0F75A058200D25F00038E2CF /* ThreadLocalCache.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = ThreadLocalCache.cpp; sourceTree = "<group>"; };
     2434                0F75A059200D25F00038E2CF /* LocalAllocator.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = LocalAllocator.cpp; sourceTree = "<group>"; };
     2435                0F75A05A200D25F00038E2CF /* LocalAllocatorInlines.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = LocalAllocatorInlines.h; sourceTree = "<group>"; };
     2436                0F75A05B200D25F10038E2CF /* ThreadLocalCacheLayout.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = ThreadLocalCacheLayout.cpp; sourceTree = "<group>"; };
     2437                0F75A05C200D25F10038E2CF /* ThreadLocalCacheLayout.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ThreadLocalCacheLayout.h; sourceTree = "<group>"; };
     2438                0F75A05D200D25F10038E2CF /* AllocatorInlines.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AllocatorInlines.h; sourceTree = "<group>"; };
     2439                0F75A0652013E4EF0038E2CF /* JITAllocator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = JITAllocator.h; sourceTree = "<group>"; };
    24222440                0F766D1C15A5028D008F363E /* JITStubRoutine.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = JITStubRoutine.h; sourceTree = "<group>"; };
    24232441                0F766D2615A8CC1B008F363E /* JITStubRoutine.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = JITStubRoutine.cpp; sourceTree = "<group>"; };
     
    54485466                                FE1220251BE7F5640039E6F2 /* JITAddGenerator.cpp */,
    54495467                                FE1220261BE7F5640039E6F2 /* JITAddGenerator.h */,
     5468                                0F75A0652013E4EF0038E2CF /* JITAllocator.h */,
    54505469                                86A90ECF0EE7D51F00AB350D /* JITArithmetic.cpp */,
    54515470                                A75706DD118A2BCF0057F88F /* JITArithmetic32_64.cpp */,
     
    55415560                        isa = PBXGroup;
    55425561                        children = (
     5562                                0F75A054200D25EF0038E2CF /* Allocator.h */,
     5563                                0F75A05D200D25F10038E2CF /* AllocatorInlines.h */,
     5564                                0F75A059200D25F00038E2CF /* LocalAllocator.cpp */,
     5565                                0F75A057200D25F00038E2CF /* LocalAllocator.h */,
     5566                                0F75A05A200D25F00038E2CF /* LocalAllocatorInlines.h */,
     5567                                0F75A058200D25F00038E2CF /* ThreadLocalCache.cpp */,
     5568                                0F75A055200D25EF0038E2CF /* ThreadLocalCache.h */,
     5569                                0F75A056200D25EF0038E2CF /* ThreadLocalCacheInlines.h */,
     5570                                0F75A05B200D25F10038E2CF /* ThreadLocalCacheLayout.cpp */,
     5571                                0F75A05C200D25F10038E2CF /* ThreadLocalCacheLayout.h */,
    55435572                                0FEC3C501F33A41600F59B6C /* AlignedMemoryAllocator.cpp */,
    55445573                                0FEC3C511F33A41600F59B6C /* AlignedMemoryAllocator.h */,
     
    80498078                                0FEC85771BDACDC70080FF74 /* AirFrequentedBlock.h in Headers */,
    80508079                                0FEC85791BDACDC70080FF74 /* AirGenerate.h in Headers */,
     8080                                0F75A061200D26180038E2CF /* LocalAllocator.h in Headers */,
    80518081                                0FEC857A1BDACDC70080FF74 /* AirGenerationContext.h in Headers */,
    80528082                                0FEC857C1BDACDC70080FF74 /* AirHandleCalleeSaves.h in Headers */,
     
    82148244                                0F33FCF81C136E2500323F67 /* B3StackmapGenerationParams.h in Headers */,
    82158245                                0FEC85311BDACDAC0080FF74 /* B3StackmapSpecial.h in Headers */,
     8246                                0F75A064200D26280038E2CF /* ThreadLocalCacheLayout.h in Headers */,
    82168247                                0F338DF21BE93AD10013C88F /* B3StackmapValue.h in Headers */,
    82178248                                0F9495881C57F47500413A48 /* B3StackSlot.h in Headers */,
     
    83838414                                0F2DD8121AB3D8BE00BBB8E8 /* DFGArgumentsEliminationPhase.h in Headers */,
    83848415                                0F2DD8141AB3D8BE00BBB8E8 /* DFGArgumentsUtilities.h in Headers */,
     8416                                0F75A063200D261F0038E2CF /* Allocator.h in Headers */,
    83858417                                0F485322187750560083B687 /* DFGArithMode.h in Headers */,
    83868418                                0F05C3B41683CF9200BAF45B /* DFGArrayifySlowPathGenerator.h in Headers */,
     
    86638695                                2AACE63D18CA5A0300ED0191 /* GCActivityCallback.h in Headers */,
    86648696                                BCBE2CAE14E985AA000593AD /* GCAssertions.h in Headers */,
     8697                                0F75A060200D260B0038E2CF /* LocalAllocatorInlines.h in Headers */,
    86658698                                0F766D3015A8DCE2008F363E /* GCAwareJITStubRoutine.h in Headers */,
    86668699                                0FD0E5EA1E43D34D0006AB08 /* GCConductor.h in Headers */,
     
    87478780                                0FB7F39B15ED8E4600F167B2 /* IndexingType.h in Headers */,
    87488781                                14386A791DD6989C008652C4 /* IndirectEvalExecutable.h in Headers */,
     8782                                0F75A062200D261D0038E2CF /* AllocatorInlines.h in Headers */,
    87498783                                0FBF92B91FD76FFF00AC28A8 /* InferredStructure.h in Headers */,
    87508784                                0FBF92BA1FD7700400AC28A8 /* InferredStructureWatchpoint.h in Headers */,
     
    89328966                                E33F50871B8449EF00413856 /* JSInternalPromiseConstructor.lut.h in Headers */,
    89338967                                E33F50851B8437A000413856 /* JSInternalPromiseDeferred.h in Headers */,
     8968                                0F75A05E200D25F60038E2CF /* ThreadLocalCache.h in Headers */,
    89348969                                E33F50751B8421C000413856 /* JSInternalPromisePrototype.h in Headers */,
    89358970                                A503FA1E188E0FB000110F14 /* JSJavaScriptCallFramePrototype.h in Headers */,
     
    94319466                                14BE7D3317135CF400D1807A /* WeakInlines.h in Headers */,
    94329467                                A7CA3AE417DA41AE006538AF /* WeakMapConstructor.h in Headers */,
     9468                                0F75A0662013E4F10038E2CF /* JITAllocator.h in Headers */,
    94339469                                E3A32BC71FC83147007D7E76 /* WeakMapImpl.h in Headers */,
    94349470                                E393ADD81FE702D00022D681 /* WeakMapImplInlines.h in Headers */,
  • trunk/Source/JavaScriptCore/Sources.txt

    r227609 r227617  
    467467
    468468heap/AlignedMemoryAllocator.cpp
     469heap/Allocator.cpp
    469470heap/BlockDirectory.cpp
    470471heap/CellAttributes.cpp
     
    501502heap/JITStubRoutineSet.cpp
    502503heap/LargeAllocation.cpp
     504heap/LocalAllocator.cpp
    503505heap/MachineStackMarker.cpp
    504506heap/MarkStack.cpp
     
    519521heap/SynchronousStopTheWorldMutatorScheduler.cpp
    520522heap/Synchronousness.cpp
     523heap/ThreadLocalCache.cpp
     524heap/ThreadLocalCacheLayout.cpp
    521525heap/VisitRaceKey.cpp
    522526heap/Weak.cpp
  • trunk/Source/JavaScriptCore/b3/B3LowerToAir.cpp

    r227609 r227617  
    12781278                arg = tmp(value.value());
    12791279                break;
     1280            case ValueRep::SomeRegisterWithClobber: {
     1281                Tmp dstTmp = m_code.newTmp(value.value()->resultBank());
     1282                append(relaxedMoveForType(value.value()->type()), immOrTmp(value.value()), dstTmp);
     1283                arg = dstTmp;
     1284                break;
     1285            }
    12801286            case ValueRep::LateRegister:
    12811287            case ValueRep::Register:
  • trunk/Source/JavaScriptCore/b3/B3PatchpointSpecial.cpp

    r227609 r227617  
    119119            return true;
    120120        case ValueRep::SomeRegister:
     121        case ValueRep::SomeRegisterWithClobber:
    121122        case ValueRep::SomeEarlyRegister:
    122123        case ValueRep::Register:
  • trunk/Source/JavaScriptCore/b3/B3StackmapSpecial.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    111111                role = Arg::Use;
    112112                break;
     113            case ValueRep::SomeRegisterWithClobber:
     114                role = Arg::UseDef;
     115                break;
    113116            case ValueRep::LateRegister:
    114117                role = Arg::LateUse;
     
    129132            // original stackmap value across the Special operation.
    130133            if (!Arg::isLateUse(role) && optionalDefArgWidth && *optionalDefArgWidth < child.value()->resultWidth()) {
     134                // The role can only be some kind of def if we did SomeRegisterWithClobber, which is
     135                // only allowed for patchpoints. Patchpoints don't use the defArgWidth feature.
     136                RELEASE_ASSERT(!Arg::isAnyDef(role));
     137               
    131138                if (Arg::isWarmUse(role))
    132139                    role = Arg::LateUse;
     
    246253        return true;
    247254    case ValueRep::SomeRegister:
     255    case ValueRep::SomeRegisterWithClobber:
    248256    case ValueRep::SomeEarlyRegister:
    249257        return arg.isTmp();
  • trunk/Source/JavaScriptCore/b3/B3StackmapValue.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    5454}
    5555
     56void StackmapValue::appendSomeRegisterWithClobber(Value* value)
     57{
     58    append(ConstrainedValue(value, ValueRep::SomeRegisterWithClobber));
     59}
     60
    5661void StackmapValue::setConstrainedChild(unsigned index, const ConstrainedValue& constrainedValue)
    5762{
  • trunk/Source/JavaScriptCore/b3/B3StackmapValue.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2015-2016 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    100100    // to SomeRegister.
    101101    void appendSomeRegister(Value*);
    102 
     102    void appendSomeRegisterWithClobber(Value*);
     103   
    103104    const Vector<ValueRep>& reps() const { return m_reps; }
    104105
  • trunk/Source/JavaScriptCore/b3/B3Validate.cpp

    r227609 r227617  
    433433                if (value->type() == Void)
    434434                    VALIDATE(value->as<PatchpointValue>()->resultConstraint == ValueRep::WarmAny, ("At ", *value));
    435                 else {
    436                     switch (value->as<PatchpointValue>()->resultConstraint.kind()) {
    437                     case ValueRep::WarmAny:
    438                     case ValueRep::SomeRegister:
    439                     case ValueRep::SomeEarlyRegister:
    440                     case ValueRep::Register:
    441                     case ValueRep::StackArgument:
    442                         break;
    443                     default:
    444                         VALIDATE(false, ("At ", *value));
    445                         break;
    446                     }
    447                    
     435                else
    448436                    validateStackmapConstraint(value, ConstrainedValue(value, value->as<PatchpointValue>()->resultConstraint), ConstraintRole::Def);
    449                 }
    450437                validateStackmap(value);
    451438                break;
     
    572559        switch (value.rep().kind()) {
    573560        case ValueRep::WarmAny:
    574         case ValueRep::ColdAny:
    575         case ValueRep::LateColdAny:
    576561        case ValueRep::SomeRegister:
    577562        case ValueRep::StackArgument:
     563            break;
     564        case ValueRep::LateColdAny:
     565        case ValueRep::ColdAny:
     566            VALIDATE(role == ConstraintRole::Use, ("At ", *context, ": ", value));
     567            break;
     568        case ValueRep::SomeRegisterWithClobber:
     569            VALIDATE(role == ConstraintRole::Use, ("At ", *context, ": ", value));
     570            VALIDATE(context->as<PatchpointValue>(), ("At ", *context));
    578571            break;
    579572        case ValueRep::SomeEarlyRegister:
     
    582575        case ValueRep::Register:
    583576        case ValueRep::LateRegister:
     577            if (value.rep().kind() == ValueRep::LateRegister)
     578                VALIDATE(role == ConstraintRole::Use, ("At ", *context, ": ", value));
    584579            if (value.rep().reg().isGPR())
    585580                VALIDATE(isInt(value.value()->type()), ("At ", *context, ": ", value));
  • trunk/Source/JavaScriptCore/b3/B3ValueRep.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2015-2016 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    4141    case LateColdAny:
    4242    case SomeRegister:
     43    case SomeRegisterWithClobber:
    4344    case SomeEarlyRegister:
    4445    case Constant:
     
    7273    case LateColdAny:
    7374    case SomeRegister:
     75    case SomeRegisterWithClobber:
    7476    case SomeEarlyRegister:
    7577        return;
     
    176178        out.print("SomeRegister");
    177179        return;
     180    case ValueRep::SomeRegisterWithClobber:
     181        out.print("SomeRegisterWithClobber");
     182        return;
    178183    case ValueRep::SomeEarlyRegister:
    179184        out.print("SomeEarlyRegister");
  • trunk/Source/JavaScriptCore/b3/B3ValueRep.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2015-2016 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    6666        // register that this claims to clobber!
    6767        SomeRegister,
     68       
     69        // As an input representation, this means that B3 should pick some register but that this
     70        // register is then cobbered with garbage. This only works for patchpoints.
     71        SomeRegisterWithClobber,
    6872
    6973        // As an input representation, this tells us that B3 should pick some register, but implies
     
    108112        : m_kind(kind)
    109113    {
    110         ASSERT(kind == WarmAny || kind == ColdAny || kind == LateColdAny || kind == SomeRegister || kind == SomeEarlyRegister);
     114        ASSERT(kind == WarmAny || kind == ColdAny || kind == LateColdAny || kind == SomeRegister || kind == SomeRegisterWithClobber || kind == SomeEarlyRegister);
    111115    }
    112116
  • trunk/Source/JavaScriptCore/bytecode/AccessCase.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2017-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    956956
    957957            if (allocatingInline) {
    958                 BlockDirectory* allocator = vm.jsValueGigacageAuxiliarySpace.allocatorFor(newSize, AllocatorForMode::AllocatorIfExists);
    959 
    960                 if (!allocator) {
    961                     // Yuck, this case would suck!
    962                     slowPath.append(jit.jump());
    963                 }
    964 
    965                 jit.move(CCallHelpers::TrustedImmPtr(allocator), scratchGPR2);
    966                 jit.emitAllocate(scratchGPR, allocator, scratchGPR2, scratchGPR3, slowPath);
     958                Allocator allocator = vm.jsValueGigacageAuxiliarySpace.allocatorFor(newSize, AllocatorForMode::AllocatorIfExists);
     959
     960                jit.emitAllocate(scratchGPR, JITAllocator::constant(allocator), scratchGPR2, scratchGPR3, slowPath);
    967961                jit.addPtr(CCallHelpers::TrustedImm32(newSize + sizeof(IndexingHeader)), scratchGPR);
    968962
  • trunk/Source/JavaScriptCore/bytecode/ObjectAllocationProfile.h

    r227609 r227617  
    4444
    4545    ObjectAllocationProfile()
    46         : m_allocator(0)
    47         , m_inlineCapacity(0)
     46        : m_inlineCapacity(0)
    4847    {
    4948    }
     
    6463    void clear()
    6564    {
    66         m_allocator = nullptr;
     65        m_allocator = Allocator();
    6766        m_structure.clear();
    6867        m_inlineCapacity = 0;
     
    7877    unsigned possibleDefaultPropertyCount(VM&, JSObject* prototype);
    7978
    80     BlockDirectory* m_allocator; // Precomputed to make things easier for generated code.
     79    Allocator m_allocator; // Precomputed to make things easier for generated code.
    8180    WriteBarrier<Structure> m_structure;
    8281    unsigned m_inlineCapacity;
  • trunk/Source/JavaScriptCore/bytecode/ObjectAllocationProfileInlines.h

    r227609 r227617  
    5454        if (Structure* structure = executable->cachedPolyProtoStructure()) {
    5555            RELEASE_ASSERT(structure->typeInfo().type() == FinalObjectType);
    56             m_allocator = nullptr;
     56            m_allocator = Allocator();
    5757            m_structure.set(vm, owner, structure);
    5858            m_inlineCapacity = structure->inlineCapacity();
     
    100100
    101101    size_t allocationSize = JSFinalObject::allocationSize(inlineCapacity);
    102     BlockDirectory* allocator = vm.cellSpace.allocatorForNonVirtual(allocationSize, AllocatorForMode::EnsureAllocator);
     102    Allocator allocator = vm.cellSpace.allocatorForNonVirtual(allocationSize, AllocatorForMode::EnsureAllocator);
    103103
    104104    // Take advantage of extra inline capacity available in the size class.
    105105    if (allocator) {
    106         size_t slop = (allocator->cellSize() - allocationSize) / sizeof(WriteBarrier<Unknown>);
     106        size_t slop = (allocator.cellSize(vm.heap) - allocationSize) / sizeof(WriteBarrier<Unknown>);
    107107        inlineCapacity += slop;
    108108        if (inlineCapacity > JSFinalObject::maxInlineCapacity())
     
    114114    if (isPolyProto) {
    115115        ASSERT(structure->hasPolyProto());
    116         m_allocator = nullptr;
     116        m_allocator = Allocator();
    117117        executable->setCachedPolyProtoStructure(vm, structure);
    118118    } else {
  • trunk/Source/JavaScriptCore/dfg/DFGSpeculativeJIT.cpp

    r227609 r227617  
    114114
    115115    if (size) {
    116         if (BlockDirectory* allocator = m_jit.vm()->jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists)) {
    117             m_jit.move(TrustedImmPtr(allocator), scratchGPR);
    118             m_jit.emitAllocate(storageGPR, allocator, scratchGPR, scratch2GPR, slowCases);
     116        if (Allocator allocator = m_jit.vm()->jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists)) {
     117            m_jit.emitAllocate(storageGPR, JITAllocator::constant(allocator), scratchGPR, scratch2GPR, slowCases);
    119118           
    120119            m_jit.addPtr(
     
    129128
    130129    size_t allocationSize = JSFinalObject::allocationSize(inlineCapacity);
    131     BlockDirectory* allocatorPtr = subspaceFor<JSFinalObject>(*m_jit.vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
    132     if (allocatorPtr) {
    133         m_jit.move(TrustedImmPtr(allocatorPtr), scratchGPR);
     130    Allocator allocator = subspaceFor<JSFinalObject>(*m_jit.vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
     131    if (allocator) {
    134132        uint32_t mask = WTF::computeIndexingMask(vectorLength);
    135         emitAllocateJSObject(resultGPR, allocatorPtr, scratchGPR, TrustedImmPtr(structure), storageGPR, TrustedImm32(mask), scratch2GPR, slowCases);
     133        emitAllocateJSObject(resultGPR, JITAllocator::constant(allocator), scratchGPR, TrustedImmPtr(structure), storageGPR, TrustedImm32(mask), scratch2GPR, slowCases);
    136134        m_jit.emitInitializeInlineStorage(resultGPR, structure->inlineCapacity());
    137135    } else
     
    42094207   
    42104208    JITCompiler::JumpList slowPath;
    4211     BlockDirectory* blockDirectory = subspaceFor<JSRopeString>(*m_jit.vm())->allocatorForNonVirtual(sizeof(JSRopeString), AllocatorForMode::AllocatorIfExists);
    4212     m_jit.move(TrustedImmPtr(blockDirectory), allocatorGPR);
    4213     emitAllocateJSCell(resultGPR, blockDirectory, allocatorGPR, TrustedImmPtr(m_jit.graph().registerStructure(m_jit.vm()->stringStructure.get())), scratchGPR, slowPath);
     4209    Allocator allocatorValue = subspaceFor<JSRopeString>(*m_jit.vm())->allocatorForNonVirtual(sizeof(JSRopeString), AllocatorForMode::AllocatorIfExists);
     4210    emitAllocateJSCell(resultGPR, JITAllocator::constant(allocatorValue), allocatorGPR, TrustedImmPtr(m_jit.graph().registerStructure(m_jit.vm()->stringStructure.get())), scratchGPR, slowPath);
    42144211       
    42154212    m_jit.storePtr(TrustedImmPtr(0), JITCompiler::Address(resultGPR, JSString::offsetOfValue()));
     
    83858382    size_t size = initialOutOfLineCapacity * sizeof(JSValue);
    83868383
    8387     BlockDirectory* allocator = m_jit.vm()->jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists);
     8384    Allocator allocator = m_jit.vm()->jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists);
    83888385
    83898386    if (!allocator || node->transition()->previous->couldHaveIndexingHeader()) {
     
    84108407    GPRReg scratchGPR3 = scratch3.gpr();
    84118408       
    8412     m_jit.move(TrustedImmPtr(allocator), scratchGPR2);
    84138409    JITCompiler::JumpList slowPath;
    8414     m_jit.emitAllocate(scratchGPR1, allocator, scratchGPR2, scratchGPR3, slowPath);
     8410    m_jit.emitAllocate(scratchGPR1, JITAllocator::constant(allocator), scratchGPR2, scratchGPR3, slowPath);
    84158411    m_jit.addPtr(JITCompiler::TrustedImm32(size + sizeof(IndexingHeader)), scratchGPR1);
    84168412   
     
    84308426    ASSERT(newSize == node->transition()->next->outOfLineCapacity() * sizeof(JSValue));
    84318427   
    8432     BlockDirectory* allocator = m_jit.vm()->jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(newSize, AllocatorForMode::AllocatorIfExists);
     8428    Allocator allocator = m_jit.vm()->jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(newSize, AllocatorForMode::AllocatorIfExists);
    84338429
    84348430    if (!allocator || node->transition()->previous->couldHaveIndexingHeader()) {
     
    84588454   
    84598455    JITCompiler::JumpList slowPath;
    8460     m_jit.move(TrustedImmPtr(allocator), scratchGPR2);
    8461     m_jit.emitAllocate(scratchGPR1, allocator, scratchGPR2, scratchGPR3, slowPath);
     8456    m_jit.emitAllocate(scratchGPR1, JITAllocator::constant(allocator), scratchGPR2, scratchGPR3, slowPath);
    84628457   
    84638458    m_jit.addPtr(JITCompiler::TrustedImm32(newSize + sizeof(IndexingHeader)), scratchGPR1);
     
    1144911444    m_jit.loadPtr(JITCompiler::Address(calleeGPR, JSFunction::offsetOfRareData()), rareDataGPR);
    1145011445    slowPath.append(m_jit.branchTestPtr(MacroAssembler::Zero, rareDataGPR));
    11451     m_jit.loadPtr(JITCompiler::Address(rareDataGPR, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfAllocator()), allocatorGPR);
     11446    m_jit.load32(JITCompiler::Address(rareDataGPR, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfAllocator()), allocatorGPR);
    1145211447    m_jit.loadPtr(JITCompiler::Address(rareDataGPR, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfStructure()), structureGPR);
    1145311448
    11454     slowPath.append(m_jit.branchTestPtr(MacroAssembler::Zero, allocatorGPR));
     11449    slowPath.append(m_jit.branch32(MacroAssembler::Equal, allocatorGPR, TrustedImm32(Allocator().offset())));
    1145511450
    1145611451    auto butterfly = TrustedImmPtr(nullptr);
    1145711452    auto mask = TrustedImm32(0);
    11458     emitAllocateJSObject(resultGPR, nullptr, allocatorGPR, structureGPR, butterfly, mask, scratchGPR, slowPath);
     11453    emitAllocateJSObject(resultGPR, JITAllocator::variable(), allocatorGPR, structureGPR, butterfly, mask, scratchGPR, slowPath);
    1145911454
    1146011455    m_jit.loadPtr(JITCompiler::Address(calleeGPR, JSFunction::offsetOfRareData()), rareDataGPR);
     
    1148211477    RegisteredStructure structure = node->structure();
    1148311478    size_t allocationSize = JSFinalObject::allocationSize(structure->inlineCapacity());
    11484     BlockDirectory* allocatorPtr = subspaceFor<JSFinalObject>(*m_jit.vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
    11485 
    11486     m_jit.move(TrustedImmPtr(allocatorPtr), allocatorGPR);
    11487     auto butterfly = TrustedImmPtr(nullptr);
    11488     auto mask = TrustedImm32(0);
    11489     emitAllocateJSObject(resultGPR, allocatorPtr, allocatorGPR, TrustedImmPtr(structure), butterfly, mask, scratchGPR, slowPath);
    11490     m_jit.emitInitializeInlineStorage(resultGPR, structure->inlineCapacity());
    11491     m_jit.mutatorFence(*m_jit.vm());
     11479    Allocator allocatorValue = subspaceFor<JSFinalObject>(*m_jit.vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
     11480
     11481    if (!allocatorValue)
     11482        slowPath.append(m_jit.jump());
     11483    else {
     11484        auto butterfly = TrustedImmPtr(nullptr);
     11485        auto mask = TrustedImm32(0);
     11486        emitAllocateJSObject(resultGPR, JITAllocator::constant(allocatorValue), allocatorGPR, TrustedImmPtr(structure), butterfly, mask, scratchGPR, slowPath);
     11487        m_jit.emitInitializeInlineStorage(resultGPR, structure->inlineCapacity());
     11488        m_jit.mutatorFence(*m_jit.vm());
     11489    }
    1149211490
    1149311491    addSlowPathGenerator(slowPathCall(slowPath, this, operationNewObject, resultGPR, structure));
  • trunk/Source/JavaScriptCore/dfg/DFGSpeculativeJIT.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2011-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2011-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    31433143    template <typename StructureType> // StructureType can be GPR or ImmPtr.
    31443144    void emitAllocateJSCell(
    3145         GPRReg resultGPR, BlockDirectory* allocator, GPRReg allocatorGPR, StructureType structure,
     3145        GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, StructureType structure,
    31463146        GPRReg scratchGPR, MacroAssembler::JumpList& slowPath)
    31473147    {
     
    31523152    template <typename StructureType, typename StorageType, typename MaskType> // StructureType, StorageType and, MaskType can be GPR or ImmPtr.
    31533153    void emitAllocateJSObject(
    3154         GPRReg resultGPR, BlockDirectory* allocator, GPRReg allocatorGPR, StructureType structure,
     3154        GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, StructureType structure,
    31553155        StorageType storage, MaskType mask, GPRReg scratchGPR, MacroAssembler::JumpList& slowPath)
    31563156    {
  • trunk/Source/JavaScriptCore/ftl/FTLAbstractHeapRepository.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    132132#define FOR_EACH_INDEXED_ABSTRACT_HEAP(macro) \
    133133    macro(ArrayStorage_vector, ArrayStorage::vectorOffset(), sizeof(WriteBarrier<Unknown>)) \
    134     macro(CompleteSubspace_allocatorForSizeStep, CompleteSubspace::offsetOfAllocatorForSizeStep(), sizeof(BlockDirectory*)) \
     134    macro(CompleteSubspace_allocatorForSizeStep, CompleteSubspace::offsetOfAllocatorForSizeStep(), sizeof(Allocator)) \
    135135    macro(DirectArguments_storage, DirectArguments::storageOffset(), sizeof(EncodedJSValue)) \
    136136    macro(JSLexicalEnvironment_variables, JSLexicalEnvironment::offsetOfVariables(), sizeof(EncodedJSValue)) \
  • trunk/Source/JavaScriptCore/ftl/FTLLowerDFGToB3.cpp

    r227609 r227617  
    58775877        LBasicBlock lastNext = m_out.insertNewBlocksBefore(slowPath);
    58785878       
    5879         BlockDirectory* allocator = subspaceFor<JSRopeString>(vm())->allocatorForNonVirtual(sizeof(JSRopeString), AllocatorForMode::AllocatorIfExists);
     5879        Allocator allocator = subspaceFor<JSRopeString>(vm())->allocatorForNonVirtual(sizeof(JSRopeString), AllocatorForMode::AllocatorIfExists);
    58805880       
    58815881        LValue result = allocateCell(
    5882             m_out.constIntPtr(allocator), vm().stringStructure.get(), slowPath);
     5882            m_out.constInt32(allocator.offset()), vm().stringStructure.get(), slowPath);
    58835883       
    58845884        m_out.storePtr(m_out.intPtrZero, result, m_heaps.JSString_value);
     
    99109910            if (structure->outOfLineCapacity() || hasIndexedProperties(structure->indexingType())) {
    99119911                size_t allocationSize = JSFinalObject::allocationSize(structure->inlineCapacity());
    9912                 BlockDirectory* cellAllocator = subspaceFor<JSFinalObject>(vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
     9912                Allocator cellAllocator = subspaceFor<JSFinalObject>(vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
    99139913
    99149914                bool hasIndexingHeader = hasIndexedProperties(structure->indexingType());
     
    99699969                LValue mask = computeButterflyIndexingMask(vectorLength);
    99709970                LValue fastObjectValue = allocateObject(
    9971                     m_out.constIntPtr(cellAllocator), structure, fastButterflyValue, mask, slowPath);
     9971                    m_out.constInt32(cellAllocator.offset()), structure, fastButterflyValue, mask, slowPath);
    99729972
    99739973                ValueFromBlock fastObject = m_out.anchor(fastObjectValue);
     
    1095610956
    1095710957        size_t sizeInBytes = sizeInValues * sizeof(JSValue);
    10958         BlockDirectory* allocator = vm().jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(sizeInBytes, AllocatorForMode::AllocatorIfExists);
    10959         LValue startOfStorage = allocateHeapCell(m_out.constIntPtr(allocator), slowPath);
     10958        Allocator allocator = vm().jsValueGigacageAuxiliarySpace.allocatorForNonVirtual(sizeInBytes, AllocatorForMode::AllocatorIfExists);
     10959        LValue startOfStorage = allocateHeapCell(m_out.constInt32(allocator.offset()), slowPath);
    1096010960        ValueFromBlock fastButterfly = m_out.anchor(
    1096110961            m_out.add(m_out.constIntPtr(sizeInBytes + sizeof(IndexingHeader)), startOfStorage));
     
    1198911989    LValue allocateHeapCell(LValue allocator, LBasicBlock slowPath)
    1199011990    {
    11991         BlockDirectory* actualAllocator = nullptr;
    11992         if (allocator->hasIntPtr())
    11993             actualAllocator = bitwise_cast<BlockDirectory*>(allocator->asIntPtr());
    11994        
    11995         if (!actualAllocator) {
     11991        JITAllocator actualAllocator;
     11992        if (allocator->hasInt32())
     11993            actualAllocator = JITAllocator::constant(Allocator(allocator->asInt32()));
     11994        else
     11995            actualAllocator = JITAllocator::variable();
     11996       
     11997        if (actualAllocator.isConstant()) {
     11998            if (!actualAllocator.allocator()) {
     11999                LBasicBlock haveAllocator = m_out.newBlock();
     12000                LBasicBlock lastNext = m_out.insertNewBlocksBefore(haveAllocator);
     12001                m_out.jump(slowPath);
     12002                m_out.appendTo(haveAllocator, lastNext);
     12003                return m_out.intPtrZero;
     12004            }
     12005        } else {
    1199612006            // This means that either we know that the allocator is null or we don't know what the
    1199712007            // allocator is. In either case, we need the null check.
    1199812008            LBasicBlock haveAllocator = m_out.newBlock();
    1199912009            LBasicBlock lastNext = m_out.insertNewBlocksBefore(haveAllocator);
    12000             m_out.branch(allocator, usually(haveAllocator), rarely(slowPath));
     12010            m_out.branch(
     12011                m_out.notEqual(allocator, m_out.constInt32(Allocator().offset())),
     12012                usually(haveAllocator), rarely(slowPath));
    1200112013            m_out.appendTo(haveAllocator, lastNext);
    1200212014        }
     
    1200812020        PatchpointValue* patchpoint = m_out.patchpoint(pointerType());
    1200912021        patchpoint->effects.terminal = true;
    12010         patchpoint->appendSomeRegister(allocator);
     12022        if (actualAllocator.isConstant())
     12023            patchpoint->numGPScratchRegisters++;
     12024        else
     12025            patchpoint->appendSomeRegisterWithClobber(allocator);
    1201112026        patchpoint->numGPScratchRegisters++;
    1201212027        patchpoint->resultConstraint = ValueRep::SomeEarlyRegister;
     
    1201812033            [=] (CCallHelpers& jit, const StackmapGenerationParams& params) {
    1201912034                CCallHelpers::JumpList jumpToSlowPath;
     12035               
     12036                GPRReg allocatorGPR;
     12037                if (actualAllocator.isConstant())
     12038                    allocatorGPR = params.gpScratch(1);
     12039                else
     12040                    allocatorGPR = params[1].gpr();
    1202012041               
    1202112042                // We use a patchpoint to emit the allocation path because whenever we mess with
     
    1202612047                // all of the compiler tiers.
    1202712048                jit.emitAllocateWithNonNullAllocator(
    12028                     params[0].gpr(), actualAllocator, params[1].gpr(), params.gpScratch(0),
     12049                    params[0].gpr(), actualAllocator, allocatorGPR, params.gpScratch(0),
    1202912050                    jumpToSlowPath);
    1203012051               
     
    1211812139        size_t size, StructureType structure, LValue butterfly, LValue indexingMask, LBasicBlock slowPath)
    1211912140    {
    12120         BlockDirectory* allocator = subspaceFor<ClassType>(vm())->allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists);
    12121         return allocateObject(m_out.constIntPtr(allocator), structure, butterfly, indexingMask, slowPath);
     12141        Allocator allocator = subspaceFor<ClassType>(vm())->allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists);
     12142        return allocateObject(m_out.constInt32(allocator.offset()), structure, butterfly, indexingMask, slowPath);
    1212212143    }
    1212312144   
     
    1213812159            size_t actualSize = size->asIntPtr();
    1213912160           
    12140             BlockDirectory* actualAllocator = actualSubspace->allocatorForNonVirtual(actualSize, AllocatorForMode::AllocatorIfExists);
     12161            Allocator actualAllocator = actualSubspace->allocatorForNonVirtual(actualSize, AllocatorForMode::AllocatorIfExists);
    1214112162            if (!actualAllocator) {
    1214212163                LBasicBlock continuation = m_out.newBlock();
     
    1214412165                m_out.jump(slowPath);
    1214512166                m_out.appendTo(continuation, lastNext);
    12146                 return m_out.intPtrZero;
     12167                return m_out.int32Zero;
    1214712168            }
    1214812169           
    12149             return m_out.constIntPtr(actualAllocator);
     12170            return m_out.constInt32(actualAllocator.offset());
    1215012171        }
    1215112172       
     
    1216612187        m_out.appendTo(continuation, lastNext);
    1216712188       
    12168         return m_out.loadPtr(
     12189        return m_out.load32(
    1216912190            m_out.baseIndex(
    1217012191                m_heaps.CompleteSubspace_allocatorForSizeStep,
     
    1218112202        LValue size, RegisteredStructure structure, LValue butterfly, LValue butterflyIndexingMask, LBasicBlock slowPath)
    1218212203    {
    12183         LValue allocator = allocatorForSize(
    12184             *subspaceFor<ClassType>(vm()), size, slowPath);
     12204        LValue allocator = allocatorForSize(*subspaceFor<ClassType>(vm()), size, slowPath);
    1218512205        return allocateObject(allocator, structure, butterfly, butterflyIndexingMask, slowPath);
    1218612206    }
     
    1219012210        LValue size, Structure* structure, LBasicBlock slowPath)
    1219112211    {
    12192         LValue allocator = allocatorForSize(
    12193             *subspaceFor<ClassType>(vm()), size, slowPath);
     12212        LValue allocator = allocatorForSize(*subspaceFor<ClassType>(vm()), size, slowPath);
    1219412213        return allocateCell(allocator, structure, slowPath);
    1219512214    }
     
    1219812217    {
    1219912218        size_t allocationSize = JSFinalObject::allocationSize(structure.get()->inlineCapacity());
    12200         BlockDirectory* allocator = subspaceFor<JSFinalObject>(vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
     12219        Allocator allocator = subspaceFor<JSFinalObject>(vm())->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
    1220112220       
    1220212221        // FIXME: If the allocator is null, we could simply emit a normal C call to the allocator
     
    1221012229       
    1221112230        ValueFromBlock fastResult = m_out.anchor(allocateObject(
    12212             m_out.constIntPtr(allocator), structure, m_out.intPtrZero, m_out.int32Zero, slowPath));
     12231            m_out.constInt32(allocator.offset()), structure, m_out.intPtrZero, m_out.int32Zero, slowPath));
    1221312232       
    1221412233        m_out.jump(continuation);
  • trunk/Source/JavaScriptCore/heap/BlockDirectory.cpp

    r227609 r227617  
    2727#include "BlockDirectory.h"
    2828
    29 #include "AllocatingScope.h"
    3029#include "BlockDirectoryInlines.h"
    3130#include "GCActivityCallback.h"
     
    4039namespace JSC {
    4140
    42 static constexpr bool tradeDestructorBlocks = true;
    43 
    4441BlockDirectory::BlockDirectory(Heap* heap, size_t cellSize)
    45     : m_freeList(cellSize)
    46     , m_currentBlock(0)
    47     , m_lastActiveBlock(0)
    48     , m_cellSize(static_cast<unsigned>(cellSize))
     42    : m_cellSize(static_cast<unsigned>(cellSize))
    4943    , m_heap(heap)
    5044{
     45    heap->threadLocalCacheLayout().allocateOffset(this);
    5146}
    5247
     
    8277}
    8378
    84 void BlockDirectory::didConsumeFreeList()
    85 {
    86     if (m_currentBlock)
    87         m_currentBlock->didConsumeFreeList();
    88    
    89     m_freeList.clear();
    90     m_currentBlock = nullptr;
    91 }
    92 
    93 void* BlockDirectory::tryAllocateWithoutCollecting()
    94 {
    95     SuperSamplerScope superSamplerScope(false);
    96    
    97     ASSERT(!m_currentBlock);
    98     ASSERT(m_freeList.allocationWillFail());
    99    
    100     for (;;) {
    101         m_allocationCursor = (m_canAllocateButNotEmpty | m_empty).findBit(m_allocationCursor, true);
    102         if (m_allocationCursor >= m_blocks.size())
    103             break;
    104        
    105         setIsCanAllocateButNotEmpty(NoLockingNecessary, m_allocationCursor, false);
    106 
    107         if (void* result = tryAllocateIn(m_blocks[m_allocationCursor]))
    108             return result;
    109     }
    110    
    111     if (Options::stealEmptyBlocksFromOtherAllocators()
    112         && (tradeDestructorBlocks || !needsDestruction())) {
    113         if (MarkedBlock::Handle* block = m_subspace->findEmptyBlockToSteal()) {
    114             RELEASE_ASSERT(block->alignedMemoryAllocator() == m_subspace->alignedMemoryAllocator());
    115            
    116             block->sweep(nullptr);
    117            
    118             // It's good that this clears canAllocateButNotEmpty as well as all other bits,
    119             // because there is a remote chance that a block may have both canAllocateButNotEmpty
    120             // and empty set at the same time.
    121             block->removeFromDirectory();
    122             addBlock(block);
    123             return allocateIn(block);
    124         }
    125     }
    126    
    127     return nullptr;
    128 }
    129 
    130 void* BlockDirectory::allocateIn(MarkedBlock::Handle* block)
    131 {
    132     void* result = tryAllocateIn(block);
    133     RELEASE_ASSERT(result);
    134     return result;
    135 }
    136 
    137 void* BlockDirectory::tryAllocateIn(MarkedBlock::Handle* block)
    138 {
    139     ASSERT(block);
    140     ASSERT(!block->isFreeListed());
    141    
    142     block->sweep(&m_freeList);
    143    
    144     // It's possible to stumble on a completely full block. Marking tries to retire these, but
    145     // that algorithm is racy and may forget to do it sometimes.
    146     if (m_freeList.allocationWillFail()) {
    147         ASSERT(block->isFreeListed());
    148         block->unsweepWithNoNewlyAllocated();
    149         ASSERT(!block->isFreeListed());
    150         ASSERT(!isEmpty(NoLockingNecessary, block));
    151         ASSERT(!isCanAllocateButNotEmpty(NoLockingNecessary, block));
     79MarkedBlock::Handle* BlockDirectory::findBlockForAllocation()
     80{
     81    m_allocationCursor = (m_canAllocateButNotEmpty | m_empty).findBit(m_allocationCursor, true);
     82    if (m_allocationCursor >= m_blocks.size())
    15283        return nullptr;
    153     }
    154    
    155     m_currentBlock = block;
    156    
    157     void* result = m_freeList.allocate(
    158         [] () -> HeapCell* {
    159             RELEASE_ASSERT_NOT_REACHED();
    160             return nullptr;
    161         });
    162     setIsEden(NoLockingNecessary, m_currentBlock, true);
    163     markedSpace().didAllocateInBlock(m_currentBlock);
    164     return result;
    165 }
    166 
    167 ALWAYS_INLINE void BlockDirectory::doTestCollectionsIfNeeded(GCDeferralContext* deferralContext)
    168 {
    169     if (!Options::slowPathAllocsBetweenGCs())
    170         return;
    171 
    172     static unsigned allocationCount = 0;
    173     if (!allocationCount) {
    174         if (!m_heap->isDeferred()) {
    175             if (deferralContext)
    176                 deferralContext->m_shouldGC = true;
    177             else
    178                 m_heap->collectNow(Sync, CollectionScope::Full);
    179         }
    180     }
    181     if (++allocationCount >= Options::slowPathAllocsBetweenGCs())
    182         allocationCount = 0;
    183 }
    184 
    185 void* BlockDirectory::allocateSlowCase(GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    186 {
    187     SuperSamplerScope superSamplerScope(false);
    188     ASSERT(m_heap->vm()->currentThreadIsHoldingAPILock());
    189     doTestCollectionsIfNeeded(deferralContext);
    190 
    191     ASSERT(!markedSpace().isIterating());
    192     m_heap->didAllocate(m_freeList.originalSize());
    193    
    194     didConsumeFreeList();
    195    
    196     AllocatingScope helpingHeap(*m_heap);
    197 
    198     m_heap->collectIfNecessaryOrDefer(deferralContext);
    199    
    200     // Goofy corner case: the GC called a callback and now this directory has a currentBlock. This only
    201     // happens when running WebKit tests, which inject a callback into the GC's finalization.
    202     if (UNLIKELY(m_currentBlock))
    203         return allocate(deferralContext, failureMode);
    204    
    205     void* result = tryAllocateWithoutCollecting();
    206    
    207     if (LIKELY(result != 0))
    208         return result;
    209    
    210     MarkedBlock::Handle* block = tryAllocateBlock();
    211     if (!block) {
    212         if (failureMode == AllocationFailureMode::Assert)
    213             RELEASE_ASSERT_NOT_REACHED();
    214         else
    215             return nullptr;
    216     }
    217     addBlock(block);
    218     result = allocateIn(block);
    219     ASSERT(result);
    220     return result;
     84   
     85    setIsCanAllocateButNotEmpty(NoLockingNecessary, m_allocationCursor, false);
     86    return m_blocks[m_allocationCursor];
    22187}
    22288
     
    314180    if (false)
    315181        dataLog(RawPointer(this), ": BlockDirectory::stopAllocating!\n");
    316     ASSERT(!m_lastActiveBlock);
    317     if (!m_currentBlock) {
    318         ASSERT(m_freeList.allocationWillFail());
    319         return;
    320     }
    321    
    322     m_currentBlock->stopAllocating(m_freeList);
    323     m_lastActiveBlock = m_currentBlock;
    324     m_currentBlock = 0;
    325     m_freeList.clear();
     182    m_localAllocators.forEach(
     183        [&] (LocalAllocator* allocator) {
     184            allocator->stopAllocating();
     185        });
    326186}
    327187
    328188void BlockDirectory::prepareForAllocation()
    329189{
    330     m_lastActiveBlock = nullptr;
    331     m_currentBlock = nullptr;
    332     m_freeList.clear();
    333 
     190    m_localAllocators.forEach(
     191        [&] (LocalAllocator* allocator) {
     192            allocator->prepareForAllocation();
     193        });
     194   
    334195    m_allocationCursor = 0;
    335196    m_emptyCursor = 0;
     
    345206}
    346207
     208void BlockDirectory::stopAllocatingForGood()
     209{
     210    if (false)
     211        dataLog(RawPointer(this), ": BlockDirectory::stopAllocatingForGood!\n");
     212   
     213    m_localAllocators.forEach(
     214        [&] (LocalAllocator* allocator) {
     215            allocator->stopAllocatingForGood();
     216        });
     217
     218    auto locker = holdLock(m_localAllocatorsLock);
     219    while (!m_localAllocators.isEmpty())
     220        m_localAllocators.begin()->remove();
     221}
     222
    347223void BlockDirectory::lastChanceToFinalize()
    348224{
     
    355231void BlockDirectory::resumeAllocating()
    356232{
    357     if (!m_lastActiveBlock)
    358         return;
    359 
    360     m_lastActiveBlock->resumeAllocating(m_freeList);
    361     m_currentBlock = m_lastActiveBlock;
    362     m_lastActiveBlock = nullptr;
     233    m_localAllocators.forEach(
     234        [&] (LocalAllocator* allocator) {
     235            allocator->resumeAllocating();
     236        });
    363237}
    364238
     
    380254    // vectors.
    381255   
    382     if (!tradeDestructorBlocks && needsDestruction()) {
     256    if (!Options::tradeDestructorBlocks() && needsDestruction()) {
    383257        ASSERT(m_empty.isEmpty());
    384258        m_canAllocateButNotEmpty = m_live & ~m_markingRetired;
     
    513387}
    514388
     389bool BlockDirectory::isFreeListedCell(const void* target)
     390{
     391    bool result = false;
     392    m_localAllocators.forEach(
     393        [&] (LocalAllocator* allocator) {
     394            result |= allocator->isFreeListedCell(target);
     395        });
     396    return result;
     397}
     398
    515399} // namespace JSC
    516400
  • trunk/Source/JavaScriptCore/heap/BlockDirectory.h

    r227609 r227617  
    2727
    2828#include "AllocationFailureMode.h"
     29#include "Allocator.h"
    2930#include "CellAttributes.h"
    3031#include "FreeList.h"
     32#include "LocalAllocator.h"
    3133#include "MarkedBlock.h"
    3234#include <wtf/DataLog.h>
     
    4244class MarkedSpace;
    4345class LLIntOffsetsExtractor;
     46class ThreadLocalCacheLayout;
    4447
    4548#define FOR_EACH_BLOCK_DIRECTORY_BIT(macro) \
     
    7780
    7881public:
    79     static ptrdiff_t offsetOfFreeList();
    80     static ptrdiff_t offsetOfCellSize();
    81 
    8282    BlockDirectory(Heap*, size_t cellSize);
    8383    void setSubspace(Subspace*);
     
    8585    void prepareForAllocation();
    8686    void stopAllocating();
     87    void stopAllocatingForGood();
    8788    void resumeAllocating();
    8889    void beginMarkingForFullCollection();
     
    9899    DestructionMode destruction() const { return m_attributes.destruction; }
    99100    HeapCell::Kind cellKind() const { return m_attributes.cellKind; }
    100     void* allocate(GCDeferralContext*, AllocationFailureMode);
    101101    Heap* heap() { return m_heap; }
    102102
    103     bool isFreeListedCell(const void* target) const;
     103    bool isFreeListedCell(const void* target);
    104104
    105105    template<typename Functor> void forEachBlock(const Functor&);
     
    158158    MarkedSpace& markedSpace() const;
    159159   
    160     const FreeList& freeList() const { return m_freeList; }
     160    Allocator allocator() const { return Allocator(m_tlcOffset); }
    161161   
    162162    void dump(PrintStream&) const;
     
    164164   
    165165private:
     166    friend class LocalAllocator;
    166167    friend class IsoCellSet;
    167168    friend class MarkedBlock;
    168    
    169     JS_EXPORT_PRIVATE void* allocateSlowCase(GCDeferralContext*, AllocationFailureMode failureMode);
    170     void didConsumeFreeList();
    171     void* tryAllocateWithoutCollecting();
     169    friend class ThreadLocalCacheLayout;
     170   
     171    MarkedBlock::Handle* findBlockForAllocation();
     172   
    172173    MarkedBlock::Handle* tryAllocateBlock();
    173     void* tryAllocateIn(MarkedBlock::Handle*);
    174     void* allocateIn(MarkedBlock::Handle*);
    175     ALWAYS_INLINE void doTestCollectionsIfNeeded(GCDeferralContext*);
    176    
    177     FreeList m_freeList;
    178174   
    179175    Vector<MarkedBlock::Handle*> m_blocks;
     
    194190    size_t m_unsweptCursor { 0 }; // Points to the next block that is a candidate for incremental sweeping.
    195191   
    196     MarkedBlock::Handle* m_currentBlock;
    197     MarkedBlock::Handle* m_lastActiveBlock;
    198 
    199     Lock m_lock;
    200192    unsigned m_cellSize;
    201193    CellAttributes m_attributes;
     
    207199    BlockDirectory* m_nextDirectoryInSubspace { nullptr };
    208200    BlockDirectory* m_nextDirectoryInAlignedMemoryAllocator { nullptr };
     201   
     202    Lock m_localAllocatorsLock;
     203    size_t m_tlcOffset;
     204    SentinelLinkedList<LocalAllocator, BasicRawSentinelNode<LocalAllocator>> m_localAllocators;
    209205};
    210206
    211 inline ptrdiff_t BlockDirectory::offsetOfFreeList()
    212 {
    213     return OBJECT_OFFSETOF(BlockDirectory, m_freeList);
    214 }
    215 
    216 inline ptrdiff_t BlockDirectory::offsetOfCellSize()
    217 {
    218     return OBJECT_OFFSETOF(BlockDirectory, m_cellSize);
    219 }
    220 
    221207} // namespace JSC
  • trunk/Source/JavaScriptCore/heap/BlockDirectoryInlines.h

    r227609 r227617  
    3232namespace JSC {
    3333
    34 inline bool BlockDirectory::isFreeListedCell(const void* target) const
    35 {
    36     return m_freeList.contains(bitwise_cast<HeapCell*>(target));
    37 }
    38 
    39 ALWAYS_INLINE void* BlockDirectory::allocate(GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    40 {
    41     return m_freeList.allocate(
    42         [&] () -> HeapCell* {
    43             sanitizeStackForVM(heap()->vm());
    44             return static_cast<HeapCell*>(allocateSlowCase(deferralContext, failureMode));
    45         });
    46 }
    47 
    4834template <typename Functor> inline void BlockDirectory::forEachBlock(const Functor& functor)
    4935{
  • trunk/Source/JavaScriptCore/heap/CompleteSubspace.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2017-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    2727#include "Subspace.h"
    2828
     29#include "AlignedMemoryAllocator.h"
     30#include "AllocatorInlines.h"
    2931#include "BlockDirectoryInlines.h"
    3032#include "JSCInlines.h"
     33#include "LocalAllocatorInlines.h"
    3134#include "MarkedBlockInlines.h"
    3235#include "PreventCollectionScope.h"
    3336#include "SubspaceInlines.h"
     37#include "ThreadLocalCacheInlines.h"
    3438
    3539namespace JSC {
     
    3943{
    4044    initialize(heapCellType, alignedMemoryAllocator);
    41     for (size_t i = MarkedSpace::numSizeClasses; i--;)
    42         m_allocatorForSizeStep[i] = nullptr;
    4345}
    4446
     
    4749}
    4850
    49 BlockDirectory* CompleteSubspace::allocatorFor(size_t size, AllocatorForMode mode)
     51Allocator CompleteSubspace::allocatorFor(size_t size, AllocatorForMode mode)
    5052{
    5153    return allocatorForNonVirtual(size, mode);
    5254}
    5355
    54 void* CompleteSubspace::allocate(size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
     56void* CompleteSubspace::allocate(VM& vm, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    5557{
    56     return allocateNonVirtual(size, deferralContext, failureMode);
     58    return allocateNonVirtual(vm, size, deferralContext, failureMode);
    5759}
    5860
    59 void* CompleteSubspace::allocateNonVirtual(size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
     61void* CompleteSubspace::allocateNonVirtual(VM& vm, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    6062{
    61     void *result;
    62     if (BlockDirectory* allocator = allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists))
    63         result = allocator->allocate(deferralContext, failureMode);
    64     else
    65         result = allocateSlow(size, deferralContext, failureMode);
    66     return result;
     63    Allocator allocator = allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists);
     64    return allocator.tryAllocate(
     65        vm, deferralContext, failureMode,
     66        [&] () {
     67            return allocateSlow(vm, size, deferralContext, failureMode);
     68        });
    6769}
    6870
    69 BlockDirectory* CompleteSubspace::allocatorForSlow(size_t size)
     71Allocator CompleteSubspace::allocatorForSlow(size_t size)
    7072{
    7173    size_t index = MarkedSpace::sizeClassToIndex(size);
    7274    size_t sizeClass = MarkedSpace::s_sizeClassForSizeStep[index];
    7375    if (!sizeClass)
    74         return nullptr;
     76        return Allocator();
    7577   
    7678    // This is written in such a way that it's OK for the JIT threads to end up here if they want
     
    8385    // enough: it will have
    8486    auto locker = holdLock(m_space.directoryLock());
    85     if (BlockDirectory* allocator = m_allocatorForSizeStep[index])
     87    if (Allocator allocator = m_allocatorForSizeStep[index])
    8688        return allocator;
    8789
     
    99101            break;
    100102
    101         m_allocatorForSizeStep[index] = directory;
     103        m_allocatorForSizeStep[index] = directory->allocator();
    102104       
    103105        if (!index--)
     
    108110    WTF::storeStoreFence();
    109111    m_firstDirectory = directory;
    110     return directory;
     112    return directory->allocator();
    111113}
    112114
    113 void* CompleteSubspace::allocateSlow(size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
     115void* CompleteSubspace::allocateSlow(VM& vm, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    114116{
    115     void* result = tryAllocateSlow(size, deferralContext);
     117    void* result = tryAllocateSlow(vm, size, deferralContext);
    116118    if (failureMode == AllocationFailureMode::Assert)
    117119        RELEASE_ASSERT(result);
     
    119121}
    120122
    121 void* CompleteSubspace::tryAllocateSlow(size_t size, GCDeferralContext* deferralContext)
     123void* CompleteSubspace::tryAllocateSlow(VM& vm, size_t size, GCDeferralContext* deferralContext)
    122124{
    123     sanitizeStackForVM(m_space.heap()->vm());
     125    sanitizeStackForVM(&vm);
    124126   
    125     if (BlockDirectory* allocator = allocatorFor(size, AllocatorForMode::EnsureAllocator))
    126         return allocator->allocate(deferralContext, AllocationFailureMode::ReturnNull);
     127    if (Allocator allocator = allocatorFor(size, AllocatorForMode::EnsureAllocator))
     128        return allocator.allocate(vm, deferralContext, AllocationFailureMode::ReturnNull);
    127129   
    128130    if (size <= Options::largeAllocationCutoff()
     
    133135    }
    134136   
    135     m_space.heap()->collectIfNecessaryOrDefer(deferralContext);
     137    vm.heap.collectIfNecessaryOrDefer(deferralContext);
    136138   
    137139    size = WTF::roundUpToMultipleOf<MarkedSpace::sizeStep>(size);
    138     LargeAllocation* allocation = LargeAllocation::tryCreate(*m_space.m_heap, size, this);
     140    LargeAllocation* allocation = LargeAllocation::tryCreate(vm.heap, size, this);
    139141    if (!allocation)
    140142        return nullptr;
    141143   
    142144    m_space.m_largeAllocations.append(allocation);
    143     m_space.m_heap->didAllocate(size);
     145    vm.heap.didAllocate(size);
    144146    m_space.m_capacity += size;
    145147   
  • trunk/Source/JavaScriptCore/heap/CompleteSubspace.h

    r227609 r227617  
    4040    // FIXME: Currently subspaces speak of BlockDirectories as "allocators", but that's temporary.
    4141    // https://bugs.webkit.org/show_bug.cgi?id=181559
    42     BlockDirectory* allocatorFor(size_t, AllocatorForMode) override;
    43     BlockDirectory* allocatorForNonVirtual(size_t, AllocatorForMode);
     42    Allocator allocatorFor(size_t, AllocatorForMode) override;
     43    Allocator allocatorForNonVirtual(size_t, AllocatorForMode);
    4444   
    45     void* allocate(size_t, GCDeferralContext*, AllocationFailureMode) override;
    46     JS_EXPORT_PRIVATE void* allocateNonVirtual(size_t, GCDeferralContext*, AllocationFailureMode);
     45    void* allocate(VM&, size_t, GCDeferralContext*, AllocationFailureMode) override;
     46    JS_EXPORT_PRIVATE void* allocateNonVirtual(VM&, size_t, GCDeferralContext*, AllocationFailureMode);
    4747   
    4848    static ptrdiff_t offsetOfAllocatorForSizeStep() { return OBJECT_OFFSETOF(CompleteSubspace, m_allocatorForSizeStep); }
    4949   
    50     BlockDirectory** allocatorForSizeStep() { return &m_allocatorForSizeStep[0]; }
     50    Allocator* allocatorForSizeStep() { return &m_allocatorForSizeStep[0]; }
    5151
    5252private:
    53     BlockDirectory* allocatorForSlow(size_t);
     53    Allocator allocatorForSlow(size_t);
    5454   
    5555    // These slow paths are concerned with large allocations and allocator creation.
    56     void* allocateSlow(size_t, GCDeferralContext*, AllocationFailureMode);
    57     void* tryAllocateSlow(size_t, GCDeferralContext*);
     56    void* allocateSlow(VM&, size_t, GCDeferralContext*, AllocationFailureMode);
     57    void* tryAllocateSlow(VM&, size_t, GCDeferralContext*);
    5858   
    59     std::array<BlockDirectory*, MarkedSpace::numSizeClasses> m_allocatorForSizeStep;
     59    std::array<Allocator, MarkedSpace::numSizeClasses> m_allocatorForSizeStep;
    6060    Vector<std::unique_ptr<BlockDirectory>> m_directories;
    6161};
    6262
    63 ALWAYS_INLINE BlockDirectory* CompleteSubspace::allocatorForNonVirtual(size_t size, AllocatorForMode mode)
     63ALWAYS_INLINE Allocator CompleteSubspace::allocatorForNonVirtual(size_t size, AllocatorForMode mode)
    6464{
    6565    if (size <= MarkedSpace::largeCutoff) {
    66         BlockDirectory* result = m_allocatorForSizeStep[MarkedSpace::sizeClassToIndex(size)];
     66        Allocator result = m_allocatorForSizeStep[MarkedSpace::sizeClassToIndex(size)];
    6767        switch (mode) {
    6868        case AllocatorForMode::MustAlreadyHaveAllocator:
     
    7979    }
    8080    RELEASE_ASSERT(mode != AllocatorForMode::MustAlreadyHaveAllocator);
    81     return nullptr;
     81    return Allocator();
    8282}
    8383
  • trunk/Source/JavaScriptCore/heap/FreeList.h

    r227609 r227617  
    5858
    5959class FreeList {
    60     WTF_MAKE_NONCOPYABLE(FreeList);
    61    
    6260public:
    6361    FreeList(unsigned cellSize);
  • trunk/Source/JavaScriptCore/heap/GCDeferralContext.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2016 Apple Inc. All rights reserved.
     2 * Copyright (C) 2016-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    3030class Heap;
    3131class BlockDirectory;
     32class LocalAllocator;
    3233
    3334class GCDeferralContext {
    3435    friend class Heap;
    3536    friend class BlockDirectory;
     37    friend class LocalAllocator;
    3638public:
    3739    inline GCDeferralContext(Heap&);
  • trunk/Source/JavaScriptCore/heap/Heap.cpp

    r227609 r227617  
    6969#include "SweepingScope.h"
    7070#include "SynchronousStopTheWorldMutatorScheduler.h"
     71#include "ThreadLocalCacheLayout.h"
    7172#include "TypeProfiler.h"
    7273#include "TypeProfilerLog.h"
     
    313314    , m_threadLock(Box<Lock>::create())
    314315    , m_threadCondition(AutomaticThreadCondition::create())
     316    , m_threadLocalCacheLayout(std::make_unique<ThreadLocalCacheLayout>())
    315317{
    316318    m_worldState.store(0);
     
    447449   
    448450    m_arrayBuffers.lastChanceToFinalize();
    449     m_objectSpace.stopAllocating();
     451    m_objectSpace.stopAllocatingForGood();
    450452    m_objectSpace.lastChanceToFinalize();
    451453    releaseDelayedReleasedObjects();
  • trunk/Source/JavaScriptCore/heap/Heap.h

    r227609 r227617  
    8585class StopIfNecessaryTimer;
    8686class SweepingScope;
     87class ThreadLocalCacheLayout;
    8788class VM;
    8889class WeakGCMapBase;
     
    373374    template<typename Func>
    374375    void forEachSlotVisitor(const Func&);
     376   
     377    ThreadLocalCacheLayout& threadLocalCacheLayout() { return *m_threadLocalCacheLayout; }
    375378
    376379private:
     
    715718    CurrentThreadState* m_currentThreadState { nullptr };
    716719    WTF::Thread* m_currentThread { nullptr }; // It's OK if this becomes a dangling pointer.
     720   
     721    std::unique_ptr<ThreadLocalCacheLayout> m_threadLocalCacheLayout;
    717722};
    718723
  • trunk/Source/JavaScriptCore/heap/IsoCellSet.h

    r227609 r227617  
    2626#pragma once
    2727
     28#include "MarkedBlock.h"
    2829#include <wtf/Bitmap.h>
    2930#include <wtf/ConcurrentVector.h>
    3031#include <wtf/FastBitVector.h>
     32#include <wtf/SentinelLinkedList.h>
     33#include <wtf/SharedTask.h>
    3134
    3235namespace JSC {
  • trunk/Source/JavaScriptCore/heap/IsoSubspace.cpp

    r227609 r227617  
    2727#include "IsoSubspace.h"
    2828
     29#include "AllocatorInlines.h"
    2930#include "BlockDirectoryInlines.h"
    3031#include "IsoAlignedMemoryAllocator.h"
     32#include "LocalAllocatorInlines.h"
     33#include "ThreadLocalCacheInlines.h"
    3134
    3235namespace JSC {
     
    3639    , m_size(size)
    3740    , m_directory(&heap, WTF::roundUpToMultipleOf<MarkedBlock::atomSize>(size))
     41    , m_allocator(m_directory.allocator())
    3842    , m_isoAlignedMemoryAllocator(std::make_unique<IsoAlignedMemoryAllocator>())
    3943{
     
    5155}
    5256
    53 BlockDirectory* IsoSubspace::allocatorFor(size_t size, AllocatorForMode mode)
     57Allocator IsoSubspace::allocatorFor(size_t size, AllocatorForMode mode)
    5458{
    5559    return allocatorForNonVirtual(size, mode);
    5660}
    5761
    58 void* IsoSubspace::allocate(size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
     62void* IsoSubspace::allocate(VM& vm, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    5963{
    60     return allocateNonVirtual(size, deferralContext, failureMode);
     64    return allocateNonVirtual(vm, size, deferralContext, failureMode);
    6165}
    6266
    63 void* IsoSubspace::allocateNonVirtual(size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
     67void* IsoSubspace::allocateNonVirtual(VM& vm, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    6468{
    6569    RELEASE_ASSERT(size == this->size());
    66     void* result = m_directory.allocate(deferralContext, failureMode);
     70    void* result = m_allocator.allocate(vm, deferralContext, failureMode);
    6771    return result;
    6872}
  • trunk/Source/JavaScriptCore/heap/IsoSubspace.h

    r227609 r227617  
    4242    size_t size() const { return m_size; }
    4343
    44     BlockDirectory* allocatorFor(size_t, AllocatorForMode) override;
    45     BlockDirectory* allocatorForNonVirtual(size_t, AllocatorForMode);
     44    Allocator allocatorFor(size_t, AllocatorForMode) override;
     45    Allocator allocatorForNonVirtual(size_t, AllocatorForMode);
    4646
    47     void* allocate(size_t, GCDeferralContext*, AllocationFailureMode) override;
    48     JS_EXPORT_PRIVATE void* allocateNonVirtual(size_t, GCDeferralContext*, AllocationFailureMode);
     47    void* allocate(VM&, size_t, GCDeferralContext*, AllocationFailureMode) override;
     48    JS_EXPORT_PRIVATE void* allocateNonVirtual(VM&, size_t, GCDeferralContext*, AllocationFailureMode);
    4949
    5050private:
     
    5757    size_t m_size;
    5858    BlockDirectory m_directory;
     59    Allocator m_allocator;
    5960    std::unique_ptr<IsoAlignedMemoryAllocator> m_isoAlignedMemoryAllocator;
    6061    SentinelLinkedList<IsoCellSet, BasicRawSentinelNode<IsoCellSet>> m_cellSets;
    6162};
    6263
    63 inline BlockDirectory* IsoSubspace::allocatorForNonVirtual(size_t size, AllocatorForMode)
     64inline Allocator IsoSubspace::allocatorForNonVirtual(size_t size, AllocatorForMode)
    6465{
    6566    RELEASE_ASSERT(size == this->size());
    66     return &m_directory;
     67    return m_allocator;
    6768}
    6869
  • trunk/Source/JavaScriptCore/heap/MarkedBlock.cpp

    r227609 r227617  
    400400        return;
    401401
    402     RELEASE_ASSERT(!m_isFreeListed);
    403     RELEASE_ASSERT(!isAllocated());
     402    if (m_isFreeListed) {
     403        dataLog("FATAL: ", RawPointer(this), "->sweep: block is free-listed.\n");
     404        RELEASE_ASSERT_NOT_REACHED();
     405    }
     406   
     407    if (isAllocated()) {
     408        dataLog("FATAL: ", RawPointer(this), "->sweep: block is allocated.\n");
     409        RELEASE_ASSERT_NOT_REACHED();
     410    }
    404411   
    405412    if (space()->isMarking())
  • trunk/Source/JavaScriptCore/heap/MarkedSpace.cpp

    r227609 r227617  
    303303}
    304304
     305void MarkedSpace::stopAllocatingForGood()
     306{
     307    ASSERT(!isIterating());
     308    forEachDirectory(
     309        [&] (BlockDirectory& directory) -> IterationStatus {
     310            directory.stopAllocatingForGood();
     311            return IterationStatus::Continue;
     312        });
     313}
     314
    305315void MarkedSpace::prepareForConservativeScan()
    306316{
  • trunk/Source/JavaScriptCore/heap/MarkedSpace.h

    r227609 r227617  
    9595    Heap* heap() const { return m_heap; }
    9696   
    97     void lastChanceToFinalize(); // You must call stopAllocating before you call this.
     97    void lastChanceToFinalize(); // Must call stopAllocatingForGood first.
    9898    void freeMemory();
    9999
     
    112112
    113113    void stopAllocating();
     114    void stopAllocatingForGood();
    114115    void resumeAllocating(); // If we just stopped allocation but we didn't do a collection, we need to resume allocation.
    115116   
  • trunk/Source/JavaScriptCore/heap/SlotVisitor.cpp

    r227609 r227617  
    3838#include "JSString.h"
    3939#include "JSCInlines.h"
     40#include "MarkingConstraintSolver.h"
    4041#include "SlotVisitorInlines.h"
    4142#include "StopIfNecessaryTimer.h"
  • trunk/Source/JavaScriptCore/heap/SlotVisitor.h

    r227609 r227617  
    3232#include <wtf/Forward.h>
    3333#include <wtf/MonotonicTime.h>
     34#include <wtf/SharedTask.h>
    3435#include <wtf/text/CString.h>
    3536
     
    4243class HeapSnapshotBuilder;
    4344class MarkedBlock;
     45class MarkingConstraint;
    4446class MarkingConstraintSolver;
    4547class UnconditionalFinalizer;
  • trunk/Source/JavaScriptCore/heap/Subspace.h

    r227609 r227617  
    2828#include "AllocationFailureMode.h"
    2929#include "AllocatorForMode.h"
     30#include "Allocator.h"
    3031#include "MarkedBlock.h"
    3132#include "MarkedSpace.h"
     
    5758    void destroy(VM&, JSCell*);
    5859
    59     virtual BlockDirectory* allocatorFor(size_t, AllocatorForMode) = 0;
    60     virtual void* allocate(size_t, GCDeferralContext*, AllocationFailureMode) = 0;
     60    virtual Allocator allocatorFor(size_t, AllocatorForMode) = 0;
     61    virtual void* allocate(VM&, size_t, GCDeferralContext*, AllocationFailureMode) = 0;
    6162   
    6263    void prepareForAllocation();
  • trunk/Source/JavaScriptCore/jit/AssemblyHelpers.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2011-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2011-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    582582}
    583583#endif
     584
     585void AssemblyHelpers::emitAllocateWithNonNullAllocator(GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, GPRReg scratchGPR, JumpList& slowPath)
     586{
     587    // NOTE: This is carefully written so that we can call it while we disallow scratch
     588    // register usage.
     589       
     590    if (Options::forceGCSlowPaths()) {
     591        slowPath.append(jump());
     592        return;
     593    }
     594   
     595    Jump popPath;
     596    Jump done;
     597   
     598#if ENABLE(FAST_TLS_JIT)
     599    loadFromTLSPtr(fastTLSOffsetForKey(WTF_GC_TLC_KEY), scratchGPR);
     600#else
     601    loadPtr(&vm().threadLocalCacheData, scratchGPR);
     602#endif
     603    if (!isX86())
     604        load32(Address(scratchGPR, ThreadLocalCache::offsetOfSizeInData()), resultGPR);
     605    if (allocator.isConstant()) {
     606        if (isX86())
     607            slowPath.append(branch32(BelowOrEqual, Address(scratchGPR, ThreadLocalCache::offsetOfSizeInData()), TrustedImm32(allocator.allocator().offset())));
     608        else
     609            slowPath.append(branch32(BelowOrEqual, resultGPR, TrustedImm32(allocator.allocator().offset())));
     610        addPtr(TrustedImm32(ThreadLocalCache::offsetOfFirstAllocatorInData() + allocator.allocator().offset()), scratchGPR, allocatorGPR);
     611    } else {
     612        if (isX86())
     613            slowPath.append(branch32(BelowOrEqual, Address(scratchGPR, ThreadLocalCache::offsetOfSizeInData()), allocatorGPR));
     614        else
     615            slowPath.append(branch32(BelowOrEqual, resultGPR, allocatorGPR));
     616        addPtr(TrustedImm32(ThreadLocalCache::offsetOfFirstAllocatorInData()), allocatorGPR);
     617        addPtr(scratchGPR, allocatorGPR);
     618    }
     619
     620    load32(Address(allocatorGPR, LocalAllocator::offsetOfFreeList() + FreeList::offsetOfRemaining()), resultGPR);
     621    popPath = branchTest32(Zero, resultGPR);
     622    if (allocator.isConstant())
     623        add32(TrustedImm32(-allocator.allocator().cellSize(vm().heap)), resultGPR, scratchGPR);
     624    else {
     625        if (isX86()) {
     626            move(resultGPR, scratchGPR);
     627            sub32(Address(allocatorGPR, LocalAllocator::offsetOfCellSize()), scratchGPR);
     628        } else {
     629            load32(Address(allocatorGPR, LocalAllocator::offsetOfCellSize()), scratchGPR);
     630            sub32(resultGPR, scratchGPR, scratchGPR);
     631        }
     632    }
     633    negPtr(resultGPR);
     634    store32(scratchGPR, Address(allocatorGPR, LocalAllocator::offsetOfFreeList() + FreeList::offsetOfRemaining()));
     635    Address payloadEndAddr = Address(allocatorGPR, LocalAllocator::offsetOfFreeList() + FreeList::offsetOfPayloadEnd());
     636    if (isX86())
     637        addPtr(payloadEndAddr, resultGPR);
     638    else {
     639        loadPtr(payloadEndAddr, scratchGPR);
     640        addPtr(scratchGPR, resultGPR);
     641    }
     642       
     643    done = jump();
     644       
     645    popPath.link(this);
     646       
     647    loadPtr(Address(allocatorGPR, LocalAllocator::offsetOfFreeList() + FreeList::offsetOfScrambledHead()), resultGPR);
     648    if (isX86())
     649        xorPtr(Address(allocatorGPR, LocalAllocator::offsetOfFreeList() + FreeList::offsetOfSecret()), resultGPR);
     650    else {
     651        loadPtr(Address(allocatorGPR, LocalAllocator::offsetOfFreeList() + FreeList::offsetOfSecret()), scratchGPR);
     652        xorPtr(scratchGPR, resultGPR);
     653    }
     654    slowPath.append(branchTestPtr(Zero, resultGPR));
     655       
     656    // The object is half-allocated: we have what we know is a fresh object, but
     657    // it's still on the GC's free list.
     658    loadPtr(Address(resultGPR), scratchGPR);
     659    storePtr(scratchGPR, Address(allocatorGPR, LocalAllocator::offsetOfFreeList() + FreeList::offsetOfScrambledHead()));
     660       
     661    done.link(this);
     662}
     663
     664void AssemblyHelpers::emitAllocate(GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, GPRReg scratchGPR, JumpList& slowPath)
     665{
     666    if (allocator.isConstant()) {
     667        if (!allocator.allocator()) {
     668            slowPath.append(jump());
     669            return;
     670        }
     671    }
     672    emitAllocateWithNonNullAllocator(resultGPR, allocator, allocatorGPR, scratchGPR, slowPath);
     673}
     674
     675void AssemblyHelpers::emitAllocateVariableSized(GPRReg resultGPR, CompleteSubspace& subspace, GPRReg allocationSize, GPRReg scratchGPR1, GPRReg scratchGPR2, JumpList& slowPath)
     676{
     677    static_assert(!(MarkedSpace::sizeStep & (MarkedSpace::sizeStep - 1)), "MarkedSpace::sizeStep must be a power of two.");
     678   
     679    unsigned stepShift = getLSBSet(MarkedSpace::sizeStep);
     680   
     681    add32(TrustedImm32(MarkedSpace::sizeStep - 1), allocationSize, scratchGPR1);
     682    urshift32(TrustedImm32(stepShift), scratchGPR1);
     683    slowPath.append(branch32(Above, scratchGPR1, TrustedImm32(MarkedSpace::largeCutoff >> stepShift)));
     684    move(TrustedImmPtr(subspace.allocatorForSizeStep() - 1), scratchGPR2);
     685    load32(BaseIndex(scratchGPR2, scratchGPR1, TimesFour), scratchGPR1);
     686   
     687    emitAllocate(resultGPR, JITAllocator::variable(), scratchGPR1, scratchGPR2, slowPath);
     688}
    584689
    585690void AssemblyHelpers::restoreCalleeSavesFromEntryFrameCalleeSavesBuffer(EntryFrame*& topEntryFrame)
  • trunk/Source/JavaScriptCore/jit/AssemblyHelpers.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2011-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2011-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    3333#include "Heap.h"
    3434#include "InlineCallFrame.h"
     35#include "JITAllocator.h"
    3536#include "JITCode.h"
    3637#include "MacroAssembler.h"
     
    6061
    6162    CodeBlock* codeBlock() { return m_codeBlock; }
     63    VM& vm() { return *m_codeBlock->vm(); }
    6264    AssemblerType_T& assembler() { return m_assembler; }
    6365
     
    14921494    // Call this if you know that the value held in allocatorGPR is non-null. This DOES NOT mean
    14931495    // that allocator is non-null; allocator can be null as a signal that we don't know what the
    1494     // value of allocatorGPR is.
    1495     void emitAllocateWithNonNullAllocator(GPRReg resultGPR, BlockDirectory* allocator, GPRReg allocatorGPR, GPRReg scratchGPR, JumpList& slowPath)
    1496     {
    1497         // NOTE: This is carefully written so that we can call it while we disallow scratch
    1498         // register usage.
    1499        
    1500         if (Options::forceGCSlowPaths()) {
    1501             slowPath.append(jump());
    1502             return;
    1503         }
    1504        
    1505         Jump popPath;
    1506         Jump done;
    1507        
    1508         load32(Address(allocatorGPR, BlockDirectory::offsetOfFreeList() + FreeList::offsetOfRemaining()), resultGPR);
    1509         popPath = branchTest32(Zero, resultGPR);
    1510         if (allocator)
    1511             add32(TrustedImm32(-allocator->cellSize()), resultGPR, scratchGPR);
    1512         else {
    1513             if (isX86()) {
    1514                 move(resultGPR, scratchGPR);
    1515                 sub32(Address(allocatorGPR, BlockDirectory::offsetOfCellSize()), scratchGPR);
    1516             } else {
    1517                 load32(Address(allocatorGPR, BlockDirectory::offsetOfCellSize()), scratchGPR);
    1518                 sub32(resultGPR, scratchGPR, scratchGPR);
    1519             }
    1520         }
    1521         negPtr(resultGPR);
    1522         store32(scratchGPR, Address(allocatorGPR, BlockDirectory::offsetOfFreeList() + FreeList::offsetOfRemaining()));
    1523         Address payloadEndAddr = Address(allocatorGPR, BlockDirectory::offsetOfFreeList() + FreeList::offsetOfPayloadEnd());
    1524         if (isX86())
    1525             addPtr(payloadEndAddr, resultGPR);
    1526         else {
    1527             loadPtr(payloadEndAddr, scratchGPR);
    1528             addPtr(scratchGPR, resultGPR);
    1529         }
    1530        
    1531         done = jump();
    1532        
    1533         popPath.link(this);
    1534        
    1535         loadPtr(Address(allocatorGPR, BlockDirectory::offsetOfFreeList() + FreeList::offsetOfScrambledHead()), resultGPR);
    1536         if (isX86())
    1537             xorPtr(Address(allocatorGPR, BlockDirectory::offsetOfFreeList() + FreeList::offsetOfSecret()), resultGPR);
    1538         else {
    1539             loadPtr(Address(allocatorGPR, BlockDirectory::offsetOfFreeList() + FreeList::offsetOfSecret()), scratchGPR);
    1540             xorPtr(scratchGPR, resultGPR);
    1541         }
    1542         slowPath.append(branchTestPtr(Zero, resultGPR));
    1543        
    1544         // The object is half-allocated: we have what we know is a fresh object, but
    1545         // it's still on the GC's free list.
    1546         loadPtr(Address(resultGPR), scratchGPR);
    1547         storePtr(scratchGPR, Address(allocatorGPR, BlockDirectory::offsetOfFreeList() + FreeList::offsetOfScrambledHead()));
    1548        
    1549         done.link(this);
    1550     }
    1551    
    1552     void emitAllocate(GPRReg resultGPR, BlockDirectory* allocator, GPRReg allocatorGPR, GPRReg scratchGPR, JumpList& slowPath)
    1553     {
    1554         if (!allocator)
    1555             slowPath.append(branchTestPtr(Zero, allocatorGPR));
    1556         emitAllocateWithNonNullAllocator(resultGPR, allocator, allocatorGPR, scratchGPR, slowPath);
    1557     }
     1496    // value of allocatorGPR is. Additionally, if the allocator is not null, then there is no need
     1497    // to populate allocatorGPR - this code will ignore the contents of allocatorGPR.
     1498    void emitAllocateWithNonNullAllocator(GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, GPRReg scratchGPR, JumpList& slowPath);
     1499   
     1500    void emitAllocate(GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, GPRReg scratchGPR, JumpList& slowPath);
    15581501   
    15591502    template<typename StructureType>
    1560     void emitAllocateJSCell(GPRReg resultGPR, BlockDirectory* allocator, GPRReg allocatorGPR, StructureType structure, GPRReg scratchGPR, JumpList& slowPath)
     1503    void emitAllocateJSCell(GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, StructureType structure, GPRReg scratchGPR, JumpList& slowPath)
    15611504    {
    15621505        emitAllocate(resultGPR, allocator, allocatorGPR, scratchGPR, slowPath);
     
    15651508   
    15661509    template<typename StructureType, typename StorageType, typename MaskType>
    1567     void emitAllocateJSObject(GPRReg resultGPR, BlockDirectory* allocator, GPRReg allocatorGPR, StructureType structure, StorageType storage, MaskType mask, GPRReg scratchGPR, JumpList& slowPath)
     1510    void emitAllocateJSObject(GPRReg resultGPR, const JITAllocator& allocator, GPRReg allocatorGPR, StructureType structure, StorageType storage, MaskType mask, GPRReg scratchGPR, JumpList& slowPath)
    15681511    {
    15691512        emitAllocateJSCell(resultGPR, allocator, allocatorGPR, structure, scratchGPR, slowPath);
     
    15771520        GPRReg scratchGPR2, JumpList& slowPath, size_t size)
    15781521    {
    1579         BlockDirectory* allocator = subspaceFor<ClassType>(vm)->allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists);
    1580         if (!allocator) {
    1581             slowPath.append(jump());
    1582             return;
    1583         }
    1584         move(TrustedImmPtr(allocator), scratchGPR1);
    1585         emitAllocateJSObject(resultGPR, allocator, scratchGPR1, structure, storage, mask, scratchGPR2, slowPath);
     1522        Allocator allocator = subspaceFor<ClassType>(vm)->allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists);
     1523        emitAllocateJSObject(resultGPR, JITAllocator::constant(allocator), scratchGPR1, structure, storage, mask, scratchGPR2, slowPath);
    15861524    }
    15871525   
     
    15941532    // allocationSize can be aliased with any of the other input GPRs. If it's not aliased then it
    15951533    // won't be clobbered.
    1596     void emitAllocateVariableSized(GPRReg resultGPR, CompleteSubspace& subspace, GPRReg allocationSize, GPRReg scratchGPR1, GPRReg scratchGPR2, JumpList& slowPath)
    1597     {
    1598         static_assert(!(MarkedSpace::sizeStep & (MarkedSpace::sizeStep - 1)), "MarkedSpace::sizeStep must be a power of two.");
    1599        
    1600         unsigned stepShift = getLSBSet(MarkedSpace::sizeStep);
    1601        
    1602         add32(TrustedImm32(MarkedSpace::sizeStep - 1), allocationSize, scratchGPR1);
    1603         urshift32(TrustedImm32(stepShift), scratchGPR1);
    1604         slowPath.append(branch32(Above, scratchGPR1, TrustedImm32(MarkedSpace::largeCutoff >> stepShift)));
    1605         move(TrustedImmPtr(subspace.allocatorForSizeStep() - 1), scratchGPR2);
    1606         loadPtr(BaseIndex(scratchGPR2, scratchGPR1, timesPtr()), scratchGPR1);
    1607        
    1608         emitAllocate(resultGPR, nullptr, scratchGPR1, scratchGPR2, slowPath);
    1609     }
     1534    void emitAllocateVariableSized(GPRReg resultGPR, CompleteSubspace& subspace, GPRReg allocationSize, GPRReg scratchGPR1, GPRReg scratchGPR2, JumpList& slowPath);
    16101535   
    16111536    template<typename ClassType, typename StructureType>
  • trunk/Source/JavaScriptCore/jit/JITOpcodes.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2009-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2009-2018 Apple Inc. All rights reserved.
    33 * Copyright (C) 2010 Patrick Gansterer <paroga@paroga.com>
    44 *
     
    8282    Structure* structure = currentInstruction[3].u.objectAllocationProfile->structure();
    8383    size_t allocationSize = JSFinalObject::allocationSize(structure->inlineCapacity());
    84     BlockDirectory* allocator = subspaceFor<JSFinalObject>(*m_vm)->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
     84    Allocator allocator = subspaceFor<JSFinalObject>(*m_vm)->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
    8585
    8686    RegisterID resultReg = regT0;
     
    8888    RegisterID scratchReg = regT2;
    8989
    90     move(TrustedImmPtr(allocator), allocatorReg);
    91     if (allocator)
    92         addSlowCase(Jump());
    93     JumpList slowCases;
    94     auto butterfly = TrustedImmPtr(nullptr);
    95     auto mask = TrustedImm32(0);
    96     emitAllocateJSObject(resultReg, allocator, allocatorReg, TrustedImmPtr(structure), butterfly, mask, scratchReg, slowCases);
    97     emitInitializeInlineStorage(resultReg, structure->inlineCapacity());
    98     addSlowCase(slowCases);
    99     emitPutVirtualRegister(currentInstruction[1].u.operand);
     90    if (!allocator)
     91        addSlowCase(jump());
     92    else {
     93        JumpList slowCases;
     94        auto butterfly = TrustedImmPtr(nullptr);
     95        auto mask = TrustedImm32(0);
     96        emitAllocateJSObject(resultReg, JITAllocator::constant(allocator), allocatorReg, TrustedImmPtr(structure), butterfly, mask, scratchReg, slowCases);
     97        emitInitializeInlineStorage(resultReg, structure->inlineCapacity());
     98        addSlowCase(slowCases);
     99        emitPutVirtualRegister(currentInstruction[1].u.operand);
     100    }
    100101}
    101102
     
    767768    loadPtr(Address(calleeReg, JSFunction::offsetOfRareData()), rareDataReg);
    768769    addSlowCase(branchTestPtr(Zero, rareDataReg));
    769     loadPtr(Address(rareDataReg, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfAllocator()), allocatorReg);
     770    load32(Address(rareDataReg, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfAllocator()), allocatorReg);
    770771    loadPtr(Address(rareDataReg, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfStructure()), structureReg);
    771     addSlowCase(branchTestPtr(Zero, allocatorReg));
     772    addSlowCase(branch32(Equal, allocatorReg, TrustedImm32(Allocator().offset())));
    772773
    773774    loadPtr(cachedFunction, cachedFunctionReg);
     
    779780    auto butterfly = TrustedImmPtr(nullptr);
    780781    auto mask = TrustedImm32(0);
    781     emitAllocateJSObject(resultReg, nullptr, allocatorReg, structureReg, butterfly, mask, scratchReg, slowCases);
     782    emitAllocateJSObject(resultReg, JITAllocator::variable(), allocatorReg, structureReg, butterfly, mask, scratchReg, slowCases);
    782783    emitGetVirtualRegister(callee, scratchReg);
    783784    loadPtr(Address(scratchReg, JSFunction::offsetOfRareData()), scratchReg);
  • trunk/Source/JavaScriptCore/jit/JITOpcodes32_64.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2009-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2009-2018 Apple Inc. All rights reserved.
    33 * Copyright (C) 2010 Patrick Gansterer <paroga@paroga.com>
    44 *
     
    8080    Structure* structure = currentInstruction[3].u.objectAllocationProfile->structure();
    8181    size_t allocationSize = JSFinalObject::allocationSize(structure->inlineCapacity());
    82     BlockDirectory* allocator = subspaceFor<JSFinalObject>(*m_vm)->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
     82    Allocator allocator = subspaceFor<JSFinalObject>(*m_vm)->allocatorForNonVirtual(allocationSize, AllocatorForMode::AllocatorIfExists);
    8383
    8484    RegisterID resultReg = returnValueGPR;
     
    8686    RegisterID scratchReg = regT3;
    8787
    88     move(TrustedImmPtr(allocator), allocatorReg);
    89     if (allocator)
    90         addSlowCase(Jump());
    91     JumpList slowCases;
    92     auto butterfly = TrustedImmPtr(nullptr);
    93     auto mask = TrustedImm32(0);
    94     emitAllocateJSObject(resultReg, allocator, allocatorReg, TrustedImmPtr(structure), butterfly, mask, scratchReg, slowCases);
    95     emitInitializeInlineStorage(resultReg, structure->inlineCapacity());
    96     addSlowCase(slowCases);
    97     emitStoreCell(currentInstruction[1].u.operand, resultReg);
     88    if (!allocator)
     89        addSlowCase(jump());
     90    else {
     91        JumpList slowCases;
     92        auto butterfly = TrustedImmPtr(nullptr);
     93        auto mask = TrustedImm32(0);
     94        emitAllocateJSObject(resultReg, JITAllocator::constant(allocator), allocatorReg, TrustedImmPtr(structure), butterfly, mask, scratchReg, slowCases);
     95        emitInitializeInlineStorage(resultReg, structure->inlineCapacity());
     96        addSlowCase(slowCases);
     97        emitStoreCell(currentInstruction[1].u.operand, resultReg);
     98    }
    9899}
    99100
     
    857858    loadPtr(Address(calleeReg, JSFunction::offsetOfRareData()), rareDataReg);
    858859    addSlowCase(branchTestPtr(Zero, rareDataReg));
    859     loadPtr(Address(rareDataReg, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfAllocator()), allocatorReg);
     860    load32(Address(rareDataReg, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfAllocator()), allocatorReg);
    860861    loadPtr(Address(rareDataReg, FunctionRareData::offsetOfObjectAllocationProfile() + ObjectAllocationProfile::offsetOfStructure()), structureReg);
    861     addSlowCase(branchTestPtr(Zero, allocatorReg));
     862    addSlowCase(branch32(Equal, allocatorReg, TrustedImm32(Allocator().offset())));
    862863
    863864    loadPtr(cachedFunction, cachedFunctionReg);
     
    869870    auto butterfly = TrustedImmPtr(nullptr);
    870871    auto mask = TrustedImm32(0);
    871     emitAllocateJSObject(resultReg, nullptr, allocatorReg, structureReg, butterfly, mask, scratchReg, slowCases);
     872    emitAllocateJSObject(resultReg, JITAllocator::variable(), allocatorReg, structureReg, butterfly, mask, scratchReg, slowCases);
    872873    addSlowCase(slowCases);
    873874    emitStoreCell(currentInstruction[1].u.operand, resultReg);
  • trunk/Source/JavaScriptCore/runtime/ButterflyInlines.h

    r227609 r227617  
    8080{
    8181    size_t size = totalSize(preCapacity, propertyCapacity, hasIndexingHeader, indexingPayloadSizeInBytes);
    82     void* base = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(size, nullptr, AllocationFailureMode::Assert);
     82    void* base = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(vm, size, nullptr, AllocationFailureMode::Assert);
    8383    Butterfly* result = fromBase(base, preCapacity, propertyCapacity);
    8484    return result;
     
    8888{
    8989    size_t size = totalSize(preCapacity, propertyCapacity, hasIndexingHeader, indexingPayloadSizeInBytes);
    90     void* base = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(size, nullptr, AllocationFailureMode::ReturnNull);
     90    void* base = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(vm, size, nullptr, AllocationFailureMode::ReturnNull);
    9191    if (!base)
    9292        return nullptr;
     
    166166    size_t oldSize = totalSize(0, propertyCapacity, hadIndexingHeader, oldIndexingPayloadSizeInBytes);
    167167    size_t newSize = totalSize(0, propertyCapacity, true, newIndexingPayloadSizeInBytes);
    168     void* newBase = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(newSize, nullptr, AllocationFailureMode::ReturnNull);
     168    void* newBase = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(vm, newSize, nullptr, AllocationFailureMode::ReturnNull);
    169169    if (!newBase)
    170170        return nullptr;
  • trunk/Source/JavaScriptCore/runtime/DirectArguments.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    119119    putDirect(vm, vm.propertyNames->iteratorSymbol, globalObject()->arrayProtoValuesFunction(), static_cast<unsigned>(PropertyAttribute::DontEnum));
    120120   
    121     void* backingStore = vm.gigacageAuxiliarySpace(m_mappedArguments.kind).allocateNonVirtual(mappedArgumentsSize(), nullptr, AllocationFailureMode::Assert);
     121    void* backingStore = vm.gigacageAuxiliarySpace(m_mappedArguments.kind).allocateNonVirtual(vm, mappedArgumentsSize(), nullptr, AllocationFailureMode::Assert);
    122122    bool* overrides = static_cast<bool*>(backingStore);
    123123    m_mappedArguments.set(vm, this, overrides);
  • trunk/Source/JavaScriptCore/runtime/GenericArgumentsInlines.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    264264
    265265    if (argsLength) {
    266         void* backingStore = vm.gigacageAuxiliarySpace(m_modifiedArgumentsDescriptor.kind).allocateNonVirtual(WTF::roundUpToMultipleOf<8>(argsLength), nullptr, AllocationFailureMode::Assert);
     266        void* backingStore = vm.gigacageAuxiliarySpace(m_modifiedArgumentsDescriptor.kind).allocateNonVirtual(vm, WTF::roundUpToMultipleOf<8>(argsLength), nullptr, AllocationFailureMode::Assert);
    267267        bool* modifiedArguments = static_cast<bool*>(backingStore);
    268268        m_modifiedArgumentsDescriptor.set(vm, this, modifiedArguments);
  • trunk/Source/JavaScriptCore/runtime/HashMapImpl.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2016-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2016-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    208208        auto scope = DECLARE_THROW_SCOPE(vm);
    209209        size_t allocationSize = HashMapBuffer::allocationSize(capacity);
    210         void* data = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(allocationSize, nullptr, AllocationFailureMode::ReturnNull);
     210        void* data = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(vm, allocationSize, nullptr, AllocationFailureMode::ReturnNull);
    211211        if (!data) {
    212212            throwOutOfMemoryError(exec, scope);
  • trunk/Source/JavaScriptCore/runtime/JSArray.cpp

    r227609 r227617  
    11/*
    22 *  Copyright (C) 1999-2000 Harri Porten (porten@kde.org)
    3  *  Copyright (C) 2003-2017 Apple Inc. All rights reserved.
     3 *  Copyright (C) 2003-2018 Apple Inc. All rights reserved.
    44 *  Copyright (C) 2003 Peter Kelly (pmk@post.com)
    55 *  Copyright (C) 2006 Alexey Proskuryakov (ap@nypop.com)
     
    8282
    8383        unsigned vectorLength = Butterfly::optimalContiguousVectorLength(structure, initialLength);
    84         void* temp = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(Butterfly::totalSize(0, outOfLineStorage, true, vectorLength * sizeof(EncodedJSValue)), deferralContext, AllocationFailureMode::ReturnNull);
     84        void* temp = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(
     85            vm,
     86            Butterfly::totalSize(0, outOfLineStorage, true, vectorLength * sizeof(EncodedJSValue)),
     87            deferralContext, AllocationFailureMode::ReturnNull);
    8588        if (UNLIKELY(!temp))
    8689            return nullptr;
     
    99102        static const unsigned indexBias = 0;
    100103        unsigned vectorLength = ArrayStorage::optimalVectorLength(indexBias, structure, initialLength);
    101         void* temp = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(Butterfly::totalSize(indexBias, outOfLineStorage, true, ArrayStorage::sizeFor(vectorLength)), deferralContext, AllocationFailureMode::ReturnNull);
     104        void* temp = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(
     105            vm,
     106            Butterfly::totalSize(indexBias, outOfLineStorage, true, ArrayStorage::sizeFor(vectorLength)),
     107            deferralContext, AllocationFailureMode::ReturnNull);
    102108        if (UNLIKELY(!temp))
    103109            return nullptr;
     
    369375    } else {
    370376        size_t newSize = Butterfly::totalSize(0, propertyCapacity, true, ArrayStorage::sizeFor(desiredCapacity));
    371         newAllocBase = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(newSize, nullptr, AllocationFailureMode::ReturnNull);
     377        newAllocBase = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(
     378            vm, newSize, nullptr, AllocationFailureMode::ReturnNull);
    372379        if (!newAllocBase)
    373380            return false;
  • trunk/Source/JavaScriptCore/runtime/JSArray.h

    r227609 r227617  
    11/*
    22 *  Copyright (C) 1999-2000 Harri Porten (porten@kde.org)
    3  *  Copyright (C) 2003-2017 Apple Inc. All rights reserved.
     3 *  Copyright (C) 2003-2018 Apple Inc. All rights reserved.
    44 *
    55 *  This library is free software; you can redistribute it and/or
     
    239239        unsigned vectorLength = Butterfly::optimalContiguousVectorLength(structure, vectorLengthHint);
    240240        void* temp = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(
     241            vm,
    241242            Butterfly::totalSize(0, outOfLineStorage, true, vectorLength * sizeof(EncodedJSValue)),
    242243            nullptr, AllocationFailureMode::ReturnNull);
  • trunk/Source/JavaScriptCore/runtime/JSArrayBufferView.cpp

    r227609 r227617  
    6767        size_t size = sizeOf(length, elementSize);
    6868        if (size) {
    69             temp = vm.primitiveGigacageAuxiliarySpace.allocateNonVirtual(size, nullptr, AllocationFailureMode::ReturnNull);
     69            temp = vm.primitiveGigacageAuxiliarySpace.allocateNonVirtual(vm, size, nullptr, AllocationFailureMode::ReturnNull);
    7070            if (!temp)
    7171                return;
  • trunk/Source/JavaScriptCore/runtime/JSCellInlines.h

    r227609 r227617  
    11/*
    2  * Copyright (C) 2012-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2012-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    146146ALWAYS_INLINE void* tryAllocateCellHelper(Heap& heap, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
    147147{
     148    VM& vm = *heap.vm();
    148149    ASSERT(deferralContext || !DisallowGC::isInEffectOnCurrentThread());
    149150    ASSERT(size >= sizeof(T));
    150     JSCell* result = static_cast<JSCell*>(subspaceFor<T>(*heap.vm())->allocateNonVirtual(size, deferralContext, failureMode));
     151    JSCell* result = static_cast<JSCell*>(subspaceFor<T>(vm)->allocateNonVirtual(vm, size, deferralContext, failureMode));
    151152    if (failureMode == AllocationFailureMode::ReturnNull && !result)
    152153        return nullptr;
    153154#if ENABLE(GC_VALIDATION)
    154     ASSERT(!heap.vm()->isInitializingObject());
    155     heap.vm()->setInitializingObjectClass(T::info());
     155    ASSERT(!vm.isInitializingObject());
     156    vm.setInitializingObjectClass(T::info());
    156157#endif
    157158    result->clearStructure();
  • trunk/Source/JavaScriptCore/runtime/JSGlobalObject.cpp

    r227609 r227617  
    336336}
    337337
    338 JSGlobalObject::JSGlobalObject(VM& vm, Structure* structure, const GlobalObjectMethodTable* globalObjectMethodTable)
     338JSGlobalObject::JSGlobalObject(VM& vm, Structure* structure, const GlobalObjectMethodTable* globalObjectMethodTable, RefPtr<ThreadLocalCache> threadLocalCache)
    339339    : Base(vm, structure, 0)
    340340    , m_vm(vm)
     
    354354    , m_runtimeFlags()
    355355    , m_globalObjectMethodTable(globalObjectMethodTable ? globalObjectMethodTable : &s_globalObjectMethodTable)
     356    , m_threadLocalCache(threadLocalCache ? WTFMove(threadLocalCache) : vm.defaultThreadLocalCache)
    356357{
    357358}
  • trunk/Source/JavaScriptCore/runtime/JSGlobalObject.h

    r227609 r227617  
    492492
    493493protected:
    494     JS_EXPORT_PRIVATE explicit JSGlobalObject(VM&, Structure*, const GlobalObjectMethodTable* = 0);
     494    JS_EXPORT_PRIVATE explicit JSGlobalObject(VM&, Structure*, const GlobalObjectMethodTable* = 0, RefPtr<ThreadLocalCache> = nullptr);
    495495
    496496    JS_EXPORT_PRIVATE void finishCreation(VM&);
     
    894894    void setWrapperMap(JSWrapperMap* map) { m_wrapperMap = map; }
    895895#endif
     896   
     897    ThreadLocalCache& threadLocalCache() const { return *m_threadLocalCache.get(); }
    896898
    897899protected:
     
    925927    RetainPtr<JSWrapperMap> m_wrapperMap;
    926928#endif
     929   
     930    RefPtr<ThreadLocalCache> m_threadLocalCache;
    927931};
    928932
  • trunk/Source/JavaScriptCore/runtime/JSLock.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2005-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2005-2018 Apple Inc. All rights reserved.
    33 *
    44 * This library is free software; you can redistribute it and/or
     
    2929#include "MachineStackMarker.h"
    3030#include "SamplingProfiler.h"
     31#include "ThreadLocalCacheInlines.h"
    3132#include "WasmMachineThreads.h"
    3233#include <thread>
     
    148149    m_vm->setLastStackTop(thread.savedLastStackTop());
    149150    ASSERT(thread.stack().contains(m_vm->lastStackTop()));
    150 
     151   
     152    m_vm->defaultThreadLocalCache->install(*m_vm);
     153   
    151154    m_vm->heap.machineThreads().addCurrentThread();
    152155#if ENABLE(WEBASSEMBLY)
  • trunk/Source/JavaScriptCore/runtime/Options.h

    r227609 r227617  
    237237    v(bool, useBumpAllocator, true, Normal, nullptr) \
    238238    v(bool, stealEmptyBlocksFromOtherAllocators, true, Normal, nullptr) \
     239    v(bool, tradeDestructorBlocks, true, Normal, nullptr) \
    239240    v(bool, eagerlyUpdateTopCallFrame, false, Normal, nullptr) \
    240241    \
  • trunk/Source/JavaScriptCore/runtime/RegExpMatchesArray.h

    r227609 r227617  
    11/*
    2  *  Copyright (C) 2008-2017 Apple Inc. All Rights Reserved.
     2 *  Copyright (C) 2008-2018 Apple Inc. All Rights Reserved.
    33 *
    44 *  This library is free software; you can redistribute it and/or
     
    4343    JSGlobalObject* globalObject = structure->globalObject();
    4444    bool createUninitialized = globalObject->isOriginalArrayStructure(structure);
    45     void* temp = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(Butterfly::totalSize(0, structure->outOfLineCapacity(), true, vectorLength * sizeof(EncodedJSValue)), deferralContext, AllocationFailureMode::ReturnNull);
     45    void* temp = vm.jsValueGigacageAuxiliarySpace.allocateNonVirtual(vm, Butterfly::totalSize(0, structure->outOfLineCapacity(), true, vectorLength * sizeof(EncodedJSValue)), deferralContext, AllocationFailureMode::ReturnNull);
    4646    if (UNLIKELY(!temp))
    4747        return nullptr;
  • trunk/Source/JavaScriptCore/runtime/VM.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2008-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2008-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    122122#include "StructureInlines.h"
    123123#include "TestRunnerUtils.h"
     124#include "ThreadLocalCacheInlines.h"
    124125#include "ThunkGenerators.h"
    125126#include "TypeProfiler.h"
     
    307308    setLastStackTop(stack.origin());
    308309
     310    defaultThreadLocalCache = ThreadLocalCache::create(heap);
     311    defaultThreadLocalCache->install(*this);
     312
    309313    // Need to be careful to keep everything consistent here
    310314    JSLockHolder lock(this);
     
    495499    m_apiLock->willDestroyVM(this);
    496500    heap.lastChanceToFinalize();
     501   
     502#if !ENABLE(FAST_TLS_JIT)
     503    ThreadLocalCache::destructor(threadLocalCacheData);
     504#endif
    497505
    498506    delete interpreter;
  • trunk/Source/JavaScriptCore/runtime/VM.h

    r227609 r227617  
    5555#include "VMEntryRecord.h"
    5656#include "VMTraps.h"
     57#include "ThreadLocalCache.h"
    5758#include "WasmContext.h"
    5859#include "Watchpoint.h"
     
    657658    JSObject* stringRecursionCheckFirstObject { nullptr };
    658659    HashSet<JSObject*> stringRecursionCheckVisitedObjects;
     660   
     661#if !ENABLE(FAST_TLS_JIT)
     662    ThreadLocalCache::Data* threadLocalCacheData { nullptr };
     663#endif
     664    RefPtr<ThreadLocalCache> defaultThreadLocalCache;
    659665
    660666    LocalTimeOffsetCache localTimeOffsetCache;
  • trunk/Source/JavaScriptCore/runtime/VMEntryScope.cpp

    r227609 r227617  
    11/*
    2  * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2018 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    3030#include "Options.h"
    3131#include "SamplingProfiler.h"
     32#include "ThreadLocalCacheInlines.h"
    3233#include "VM.h"
    3334#include "Watchdog.h"
     
    4142    , m_globalObject(globalObject)
    4243{
     44    globalObject->threadLocalCache().install(vm);
    4345    ASSERT(!DisallowVMReentry::isInEffectOnCurrentThread());
    4446    ASSERT(Thread::current().stack().isGrowingDownward());
  • trunk/Source/WTF/ChangeLog

    r227609 r227617  
     12018-01-25  Filip Pizlo  <fpizlo@apple.com>
     2
     3        JSC GC should support TLCs (thread local caches)
     4        https://bugs.webkit.org/show_bug.cgi?id=181559
     5
     6        Reviewed by Mark Lam and Saam Barati.
     7
     8        * wtf/Bitmap.h: Just fixing a compile error.
     9
    1102018-01-25  Commit Queue  <commit-queue@webkit.org>
    211
  • trunk/Source/WTF/wtf/Bitmap.h

    r227609 r227617  
    2222#include <array>
    2323#include <wtf/Atomics.h>
     24#include <wtf/HashFunctions.h>
    2425#include <wtf/StdLibExtras.h>
    2526#include <stdint.h>
Note: See TracChangeset for help on using the changeset viewer.