Changeset 155891 in webkit
- Timestamp:
- Sep 16, 2013 12:48:48 PM (11 years ago)
- Location:
- trunk/Source/JavaScriptCore
- Files:
-
- 1 added
- 14 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/Source/JavaScriptCore/ChangeLog
r155889 r155891 1 2013-09-13 Mark Hahnenberg <mhahnenberg@apple.com> 2 3 MarkedBlocks shouldn't be put in Allocated state if they didn't produce a FreeList 4 https://bugs.webkit.org/show_bug.cgi?id=121236 5 6 Reviewed by Geoffrey Garen. 7 8 Right now, after a collection all MarkedBlocks are in the Marked block state. When lazy sweeping 9 happens, if a block returns an empty free list after being swept, we call didConsumeFreeList(), 10 which moves the block into the Allocated block state. This happens to both the block that was 11 just being allocated out of (i.e. m_currentBlock) as well as any blocks who are completely full. 12 We should distinguish between these two cases: m_currentBlock should transition to 13 Allocated (because we were just allocating out of it) and any subsequent block that returns an 14 empty free list should transition back to the Marked state. This will make the block state more 15 consistent with the actual state the block is in, and it will also allow us to speed up moving 16 all blocks the the Marked state during generational collection. 17 18 Added new RAII-style HeapIterationScope class that notifies the Heap when it is about to be 19 iterated and when iteration has finished. Any clients that need accurate liveness data when 20 iterating over the Heap now need to use a HeapIterationScope so that the state of Heap can 21 be properly restored after they are done iterating. No new GC-allocated objects can be created 22 until this object goes out of scope. 23 24 * JavaScriptCore.xcodeproj/project.pbxproj: 25 * debugger/Debugger.cpp: 26 (JSC::Debugger::recompileAllJSFunctions): Added HeapIterationScope for the Recompiler iteration. 27 * heap/Heap.cpp: 28 (JSC::Heap::willStartIterating): Callback used by HeapIterationScope to indicate that iteration of 29 the Heap is about to begin. This will cause cell liveness data to be canonicalized by calling stopAllocating. 30 (JSC::Heap::didFinishIterating): Same, but indicates that iteration has finished. 31 (JSC::Heap::globalObjectCount): Used HeapIterationScope. 32 (JSC::Heap::objectTypeCounts): Ditto. 33 (JSC::Heap::markDeadObjects): Ditto. 34 (JSC::Heap::zombifyDeadObjects): Ditto. 35 * heap/Heap.h: 36 * heap/HeapIterationScope.h: Added. New RAII-style object for indicating to the Heap that it's about 37 to be iterated or that iteration has finished. 38 (JSC::HeapIterationScope::HeapIterationScope): 39 (JSC::HeapIterationScope::~HeapIterationScope): 40 * heap/HeapStatistics.cpp: 41 (JSC::HeapStatistics::showObjectStatistics): Used new HeapIterationScope. 42 * heap/MarkedAllocator.cpp: 43 (JSC::MarkedAllocator::tryAllocateHelper): We now treat the case where we have just finished 44 allocating out of the current block differently from the case where we sweep a block and it 45 returns an empty free list. This was the primary point of this patch. 46 (JSC::MarkedAllocator::allocateSlowCase): ASSERT that nobody is currently iterating the Heap 47 when allocating. 48 * heap/MarkedAllocator.h: 49 (JSC::MarkedAllocator::reset): All allocators are reset after every collection. We need to make 50 sure that the m_lastActiveBlock gets cleared, which it might not always because we don't call 51 takeCanonicalizedBlock on blocks in the large allocators. 52 (JSC::MarkedAllocator::stopAllocating): We shouldn't already have a last active block, 53 so ASSERT as much. 54 (JSC::MarkedAllocator::resumeAllocating): Do the opposite of what stopAllocating 55 does. So, if we don't have a m_lastActiveBlock then we don't have to worry about undoing anything 56 done by stopAllocating. If we do, then we call resumeAllocating on the block, which returns the FreeList 57 as it was prior to stopping allocation. We then set the current block to the last active block and 58 clear the last active block. 59 * heap/MarkedBlock.cpp: 60 (JSC::MarkedBlock::resumeAllocating): Any block resuming allocation should be in 61 the Marked state, so ASSERT as much. We always allocate a m_newlyAllocated Bitmap if we're 62 FreeListed, so if we didn't allocate one then we know we were Marked when allocation was stopped, 63 so just return early with an empty FreeList. If we do have a non-null m_newlyAllocated Bitmap 64 then we need to be swept in order to rebuild our FreeList. 65 * heap/MarkedBlock.h: 66 (JSC::MarkedBlock::didConsumeEmptyFreeList): This is called if we ever sweep a block and get back 67 an empty free list. Instead of transitioning to the Allocated state, we now go straight back to the 68 Marked state. This makes sense because we weren't actually allocated out of, so we shouldn't be in 69 the allocated state. Also added some ASSERTs to make sure that we're in the state that we expect: all of 70 our mark bits should be set and we should not have a m_newlyAllocated Bitmap. 71 * heap/MarkedSpace.cpp: 72 (JSC::MarkedSpace::MarkedSpace): 73 (JSC::MarkedSpace::forEachAllocator): Added a new functor-style iteration method so that we can 74 easily iterate over each allocator for, e.g., stopping and resuming allocators without 75 duplicating code. 76 (JSC::StopAllocatingFunctor::operator()): New functors for use with forEachAllocator. 77 (JSC::MarkedSpace::stopAllocating): Ditto. 78 (JSC::ResumeAllocatingFunctor::operator()): Ditto. 79 (JSC::MarkedSpace::resumeAllocating): Ditto. 80 (JSC::MarkedSpace::willStartIterating): Callback that notifies MarkedSpace that it is being iterated. 81 Does some ASSERTs, sets a flag, canonicalizes cell liveness data by calling stopAllocating. 82 (JSC::MarkedSpace::didFinishIterating): Ditto, but to signal that iteration has completed. 83 * heap/MarkedSpace.h: 84 (JSC::MarkedSpace::iterationInProgress): Returns true if a HeapIterationScope is currently active. 85 (JSC::MarkedSpace::forEachLiveCell): Accepts a HeapIterationScope to enforce the rule that you have to 86 create one prior to iterating over the Heap. 87 (JSC::MarkedSpace::forEachDeadCell): Ditto. 88 * runtime/JSGlobalObject.cpp: 89 (JSC::JSGlobalObject::haveABadTime): Changed to use new HeapIterationScope. 90 * runtime/VM.cpp: 91 (JSC::VM::releaseExecutableMemory): Ditto. 92 1 93 2013-09-16 Filip Pizlo <fpizlo@apple.com> 2 94 -
trunk/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj
r155473 r155891 650 650 2600B5A7152BAAA70091EE5F /* JSStringJoiner.h in Headers */ = {isa = PBXBuildFile; fileRef = 2600B5A5152BAAA70091EE5F /* JSStringJoiner.h */; }; 651 651 2A48D1911772365B00C65A5F /* APICallbackFunction.h in Headers */ = {isa = PBXBuildFile; fileRef = C211B574176A224D000E2A23 /* APICallbackFunction.h */; }; 652 2AD8932B17E3868F00668276 /* HeapIterationScope.h in Headers */ = {isa = PBXBuildFile; fileRef = 2AD8932917E3868F00668276 /* HeapIterationScope.h */; }; 652 653 371D842D17C98B6E00ECF994 /* libz.dylib in Frameworks */ = {isa = PBXBuildFile; fileRef = 371D842C17C98B6E00ECF994 /* libz.dylib */; }; 653 654 41359CF30FDD89AD00206180 /* DateConversion.h in Headers */ = {isa = PBXBuildFile; fileRef = D21202290AD4310C00ED79B6 /* DateConversion.h */; }; … … 1813 1814 2600B5A4152BAAA70091EE5F /* JSStringJoiner.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = JSStringJoiner.cpp; sourceTree = "<group>"; }; 1814 1815 2600B5A5152BAAA70091EE5F /* JSStringJoiner.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = JSStringJoiner.h; sourceTree = "<group>"; }; 1816 2AD8932917E3868F00668276 /* HeapIterationScope.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = HeapIterationScope.h; sourceTree = "<group>"; }; 1815 1817 371D842C17C98B6E00ECF994 /* libz.dylib */ = {isa = PBXFileReference; lastKnownFileType = "compiled.mach-o.dylib"; name = libz.dylib; path = usr/lib/libz.dylib; sourceTree = SDKROOT; }; 1816 1818 449097EE0F8F81B50076A327 /* FeatureDefines.xcconfig */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.xcconfig; path = FeatureDefines.xcconfig; sourceTree = "<group>"; }; … … 2851 2853 0FC8150814043BCA00CFA603 /* WriteBarrierSupport.cpp */, 2852 2854 0FC8150914043BD200CFA603 /* WriteBarrierSupport.h */, 2855 2AD8932917E3868F00668276 /* HeapIterationScope.h */, 2853 2856 ); 2854 2857 path = heap; … … 4029 4032 0F766D4415B2A3C0008F363E /* DFGRegisterSet.h in Headers */, 4030 4033 86BB09C1138E381B0056702F /* DFGRepatch.h in Headers */, 4034 2AD8932B17E3868F00668276 /* HeapIterationScope.h in Headers */, 4031 4035 A77A424317A0BBFD00A8DB81 /* DFGSafeToExecute.h in Headers */, 4032 4036 A741017F179DAF80002EB8BA /* DFGSaneStringGetByValSlowPathGenerator.h in Headers */, -
trunk/Source/JavaScriptCore/debugger/Debugger.cpp
r154797 r155891 24 24 25 25 #include "Error.h" 26 #include "HeapIterationScope.h" 26 27 #include "Interpreter.h" 27 28 #include "JSFunction.h" … … 123 124 124 125 Recompiler recompiler(this); 125 vm->heap.objectSpace().forEachLiveCell(recompiler); 126 HeapIterationScope iterationScope(vm->heap); 127 vm->heap.objectSpace().forEachLiveCell(iterationScope, recompiler); 126 128 } 127 129 -
trunk/Source/JavaScriptCore/heap/Heap.cpp
r155413 r155891 30 30 #include "GCActivityCallback.h" 31 31 #include "GCIncomingRefCountedSetInlines.h" 32 #include "HeapIterationScope.h" 32 33 #include "HeapRootVisitor.h" 33 34 #include "HeapStatistics.h" … … 416 417 } 417 418 418 void Heap::canonicalizeCellLivenessData() 419 { 420 m_objectSpace.canonicalizeCellLivenessData(); 419 void Heap::willStartIterating() 420 { 421 m_objectSpace.willStartIterating(); 422 } 423 424 void Heap::didFinishIterating() 425 { 426 m_objectSpace.didFinishIterating(); 421 427 } 422 428 … … 663 669 size_t Heap::globalObjectCount() 664 670 { 665 return m_objectSpace.forEachLiveCell<CountIfGlobalObject>(); 671 HeapIterationScope iterationScope(*this); 672 return m_objectSpace.forEachLiveCell<CountIfGlobalObject>(iterationScope); 666 673 } 667 674 … … 678 685 PassOwnPtr<TypeCountSet> Heap::objectTypeCounts() 679 686 { 680 return m_objectSpace.forEachLiveCell<RecordType>(); 687 HeapIterationScope iterationScope(*this); 688 return m_objectSpace.forEachLiveCell<RecordType>(iterationScope); 681 689 } 682 690 … … 764 772 765 773 { 766 GCPHASE( Canonicalize);767 m_objectSpace. canonicalizeCellLivenessData();774 GCPHASE(StopAllocation); 775 m_objectSpace.stopAllocating(); 768 776 } 769 777 … … 876 884 void Heap::markDeadObjects() 877 885 { 878 m_objectSpace.forEachDeadCell<MarkObject>(); 886 HeapIterationScope iterationScope(*this); 887 m_objectSpace.forEachDeadCell<MarkObject>(iterationScope); 879 888 } 880 889 … … 955 964 // Sweep now because destructors will crash once we're zombified. 956 965 m_objectSpace.sweep(); 957 m_objectSpace.forEachDeadCell<Zombify>(); 966 HeapIterationScope iterationScope(*this); 967 m_objectSpace.forEachDeadCell<Zombify>(iterationScope); 958 968 } 959 969 -
trunk/Source/JavaScriptCore/heap/Heap.h
r155317 r155891 167 167 HandleStack* handleStack() { return &m_handleStack; } 168 168 169 void canonicalizeCellLivenessData(); 169 void willStartIterating(); 170 void didFinishIterating(); 170 171 void getConservativeRegisterRoots(HashSet<JSCell*>& roots); 171 172 -
trunk/Source/JavaScriptCore/heap/HeapStatistics.cpp
r139541 r155891 28 28 29 29 #include "Heap.h" 30 #include "HeapIterationScope.h" 30 31 #include "JSObject.h" 31 32 #include "Operations.h" … … 236 237 237 238 StorageStatistics storageStatistics; 238 heap->m_objectSpace.forEachLiveCell(storageStatistics); 239 { 240 HeapIterationScope iterationScope(*heap); 241 heap->m_objectSpace.forEachLiveCell(iterationScope, storageStatistics); 242 } 239 243 dataLogF("wasted .property storage: %ldkB (%ld%%)\n", 240 244 static_cast<long>( -
trunk/Source/JavaScriptCore/heap/MarkedAllocator.cpp
r155652 r155891 31 31 { 32 32 if (!m_freeList.head) { 33 if (m_currentBlock) { 34 ASSERT(m_currentBlock == m_blocksToSweep); 35 m_currentBlock->didConsumeFreeList(); 36 m_blocksToSweep = m_currentBlock->next(); 37 } 38 33 39 for (MarkedBlock*& block = m_blocksToSweep; block; block = block->next()) { 34 40 MarkedBlock::FreeList freeList = block->sweep(MarkedBlock::SweepToFreeList); 35 41 if (!freeList.head) { 36 block->didConsume FreeList();42 block->didConsumeEmptyFreeList(); 37 43 continue; 38 44 } 39 45 40 46 if (bytes > block->cellSize()) { 41 block-> canonicalizeCellLivenessData(freeList);47 block->stopAllocating(freeList); 42 48 continue; 43 49 } … … 77 83 #endif 78 84 85 ASSERT(!m_markedSpace->isIterating()); 79 86 ASSERT(!m_freeList.head); 80 87 m_heap->didAllocate(m_freeList.bytes); -
trunk/Source/JavaScriptCore/heap/MarkedAllocator.h
r155316 r155891 23 23 MarkedAllocator(); 24 24 void reset(); 25 void canonicalizeCellLivenessData(); 25 void stopAllocating(); 26 void resumeAllocating(); 26 27 size_t cellSize() { return m_cellSize; } 27 28 MarkedBlock::DestructorType destructorType() { return m_destructorType; } 28 29 void* allocate(size_t); 29 30 Heap* heap() { return m_heap; } 30 MarkedBlock* take CanonicalizedBlock()31 MarkedBlock* takeLastActiveBlock() 31 32 { 32 MarkedBlock* block = m_ canonicalizedBlock;33 m_ canonicalizedBlock = 0;33 MarkedBlock* block = m_lastActiveBlock; 34 m_lastActiveBlock = 0; 34 35 return block; 35 36 } … … 51 52 MarkedBlock::FreeList m_freeList; 52 53 MarkedBlock* m_currentBlock; 53 MarkedBlock* m_ canonicalizedBlock;54 MarkedBlock* m_lastActiveBlock; 54 55 MarkedBlock* m_blocksToSweep; 55 56 DoublyLinkedList<MarkedBlock> m_blockList; … … 67 68 inline MarkedAllocator::MarkedAllocator() 68 69 : m_currentBlock(0) 69 , m_ canonicalizedBlock(0)70 , m_lastActiveBlock(0) 70 71 , m_blocksToSweep(0) 71 72 , m_cellSize(0) … … 104 105 inline void MarkedAllocator::reset() 105 106 { 107 m_lastActiveBlock = 0; 106 108 m_currentBlock = 0; 107 109 m_freeList = MarkedBlock::FreeList(); … … 109 111 } 110 112 111 inline void MarkedAllocator:: canonicalizeCellLivenessData()113 inline void MarkedAllocator::stopAllocating() 112 114 { 115 ASSERT(!m_lastActiveBlock); 113 116 if (!m_currentBlock) { 114 117 ASSERT(!m_freeList.head); … … 116 119 } 117 120 118 m_currentBlock-> canonicalizeCellLivenessData(m_freeList);119 m_ canonicalizedBlock = m_currentBlock;121 m_currentBlock->stopAllocating(m_freeList); 122 m_lastActiveBlock = m_currentBlock; 120 123 m_currentBlock = 0; 121 124 m_freeList = MarkedBlock::FreeList(); 125 } 126 127 inline void MarkedAllocator::resumeAllocating() 128 { 129 if (!m_lastActiveBlock) 130 return; 131 132 m_freeList = m_lastActiveBlock->resumeAllocating(); 133 m_currentBlock = m_lastActiveBlock; 134 m_lastActiveBlock = 0; 122 135 } 123 136 -
trunk/Source/JavaScriptCore/heap/MarkedBlock.cpp
r148696 r155891 162 162 }; 163 163 164 void MarkedBlock:: canonicalizeCellLivenessData(const FreeList& freeList)164 void MarkedBlock::stopAllocating(const FreeList& freeList) 165 165 { 166 166 HEAP_LOG_BLOCK_STATE_TRANSITION(this); … … 200 200 } 201 201 202 MarkedBlock::FreeList MarkedBlock::resumeAllocating() 203 { 204 HEAP_LOG_BLOCK_STATE_TRANSITION(this); 205 206 ASSERT(m_state == Marked); 207 208 if (!m_newlyAllocated) { 209 // We didn't have to create a "newly allocated" bitmap. That means we were already Marked 210 // when we last stopped allocation, so return an empty free list and stay in the Marked state. 211 return FreeList(); 212 } 213 214 // Re-create our free list from before stopping allocation. 215 return sweep(SweepToFreeList); 216 } 217 202 218 } // namespace JSC -
trunk/Source/JavaScriptCore/heap/MarkedBlock.h
r155652 r155891 133 133 // of these functions: 134 134 void didConsumeFreeList(); // Call this once you've allocated all the items in the free list. 135 void canonicalizeCellLivenessData(const FreeList&); 135 void stopAllocating(const FreeList&); 136 FreeList resumeAllocating(); // Call this if you canonicalized a block for some non-collection related purpose. 137 void didConsumeEmptyFreeList(); // Call this if you sweep a block, but the returned FreeList is empty. 136 138 137 139 // Returns true if the "newly allocated" bitmap was non-null … … 278 280 } 279 281 282 inline void MarkedBlock::didConsumeEmptyFreeList() 283 { 284 HEAP_LOG_BLOCK_STATE_TRANSITION(this); 285 286 ASSERT(!m_newlyAllocated); 287 #ifndef NDEBUG 288 for (size_t i = firstAtom(); i < m_endAtom; i += m_atomsPerCell) 289 ASSERT(m_marks.get(i)); 290 #endif 291 ASSERT(m_state == FreeListed); 292 m_state = Marked; 293 } 294 280 295 inline void MarkedBlock::clearMarks() 281 296 { -
trunk/Source/JavaScriptCore/heap/MarkedSpace.cpp
r155406 r155891 81 81 : m_heap(heap) 82 82 , m_capacity(0) 83 , m_isIterating(false) 83 84 { 84 85 for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep) { … … 111 112 void MarkedSpace::lastChanceToFinalize() 112 113 { 113 canonicalizeCellLivenessData();114 stopAllocating(); 114 115 forEachBlock<LastChanceToFinalize>(); 115 116 } … … 151 152 } 152 153 153 void MarkedSpace::canonicalizeCellLivenessData() 154 template <typename Functor> 155 void MarkedSpace::forEachAllocator() 156 { 157 Functor functor; 158 forEachAllocator(functor); 159 } 160 161 template <typename Functor> 162 void MarkedSpace::forEachAllocator(Functor& functor) 154 163 { 155 164 for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep) { 156 allocatorFor(cellSize).canonicalizeCellLivenessData();157 normalDestructorAllocatorFor(cellSize).canonicalizeCellLivenessData();158 immortalStructureDestructorAllocatorFor(cellSize).canonicalizeCellLivenessData();165 functor(allocatorFor(cellSize)); 166 functor(normalDestructorAllocatorFor(cellSize)); 167 functor(immortalStructureDestructorAllocatorFor(cellSize)); 159 168 } 160 169 161 170 for (size_t cellSize = impreciseStep; cellSize <= impreciseCutoff; cellSize += impreciseStep) { 162 allocatorFor(cellSize).canonicalizeCellLivenessData(); 163 normalDestructorAllocatorFor(cellSize).canonicalizeCellLivenessData(); 164 immortalStructureDestructorAllocatorFor(cellSize).canonicalizeCellLivenessData(); 165 } 166 167 m_normalSpace.largeAllocator.canonicalizeCellLivenessData(); 168 m_normalDestructorSpace.largeAllocator.canonicalizeCellLivenessData(); 169 m_immortalStructureDestructorSpace.largeAllocator.canonicalizeCellLivenessData(); 171 functor(allocatorFor(cellSize)); 172 functor(normalDestructorAllocatorFor(cellSize)); 173 functor(immortalStructureDestructorAllocatorFor(cellSize)); 174 } 175 176 functor(m_normalSpace.largeAllocator); 177 functor(m_normalDestructorSpace.largeAllocator); 178 functor(m_immortalStructureDestructorSpace.largeAllocator); 179 } 180 181 struct StopAllocatingFunctor { 182 void operator()(MarkedAllocator& allocator) { allocator.stopAllocating(); } 183 }; 184 185 void MarkedSpace::stopAllocating() 186 { 187 ASSERT(!isIterating()); 188 forEachAllocator<StopAllocatingFunctor>(); 189 } 190 191 struct ResumeAllocatingFunctor { 192 void operator()(MarkedAllocator& allocator) { allocator.resumeAllocating(); } 193 }; 194 195 void MarkedSpace::resumeAllocating() 196 { 197 ASSERT(isIterating()); 198 forEachAllocator<ResumeAllocatingFunctor>(); 170 199 } 171 200 … … 246 275 { 247 276 for (size_t i = 0; i < preciseCount; ++i) { 248 clearNewlyAllocatedInBlock(m_normalSpace.preciseAllocators[i].take CanonicalizedBlock());249 clearNewlyAllocatedInBlock(m_normalDestructorSpace.preciseAllocators[i].take CanonicalizedBlock());250 clearNewlyAllocatedInBlock(m_immortalStructureDestructorSpace.preciseAllocators[i].take CanonicalizedBlock());277 clearNewlyAllocatedInBlock(m_normalSpace.preciseAllocators[i].takeLastActiveBlock()); 278 clearNewlyAllocatedInBlock(m_normalDestructorSpace.preciseAllocators[i].takeLastActiveBlock()); 279 clearNewlyAllocatedInBlock(m_immortalStructureDestructorSpace.preciseAllocators[i].takeLastActiveBlock()); 251 280 } 252 281 253 282 for (size_t i = 0; i < impreciseCount; ++i) { 254 clearNewlyAllocatedInBlock(m_normalSpace.impreciseAllocators[i].take CanonicalizedBlock());255 clearNewlyAllocatedInBlock(m_normalDestructorSpace.impreciseAllocators[i].take CanonicalizedBlock());256 clearNewlyAllocatedInBlock(m_immortalStructureDestructorSpace.impreciseAllocators[i].take CanonicalizedBlock());283 clearNewlyAllocatedInBlock(m_normalSpace.impreciseAllocators[i].takeLastActiveBlock()); 284 clearNewlyAllocatedInBlock(m_normalDestructorSpace.impreciseAllocators[i].takeLastActiveBlock()); 285 clearNewlyAllocatedInBlock(m_immortalStructureDestructorSpace.impreciseAllocators[i].takeLastActiveBlock()); 257 286 } 258 287 … … 271 300 } 272 301 302 void MarkedSpace::willStartIterating() 303 { 304 ASSERT(!isIterating()); 305 stopAllocating(); 306 m_isIterating = true; 307 } 308 309 void MarkedSpace::didFinishIterating() 310 { 311 ASSERT(isIterating()); 312 resumeAllocating(); 313 m_isIterating = false; 314 } 315 273 316 } // namespace JSC -
trunk/Source/JavaScriptCore/heap/MarkedSpace.h
r155406 r155891 38 38 39 39 class Heap; 40 class HeapIterationScope; 40 41 class JSCell; 41 42 class LiveObjectIterator; … … 81 82 82 83 MarkedBlockSet& blocks() { return m_blocks; } 83 84 void canonicalizeCellLivenessData(); 84 85 void willStartIterating(); 86 bool isIterating() { return m_isIterating; } 87 void didFinishIterating(); 88 89 void stopAllocating(); 90 void resumeAllocating(); // If we just stopped allocation but we didn't do a collection, we need to resume allocation. 85 91 86 92 typedef HashSet<MarkedBlock*>::iterator BlockIterator; 87 93 88 template<typename Functor> typename Functor::ReturnType forEachLiveCell( Functor&);89 template<typename Functor> typename Functor::ReturnType forEachLiveCell( );90 template<typename Functor> typename Functor::ReturnType forEachDeadCell( Functor&);91 template<typename Functor> typename Functor::ReturnType forEachDeadCell( );94 template<typename Functor> typename Functor::ReturnType forEachLiveCell(HeapIterationScope&, Functor&); 95 template<typename Functor> typename Functor::ReturnType forEachLiveCell(HeapIterationScope&); 96 template<typename Functor> typename Functor::ReturnType forEachDeadCell(HeapIterationScope&, Functor&); 97 template<typename Functor> typename Functor::ReturnType forEachDeadCell(HeapIterationScope&); 92 98 template<typename Functor> typename Functor::ReturnType forEachBlock(Functor&); 93 99 template<typename Functor> typename Functor::ReturnType forEachBlock(); … … 112 118 friend class LLIntOffsetsExtractor; 113 119 120 template<typename Functor> void forEachAllocator(Functor&); 121 template<typename Functor> void forEachAllocator(); 122 114 123 // [ 32... 128 ] 115 124 static const size_t preciseStep = MarkedBlock::atomSize; … … 134 143 Heap* m_heap; 135 144 size_t m_capacity; 145 bool m_isIterating; 136 146 MarkedBlockSet m_blocks; 137 147 }; 138 148 139 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachLiveCell(Functor& functor) 140 { 141 canonicalizeCellLivenessData(); 142 149 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachLiveCell(HeapIterationScope&, Functor& functor) 150 { 151 ASSERT(isIterating()); 143 152 BlockIterator end = m_blocks.set().end(); 144 153 for (BlockIterator it = m_blocks.set().begin(); it != end; ++it) … … 147 156 } 148 157 149 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachLiveCell( )158 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachLiveCell(HeapIterationScope& scope) 150 159 { 151 160 Functor functor; 152 return forEachLiveCell(functor); 153 } 154 155 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachDeadCell(Functor& functor) 156 { 157 canonicalizeCellLivenessData(); 158 161 return forEachLiveCell(scope, functor); 162 } 163 164 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachDeadCell(HeapIterationScope&, Functor& functor) 165 { 166 ASSERT(isIterating()); 159 167 BlockIterator end = m_blocks.set().end(); 160 168 for (BlockIterator it = m_blocks.set().begin(); it != end; ++it) … … 163 171 } 164 172 165 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachDeadCell( )173 template<typename Functor> inline typename Functor::ReturnType MarkedSpace::forEachDeadCell(HeapIterationScope& scope) 166 174 { 167 175 Functor functor; 168 return forEachDeadCell( functor);176 return forEachDeadCell(scope, functor); 169 177 } 170 178 -
trunk/Source/JavaScriptCore/runtime/JSGlobalObject.cpp
r155711 r155891 47 47 #include "FunctionPrototype.h" 48 48 #include "GetterSetter.h" 49 #include "HeapIterationScope.h" 49 50 #include "Interpreter.h" 50 51 #include "JSAPIWrapperObject.h" … … 515 516 MarkedArgumentBuffer foundObjects; // Use MarkedArgumentBuffer because switchToSlowPutArrayStorage() may GC. 516 517 ObjectsWithBrokenIndexingFinder finder(foundObjects, this); 517 vm.heap.objectSpace().forEachLiveCell(finder); 518 { 519 HeapIterationScope iterationScope(vm.heap); 520 vm.heap.objectSpace().forEachLiveCell(iterationScope, finder); 521 } 518 522 while (!foundObjects.isEmpty()) { 519 523 JSObject* object = asObject(foundObjects.last()); -
trunk/Source/JavaScriptCore/runtime/VM.cpp
r155558 r155891 43 43 #include "GetterSetter.h" 44 44 #include "Heap.h" 45 #include "HeapIterationScope.h" 45 46 #include "HostCallReturnValue.h" 46 47 #include "Identifier.h" … … 522 523 if (dynamicGlobalObject) { 523 524 StackPreservingRecompiler recompiler; 525 HeapIterationScope iterationScope(heap); 524 526 HashSet<JSCell*> roots; 525 heap.canonicalizeCellLivenessData();526 527 heap.getConservativeRegisterRoots(roots); 527 528 HashSet<JSCell*>::iterator end = roots.end(); … … 544 545 545 546 } 546 heap.objectSpace().forEachLiveCell<StackPreservingRecompiler>( recompiler);547 heap.objectSpace().forEachLiveCell<StackPreservingRecompiler>(iterationScope, recompiler); 547 548 } 548 549 m_regExpCache->invalidateCode();
Note: See TracChangeset
for help on using the changeset viewer.