Changeset 94920 in webkit


Ignore:
Timestamp:
Sep 10, 2011 10:49:36 PM (13 years ago)
Author:
fpizlo@apple.com
Message:

The executable allocator makes it difficult to free individual
chunks of executable memory
https://bugs.webkit.org/show_bug.cgi?id=66363

Reviewed by Oliver Hunt.

Introduced a best-fit, balanced-tree based allocator. The allocator
required a balanced tree that does not allocate memory and that
permits the removal of individual nodes directly (as opposed to by
key); neither AVLTree nor WebCore's PODRedBlackTree supported this.
Changed all references to executable code to use a reference counted
handle.

Source/JavaScriptCore:

(JSC::AssemblerBuffer::executableCopy):

  • assembler/LinkBuffer.h:

(JSC::LinkBuffer::LinkBuffer):
(JSC::LinkBuffer::finalizeCode):
(JSC::LinkBuffer::linkCode):

  • assembler/MacroAssemblerCodeRef.h:

(JSC::MacroAssemblerCodeRef::MacroAssemblerCodeRef):
(JSC::MacroAssemblerCodeRef::createSelfManagedCodeRef):
(JSC::MacroAssemblerCodeRef::executableMemory):
(JSC::MacroAssemblerCodeRef::code):
(JSC::MacroAssemblerCodeRef::size):
(JSC::MacroAssemblerCodeRef::operator!):

  • assembler/X86Assembler.h:

(JSC::X86Assembler::executableCopy):
(JSC::X86Assembler::X86InstructionFormatter::executableCopy):

  • bytecode/CodeBlock.h:
  • bytecode/Instruction.h:
  • bytecode/StructureStubInfo.h:
  • dfg/DFGJITCompiler.cpp:

(JSC::DFG::JITCompiler::compile):
(JSC::DFG::JITCompiler::compileFunction):

  • dfg/DFGRepatch.cpp:

(JSC::DFG::generateProtoChainAccessStub):
(JSC::DFG::tryCacheGetByID):
(JSC::DFG::tryBuildGetByIDList):
(JSC::DFG::tryBuildGetByIDProtoList):
(JSC::DFG::tryCachePutByID):

  • jit/ExecutableAllocator.cpp:

(JSC::ExecutableAllocator::initializeAllocator):
(JSC::ExecutableAllocator::ExecutableAllocator):
(JSC::ExecutableAllocator::allocate):
(JSC::ExecutableAllocator::committedByteCount):
(JSC::ExecutableAllocator::dumpProfile):

  • jit/ExecutableAllocator.h:

(JSC::ExecutableAllocator::dumpProfile):

  • jit/ExecutableAllocatorFixedVMPool.cpp:

(JSC::ExecutableAllocator::initializeAllocator):
(JSC::ExecutableAllocator::ExecutableAllocator):
(JSC::ExecutableAllocator::isValid):
(JSC::ExecutableAllocator::underMemoryPressure):
(JSC::ExecutableAllocator::allocate):
(JSC::ExecutableAllocator::committedByteCount):
(JSC::ExecutableAllocator::dumpProfile):

  • jit/JIT.cpp:

(JSC::JIT::privateCompile):

  • jit/JIT.h:

(JSC::JIT::compileCTIMachineTrampolines):
(JSC::JIT::compileCTINativeCall):

  • jit/JITCode.h:

(JSC::JITCode::operator !):
(JSC::JITCode::addressForCall):
(JSC::JITCode::offsetOf):
(JSC::JITCode::execute):
(JSC::JITCode::start):
(JSC::JITCode::size):
(JSC::JITCode::getExecutableMemory):
(JSC::JITCode::HostFunction):
(JSC::JITCode::JITCode):

  • jit/JITOpcodes.cpp:

(JSC::JIT::privateCompileCTIMachineTrampolines):
(JSC::JIT::privateCompileCTINativeCall):

  • jit/JITOpcodes32_64.cpp:

(JSC::JIT::privateCompileCTIMachineTrampolines):
(JSC::JIT::privateCompileCTINativeCall):

  • jit/JITPropertyAccess.cpp:

(JSC::JIT::stringGetByValStubGenerator):
(JSC::JIT::emitSlow_op_get_by_val):
(JSC::JIT::privateCompilePutByIdTransition):
(JSC::JIT::privateCompilePatchGetArrayLength):
(JSC::JIT::privateCompileGetByIdProto):
(JSC::JIT::privateCompileGetByIdSelfList):
(JSC::JIT::privateCompileGetByIdProtoList):
(JSC::JIT::privateCompileGetByIdChainList):
(JSC::JIT::privateCompileGetByIdChain):

  • jit/JITPropertyAccess32_64.cpp:

(JSC::JIT::stringGetByValStubGenerator):
(JSC::JIT::emitSlow_op_get_by_val):
(JSC::JIT::privateCompilePutByIdTransition):
(JSC::JIT::privateCompilePatchGetArrayLength):
(JSC::JIT::privateCompileGetByIdProto):
(JSC::JIT::privateCompileGetByIdSelfList):
(JSC::JIT::privateCompileGetByIdProtoList):
(JSC::JIT::privateCompileGetByIdChainList):
(JSC::JIT::privateCompileGetByIdChain):

  • jit/JITStubs.cpp:

(JSC::JITThunks::JITThunks):
(JSC::DEFINE_STUB_FUNCTION):
(JSC::getPolymorphicAccessStructureListSlot):
(JSC::JITThunks::ctiStub):
(JSC::JITThunks::hostFunctionStub):

  • jit/JITStubs.h:
  • jit/SpecializedThunkJIT.h:

(JSC::SpecializedThunkJIT::SpecializedThunkJIT):
(JSC::SpecializedThunkJIT::finalize):

  • jit/ThunkGenerators.cpp:

(JSC::charCodeAtThunkGenerator):
(JSC::charAtThunkGenerator):
(JSC::fromCharCodeThunkGenerator):
(JSC::sqrtThunkGenerator):
(JSC::floorThunkGenerator):
(JSC::ceilThunkGenerator):
(JSC::roundThunkGenerator):
(JSC::expThunkGenerator):
(JSC::logThunkGenerator):
(JSC::absThunkGenerator):
(JSC::powThunkGenerator):

  • jit/ThunkGenerators.h:
  • runtime/Executable.h:

(JSC::NativeExecutable::create):

  • runtime/InitializeThreading.cpp:

(JSC::initializeThreadingOnce):

  • runtime/JSGlobalData.cpp:

(JSC::JSGlobalData::JSGlobalData):
(JSC::JSGlobalData::dumpSampleData):

  • runtime/JSGlobalData.h:

(JSC::JSGlobalData::getCTIStub):

  • wtf/CMakeLists.txt:
  • wtf/MetaAllocator.cpp: Added.

(WTF::MetaAllocatorHandle::MetaAllocatorHandle):
(WTF::MetaAllocatorHandle::~MetaAllocatorHandle):
(WTF::MetaAllocatorHandle::shrink):
(WTF::MetaAllocator::MetaAllocator):
(WTF::MetaAllocator::allocate):
(WTF::MetaAllocator::currentStatistics):
(WTF::MetaAllocator::findAndRemoveFreeSpace):
(WTF::MetaAllocator::addFreeSpaceFromReleasedHandle):
(WTF::MetaAllocator::addFreshFreeSpace):
(WTF::MetaAllocator::debugFreeSpaceSize):
(WTF::MetaAllocator::addFreeSpace):
(WTF::MetaAllocator::incrementPageOccupancy):
(WTF::MetaAllocator::decrementPageOccupancy):
(WTF::MetaAllocator::roundUp):
(WTF::MetaAllocator::allocFreeSpaceNode):
(WTF::MetaAllocator::freeFreeSpaceNode):
(WTF::MetaAllocator::dumpProfile):

  • wtf/MetaAllocator.h: Added.

(WTF::MetaAllocator::bytesAllocated):
(WTF::MetaAllocator::bytesReserved):
(WTF::MetaAllocator::bytesCommitted):
(WTF::MetaAllocator::dumpProfile):
(WTF::MetaAllocator::~MetaAllocator):

  • wtf/MetaAllocatorHandle.h: Added.
  • wtf/RedBlackTree.h: Added.

(WTF::RedBlackTree::Node::Node):
(WTF::RedBlackTree::Node::successor):
(WTF::RedBlackTree::Node::predecessor):
(WTF::RedBlackTree::Node::reset):
(WTF::RedBlackTree::Node::parent):
(WTF::RedBlackTree::Node::setParent):
(WTF::RedBlackTree::Node::left):
(WTF::RedBlackTree::Node::setLeft):
(WTF::RedBlackTree::Node::right):
(WTF::RedBlackTree::Node::setRight):
(WTF::RedBlackTree::Node::color):
(WTF::RedBlackTree::Node::setColor):
(WTF::RedBlackTree::RedBlackTree):
(WTF::RedBlackTree::insert):
(WTF::RedBlackTree::remove):
(WTF::RedBlackTree::findExact):
(WTF::RedBlackTree::findLeastGreaterThanOrEqual):
(WTF::RedBlackTree::findGreatestLessThanOrEqual):
(WTF::RedBlackTree::first):
(WTF::RedBlackTree::last):
(WTF::RedBlackTree::size):
(WTF::RedBlackTree::isEmpty):
(WTF::RedBlackTree::treeMinimum):
(WTF::RedBlackTree::treeMaximum):
(WTF::RedBlackTree::treeInsert):
(WTF::RedBlackTree::leftRotate):
(WTF::RedBlackTree::rightRotate):
(WTF::RedBlackTree::removeFixup):

  • wtf/wtf.pri:
  • yarr/YarrJIT.cpp:

(JSC::Yarr::YarrGenerator::compile):

  • yarr/YarrJIT.h:

(JSC::Yarr::YarrCodeBlock::execute):
(JSC::Yarr::YarrCodeBlock::getAddr):

Source/JavaScriptGlue:

  • ForwardingHeaders/wtf/MetaAllocatorHandle.h: Added.

Source/WebCore:

No new layout tests because behavior is not changed. New API unit
tests:
Tests/WTF/RedBlackTree.cpp
Tests/WTF/MetaAllocator.cpp

  • ForwardingHeaders/wtf/MetaAllocatorHandle.h: Added.

Tools:

  • TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj:
  • TestWebKitAPI/Tests/WTF/MetaAllocator.cpp: Added.

(TestWebKitAPI::TEST_F):

  • TestWebKitAPI/Tests/WTF/RedBlackTree.cpp: Added.

(TestWebKitAPI::Pair::findExact):
(TestWebKitAPI::Pair::remove):
(TestWebKitAPI::Pair::findLeastGreaterThanOrEqual):
(TestWebKitAPI::Pair::assertFoundAndRemove):
(TestWebKitAPI::Pair::assertEqual):
(TestWebKitAPI::Pair::assertSameValuesForKey):
(TestWebKitAPI::Pair::testDriver):
(TestWebKitAPI::TEST_F):

Location:
trunk
Files:
8 added
41 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/JavaScriptCore/ChangeLog

    r94919 r94920  
     12011-09-08  Filip Pizlo  <fpizlo@apple.com>
     2
     3        The executable allocator makes it difficult to free individual
     4        chunks of executable memory
     5        https://bugs.webkit.org/show_bug.cgi?id=66363
     6
     7        Reviewed by Oliver Hunt.
     8       
     9        Introduced a best-fit, balanced-tree based allocator. The allocator
     10        required a balanced tree that does not allocate memory and that
     11        permits the removal of individual nodes directly (as opposed to by
     12        key); neither AVLTree nor WebCore's PODRedBlackTree supported this.
     13        Changed all references to executable code to use a reference counted
     14        handle.
     15
     16        * GNUmakefile.list.am:
     17        * JavaScriptCore.exp:
     18        * JavaScriptCore.vcproj/WTF/WTF.vcproj:
     19        * JavaScriptCore.xcodeproj/project.pbxproj:
     20        * assembler/AssemblerBuffer.h:
     21        (JSC::AssemblerBuffer::executableCopy):
     22        * assembler/LinkBuffer.h:
     23        (JSC::LinkBuffer::LinkBuffer):
     24        (JSC::LinkBuffer::finalizeCode):
     25        (JSC::LinkBuffer::linkCode):
     26        * assembler/MacroAssemblerCodeRef.h:
     27        (JSC::MacroAssemblerCodeRef::MacroAssemblerCodeRef):
     28        (JSC::MacroAssemblerCodeRef::createSelfManagedCodeRef):
     29        (JSC::MacroAssemblerCodeRef::executableMemory):
     30        (JSC::MacroAssemblerCodeRef::code):
     31        (JSC::MacroAssemblerCodeRef::size):
     32        (JSC::MacroAssemblerCodeRef::operator!):
     33        * assembler/X86Assembler.h:
     34        (JSC::X86Assembler::executableCopy):
     35        (JSC::X86Assembler::X86InstructionFormatter::executableCopy):
     36        * bytecode/CodeBlock.h:
     37        * bytecode/Instruction.h:
     38        * bytecode/StructureStubInfo.h:
     39        * dfg/DFGJITCompiler.cpp:
     40        (JSC::DFG::JITCompiler::compile):
     41        (JSC::DFG::JITCompiler::compileFunction):
     42        * dfg/DFGRepatch.cpp:
     43        (JSC::DFG::generateProtoChainAccessStub):
     44        (JSC::DFG::tryCacheGetByID):
     45        (JSC::DFG::tryBuildGetByIDList):
     46        (JSC::DFG::tryBuildGetByIDProtoList):
     47        (JSC::DFG::tryCachePutByID):
     48        * jit/ExecutableAllocator.cpp:
     49        (JSC::ExecutableAllocator::initializeAllocator):
     50        (JSC::ExecutableAllocator::ExecutableAllocator):
     51        (JSC::ExecutableAllocator::allocate):
     52        (JSC::ExecutableAllocator::committedByteCount):
     53        (JSC::ExecutableAllocator::dumpProfile):
     54        * jit/ExecutableAllocator.h:
     55        (JSC::ExecutableAllocator::dumpProfile):
     56        * jit/ExecutableAllocatorFixedVMPool.cpp:
     57        (JSC::ExecutableAllocator::initializeAllocator):
     58        (JSC::ExecutableAllocator::ExecutableAllocator):
     59        (JSC::ExecutableAllocator::isValid):
     60        (JSC::ExecutableAllocator::underMemoryPressure):
     61        (JSC::ExecutableAllocator::allocate):
     62        (JSC::ExecutableAllocator::committedByteCount):
     63        (JSC::ExecutableAllocator::dumpProfile):
     64        * jit/JIT.cpp:
     65        (JSC::JIT::privateCompile):
     66        * jit/JIT.h:
     67        (JSC::JIT::compileCTIMachineTrampolines):
     68        (JSC::JIT::compileCTINativeCall):
     69        * jit/JITCode.h:
     70        (JSC::JITCode::operator !):
     71        (JSC::JITCode::addressForCall):
     72        (JSC::JITCode::offsetOf):
     73        (JSC::JITCode::execute):
     74        (JSC::JITCode::start):
     75        (JSC::JITCode::size):
     76        (JSC::JITCode::getExecutableMemory):
     77        (JSC::JITCode::HostFunction):
     78        (JSC::JITCode::JITCode):
     79        * jit/JITOpcodes.cpp:
     80        (JSC::JIT::privateCompileCTIMachineTrampolines):
     81        (JSC::JIT::privateCompileCTINativeCall):
     82        * jit/JITOpcodes32_64.cpp:
     83        (JSC::JIT::privateCompileCTIMachineTrampolines):
     84        (JSC::JIT::privateCompileCTINativeCall):
     85        * jit/JITPropertyAccess.cpp:
     86        (JSC::JIT::stringGetByValStubGenerator):
     87        (JSC::JIT::emitSlow_op_get_by_val):
     88        (JSC::JIT::privateCompilePutByIdTransition):
     89        (JSC::JIT::privateCompilePatchGetArrayLength):
     90        (JSC::JIT::privateCompileGetByIdProto):
     91        (JSC::JIT::privateCompileGetByIdSelfList):
     92        (JSC::JIT::privateCompileGetByIdProtoList):
     93        (JSC::JIT::privateCompileGetByIdChainList):
     94        (JSC::JIT::privateCompileGetByIdChain):
     95        * jit/JITPropertyAccess32_64.cpp:
     96        (JSC::JIT::stringGetByValStubGenerator):
     97        (JSC::JIT::emitSlow_op_get_by_val):
     98        (JSC::JIT::privateCompilePutByIdTransition):
     99        (JSC::JIT::privateCompilePatchGetArrayLength):
     100        (JSC::JIT::privateCompileGetByIdProto):
     101        (JSC::JIT::privateCompileGetByIdSelfList):
     102        (JSC::JIT::privateCompileGetByIdProtoList):
     103        (JSC::JIT::privateCompileGetByIdChainList):
     104        (JSC::JIT::privateCompileGetByIdChain):
     105        * jit/JITStubs.cpp:
     106        (JSC::JITThunks::JITThunks):
     107        (JSC::DEFINE_STUB_FUNCTION):
     108        (JSC::getPolymorphicAccessStructureListSlot):
     109        (JSC::JITThunks::ctiStub):
     110        (JSC::JITThunks::hostFunctionStub):
     111        * jit/JITStubs.h:
     112        * jit/SpecializedThunkJIT.h:
     113        (JSC::SpecializedThunkJIT::SpecializedThunkJIT):
     114        (JSC::SpecializedThunkJIT::finalize):
     115        * jit/ThunkGenerators.cpp:
     116        (JSC::charCodeAtThunkGenerator):
     117        (JSC::charAtThunkGenerator):
     118        (JSC::fromCharCodeThunkGenerator):
     119        (JSC::sqrtThunkGenerator):
     120        (JSC::floorThunkGenerator):
     121        (JSC::ceilThunkGenerator):
     122        (JSC::roundThunkGenerator):
     123        (JSC::expThunkGenerator):
     124        (JSC::logThunkGenerator):
     125        (JSC::absThunkGenerator):
     126        (JSC::powThunkGenerator):
     127        * jit/ThunkGenerators.h:
     128        * runtime/Executable.h:
     129        (JSC::NativeExecutable::create):
     130        * runtime/InitializeThreading.cpp:
     131        (JSC::initializeThreadingOnce):
     132        * runtime/JSGlobalData.cpp:
     133        (JSC::JSGlobalData::JSGlobalData):
     134        (JSC::JSGlobalData::dumpSampleData):
     135        * runtime/JSGlobalData.h:
     136        (JSC::JSGlobalData::getCTIStub):
     137        * wtf/CMakeLists.txt:
     138        * wtf/MetaAllocator.cpp: Added.
     139        (WTF::MetaAllocatorHandle::MetaAllocatorHandle):
     140        (WTF::MetaAllocatorHandle::~MetaAllocatorHandle):
     141        (WTF::MetaAllocatorHandle::shrink):
     142        (WTF::MetaAllocator::MetaAllocator):
     143        (WTF::MetaAllocator::allocate):
     144        (WTF::MetaAllocator::currentStatistics):
     145        (WTF::MetaAllocator::findAndRemoveFreeSpace):
     146        (WTF::MetaAllocator::addFreeSpaceFromReleasedHandle):
     147        (WTF::MetaAllocator::addFreshFreeSpace):
     148        (WTF::MetaAllocator::debugFreeSpaceSize):
     149        (WTF::MetaAllocator::addFreeSpace):
     150        (WTF::MetaAllocator::incrementPageOccupancy):
     151        (WTF::MetaAllocator::decrementPageOccupancy):
     152        (WTF::MetaAllocator::roundUp):
     153        (WTF::MetaAllocator::allocFreeSpaceNode):
     154        (WTF::MetaAllocator::freeFreeSpaceNode):
     155        (WTF::MetaAllocator::dumpProfile):
     156        * wtf/MetaAllocator.h: Added.
     157        (WTF::MetaAllocator::bytesAllocated):
     158        (WTF::MetaAllocator::bytesReserved):
     159        (WTF::MetaAllocator::bytesCommitted):
     160        (WTF::MetaAllocator::dumpProfile):
     161        (WTF::MetaAllocator::~MetaAllocator):
     162        * wtf/MetaAllocatorHandle.h: Added.
     163        * wtf/RedBlackTree.h: Added.
     164        (WTF::RedBlackTree::Node::Node):
     165        (WTF::RedBlackTree::Node::successor):
     166        (WTF::RedBlackTree::Node::predecessor):
     167        (WTF::RedBlackTree::Node::reset):
     168        (WTF::RedBlackTree::Node::parent):
     169        (WTF::RedBlackTree::Node::setParent):
     170        (WTF::RedBlackTree::Node::left):
     171        (WTF::RedBlackTree::Node::setLeft):
     172        (WTF::RedBlackTree::Node::right):
     173        (WTF::RedBlackTree::Node::setRight):
     174        (WTF::RedBlackTree::Node::color):
     175        (WTF::RedBlackTree::Node::setColor):
     176        (WTF::RedBlackTree::RedBlackTree):
     177        (WTF::RedBlackTree::insert):
     178        (WTF::RedBlackTree::remove):
     179        (WTF::RedBlackTree::findExact):
     180        (WTF::RedBlackTree::findLeastGreaterThanOrEqual):
     181        (WTF::RedBlackTree::findGreatestLessThanOrEqual):
     182        (WTF::RedBlackTree::first):
     183        (WTF::RedBlackTree::last):
     184        (WTF::RedBlackTree::size):
     185        (WTF::RedBlackTree::isEmpty):
     186        (WTF::RedBlackTree::treeMinimum):
     187        (WTF::RedBlackTree::treeMaximum):
     188        (WTF::RedBlackTree::treeInsert):
     189        (WTF::RedBlackTree::leftRotate):
     190        (WTF::RedBlackTree::rightRotate):
     191        (WTF::RedBlackTree::removeFixup):
     192        * wtf/wtf.pri:
     193        * yarr/YarrJIT.cpp:
     194        (JSC::Yarr::YarrGenerator::compile):
     195        * yarr/YarrJIT.h:
     196        (JSC::Yarr::YarrCodeBlock::execute):
     197        (JSC::Yarr::YarrCodeBlock::getAddr):
     198
    11992011-09-10  Sam Weinig  <sam@webkit.org>
    2200
  • trunk/Source/JavaScriptCore/GNUmakefile.list.am

    r94814 r94920  
    530530        Source/JavaScriptCore/wtf/MD5.cpp \
    531531        Source/JavaScriptCore/wtf/MD5.h \
     532        Source/JavaScriptCore/wtf/MetaAllocator.cpp \
     533        Source/JavaScriptCore/wtf/MetaAllocator.h \
     534        Source/JavaScriptCore/wtf/MetaAllocatorHandle.h \
    532535        Source/JavaScriptCore/wtf/MessageQueue.h \
    533536        Source/JavaScriptCore/wtf/NonCopyingSort.h \
     
    562565        Source/JavaScriptCore/wtf/RandomNumber.h \
    563566        Source/JavaScriptCore/wtf/RandomNumberSeed.h \
     567        Source/JavaScriptCore/wtf/RedBlackTree.h \
    564568        Source/JavaScriptCore/wtf/RefCounted.h \
    565569        Source/JavaScriptCore/wtf/RefCountedLeakCounter.cpp \
  • trunk/Source/JavaScriptCore/JavaScriptCore.exp

    r94919 r94920  
    408408__ZN3WTF12isMainThreadEv
    409409__ZN3WTF12randomNumberEv
     410__ZN3WTF13MetaAllocator17addFreshFreeSpaceEPvm
     411__ZN3WTF13MetaAllocator17freeFreeSpaceNodeEPNS_12RedBlackTreeImPvE4NodeE
     412__ZN3WTF13MetaAllocator18debugFreeSpaceSizeEv
     413__ZN3WTF13MetaAllocator8allocateEm
     414__ZN3WTF13MetaAllocatorC2Em
    410415__ZN3WTF13StringBuilder11reifyStringEv
    411416__ZN3WTF13StringBuilder11shrinkToFitEv
     
    442447__ZN3WTF18dateToDaysFrom1970Eiii
    443448__ZN3WTF18monthFromDayInYearEib
     449__ZN3WTF19MetaAllocatorHandle6shrinkEm
     450__ZN3WTF19MetaAllocatorHandleD1Ev
    444451__ZN3WTF19initializeThreadingEv
    445452__ZN3WTF20equalIgnoringNullityEPNS_10StringImplES1_
     
    522529__ZN3WTF8msToYearEd
    523530__ZN3WTF8nullAtomE
     531__ZN3WTF8pageSizeEv
    524532__ZN3WTF8starAtomE
    525533__ZN3WTF8textAtomE
  • trunk/Source/JavaScriptCore/JavaScriptCore.vcproj/WTF/WTF.vcproj

    r94890 r94920  
    838838                </File>
    839839                <File
     840                        RelativePath="..\..\wtf\MetaAllocator.cpp"
     841                        >
     842                </File>
     843                <File
     844                        RelativePath="..\..\wtf\MetaAllocator.h"
     845                        >
     846                </File>
     847                <File
     848                        RelativePath="..\..\wtf\MetaAllocatorHandle.h"
     849                        >
     850                </File>
     851                <File
    840852                        RelativePath="..\..\wtf\MD5.cpp"
    841853                        >
     
    975987                <File
    976988                        RelativePath="..\..\wtf\RandomNumberSeed.h"
     989                        >
     990                </File>
     991                <File
     992                        RelativePath="..\..\wtf\RedBlackTree.h"
    977993                        >
    978994                </File>
  • trunk/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj

    r94814 r94920  
    5252                0F242DA713F3B1E8007ADD4C /* WeakReferenceHarvester.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F242DA513F3B1BB007ADD4C /* WeakReferenceHarvester.h */; settings = {ATTRIBUTES = (Private, ); }; };
    5353                0F7700921402FF3C0078EB39 /* SamplingCounter.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F7700911402FF280078EB39 /* SamplingCounter.cpp */; };
     54                0F963B2713F753BB0002D9B2 /* RedBlackTree.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F963B2613F753990002D9B2 /* RedBlackTree.h */; settings = {ATTRIBUTES = (Private, ); }; };
     55                0F963B2C13F853EC0002D9B2 /* MetaAllocator.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F963B2B13F853C70002D9B2 /* MetaAllocator.cpp */; settings = {COMPILER_FLAGS = "-fno-strict-aliasing"; }; };
     56                0F963B2D13F854020002D9B2 /* MetaAllocator.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F963B2A13F853BD0002D9B2 /* MetaAllocator.h */; settings = {ATTRIBUTES = (Private, ); }; };
     57                0F963B2F13FC66BB0002D9B2 /* MetaAllocatorHandle.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F963B2E13FC66AE0002D9B2 /* MetaAllocatorHandle.h */; settings = {ATTRIBUTES = (Private, ); }; };
    5458                0F963B3813FC6FE90002D9B2 /* ValueProfile.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F963B3613FC6FDE0002D9B2 /* ValueProfile.h */; settings = {ATTRIBUTES = (Private, ); }; };
    5559                0FC8150A14043BF500CFA603 /* WriteBarrierSupport.h in Headers */ = {isa = PBXBuildFile; fileRef = 0FC8150914043BD200CFA603 /* WriteBarrierSupport.h */; settings = {ATTRIBUTES = (Private, ); }; };
     
    559563                BC18C46C0E16F5CD00B34460 /* TCPackedCache.h in Headers */ = {isa = PBXBuildFile; fileRef = 5DA479650CFBCF56009328A0 /* TCPackedCache.h */; };
    560564                BC18C46D0E16F5CD00B34460 /* TCPageMap.h in Headers */ = {isa = PBXBuildFile; fileRef = 6541BD6E08E80A17002CBEE7 /* TCPageMap.h */; };
    561                 BC18C46E0E16F5CD00B34460 /* TCSpinLock.h in Headers */ = {isa = PBXBuildFile; fileRef = 6541BD6F08E80A17002CBEE7 /* TCSpinLock.h */; };
     565                BC18C46E0E16F5CD00B34460 /* TCSpinLock.h in Headers */ = {isa = PBXBuildFile; fileRef = 6541BD6F08E80A17002CBEE7 /* TCSpinLock.h */; settings = {ATTRIBUTES = (Private, ); }; };
    562566                BC18C46F0E16F5CD00B34460 /* TCSystemAlloc.h in Headers */ = {isa = PBXBuildFile; fileRef = 6541BD7108E80A17002CBEE7 /* TCSystemAlloc.h */; };
    563567                BC18C4700E16F5CD00B34460 /* Threading.h in Headers */ = {isa = PBXBuildFile; fileRef = E1EE79220D6C95CD00FEA3BA /* Threading.h */; settings = {ATTRIBUTES = (Private, ); }; };
     
    779783                0F77008E1402FDD60078EB39 /* SamplingCounter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SamplingCounter.h; sourceTree = "<group>"; };
    780784                0F7700911402FF280078EB39 /* SamplingCounter.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = SamplingCounter.cpp; sourceTree = "<group>"; };
     785                0F963B2613F753990002D9B2 /* RedBlackTree.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RedBlackTree.h; sourceTree = "<group>"; };
     786                0F963B2A13F853BD0002D9B2 /* MetaAllocator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MetaAllocator.h; sourceTree = "<group>"; };
     787                0F963B2B13F853C70002D9B2 /* MetaAllocator.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = MetaAllocator.cpp; sourceTree = "<group>"; };
     788                0F963B2E13FC66AE0002D9B2 /* MetaAllocatorHandle.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MetaAllocatorHandle.h; sourceTree = "<group>"; };
    781789                0F963B3613FC6FDE0002D9B2 /* ValueProfile.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ValueProfile.h; sourceTree = "<group>"; };
    782790                0FC8150814043BCA00CFA603 /* WriteBarrierSupport.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = WriteBarrierSupport.cpp; sourceTree = "<group>"; };
     
    17641772                        isa = PBXGroup;
    17651773                        children = (
     1774                                0F963B2E13FC66AE0002D9B2 /* MetaAllocatorHandle.h */,
     1775                                0F963B2B13F853C70002D9B2 /* MetaAllocator.cpp */,
     1776                                0F963B2A13F853BD0002D9B2 /* MetaAllocator.h */,
     1777                                0F963B2613F753990002D9B2 /* RedBlackTree.h */,
    17661778                                C2EE599D13FC972A009CEAFE /* DecimalNumber.cpp */,
    17671779                                C2EE599E13FC972A009CEAFE /* DecimalNumber.h */,
     
    23842396                                86ADD1450FDDEA980006EEC2 /* ARMv7Assembler.h in Headers */,
    23852397                                BC18C3E60E16F5CD00B34460 /* ArrayConstructor.h in Headers */,
     2398                                BC18C46E0E16F5CD00B34460 /* TCSpinLock.h in Headers */,
     2399                                0F963B2F13FC66BB0002D9B2 /* MetaAllocatorHandle.h in Headers */,
     2400                                0F963B2D13F854020002D9B2 /* MetaAllocator.h in Headers */,
     2401                                0F963B2713F753BB0002D9B2 /* RedBlackTree.h in Headers */,
    23862402                                0FC815151405119B00CFA603 /* VTableSpectrum.h in Headers */,
    23872403                                C22B31B9140577D700DB475A /* SamplingCounter.h in Headers */,
     
    26872703                                BC18C46C0E16F5CD00B34460 /* TCPackedCache.h in Headers */,
    26882704                                BC18C46D0E16F5CD00B34460 /* TCPageMap.h in Headers */,
    2689                                 BC18C46E0E16F5CD00B34460 /* TCSpinLock.h in Headers */,
    26902705                                BC18C46F0E16F5CD00B34460 /* TCSystemAlloc.h in Headers */,
    26912706                                971EDEA61169E0D3005E4262 /* Terminator.h in Headers */,
     
    30503065                        buildActionMask = 2147483647;
    30513066                        files = (
     3067                                0F963B2C13F853EC0002D9B2 /* MetaAllocator.cpp in Sources */,
    30523068                                0FD82E2114172CE300179C94 /* DFGCapabilities.cpp in Sources */,
    30533069                                0FD3C82514115D4000FD81CB /* DFGPropagator.cpp in Sources */,
  • trunk/Source/JavaScriptCore/assembler/AssemblerBuffer.h

    r87527 r94920  
    2929#if ENABLE(ASSEMBLER)
    3030
     31#include "JSGlobalData.h"
    3132#include "stdint.h"
    3233#include <string.h>
     
    129130        }
    130131
    131         void* executableCopy(JSGlobalData& globalData, ExecutablePool* allocator)
     132        PassRefPtr<ExecutableMemoryHandle> executableCopy(JSGlobalData& globalData)
    132133        {
    133134            if (!m_index)
    134135                return 0;
    135136
    136             void* result = allocator->alloc(globalData, m_index);
     137            RefPtr<ExecutableMemoryHandle> result = globalData.executableAllocator.allocate(globalData, m_index);
    137138
    138139            if (!result)
    139140                return 0;
    140141
    141             ExecutableAllocator::makeWritable(result, m_index);
     142            ExecutableAllocator::makeWritable(result->start(), result->sizeInBytes());
    142143
    143             return memcpy(result, m_buffer, m_index);
     144            memcpy(result->start(), m_buffer, m_index);
     145           
     146            return result.release();
    144147        }
    145148
  • trunk/Source/JavaScriptCore/assembler/LinkBuffer.h

    r93238 r94920  
    7070
    7171public:
    72     LinkBuffer(JSGlobalData& globalData, MacroAssembler* masm, PassRefPtr<ExecutablePool> executablePool)
    73         : m_executablePool(executablePool)
    74         , m_size(0)
     72    LinkBuffer(JSGlobalData& globalData, MacroAssembler* masm)
     73        : m_size(0)
    7574        , m_code(0)
    7675        , m_assembler(masm)
     
    8382    }
    8483
    85     LinkBuffer(JSGlobalData& globalData, MacroAssembler* masm, ExecutableAllocator& allocator)
    86         : m_executablePool(allocator.poolForSize(globalData, masm->m_assembler.codeSize()))
    87         , m_size(0)
    88         , m_code(0)
    89         , m_assembler(masm)
    90         , m_globalData(&globalData)
    91 #ifndef NDEBUG
    92         , m_completed(false)
    93 #endif
    94     {
    95         linkCode();
    96     }
    97 
    9884    ~LinkBuffer()
    9985    {
     
    186172        performFinalization();
    187173
    188         return CodeRef(m_code, m_executablePool, m_size);
    189     }
    190 
    191     CodeLocationLabel finalizeCodeAddendum()
    192     {
    193         performFinalization();
    194 
    195         return CodeLocationLabel(code());
     174        return CodeRef(m_executableMemory);
    196175    }
    197176
     
    233212        ASSERT(!m_code);
    234213#if !ENABLE(BRANCH_COMPACTION)
    235         m_code = m_assembler->m_assembler.executableCopy(*m_globalData, m_executablePool.get());
     214        m_executableMemory = m_assembler->m_assembler.executableCopy(*m_globalData);
     215        if (!m_executableMemory)
     216            return;
     217        m_code = m_executableMemory->start();
    236218        m_size = m_assembler->m_assembler.codeSize();
    237219        ASSERT(m_code);
    238220#else
    239221        size_t initialSize = m_assembler->m_assembler.codeSize();
    240         m_code = (uint8_t*)m_executablePool->alloc(*m_globalData, initialSize);
    241         if (!m_code)
     222        m_executableMemory = m_globalData->executableAllocator.allocate(*m_globalData, initialSize);
     223        if (!m_executableMemory)
    242224            return;
     225        m_code = (uint8_t*)m_executableMemory->start();
     226        ASSERT(m_code);
    243227        ExecutableAllocator::makeWritable(m_code, initialSize);
    244228        uint8_t* inData = (uint8_t*)m_assembler->unlinkedCode();
     
    298282        jumpsToLink.clear();
    299283        m_size = writePtr + initialSize - readPtr;
    300         m_executablePool->tryShrink(m_code, initialSize, m_size);
     284        m_executableMemory->shrink(m_size);
    301285
    302286#if DUMP_LINK_STATISTICS
     
    367351#endif
    368352   
    369     RefPtr<ExecutablePool> m_executablePool;
     353    RefPtr<ExecutableMemoryHandle> m_executableMemory;
    370354    size_t m_size;
    371355    void* m_code;
  • trunk/Source/JavaScriptCore/assembler/MacroAssemblerCodeRef.h

    r89630 r94920  
    200200// was allocated.
    201201class MacroAssemblerCodeRef {
     202private:
     203    // This is private because it's dangerous enough that we want uses of it
     204    // to be easy to find - hence the static create method below.
     205    explicit MacroAssemblerCodeRef(MacroAssemblerCodePtr codePtr)
     206        : m_codePtr(codePtr)
     207    {
     208        ASSERT(m_codePtr);
     209    }
     210
    202211public:
    203212    MacroAssemblerCodeRef()
    204         : m_size(0)
    205     {
    206     }
    207 
    208     MacroAssemblerCodeRef(void* code, PassRefPtr<ExecutablePool> executablePool, size_t size)
    209         : m_code(code)
    210         , m_executablePool(executablePool)
    211         , m_size(size)
    212     {
    213     }
    214 
    215     MacroAssemblerCodePtr m_code;
    216     RefPtr<ExecutablePool> m_executablePool;
    217     size_t m_size;
     213    {
     214    }
     215
     216    MacroAssemblerCodeRef(PassRefPtr<ExecutableMemoryHandle> executableMemory)
     217        : m_codePtr(executableMemory->start())
     218        , m_executableMemory(executableMemory)
     219    {
     220        ASSERT(m_executableMemory->isManaged());
     221        ASSERT(m_executableMemory->start());
     222        ASSERT(m_codePtr);
     223    }
     224   
     225    // Use this only when you know that the codePtr refers to code that is
     226    // already being kept alive through some other means. Typically this means
     227    // that codePtr is immortal.
     228    static MacroAssemblerCodeRef createSelfManagedCodeRef(MacroAssemblerCodePtr codePtr)
     229    {
     230        return MacroAssemblerCodeRef(codePtr);
     231    }
     232   
     233    ExecutableMemoryHandle* executableMemory() const
     234    {
     235        return m_executableMemory.get();
     236    }
     237   
     238    MacroAssemblerCodePtr code() const
     239    {
     240        return m_codePtr;
     241    }
     242   
     243    size_t size() const
     244    {
     245        if (!m_executableMemory)
     246            return 0;
     247        return m_executableMemory->sizeInBytes();
     248    }
     249   
     250    bool operator!() const { return !m_codePtr; }
     251
     252private:
     253    MacroAssemblerCodePtr m_codePtr;
     254    RefPtr<ExecutableMemoryHandle> m_executableMemory;
    218255};
    219256
  • trunk/Source/JavaScriptCore/assembler/X86Assembler.h

    r90237 r94920  
    15961596    }
    15971597   
    1598     void* executableCopy(JSGlobalData& globalData, ExecutablePool* allocator)
    1599     {
    1600         return m_formatter.executableCopy(globalData, allocator);
     1598    PassRefPtr<ExecutableMemoryHandle> executableCopy(JSGlobalData& globalData)
     1599    {
     1600        return m_formatter.executableCopy(globalData);
    16011601    }
    16021602
     
    19401940        void* data() const { return m_buffer.data(); }
    19411941
    1942         void* executableCopy(JSGlobalData& globalData, ExecutablePool* allocator)
    1943         {
    1944             return m_buffer.executableCopy(globalData, allocator);
     1942        PassRefPtr<ExecutableMemoryHandle> executableCopy(JSGlobalData& globalData)
     1943        {
     1944            return m_buffer.executableCopy(globalData);
    19451945        }
    19461946
  • trunk/Source/JavaScriptCore/bytecode/CodeBlock.h

    r94802 r94920  
    325325        JITCode& getJITCode() { return m_jitCode; }
    326326        JITCode::JITType getJITType() { return m_jitCode.jitType(); }
    327         ExecutablePool* executablePool() { return getJITCode().getExecutablePool(); }
     327        ExecutableMemoryHandle* executableMemory() { return getJITCode().getExecutableMemory(); }
    328328        virtual JSObject* compileOptimized(ExecState*, ScopeChainNode*) = 0;
    329329        virtual CodeBlock* replacement() = 0;
  • trunk/Source/JavaScriptCore/bytecode/Instruction.h

    r91952 r94920  
    5151
    5252#if ENABLE(JIT)
    53     typedef CodeLocationLabel PolymorphicAccessStructureListStubRoutineType;
     53    typedef MacroAssemblerCodeRef PolymorphicAccessStructureListStubRoutineType;
    5454
    5555    // Structure used by op_get_by_id_self_list and op_get_by_id_proto_list instruction to hold data off the main opcode stream.
  • trunk/Source/JavaScriptCore/bytecode/StructureStubInfo.h

    r92911 r94920  
    174174        } u;
    175175
    176         CodeLocationLabel stubRoutine;
     176        MacroAssemblerCodeRef stubRoutine;
    177177        CodeLocationCall callReturnLocation;
    178178        CodeLocationLabel hotPathBegin;
  • trunk/Source/JavaScriptCore/dfg/DFGJITCompiler.cpp

    r94914 r94920  
    956956    compileBody();
    957957    // Link
    958     LinkBuffer linkBuffer(*m_globalData, this, m_globalData->executableAllocator);
     958    LinkBuffer linkBuffer(*m_globalData, this);
    959959    link(linkBuffer);
    960960    entry = JITCode(linkBuffer.finalizeCode(), JITCode::DFGJIT);
     
    10141014
    10151015    // === Link ===
    1016     LinkBuffer linkBuffer(*m_globalData, this, m_globalData->executableAllocator);
     1016    LinkBuffer linkBuffer(*m_globalData, this);
    10171017    link(linkBuffer);
    10181018   
  • trunk/Source/JavaScriptCore/dfg/DFGRepatch.cpp

    r94701 r94920  
    9494}
    9595
    96 static void generateProtoChainAccessStub(ExecState* exec, StructureStubInfo& stubInfo, StructureChain* chain, size_t count, size_t offset, Structure* structure, CodeLocationLabel successLabel, CodeLocationLabel slowCaseLabel, CodeLocationLabel& stubRoutine)
     96static void generateProtoChainAccessStub(ExecState* exec, StructureStubInfo& stubInfo, StructureChain* chain, size_t count, size_t offset, Structure* structure, CodeLocationLabel successLabel, CodeLocationLabel slowCaseLabel, MacroAssemblerCodeRef& stubRoutine)
    9797{
    9898    JSGlobalData* globalData = &exec->globalData();
     
    134134    emitRestoreScratch(stubJit, needToRestoreScratch, scratchGPR, success, fail, failureCases);
    135135   
    136     LinkBuffer patchBuffer(*globalData, &stubJit, exec->codeBlock()->executablePool());
     136    LinkBuffer patchBuffer(*globalData, &stubJit);
    137137   
    138138    linkRestoreScratch(patchBuffer, needToRestoreScratch, success, fail, failureCases, successLabel, slowCaseLabel);
    139139   
    140     stubRoutine = patchBuffer.finalizeCodeAddendum();
     140    stubRoutine = patchBuffer.finalizeCode();
    141141}
    142142
     
    177177        emitRestoreScratch(stubJit, needToRestoreScratch, scratchGPR, success, fail, failureCases);
    178178       
    179         LinkBuffer patchBuffer(*globalData, &stubJit, codeBlock->executablePool());
     179        LinkBuffer patchBuffer(*globalData, &stubJit);
    180180       
    181181        linkRestoreScratch(patchBuffer, needToRestoreScratch, stubInfo, success, fail, failureCases);
    182182       
    183         CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    184         stubInfo.stubRoutine = entryLabel;
     183        stubInfo.stubRoutine = patchBuffer.finalizeCode();
    185184       
    186185        RepatchBuffer repatchBuffer(codeBlock);
    187         repatchBuffer.relink(stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck), entryLabel);
     186        repatchBuffer.relink(stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck), CodeLocationLabel(stubInfo.stubRoutine.code()));
    188187        repatchBuffer.relink(stubInfo.callReturnLocation, operationGetById);
    189188       
     
    232231   
    233232    RepatchBuffer repatchBuffer(codeBlock);
    234     repatchBuffer.relink(stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck), stubInfo.stubRoutine);
     233    repatchBuffer.relink(stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck), CodeLocationLabel(stubInfo.stubRoutine.code()));
    235234    repatchBuffer.relink(stubInfo.callReturnLocation, operationGetByIdProtoBuildList);
    236235   
     
    325324    if (stubInfo.accessType == access_get_by_id_self) {
    326325        ASSERT(!stubInfo.stubRoutine);
    327         polymorphicStructureList = new PolymorphicAccessStructureList(*globalData, codeBlock->ownerExecutable(), stubInfo.callReturnLocation.labelAtOffset(stubInfo.deltaCallToSlowCase), stubInfo.u.getByIdSelf.baseObjectStructure.get());
     326        polymorphicStructureList = new PolymorphicAccessStructureList(*globalData, codeBlock->ownerExecutable(), MacroAssemblerCodeRef::createSelfManagedCodeRef(stubInfo.callReturnLocation.labelAtOffset(stubInfo.deltaCallToSlowCase)), stubInfo.u.getByIdSelf.baseObjectStructure.get());
    328327        stubInfo.initGetByIdSelfList(polymorphicStructureList, 1);
    329328    } else {
     
    351350        MacroAssembler::Jump success = stubJit.jump();
    352351       
    353         LinkBuffer patchBuffer(*globalData, &stubJit, codeBlock->executablePool());
    354        
    355         CodeLocationLabel lastProtoBegin = polymorphicStructureList->list[listIndex - 1].stubRoutine;
     352        LinkBuffer patchBuffer(*globalData, &stubJit);
     353       
     354        CodeLocationLabel lastProtoBegin = CodeLocationLabel(polymorphicStructureList->list[listIndex - 1].stubRoutine.code());
    356355        ASSERT(!!lastProtoBegin);
    357356       
     
    359358        patchBuffer.link(success, stubInfo.callReturnLocation.labelAtOffset(stubInfo.deltaCallToDone));
    360359       
    361         CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    362        
    363         polymorphicStructureList->list[listIndex].set(*globalData, codeBlock->ownerExecutable(), entryLabel, structure);
     360        MacroAssemblerCodeRef stubRoutine = patchBuffer.finalizeCode();
     361       
     362        polymorphicStructureList->list[listIndex].set(*globalData, codeBlock->ownerExecutable(), stubRoutine, structure);
    364363       
    365364        CodeLocationJump jumpLocation = stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck);
    366365        RepatchBuffer repatchBuffer(codeBlock);
    367         repatchBuffer.relink(jumpLocation, entryLabel);
     366        repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    368367       
    369368        if (listIndex < (POLYMORPHIC_LIST_CACHE_SIZE - 1))
     
    409408        ASSERT(!!stubInfo.stubRoutine);
    410409        polymorphicStructureList = new PolymorphicAccessStructureList(*globalData, codeBlock->ownerExecutable(), stubInfo.stubRoutine, stubInfo.u.getByIdChain.baseObjectStructure.get(), stubInfo.u.getByIdChain.chain.get());
    411         stubInfo.stubRoutine = CodeLocationLabel();
     410        stubInfo.stubRoutine = MacroAssemblerCodeRef();
    412411        stubInfo.initGetByIdProtoList(polymorphicStructureList, 1);
    413412    } else {
     
    420419        stubInfo.u.getByIdProtoList.listSize++;
    421420       
    422         CodeLocationLabel lastProtoBegin = polymorphicStructureList->list[listIndex - 1].stubRoutine;
     421        CodeLocationLabel lastProtoBegin = CodeLocationLabel(polymorphicStructureList->list[listIndex - 1].stubRoutine.code());
    423422        ASSERT(!!lastProtoBegin);
    424423
    425         CodeLocationLabel entryLabel;
    426        
    427         generateProtoChainAccessStub(exec, stubInfo, prototypeChain, count, offset, structure, stubInfo.callReturnLocation.labelAtOffset(stubInfo.deltaCallToDone), lastProtoBegin, entryLabel);
    428        
    429         polymorphicStructureList->list[listIndex].set(*globalData, codeBlock->ownerExecutable(), entryLabel, structure);
     424        MacroAssemblerCodeRef stubRoutine;
     425       
     426        generateProtoChainAccessStub(exec, stubInfo, prototypeChain, count, offset, structure, stubInfo.callReturnLocation.labelAtOffset(stubInfo.deltaCallToDone), lastProtoBegin, stubRoutine);
     427       
     428        polymorphicStructureList->list[listIndex].set(*globalData, codeBlock->ownerExecutable(), stubRoutine, structure);
    430429       
    431430        CodeLocationJump jumpLocation = stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck);
    432431        RepatchBuffer repatchBuffer(codeBlock);
    433         repatchBuffer.relink(jumpLocation, entryLabel);
     432        repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    434433       
    435434        if (listIndex < (POLYMORPHIC_LIST_CACHE_SIZE - 1))
     
    549548                success = stubJit.jump();
    550549           
    551             LinkBuffer patchBuffer(*globalData, &stubJit, codeBlock->executablePool());
     550            LinkBuffer patchBuffer(*globalData, &stubJit);
    552551            patchBuffer.link(success, stubInfo.callReturnLocation.labelAtOffset(stubInfo.deltaCallToDone));
    553552            if (needToRestoreScratch)
     
    556555                patchBuffer.link(failureCases, stubInfo.callReturnLocation.labelAtOffset(stubInfo.deltaCallToSlowCase));
    557556           
    558             CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    559             stubInfo.stubRoutine = entryLabel;
     557            stubInfo.stubRoutine = patchBuffer.finalizeCode();
    560558           
    561559            RepatchBuffer repatchBuffer(codeBlock);
    562             repatchBuffer.relink(stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck), entryLabel);
     560            repatchBuffer.relink(stubInfo.callReturnLocation.jumpAtOffset(stubInfo.deltaCallToStructCheck), CodeLocationLabel(stubInfo.stubRoutine.code()));
    563561            repatchBuffer.relink(stubInfo.callReturnLocation, appropriatePutByIdFunction(slot, putKind));
    564562           
  • trunk/Source/JavaScriptCore/jit/ExecutableAllocator.cpp

    r87198 r94920  
    2828#include "ExecutableAllocator.h"
    2929
     30#if ENABLE(EXECUTABLE_ALLOCATOR_DEMAND)
     31#include <wtf/MetaAllocator.h>
     32#include <wtf/PageReservation.h>
     33#include <wtf/VMTags.h>
     34#endif
     35
    3036#if ENABLE(ASSEMBLER)
     37
     38using namespace WTF;
    3139
    3240namespace JSC {
     
    3442#if ENABLE(EXECUTABLE_ALLOCATOR_DEMAND)
    3543
    36 ExecutablePool::Allocation ExecutablePool::systemAlloc(size_t size)
     44class DemandExecutableAllocator: public MetaAllocator {
     45public:
     46    DemandExecutableAllocator()
     47        : MetaAllocator(32) // round up all allocations to 32 bytes
     48    {
     49        // Don't preallocate any memory here.
     50    }
     51   
     52    virtual ~DemandExecutableAllocator()
     53    {
     54        for (unsigned i = 0; i < reservations.size(); ++i)
     55            reservations.at(i).deallocate();
     56    }
     57
     58protected:
     59    virtual void* allocateNewSpace(size_t& numPages)
     60    {
     61        size_t newNumPages = (((numPages * pageSize() + JIT_ALLOCATOR_LARGE_ALLOC_SIZE - 1) / JIT_ALLOCATOR_LARGE_ALLOC_SIZE * JIT_ALLOCATOR_LARGE_ALLOC_SIZE) + pageSize() - 1) / pageSize();
     62       
     63        ASSERT(newNumPages >= numPages);
     64       
     65        numPages = newNumPages;
     66       
     67        PageReservation reservation = PageReservation::reserve(numPages * pageSize(), OSAllocator::JSJITCodePages, EXECUTABLE_POOL_WRITABLE, true);
     68        if (!reservation)
     69            CRASH();
     70       
     71        reservations.append(reservation);
     72       
     73        return reservation.base();
     74    }
     75   
     76    virtual void notifyNeedPage(void* page)
     77    {
     78        OSAllocator::commit(page, pageSize(), EXECUTABLE_POOL_WRITABLE, true);
     79    }
     80   
     81    virtual void notifyPageIsFree(void* page)
     82    {
     83        OSAllocator::decommit(page, pageSize());
     84    }
     85
     86private:
     87    Vector<PageReservation, 16> reservations;
     88};
     89
     90static DemandExecutableAllocator* allocator;
     91
     92void ExecutableAllocator::initializeAllocator()
    3793{
    38     PageAllocation allocation = PageAllocation::allocate(size, OSAllocator::JSJITCodePages, EXECUTABLE_POOL_WRITABLE, true);
    39     if (!allocation)
    40         CRASH();
    41     return allocation;
     94    ASSERT(!allocator);
     95    allocator = new DemandExecutableAllocator();
    4296}
    4397
    44 void ExecutablePool::systemRelease(ExecutablePool::Allocation& allocation)
     98ExecutableAllocator::ExecutableAllocator(JSGlobalData&)
    4599{
    46     allocation.deallocate();
     100    ASSERT(allocator);
    47101}
    48102
     
    51105    return true;
    52106}
    53    
     107
    54108bool ExecutableAllocator::underMemoryPressure()
    55109{
    56110    return false;
    57111}
    58    
     112
     113PassRefPtr<ExecutableMemoryHandle> ExecutableAllocator::allocate(JSGlobalData&, size_t sizeInBytes)
     114{
     115    RefPtr<ExecutableMemoryHandle> result = allocator->allocate(sizeInBytes);
     116    if (!result)
     117        CRASH();
     118    return result.release();
     119}
     120
    59121size_t ExecutableAllocator::committedByteCount()
    60122{
    61     return 0;
    62 } 
     123    return allocator->bytesCommitted();
     124}
    63125
     126#if ENABLE(META_ALLOCATOR_PROFILE)
     127void ExecutableAllocator::dumpProfile()
     128{
     129    allocator->dumpProfile();
     130}
    64131#endif
     132
     133#endif // ENABLE(EXECUTABLE_ALLOCATOR_DEMAND)
    65134
    66135#if ENABLE(ASSEMBLER_WX_EXCLUSIVE)
  • trunk/Source/JavaScriptCore/jit/ExecutableAllocator.h

    r93462 r94920  
    2929#include <limits>
    3030#include <wtf/Assertions.h>
     31#include <wtf/MetaAllocatorHandle.h>
    3132#include <wtf/PageAllocation.h>
    3233#include <wtf/PassRefPtr.h>
     
    103104namespace JSC {
    104105
    105 class ExecutablePool : public RefCounted<ExecutablePool> {
    106 public:
    107 #if ENABLE(EXECUTABLE_ALLOCATOR_DEMAND)
    108     typedef PageAllocation Allocation;
    109 #else
    110     class Allocation {
    111     public:
    112         Allocation(void* base, size_t size)
    113             : m_base(base)
    114             , m_size(size)
    115         {
    116         }
    117         void* base() { return m_base; }
    118         size_t size() { return m_size; }
    119         bool operator!() const { return !m_base; }
    120 
    121     private:
    122         void* m_base;
    123         size_t m_size;
    124     };
    125 #endif
    126     typedef Vector<Allocation, 2> AllocationList;
    127 
    128     static PassRefPtr<ExecutablePool> create(JSGlobalData& globalData, size_t n)
    129     {
    130         return adoptRef(new ExecutablePool(globalData, n));
    131     }
    132 
    133     void* alloc(JSGlobalData& globalData, size_t n)
    134     {
    135         ASSERT(m_freePtr <= m_end);
    136 
    137         // Round 'n' up to a multiple of word size; if all allocations are of
    138         // word sized quantities, then all subsequent allocations will be aligned.
    139         n = roundUpAllocationSize(n, sizeof(void*));
    140 
    141         if (static_cast<ptrdiff_t>(n) < (m_end - m_freePtr)) {
    142             void* result = m_freePtr;
    143             m_freePtr += n;
    144             return result;
    145         }
    146 
    147         // Insufficient space to allocate in the existing pool
    148         // so we need allocate into a new pool
    149         return poolAllocate(globalData, n);
    150     }
    151    
    152     void tryShrink(void* allocation, size_t oldSize, size_t newSize)
    153     {
    154         if (static_cast<char*>(allocation) + oldSize != m_freePtr)
    155             return;
    156         m_freePtr = static_cast<char*>(allocation) + roundUpAllocationSize(newSize, sizeof(void*));
    157     }
    158 
    159     ~ExecutablePool()
    160     {
    161         AllocationList::iterator end = m_pools.end();
    162         for (AllocationList::iterator ptr = m_pools.begin(); ptr != end; ++ptr)
    163             ExecutablePool::systemRelease(*ptr);
    164     }
    165 
    166     size_t available() const { return (m_pools.size() > 1) ? 0 : m_end - m_freePtr; }
    167 
    168 private:
    169     static Allocation systemAlloc(size_t n);
    170     static void systemRelease(Allocation& alloc);
    171 
    172     ExecutablePool(JSGlobalData&, size_t n);
    173 
    174     void* poolAllocate(JSGlobalData&, size_t n);
    175 
    176     char* m_freePtr;
    177     char* m_end;
    178     AllocationList m_pools;
    179 };
     106typedef WTF::MetaAllocatorHandle ExecutableMemoryHandle;
    180107
    181108class ExecutableAllocator {
     
    183110
    184111public:
    185     ExecutableAllocator(JSGlobalData& globalData)
    186     {
    187         if (isValid())
    188             m_smallAllocationPool = ExecutablePool::create(globalData, JIT_ALLOCATOR_LARGE_ALLOC_SIZE);
    189 #if !ENABLE(INTERPRETER)
    190         else
    191             CRASH();
    192 #endif
    193     }
     112    ExecutableAllocator(JSGlobalData&);
     113   
     114    static void initializeAllocator();
    194115
    195116    bool isValid() const;
    196117
    197118    static bool underMemoryPressure();
    198 
    199     PassRefPtr<ExecutablePool> poolForSize(JSGlobalData& globalData, size_t n)
    200     {
    201         // Try to fit in the existing small allocator
    202         ASSERT(m_smallAllocationPool);
    203         if (n < m_smallAllocationPool->available())
    204             return m_smallAllocationPool;
    205 
    206         // If the request is large, we just provide a unshared allocator
    207         if (n > JIT_ALLOCATOR_LARGE_ALLOC_SIZE)
    208             return ExecutablePool::create(globalData, n);
    209 
    210         // Create a new allocator
    211         RefPtr<ExecutablePool> pool = ExecutablePool::create(globalData, JIT_ALLOCATOR_LARGE_ALLOC_SIZE);
    212 
    213         // If the new allocator will result in more free space than in
    214         // the current small allocator, then we will use it instead
    215         if ((pool->available() - n) > m_smallAllocationPool->available())
    216             m_smallAllocationPool = pool;
    217         return pool.release();
    218     }
     119   
     120#if ENABLE(META_ALLOCATOR_PROFILE)
     121    static void dumpProfile();
     122#else
     123    static void dumpProfile() { }
     124#endif
     125
     126    PassRefPtr<ExecutableMemoryHandle> allocate(JSGlobalData&, size_t sizeInBytes);
    219127
    220128#if ENABLE(ASSEMBLER_WX_EXCLUSIVE)
     
    349257    static void reprotectRegion(void*, size_t, ProtectionSetting);
    350258#endif
    351 
    352     RefPtr<ExecutablePool> m_smallAllocationPool;
    353259};
    354260
    355 inline ExecutablePool::ExecutablePool(JSGlobalData& globalData, size_t n)
    356 {
    357     size_t allocSize = roundUpAllocationSize(n, pageSize());
    358     Allocation mem = systemAlloc(allocSize);
    359     if (!mem.base()) {
    360         releaseExecutableMemory(globalData);
    361         mem = systemAlloc(allocSize);
    362     }
    363     m_pools.append(mem);
    364     m_freePtr = static_cast<char*>(mem.base());
    365     if (!m_freePtr)
    366         CRASH(); // Failed to allocate
    367     m_end = m_freePtr + allocSize;
    368     deprecatedTurnOffVerifier();
    369 }
    370 
    371 inline void* ExecutablePool::poolAllocate(JSGlobalData& globalData, size_t n)
    372 {
    373     size_t allocSize = roundUpAllocationSize(n, pageSize());
    374    
    375     Allocation result = systemAlloc(allocSize);
    376     if (!result.base()) {
    377         releaseExecutableMemory(globalData);
    378         result = systemAlloc(allocSize);
    379         if (!result.base())
    380             CRASH(); // Failed to allocate
    381     }
    382    
    383     ASSERT(m_end >= m_freePtr);
    384     if ((allocSize - n) > static_cast<size_t>(m_end - m_freePtr)) {
    385         // Replace allocation pool
    386         m_freePtr = static_cast<char*>(result.base()) + n;
    387         m_end = static_cast<char*>(result.base()) + allocSize;
    388     }
    389 
    390     m_pools.append(result);
    391     return result.base();
    392 }
    393 
    394 }
     261} // namespace JSC
    395262
    396263#endif // ENABLE(JIT) && ENABLE(ASSEMBLER)
  • trunk/Source/JavaScriptCore/jit/ExecutableAllocatorFixedVMPool.cpp

    r87527 r94920  
    3232#include <errno.h>
    3333
    34 #include "TCSpinLock.h"
    3534#include <sys/mman.h>
    3635#include <unistd.h>
    37 #include <wtf/AVLTree.h>
     36#include <wtf/MetaAllocator.h>
    3837#include <wtf/PageReservation.h>
    3938#include <wtf/VMTags.h>
     
    4746namespace JSC {
    4847   
    49 #define TwoPow(n) (1ull << n)
     48#if CPU(ARM)
     49static const size_t fixedPoolSize = 16 * 1024 * 1024;
     50#elif CPU(X86_64)
     51static const size_t fixedPoolSize = 1024 * 1024 * 1024;
     52#else
     53static const size_t fixedPoolSize = 32 * 1024 * 1024;
     54#endif
    5055
    51 class AllocationTableSizeClass {
     56class FixedVMPoolExecutableAllocator: public MetaAllocator {
    5257public:
    53     AllocationTableSizeClass(size_t size, size_t blockSize, unsigned log2BlockSize)
    54         : m_blockSize(blockSize)
     58    FixedVMPoolExecutableAllocator()
     59        : MetaAllocator(32) // round up all allocations to 32 bytes
    5560    {
    56         ASSERT(blockSize == TwoPow(log2BlockSize));
    57 
    58         // Calculate the number of blocks needed to hold size.
    59         size_t blockMask = blockSize - 1;
    60         m_blockCount = (size + blockMask) >> log2BlockSize;
    61 
    62         // Align to the smallest power of two >= m_blockCount.
    63         m_blockAlignment = 1;
    64         while (m_blockAlignment < m_blockCount)
    65             m_blockAlignment += m_blockAlignment;
     61        m_reservation = PageReservation::reserveWithGuardPages(fixedPoolSize, OSAllocator::JSJITCodePages, EXECUTABLE_POOL_WRITABLE, true);
     62#if !ENABLE(INTERPRETER)
     63        if (!m_reservation)
     64            CRASH();
     65#endif
     66        if (m_reservation) {
     67            ASSERT(m_reservation.size() == fixedPoolSize);
     68            addFreshFreeSpace(m_reservation.base(), m_reservation.size());
     69        }
    6670    }
    67 
    68     size_t blockSize() const { return m_blockSize; }
    69     size_t blockCount() const { return m_blockCount; }
    70     size_t blockAlignment() const { return m_blockAlignment; }
    71 
    72     size_t size()
     71   
     72protected:
     73    virtual void* allocateNewSpace(size_t&)
    7374    {
    74         return m_blockSize * m_blockCount;
     75        // We're operating in a fixed pool, so new allocation is always prohibited.
     76        return 0;
     77    }
     78   
     79    virtual void notifyNeedPage(void* page)
     80    {
     81        m_reservation.commit(page, pageSize());
     82    }
     83   
     84    virtual void notifyPageIsFree(void* page)
     85    {
     86        m_reservation.decommit(page, pageSize());
    7587    }
    7688
    7789private:
    78     size_t m_blockSize;
    79     size_t m_blockCount;
    80     size_t m_blockAlignment;
     90    PageReservation m_reservation;
    8191};
    8292
    83 template<unsigned log2Entries>
    84 class AllocationTableLeaf {
    85     typedef uint64_t BitField;
     93static FixedVMPoolExecutableAllocator* allocator;
    8694
    87 public:
    88     static const unsigned log2SubregionSize = 12; // 2^12 == pagesize
    89     static const unsigned log2RegionSize = log2SubregionSize + log2Entries;
     95void ExecutableAllocator::initializeAllocator()
     96{
     97    ASSERT(!allocator);
     98    allocator = new FixedVMPoolExecutableAllocator();
     99}
    90100
    91     static const size_t subregionSize = TwoPow(log2SubregionSize);
    92     static const size_t regionSize = TwoPow(log2RegionSize);
    93     static const unsigned entries = TwoPow(log2Entries);
    94     COMPILE_ASSERT(entries <= (sizeof(BitField) * 8), AllocationTableLeaf_entries_fit_in_BitField);
    95 
    96     AllocationTableLeaf()
    97         : m_allocated(0)
    98     {
    99     }
    100 
    101     ~AllocationTableLeaf()
    102     {
    103         ASSERT(isEmpty());
    104     }
    105 
    106     size_t allocate(AllocationTableSizeClass& sizeClass)
    107     {
    108         ASSERT(sizeClass.blockSize() == subregionSize);
    109         ASSERT(!isFull());
    110 
    111         size_t alignment = sizeClass.blockAlignment();
    112         size_t count = sizeClass.blockCount();
    113         // Use this mask to check for spans of free blocks.
    114         BitField mask = ((1ull << count) - 1) << (alignment - count);
    115 
    116         // Step in units of alignment size.
    117         for (unsigned i = 0; i < entries; i += alignment) {
    118             if (!(m_allocated & mask)) {
    119                 m_allocated |= mask;
    120                 return (i + (alignment - count)) << log2SubregionSize;
    121             }
    122             mask <<= alignment;
    123         }
    124         return notFound;
    125     }
    126 
    127     void free(size_t location, AllocationTableSizeClass& sizeClass)
    128     {
    129         ASSERT(sizeClass.blockSize() == subregionSize);
    130 
    131         size_t entry = location >> log2SubregionSize;
    132         size_t count = sizeClass.blockCount();
    133         BitField mask = ((1ull << count) - 1) << entry;
    134 
    135         ASSERT((m_allocated & mask) == mask);
    136         m_allocated &= ~mask;
    137     }
    138 
    139     bool isEmpty()
    140     {
    141         return !m_allocated;
    142     }
    143 
    144     bool isFull()
    145     {
    146         return !~m_allocated;
    147     }
    148 
    149     static size_t size()
    150     {
    151         return regionSize;
    152     }
    153 
    154     static AllocationTableSizeClass classForSize(size_t size)
    155     {
    156         return AllocationTableSizeClass(size, subregionSize, log2SubregionSize);
    157     }
    158 
    159 #ifndef NDEBUG
    160     void dump(size_t parentOffset = 0, unsigned indent = 0)
    161     {
    162         for (unsigned i = 0; i < indent; ++i)
    163             fprintf(stderr, "    ");
    164         fprintf(stderr, "%08x: [%016llx]\n", (int)parentOffset, m_allocated);
    165     }
    166 #endif
    167 
    168 private:
    169     BitField m_allocated;
    170 };
    171 
    172 
    173 template<class NextLevel>
    174 class LazyAllocationTable {
    175 public:
    176     static const unsigned log2RegionSize = NextLevel::log2RegionSize;
    177     static const unsigned entries = NextLevel::entries;
    178 
    179     LazyAllocationTable()
    180         : m_ptr(0)
    181     {
    182     }
    183 
    184     ~LazyAllocationTable()
    185     {
    186         ASSERT(isEmpty());
    187     }
    188 
    189     size_t allocate(AllocationTableSizeClass& sizeClass)
    190     {
    191         if (!m_ptr)
    192             m_ptr = new NextLevel();
    193         return m_ptr->allocate(sizeClass);
    194     }
    195 
    196     void free(size_t location, AllocationTableSizeClass& sizeClass)
    197     {
    198         ASSERT(m_ptr);
    199         m_ptr->free(location, sizeClass);
    200         if (m_ptr->isEmpty()) {
    201             delete m_ptr;
    202             m_ptr = 0;
    203         }
    204     }
    205 
    206     bool isEmpty()
    207     {
    208         return !m_ptr;
    209     }
    210 
    211     bool isFull()
    212     {
    213         return m_ptr && m_ptr->isFull();
    214     }
    215 
    216     static size_t size()
    217     {
    218         return NextLevel::size();
    219     }
    220 
    221 #ifndef NDEBUG
    222     void dump(size_t parentOffset = 0, unsigned indent = 0)
    223     {
    224         ASSERT(m_ptr);
    225         m_ptr->dump(parentOffset, indent);
    226     }
    227 #endif
    228 
    229     static AllocationTableSizeClass classForSize(size_t size)
    230     {
    231         return NextLevel::classForSize(size);
    232     }
    233 
    234 private:
    235     NextLevel* m_ptr;
    236 };
    237 
    238 template<class NextLevel, unsigned log2Entries>
    239 class AllocationTableDirectory {
    240     typedef uint64_t BitField;
    241 
    242 public:
    243     static const unsigned log2SubregionSize = NextLevel::log2RegionSize;
    244     static const unsigned log2RegionSize = log2SubregionSize + log2Entries;
    245 
    246     static const size_t subregionSize = TwoPow(log2SubregionSize);
    247     static const size_t regionSize = TwoPow(log2RegionSize);
    248     static const unsigned entries = TwoPow(log2Entries);
    249     COMPILE_ASSERT(entries <= (sizeof(BitField) * 8), AllocationTableDirectory_entries_fit_in_BitField);
    250 
    251     AllocationTableDirectory()
    252         : m_full(0)
    253         , m_hasSuballocation(0)
    254     {
    255     }
    256 
    257     ~AllocationTableDirectory()
    258     {
    259         ASSERT(isEmpty());
    260     }
    261 
    262     size_t allocate(AllocationTableSizeClass& sizeClass)
    263     {
    264         ASSERT(sizeClass.blockSize() <= subregionSize);
    265         ASSERT(!isFull());
    266 
    267         if (sizeClass.blockSize() < subregionSize) {
    268             BitField bit = 1;
    269             for (unsigned i = 0; i < entries; ++i, bit += bit) {
    270                 if (m_full & bit)
    271                     continue;
    272                 size_t location = m_suballocations[i].allocate(sizeClass);
    273                 if (location != notFound) {
    274                     // If this didn't already have a subregion, it does now!
    275                     m_hasSuballocation |= bit;
    276                     // Mirror the suballocation's full bit.
    277                     if (m_suballocations[i].isFull())
    278                         m_full |= bit;
    279                     return (i * subregionSize) | location;
    280                 }
    281             }
    282             return notFound;
    283         }
    284 
    285         // A block is allocated if either it is fully allocated or contains suballocations.
    286         BitField allocated = m_full | m_hasSuballocation;
    287 
    288         size_t alignment = sizeClass.blockAlignment();
    289         size_t count = sizeClass.blockCount();
    290         // Use this mask to check for spans of free blocks.
    291         BitField mask = ((1ull << count) - 1) << (alignment - count);
    292 
    293         // Step in units of alignment size.
    294         for (unsigned i = 0; i < entries; i += alignment) {
    295             if (!(allocated & mask)) {
    296                 m_full |= mask;
    297                 return (i + (alignment - count)) << log2SubregionSize;
    298             }
    299             mask <<= alignment;
    300         }
    301         return notFound;
    302     }
    303 
    304     void free(size_t location, AllocationTableSizeClass& sizeClass)
    305     {
    306         ASSERT(sizeClass.blockSize() <= subregionSize);
    307 
    308         size_t entry = location >> log2SubregionSize;
    309 
    310         if (sizeClass.blockSize() < subregionSize) {
    311             BitField bit = 1ull << entry;
    312             m_suballocations[entry].free(location & (subregionSize - 1), sizeClass);
    313             // Check if the suballocation is now empty.
    314             if (m_suballocations[entry].isEmpty())
    315                 m_hasSuballocation &= ~bit;
    316             // No need to check, it clearly isn't full any more!
    317             m_full &= ~bit;
    318         } else {
    319             size_t count = sizeClass.blockCount();
    320             BitField mask = ((1ull << count) - 1) << entry;
    321             ASSERT((m_full & mask) == mask);
    322             ASSERT(!(m_hasSuballocation & mask));
    323             m_full &= ~mask;
    324         }
    325     }
    326 
    327     bool isEmpty()
    328     {
    329         return !(m_full | m_hasSuballocation);
    330     }
    331 
    332     bool isFull()
    333     {   
    334         return !~m_full;
    335     }
    336 
    337     static size_t size()
    338     {
    339         return regionSize;
    340     }
    341 
    342     static AllocationTableSizeClass classForSize(size_t size)
    343     {
    344         if (size < subregionSize) {
    345             AllocationTableSizeClass sizeClass = NextLevel::classForSize(size);
    346             if (sizeClass.size() < NextLevel::size())
    347                 return sizeClass;
    348         }
    349         return AllocationTableSizeClass(size, subregionSize, log2SubregionSize);
    350     }
    351 
    352 #ifndef NDEBUG
    353     void dump(size_t parentOffset = 0, unsigned indent = 0)
    354     {
    355         for (unsigned i = 0; i < indent; ++i)
    356             fprintf(stderr, "    ");
    357         fprintf(stderr, "%08x: [", (int)parentOffset);
    358         for (unsigned i = 0; i < entries; ++i) {
    359             BitField bit = 1ull << i;
    360             char c = m_hasSuballocation & bit
    361                 ? (m_full & bit ? 'N' : 'n')
    362                 : (m_full & bit ? 'F' : '-');
    363             fprintf(stderr, "%c", c);
    364         }
    365         fprintf(stderr, "]\n");
    366 
    367         for (unsigned i = 0; i < entries; ++i) {
    368             BitField bit = 1ull << i;
    369             size_t offset = parentOffset | (subregionSize * i);
    370             if (m_hasSuballocation & bit)
    371                 m_suballocations[i].dump(offset, indent + 1);
    372         }
    373     }
    374 #endif
    375 
    376 private:
    377     NextLevel m_suballocations[entries];
    378     // Subregions exist in one of four states:
    379     // (1) empty (both bits clear)
    380     // (2) fully allocated as a single allocation (m_full set)
    381     // (3) partially allocated through suballocations (m_hasSuballocation set)
    382     // (4) fully allocated through suballocations (both bits set)
    383     BitField m_full;
    384     BitField m_hasSuballocation;
    385 };
    386 
    387 
    388 typedef AllocationTableLeaf<6> PageTables256KB;
    389 typedef AllocationTableDirectory<PageTables256KB, 6> PageTables16MB;
    390 typedef AllocationTableDirectory<LazyAllocationTable<PageTables16MB>, 1> PageTables32MB;
    391 typedef AllocationTableDirectory<LazyAllocationTable<PageTables16MB>, 6> PageTables1GB;
    392 
    393 #if CPU(ARM)
    394 typedef PageTables16MB FixedVMPoolPageTables;
    395 #elif CPU(X86_64)
    396 typedef PageTables1GB FixedVMPoolPageTables;
    397 #else
    398 typedef PageTables32MB FixedVMPoolPageTables;
    399 #endif
    400 
    401 
    402 class FixedVMPoolAllocator
     101ExecutableAllocator::ExecutableAllocator(JSGlobalData&)
    403102{
    404 public:
    405     FixedVMPoolAllocator()
    406     {
    407         ASSERT(PageTables256KB::size() == 256 * 1024);
    408         ASSERT(PageTables16MB::size() == 16 * 1024 * 1024);
    409         ASSERT(PageTables32MB::size() == 32 * 1024 * 1024);
    410         ASSERT(PageTables1GB::size() == 1024 * 1024 * 1024);
    411 
    412         m_reservation = PageReservation::reserveWithGuardPages(FixedVMPoolPageTables::size(), OSAllocator::JSJITCodePages, EXECUTABLE_POOL_WRITABLE, true);
    413 #if !ENABLE(INTERPRETER)
    414         if (!isValid())
    415             CRASH();
    416 #endif
    417     }
    418  
    419     ExecutablePool::Allocation alloc(size_t requestedSize)
    420     {
    421         ASSERT(requestedSize);
    422         AllocationTableSizeClass sizeClass = classForSize(requestedSize);
    423         size_t size = sizeClass.size();
    424         ASSERT(size);
    425 
    426         if (size >= FixedVMPoolPageTables::size())
    427             return ExecutablePool::Allocation(0, 0);
    428         if (m_pages.isFull())
    429             return ExecutablePool::Allocation(0, 0);
    430 
    431         size_t offset = m_pages.allocate(sizeClass);
    432         if (offset == notFound)
    433             return ExecutablePool::Allocation(0, 0);
    434 
    435         void* pointer = offsetToPointer(offset);
    436         m_reservation.commit(pointer, size);
    437         return ExecutablePool::Allocation(pointer, size);
    438     }
    439 
    440     void free(ExecutablePool::Allocation allocation)
    441     {
    442         void* pointer = allocation.base();
    443         size_t size = allocation.size();
    444         ASSERT(size);
    445 
    446         m_reservation.decommit(pointer, size);
    447 
    448         AllocationTableSizeClass sizeClass = classForSize(size);
    449         ASSERT(sizeClass.size() == size);
    450         m_pages.free(pointerToOffset(pointer), sizeClass);
    451     }
    452 
    453     size_t allocated()
    454     {
    455         return m_reservation.committed();
    456     }
    457 
    458     bool isValid() const
    459     {
    460         return !!m_reservation;
    461     }
    462 
    463 private:
    464     AllocationTableSizeClass classForSize(size_t size)
    465     {
    466         return FixedVMPoolPageTables::classForSize(size);
    467     }
    468 
    469     void* offsetToPointer(size_t offset)
    470     {
    471         return reinterpret_cast<void*>(reinterpret_cast<intptr_t>(m_reservation.base()) + offset);
    472     }
    473 
    474     size_t pointerToOffset(void* pointer)
    475     {
    476         return reinterpret_cast<intptr_t>(pointer) - reinterpret_cast<intptr_t>(m_reservation.base());
    477     }
    478 
    479     PageReservation m_reservation;
    480     FixedVMPoolPageTables m_pages;
    481 };
    482 
    483 
    484 static SpinLock spinlock = SPINLOCK_INITIALIZER;
    485 static FixedVMPoolAllocator* allocator = 0;
    486 
    487 
    488 size_t ExecutableAllocator::committedByteCount()
    489 {
    490     SpinLockHolder lockHolder(&spinlock);
    491     return allocator ? allocator->allocated() : 0;
    492 }   
     103    ASSERT(allocator);
     104}
    493105
    494106bool ExecutableAllocator::isValid() const
    495107{
    496     SpinLockHolder lock_holder(&spinlock);
    497     if (!allocator)
    498         allocator = new FixedVMPoolAllocator();
    499     return allocator->isValid();
     108    return !!allocator->bytesReserved();
    500109}
    501110
    502111bool ExecutableAllocator::underMemoryPressure()
    503112{
    504     // Technically we should take the spin lock here, but we don't care if we get stale data.
    505     // This is only really a heuristic anyway.
    506     return allocator && (allocator->allocated() > (FixedVMPoolPageTables::size() / 2));
     113    MetaAllocator::Statistics statistics = allocator->currentStatistics();
     114    return statistics.bytesAllocated > statistics.bytesReserved / 2;
    507115}
    508116
    509 ExecutablePool::Allocation ExecutablePool::systemAlloc(size_t size)
     117PassRefPtr<ExecutableMemoryHandle> ExecutableAllocator::allocate(JSGlobalData& globalData, size_t sizeInBytes)
    510118{
    511     SpinLockHolder lock_holder(&spinlock);
    512     ASSERT(allocator);
    513     return allocator->alloc(size);
     119    RefPtr<ExecutableMemoryHandle> result = allocator->allocate(sizeInBytes);
     120    if (!result) {
     121        releaseExecutableMemory(globalData);
     122        result = allocator->allocate(sizeInBytes);
     123        if (!result)
     124            CRASH();
     125    }
     126    return result.release();
    514127}
    515128
    516 void ExecutablePool::systemRelease(ExecutablePool::Allocation& allocation)
     129size_t ExecutableAllocator::committedByteCount()
    517130{
    518     SpinLockHolder lock_holder(&spinlock);
    519     ASSERT(allocator);
    520     allocator->free(allocation);
     131    return allocator->bytesCommitted();
    521132}
     133
     134#if ENABLE(META_ALLOCATOR_PROFILE)
     135void ExecutableAllocator::dumpProfile()
     136{
     137    allocator->dumpProfile();
     138}
     139#endif
    522140
    523141}
    524142
    525143
    526 #endif // HAVE(ASSEMBLER)
     144#endif // ENABLE(EXECUTABLE_ALLOCATOR_FIXED)
  • trunk/Source/JavaScriptCore/jit/JIT.cpp

    r94802 r94920  
    571571    ASSERT(m_jmpTable.isEmpty());
    572572
    573     LinkBuffer patchBuffer(*m_globalData, this, m_globalData->executableAllocator);
     573    LinkBuffer patchBuffer(*m_globalData, this);
    574574
    575575    // Translate vPC offsets into addresses in JIT generated code, for switch tables.
  • trunk/Source/JavaScriptCore/jit/JIT.h

    r94802 r94920  
    222222        }
    223223
    224         static void compileCTIMachineTrampolines(JSGlobalData* globalData, RefPtr<ExecutablePool>* executablePool, TrampolineStructure *trampolines)
     224        static PassRefPtr<ExecutableMemoryHandle> compileCTIMachineTrampolines(JSGlobalData* globalData, TrampolineStructure *trampolines)
    225225        {
    226226            if (!globalData->canUseJIT())
    227                 return;
     227                return 0;
    228228            JIT jit(globalData, 0);
    229             jit.privateCompileCTIMachineTrampolines(executablePool, globalData, trampolines);
    230         }
    231 
    232         static CodePtr compileCTINativeCall(JSGlobalData* globalData, PassRefPtr<ExecutablePool> executablePool, NativeFunction func)
     229            return jit.privateCompileCTIMachineTrampolines(globalData, trampolines);
     230        }
     231
     232        static CodeRef compileCTINativeCall(JSGlobalData* globalData, NativeFunction func)
    233233        {
    234234            if (!globalData->canUseJIT())
    235                 return CodePtr();
     235                return CodeRef();
    236236            JIT jit(globalData, 0);
    237             return jit.privateCompileCTINativeCall(executablePool, globalData, func);
     237            return jit.privateCompileCTINativeCall(globalData, func);
    238238        }
    239239
     
    275275        void privateCompilePutByIdTransition(StructureStubInfo*, Structure*, Structure*, size_t cachedOffset, StructureChain*, ReturnAddressPtr returnAddress, bool direct);
    276276
    277         void privateCompileCTIMachineTrampolines(RefPtr<ExecutablePool>* executablePool, JSGlobalData* data, TrampolineStructure *trampolines);
     277        PassRefPtr<ExecutableMemoryHandle> privateCompileCTIMachineTrampolines(JSGlobalData*, TrampolineStructure*);
    278278        Label privateCompileCTINativeCall(JSGlobalData*, bool isConstruct = false);
    279         CodePtr privateCompileCTINativeCall(PassRefPtr<ExecutablePool> executablePool, JSGlobalData* data, NativeFunction func);
     279        CodeRef privateCompileCTINativeCall(JSGlobalData*, NativeFunction);
    280280        void privateCompilePatchGetArrayLength(ReturnAddressPtr returnAddress);
    281281
     
    10511051#endif
    10521052        WeakRandom m_randomGenerator;
    1053         static CodePtr stringGetByValStubGenerator(JSGlobalData* globalData, ExecutablePool* pool);
     1053        static CodeRef stringGetByValStubGenerator(JSGlobalData*);
    10541054       
    10551055#if ENABLE(TIERED_COMPILATION)
  • trunk/Source/JavaScriptCore/jit/JITCode.h

    r94559 r94920  
    8484        bool operator !() const
    8585        {
    86             return !m_ref.m_code.executableAddress();
     86            return !m_ref;
    8787        }
    8888
    8989        CodePtr addressForCall()
    9090        {
    91             return m_ref.m_code;
     91            return m_ref.code();
    9292        }
    9393
     
    9797        unsigned offsetOf(void* pointerIntoCode)
    9898        {
    99             intptr_t result = reinterpret_cast<intptr_t>(pointerIntoCode) - reinterpret_cast<intptr_t>(m_ref.m_code.executableAddress());
     99            intptr_t result = reinterpret_cast<intptr_t>(pointerIntoCode) - reinterpret_cast<intptr_t>(m_ref.code().executableAddress());
    100100            ASSERT(static_cast<intptr_t>(static_cast<unsigned>(result)) == result);
    101101            return static_cast<unsigned>(result);
     
    105105        inline JSValue execute(RegisterFile* registerFile, CallFrame* callFrame, JSGlobalData* globalData)
    106106        {
    107             JSValue result = JSValue::decode(ctiTrampoline(m_ref.m_code.executableAddress(), registerFile, callFrame, 0, Profiler::enabledProfilerReference(), globalData));
     107            JSValue result = JSValue::decode(ctiTrampoline(m_ref.code().executableAddress(), registerFile, callFrame, 0, Profiler::enabledProfilerReference(), globalData));
    108108            return globalData->exception ? jsNull() : result;
    109109        }
     
    111111        void* start()
    112112        {
    113             return m_ref.m_code.dataLocation();
     113            return m_ref.code().dataLocation();
    114114        }
    115115
    116116        size_t size()
    117117        {
    118             ASSERT(m_ref.m_code.executableAddress());
    119             return m_ref.m_size;
     118            ASSERT(m_ref.code().executableAddress());
     119            return m_ref.size();
    120120        }
    121121
    122         ExecutablePool* getExecutablePool()
     122        ExecutableMemoryHandle* getExecutableMemory()
    123123        {
    124             return m_ref.m_executablePool.get();
     124            return m_ref.executableMemory();
    125125        }
    126126       
     
    132132        // Host functions are a bit special; they have a m_code pointer but they
    133133        // do not individully ref the executable pool containing the trampoline.
    134         static JITCode HostFunction(CodePtr code)
     134        static JITCode HostFunction(CodeRef code)
    135135        {
    136             return JITCode(code.dataLocation(), 0, 0, HostCallThunk);
     136            return JITCode(code, HostCallThunk);
    137137        }
    138138
     
    144144
    145145    private:
    146         JITCode(void* code, PassRefPtr<ExecutablePool> executablePool, size_t size, JITType jitType)
    147             : m_ref(code, executablePool, size)
     146        JITCode(PassRefPtr<ExecutableMemoryHandle> executableMemory, JITType jitType)
     147            : m_ref(executableMemory)
    148148            , m_jitType(jitType)
    149149        {
  • trunk/Source/JavaScriptCore/jit/JITOpcodes.cpp

    r94688 r94920  
    4343#if USE(JSVALUE64)
    4444
    45 void JIT::privateCompileCTIMachineTrampolines(RefPtr<ExecutablePool>* executablePool, JSGlobalData* globalData, TrampolineStructure *trampolines)
     45PassRefPtr<ExecutableMemoryHandle> JIT::privateCompileCTIMachineTrampolines(JSGlobalData* globalData, TrampolineStructure *trampolines)
    4646{
    4747    // (2) The second function provides fast property access for string length
     
    154154
    155155    // All trampolines constructed! copy the code, link up calls, and set the pointers on the Machine object.
    156     LinkBuffer patchBuffer(*m_globalData, this, m_globalData->executableAllocator);
     156    LinkBuffer patchBuffer(*m_globalData, this);
    157157
    158158    patchBuffer.link(string_failureCases1Call, FunctionPtr(cti_op_get_by_id_string_fail));
     
    165165
    166166    CodeRef finalCode = patchBuffer.finalizeCode();
    167     *executablePool = finalCode.m_executablePool;
     167    RefPtr<ExecutableMemoryHandle> executableMemory = finalCode.executableMemory();
    168168
    169169    trampolines->ctiVirtualCallLink = patchBuffer.trampolineAt(virtualCallLinkBegin);
     
    174174    trampolines->ctiNativeConstruct = patchBuffer.trampolineAt(nativeConstructThunk);
    175175    trampolines->ctiStringLengthTrampoline = patchBuffer.trampolineAt(stringLengthBegin);
     176   
     177    return executableMemory.release();
    176178}
    177179
     
    292294}
    293295
    294 JIT::CodePtr JIT::privateCompileCTINativeCall(PassRefPtr<ExecutablePool>, JSGlobalData* globalData, NativeFunction)
    295 {
    296     return globalData->jitStubs->ctiNativeCall();
     296JIT::CodeRef JIT::privateCompileCTINativeCall(JSGlobalData* globalData, NativeFunction)
     297{
     298    return CodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    297299}
    298300
  • trunk/Source/JavaScriptCore/jit/JITOpcodes32_64.cpp

    r94688 r94920  
    4141namespace JSC {
    4242
    43 void JIT::privateCompileCTIMachineTrampolines(RefPtr<ExecutablePool>* executablePool, JSGlobalData* globalData, TrampolineStructure *trampolines)
     43PassRefPtr<ExecutableMemoryHandle> JIT::privateCompileCTIMachineTrampolines(JSGlobalData* globalData, TrampolineStructure *trampolines)
    4444{
    4545#if ENABLE(JIT_USE_SOFT_MODULO)
     
    153153
    154154    // All trampolines constructed! copy the code, link up calls, and set the pointers on the Machine object.
    155     LinkBuffer patchBuffer(*m_globalData, this, m_globalData->executableAllocator);
     155    LinkBuffer patchBuffer(*m_globalData, this);
    156156
    157157    patchBuffer.link(string_failureCases1Call, FunctionPtr(cti_op_get_by_id_string_fail));
     
    164164
    165165    CodeRef finalCode = patchBuffer.finalizeCode();
    166     *executablePool = finalCode.m_executablePool;
     166    RefPtr<ExecutableMemoryHandle> executableMemory = finalCode.executableMemory();
    167167
    168168    trampolines->ctiVirtualCall = patchBuffer.trampolineAt(virtualCallBegin);
     
    176176    trampolines->ctiSoftModulo = patchBuffer.trampolineAt(softModBegin);
    177177#endif
     178   
     179    return executableMemory.release();
    178180}
    179181
     
    313315}
    314316
    315 JIT::CodePtr JIT::privateCompileCTINativeCall(PassRefPtr<ExecutablePool> executablePool, JSGlobalData* globalData, NativeFunction func)
     317JIT::CodeRef JIT::privateCompileCTINativeCall(JSGlobalData* globalData, NativeFunction func)
    316318{
    317319    Call nativeCall;
    318     Label nativeCallThunk = align();
    319320
    320321    emitPutImmediateToCallFrameHeader(0, RegisterFile::CodeBlock);
     
    448449
    449450    // All trampolines constructed! copy the code, link up calls, and set the pointers on the Machine object.
    450     LinkBuffer patchBuffer(*m_globalData, this, executablePool);
     451    LinkBuffer patchBuffer(*m_globalData, this);
    451452
    452453    patchBuffer.link(nativeCall, FunctionPtr(func));
    453     patchBuffer.finalizeCode();
    454 
    455     return patchBuffer.trampolineAt(nativeCallThunk);
     454    return patchBuffer.finalizeCode();
    456455}
    457456
  • trunk/Source/JavaScriptCore/jit/JITPropertyAccess.cpp

    r94701 r94920  
    5151#if USE(JSVALUE64)
    5252
    53 JIT::CodePtr JIT::stringGetByValStubGenerator(JSGlobalData* globalData, ExecutablePool* pool)
     53JIT::CodeRef JIT::stringGetByValStubGenerator(JSGlobalData* globalData)
    5454{
    5555    JSInterfaceJIT jit;
     
    7878    jit.ret();
    7979   
    80     LinkBuffer patchBuffer(*globalData, &jit, pool);
    81     return patchBuffer.finalizeCode().m_code;
     80    LinkBuffer patchBuffer(*globalData, &jit);
     81    return patchBuffer.finalizeCode();
    8282}
    8383
     
    123123    linkSlowCase(iter); // base array check
    124124    Jump notString = branchPtr(NotEqual, Address(regT0), TrustedImmPtr(m_globalData->jsStringVPtr));
    125     emitNakedCall(m_globalData->getCTIStub(stringGetByValStubGenerator));
     125    emitNakedCall(CodeLocationLabel(m_globalData->getCTIStub(stringGetByValStubGenerator).code()));
    126126    Jump failed = branchTestPtr(Zero, regT0);
    127127    emitPutVirtualRegister(dst, regT0);
     
    556556    Call failureCall = tailRecursiveCall();
    557557
    558     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     558    LinkBuffer patchBuffer(*m_globalData, this);
    559559
    560560    patchBuffer.link(failureCall, FunctionPtr(direct ? cti_op_put_by_id_direct_fail : cti_op_put_by_id_fail));
     
    565565    }
    566566   
    567     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    568     stubInfo->stubRoutine = entryLabel;
     567    stubInfo->stubRoutine = patchBuffer.finalizeCode();
    569568    RepatchBuffer repatchBuffer(m_codeBlock);
    570     repatchBuffer.relinkCallerToTrampoline(returnAddress, entryLabel);
     569    repatchBuffer.relinkCallerToTrampoline(returnAddress, CodeLocationLabel(stubInfo->stubRoutine.code()));
    571570}
    572571
     
    616615    Jump success = jump();
    617616
    618     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     617    LinkBuffer patchBuffer(*m_globalData, this);
    619618
    620619    // Use the patch information to link the failure cases back to the original slow case routine.
     
    627626
    628627    // Track the stub we have created so that it will be deleted later.
    629     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    630     stubInfo->stubRoutine = entryLabel;
     628    stubInfo->stubRoutine = patchBuffer.finalizeCode();
    631629
    632630    // Finally patch the jump to slow case back in the hot path to jump here instead.
    633631    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    634632    RepatchBuffer repatchBuffer(m_codeBlock);
    635     repatchBuffer.relink(jumpLocation, entryLabel);
     633    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubInfo->stubRoutine.code()));
    636634
    637635    // We don't want to patch more than once - in future go to cti_op_put_by_id_generic.
     
    674672        compileGetDirectOffset(protoObject, regT0, cachedOffset);
    675673    Jump success = jump();
    676     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     674    LinkBuffer patchBuffer(*m_globalData, this);
    677675
    678676    // Use the patch information to link the failure cases back to the original slow case routine.
     
    691689    }
    692690    // Track the stub we have created so that it will be deleted later.
    693     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    694     stubInfo->stubRoutine = entryLabel;
     691    stubInfo->stubRoutine = patchBuffer.finalizeCode();
    695692
    696693    // Finally patch the jump to slow case back in the hot path to jump here instead.
    697694    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    698695    RepatchBuffer repatchBuffer(m_codeBlock);
    699     repatchBuffer.relink(jumpLocation, entryLabel);
     696    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubInfo->stubRoutine.code()));
    700697
    701698    // We don't want to patch more than once - in future go to cti_op_put_by_id_generic.
     
    727724    Jump success = jump();
    728725
    729     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     726    LinkBuffer patchBuffer(*m_globalData, this);
    730727
    731728    if (needsStubLink) {
     
    737734
    738735    // Use the patch information to link the failure cases back to the original slow case routine.
    739     CodeLocationLabel lastProtoBegin = polymorphicStructures->list[currentIndex - 1].stubRoutine;
     736    CodeLocationLabel lastProtoBegin = CodeLocationLabel(polymorphicStructures->list[currentIndex - 1].stubRoutine.code());
    740737    if (!lastProtoBegin)
    741738        lastProtoBegin = stubInfo->callReturnLocation.labelAtOffset(-patchOffsetGetByIdSlowCaseCall);
     
    746743    patchBuffer.link(success, stubInfo->hotPathBegin.labelAtOffset(patchOffsetGetByIdPutResult));
    747744
    748     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    749 
    750     polymorphicStructures->list[currentIndex].set(*m_globalData, m_codeBlock->ownerExecutable(), entryLabel, structure);
     745    MacroAssemblerCodeRef stubCode = patchBuffer.finalizeCode();
     746
     747    polymorphicStructures->list[currentIndex].set(*m_globalData, m_codeBlock->ownerExecutable(), stubCode, structure);
    751748
    752749    // Finally patch the jump to slow case back in the hot path to jump here instead.
    753750    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    754751    RepatchBuffer repatchBuffer(m_codeBlock);
    755     repatchBuffer.relink(jumpLocation, entryLabel);
     752    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubCode.code()));
    756753}
    757754
     
    792789    Jump success = jump();
    793790
    794     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     791    LinkBuffer patchBuffer(*m_globalData, this);
    795792
    796793    if (needsStubLink) {
     
    802799
    803800    // Use the patch information to link the failure cases back to the original slow case routine.
    804     CodeLocationLabel lastProtoBegin = prototypeStructures->list[currentIndex - 1].stubRoutine;
     801    CodeLocationLabel lastProtoBegin = CodeLocationLabel(prototypeStructures->list[currentIndex - 1].stubRoutine.code());
    805802    patchBuffer.link(failureCases1, lastProtoBegin);
    806803    patchBuffer.link(failureCases2, lastProtoBegin);
     
    809806    patchBuffer.link(success, stubInfo->hotPathBegin.labelAtOffset(patchOffsetGetByIdPutResult));
    810807
    811     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    812     prototypeStructures->list[currentIndex].set(*m_globalData, m_codeBlock->ownerExecutable(), entryLabel, structure, prototypeStructure);
     808    MacroAssemblerCodeRef stubCode = patchBuffer.finalizeCode();
     809    prototypeStructures->list[currentIndex].set(*m_globalData, m_codeBlock->ownerExecutable(), stubCode, structure, prototypeStructure);
    813810
    814811    // Finally patch the jump to slow case back in the hot path to jump here instead.
    815812    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    816813    RepatchBuffer repatchBuffer(m_codeBlock);
    817     repatchBuffer.relink(jumpLocation, entryLabel);
     814    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubCode.code()));
    818815}
    819816
     
    858855    Jump success = jump();
    859856
    860     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     857    LinkBuffer patchBuffer(*m_globalData, this);
    861858   
    862859    if (needsStubLink) {
     
    868865
    869866    // Use the patch information to link the failure cases back to the original slow case routine.
    870     CodeLocationLabel lastProtoBegin = prototypeStructures->list[currentIndex - 1].stubRoutine;
     867    CodeLocationLabel lastProtoBegin = CodeLocationLabel(prototypeStructures->list[currentIndex - 1].stubRoutine.code());
    871868
    872869    patchBuffer.link(bucketsOfFail, lastProtoBegin);
     
    875872    patchBuffer.link(success, stubInfo->hotPathBegin.labelAtOffset(patchOffsetGetByIdPutResult));
    876873
    877     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
     874    CodeRef stubRoutine = patchBuffer.finalizeCode();
    878875
    879876    // Track the stub we have created so that it will be deleted later.
    880     prototypeStructures->list[currentIndex].set(callFrame->globalData(), m_codeBlock->ownerExecutable(), entryLabel, structure, chain);
     877    prototypeStructures->list[currentIndex].set(callFrame->globalData(), m_codeBlock->ownerExecutable(), stubRoutine, structure, chain);
    881878
    882879    // Finally patch the jump to slow case back in the hot path to jump here instead.
    883880    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    884881    RepatchBuffer repatchBuffer(m_codeBlock);
    885     repatchBuffer.relink(jumpLocation, entryLabel);
     882    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    886883}
    887884
     
    926923    Jump success = jump();
    927924
    928     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     925    LinkBuffer patchBuffer(*m_globalData, this);
    929926
    930927    if (needsStubLink) {
     
    942939
    943940    // Track the stub we have created so that it will be deleted later.
    944     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    945     stubInfo->stubRoutine = entryLabel;
     941    CodeRef stubRoutine = patchBuffer.finalizeCode();
     942    stubInfo->stubRoutine = stubRoutine;
    946943
    947944    // Finally patch the jump to slow case back in the hot path to jump here instead.
    948945    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    949946    RepatchBuffer repatchBuffer(m_codeBlock);
    950     repatchBuffer.relink(jumpLocation, entryLabel);
     947    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    951948
    952949    // We don't want to patch more than once - in future go to cti_op_put_by_id_generic.
  • trunk/Source/JavaScriptCore/jit/JITPropertyAccess32_64.cpp

    r93698 r94920  
    166166}
    167167
    168 JIT::CodePtr JIT::stringGetByValStubGenerator(JSGlobalData* globalData, ExecutablePool* pool)
     168JIT::CodeRef JIT::stringGetByValStubGenerator(JSGlobalData* globalData)
    169169{
    170170    JSInterfaceJIT jit;
     
    194194    jit.ret();
    195195   
    196     LinkBuffer patchBuffer(*globalData, &jit, pool);
    197     return patchBuffer.finalizeCode().m_code;
     196    LinkBuffer patchBuffer(*globalData, &jit);
     197    return patchBuffer.finalizeCode();
    198198}
    199199
     
    233233    linkSlowCase(iter); // base array check
    234234    Jump notString = branchPtr(NotEqual, Address(regT0), TrustedImmPtr(m_globalData->jsStringVPtr));
    235     emitNakedCall(m_globalData->getCTIStub(stringGetByValStubGenerator));
     235    emitNakedCall(m_globalData->getCTIStub(stringGetByValStubGenerator).code());
    236236    Jump failed = branchTestPtr(Zero, regT0);
    237237    emitStore(dst, regT1, regT0);
     
    542542    Call failureCall = tailRecursiveCall();
    543543   
    544     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     544    LinkBuffer patchBuffer(*m_globalData, this);
    545545   
    546546    patchBuffer.link(failureCall, FunctionPtr(direct ? cti_op_put_by_id_direct_fail : cti_op_put_by_id_fail));
     
    551551    }
    552552   
    553     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    554     stubInfo->stubRoutine = entryLabel;
     553    stubInfo->stubRoutine = patchBuffer.finalizeCode();
    555554    RepatchBuffer repatchBuffer(m_codeBlock);
    556     repatchBuffer.relinkCallerToTrampoline(returnAddress, entryLabel);
     555    repatchBuffer.relinkCallerToTrampoline(returnAddress, CodeLocationLabel(stubInfo->stubRoutine.code()));
    557556}
    558557
     
    607606    Jump success = jump();
    608607   
    609     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     608    LinkBuffer patchBuffer(*m_globalData, this);
    610609   
    611610    // Use the patch information to link the failure cases back to the original slow case routine.
     
    618617   
    619618    // Track the stub we have created so that it will be deleted later.
    620     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    621     stubInfo->stubRoutine = entryLabel;
     619    stubInfo->stubRoutine = patchBuffer.finalizeCode();
    622620   
    623621    // Finally patch the jump to slow case back in the hot path to jump here instead.
    624622    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    625623    RepatchBuffer repatchBuffer(m_codeBlock);
    626     repatchBuffer.relink(jumpLocation, entryLabel);
     624    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubInfo->stubRoutine.code()));
    627625   
    628626    // We don't want to patch more than once - in future go to cti_op_put_by_id_generic.
     
    667665    Jump success = jump();
    668666   
    669     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     667    LinkBuffer patchBuffer(*m_globalData, this);
    670668   
    671669    // Use the patch information to link the failure cases back to the original slow case routine.
     
    685683
    686684    // Track the stub we have created so that it will be deleted later.
    687     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    688     stubInfo->stubRoutine = entryLabel;
     685    stubInfo->stubRoutine = patchBuffer.finalizeCode();
    689686   
    690687    // Finally patch the jump to slow case back in the hot path to jump here instead.
    691688    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    692689    RepatchBuffer repatchBuffer(m_codeBlock);
    693     repatchBuffer.relink(jumpLocation, entryLabel);
     690    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubInfo->stubRoutine.code()));
    694691   
    695692    // We don't want to patch more than once - in future go to cti_op_put_by_id_generic.
     
    724721    Jump success = jump();
    725722   
    726     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     723    LinkBuffer patchBuffer(*m_globalData, this);
    727724    if (needsStubLink) {
    728725        for (Vector<CallRecord>::iterator iter = m_calls.begin(); iter != m_calls.end(); ++iter) {
     
    732729    }   
    733730    // Use the patch information to link the failure cases back to the original slow case routine.
    734     CodeLocationLabel lastProtoBegin = polymorphicStructures->list[currentIndex - 1].stubRoutine;
     731    CodeLocationLabel lastProtoBegin = CodeLocationLabel(polymorphicStructures->list[currentIndex - 1].stubRoutine.code());
    735732    if (!lastProtoBegin)
    736733        lastProtoBegin = stubInfo->callReturnLocation.labelAtOffset(-patchOffsetGetByIdSlowCaseCall);
     
    741738    patchBuffer.link(success, stubInfo->hotPathBegin.labelAtOffset(patchOffsetGetByIdPutResult));
    742739
    743     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    744 
    745     polymorphicStructures->list[currentIndex].set(*m_globalData, m_codeBlock->ownerExecutable(), entryLabel, structure);
     740    CodeRef stubRoutine = patchBuffer.finalizeCode();
     741
     742    polymorphicStructures->list[currentIndex].set(*m_globalData, m_codeBlock->ownerExecutable(), stubRoutine, structure);
    746743   
    747744    // Finally patch the jump to slow case back in the hot path to jump here instead.
    748745    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    749746    RepatchBuffer repatchBuffer(m_codeBlock);
    750     repatchBuffer.relink(jumpLocation, entryLabel);
     747    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    751748}
    752749
     
    788785    Jump success = jump();
    789786   
    790     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     787    LinkBuffer patchBuffer(*m_globalData, this);
    791788    if (needsStubLink) {
    792789        for (Vector<CallRecord>::iterator iter = m_calls.begin(); iter != m_calls.end(); ++iter) {
     
    796793    }
    797794    // Use the patch information to link the failure cases back to the original slow case routine.
    798     CodeLocationLabel lastProtoBegin = prototypeStructures->list[currentIndex - 1].stubRoutine;
     795    CodeLocationLabel lastProtoBegin = CodeLocationLabel(prototypeStructures->list[currentIndex - 1].stubRoutine.code());
    799796    patchBuffer.link(failureCases1, lastProtoBegin);
    800797    patchBuffer.link(failureCases2, lastProtoBegin);
     
    803800    patchBuffer.link(success, stubInfo->hotPathBegin.labelAtOffset(patchOffsetGetByIdPutResult));
    804801   
    805     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    806 
    807     prototypeStructures->list[currentIndex].set(callFrame->globalData(), m_codeBlock->ownerExecutable(), entryLabel, structure, prototypeStructure);
     802    CodeRef stubRoutine = patchBuffer.finalizeCode();
     803
     804    prototypeStructures->list[currentIndex].set(callFrame->globalData(), m_codeBlock->ownerExecutable(), stubRoutine, structure, prototypeStructure);
    808805   
    809806    // Finally patch the jump to slow case back in the hot path to jump here instead.
    810807    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    811808    RepatchBuffer repatchBuffer(m_codeBlock);
    812     repatchBuffer.relink(jumpLocation, entryLabel);
     809    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    813810}
    814811
     
    855852    Jump success = jump();
    856853   
    857     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     854    LinkBuffer patchBuffer(*m_globalData, this);
    858855    if (needsStubLink) {
    859856        for (Vector<CallRecord>::iterator iter = m_calls.begin(); iter != m_calls.end(); ++iter) {
     
    863860    }
    864861    // Use the patch information to link the failure cases back to the original slow case routine.
    865     CodeLocationLabel lastProtoBegin = prototypeStructures->list[currentIndex - 1].stubRoutine;
     862    CodeLocationLabel lastProtoBegin = CodeLocationLabel(prototypeStructures->list[currentIndex - 1].stubRoutine.code());
    866863   
    867864    patchBuffer.link(bucketsOfFail, lastProtoBegin);
     
    870867    patchBuffer.link(success, stubInfo->hotPathBegin.labelAtOffset(patchOffsetGetByIdPutResult));
    871868   
    872     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
     869    CodeRef stubRoutine = patchBuffer.finalizeCode();
    873870   
    874871    // Track the stub we have created so that it will be deleted later.
    875     prototypeStructures->list[currentIndex].set(callFrame->globalData(), m_codeBlock->ownerExecutable(), entryLabel, structure, chain);
     872    prototypeStructures->list[currentIndex].set(callFrame->globalData(), m_codeBlock->ownerExecutable(), stubRoutine, structure, chain);
    876873   
    877874    // Finally patch the jump to slow case back in the hot path to jump here instead.
    878875    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    879876    RepatchBuffer repatchBuffer(m_codeBlock);
    880     repatchBuffer.relink(jumpLocation, entryLabel);
     877    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    881878}
    882879
     
    922919    Jump success = jump();
    923920   
    924     LinkBuffer patchBuffer(*m_globalData, this, m_codeBlock->executablePool());
     921    LinkBuffer patchBuffer(*m_globalData, this);
    925922    if (needsStubLink) {
    926923        for (Vector<CallRecord>::iterator iter = m_calls.begin(); iter != m_calls.end(); ++iter) {
     
    936933   
    937934    // Track the stub we have created so that it will be deleted later.
    938     CodeLocationLabel entryLabel = patchBuffer.finalizeCodeAddendum();
    939     stubInfo->stubRoutine = entryLabel;
     935    CodeRef stubRoutine = patchBuffer.finalizeCode();
     936    stubInfo->stubRoutine = stubRoutine;
    940937   
    941938    // Finally patch the jump to slow case back in the hot path to jump here instead.
    942939    CodeLocationJump jumpLocation = stubInfo->hotPathBegin.jumpAtOffset(patchOffsetGetByIdBranchToSlowCase);
    943940    RepatchBuffer repatchBuffer(m_codeBlock);
    944     repatchBuffer.relink(jumpLocation, entryLabel);
     941    repatchBuffer.relink(jumpLocation, CodeLocationLabel(stubRoutine.code()));
    945942   
    946943    // We don't want to patch more than once - in future go to cti_op_put_by_id_generic.
  • trunk/Source/JavaScriptCore/jit/JITStubs.cpp

    r94814 r94920  
    758758        return;
    759759
    760     JIT::compileCTIMachineTrampolines(globalData, &m_executablePool, &m_trampolineStructure);
    761     ASSERT(m_executablePool);
     760    m_executableMemory = JIT::compileCTIMachineTrampolines(globalData, &m_trampolineStructure);
     761    ASSERT(!!m_executableMemory);
    762762#if CPU(ARM_THUMB2)
    763763    // Unfortunate the arm compiler does not like the use of offsetof on JITStackFrame (since it contains non POD types),
     
    16091609        if (stubInfo->accessType == access_get_by_id_self) {
    16101610            ASSERT(!stubInfo->stubRoutine);
    1611             polymorphicStructureList = new PolymorphicAccessStructureList(callFrame->globalData(), codeBlock->ownerExecutable(), CodeLocationLabel(), stubInfo->u.getByIdSelf.baseObjectStructure.get());
     1611            polymorphicStructureList = new PolymorphicAccessStructureList(callFrame->globalData(), codeBlock->ownerExecutable(), MacroAssemblerCodeRef(), stubInfo->u.getByIdSelf.baseObjectStructure.get());
    16121612            stubInfo->initGetByIdSelfList(polymorphicStructureList, 1);
    16131613        } else {
     
    16351635    case access_get_by_id_proto:
    16361636        prototypeStructureList = new PolymorphicAccessStructureList(globalData, owner, stubInfo->stubRoutine, stubInfo->u.getByIdProto.baseObjectStructure.get(), stubInfo->u.getByIdProto.prototypeStructure.get());
    1637         stubInfo->stubRoutine = CodeLocationLabel();
     1637        stubInfo->stubRoutine = MacroAssemblerCodeRef();
    16381638        stubInfo->initGetByIdProtoList(prototypeStructureList, 2);
    16391639        break;
    16401640    case access_get_by_id_chain:
    16411641        prototypeStructureList = new PolymorphicAccessStructureList(globalData, owner, stubInfo->stubRoutine, stubInfo->u.getByIdChain.baseObjectStructure.get(), stubInfo->u.getByIdChain.chain.get());
    1642         stubInfo->stubRoutine = CodeLocationLabel();
     1642        stubInfo->stubRoutine = MacroAssemblerCodeRef();
    16431643        stubInfo->initGetByIdProtoList(prototypeStructureList, 2);
    16441644        break;
     
    35993599}
    36003600
    3601 MacroAssemblerCodePtr JITThunks::ctiStub(JSGlobalData* globalData, ThunkGenerator generator)
    3602 {
    3603     std::pair<CTIStubMap::iterator, bool> entry = m_ctiStubMap.add(generator, MacroAssemblerCodePtr());
     3601MacroAssemblerCodeRef JITThunks::ctiStub(JSGlobalData* globalData, ThunkGenerator generator)
     3602{
     3603    std::pair<CTIStubMap::iterator, bool> entry = m_ctiStubMap.add(generator, MacroAssemblerCodeRef());
    36043604    if (entry.second)
    3605         entry.first->second = generator(globalData, m_executablePool.get());
     3605        entry.first->second = generator(globalData);
    36063606    return entry.first->second;
    36073607}
     
    36113611    std::pair<HostFunctionStubMap::iterator, bool> entry = m_hostFunctionStubMap->add(function, Weak<NativeExecutable>());
    36123612    if (!*entry.first->second)
    3613         entry.first->second.set(*globalData, NativeExecutable::create(*globalData, JIT::compileCTINativeCall(globalData, m_executablePool, function), function, ctiNativeConstruct(), callHostFunctionAsConstructor));
     3613        entry.first->second.set(*globalData, NativeExecutable::create(*globalData, JIT::compileCTINativeCall(globalData, function), function, MacroAssemblerCodeRef::createSelfManagedCodeRef(ctiNativeConstruct()), callHostFunctionAsConstructor));
    36143614    return entry.first->second.get();
    36153615}
     
    36193619    std::pair<HostFunctionStubMap::iterator, bool> entry = m_hostFunctionStubMap->add(function, Weak<NativeExecutable>());
    36203620    if (!*entry.first->second) {
    3621         MacroAssemblerCodePtr code = globalData->canUseJIT() ? generator(globalData, m_executablePool.get()) : MacroAssemblerCodePtr();
    3622         entry.first->second.set(*globalData, NativeExecutable::create(*globalData, code, function, ctiNativeConstruct(), callHostFunctionAsConstructor));
     3621        MacroAssemblerCodeRef code = globalData->canUseJIT() ? generator(globalData) : MacroAssemblerCodeRef();
     3622        entry.first->second.set(*globalData, NativeExecutable::create(*globalData, code, function, MacroAssemblerCodeRef::createSelfManagedCodeRef(ctiNativeConstruct()), callHostFunctionAsConstructor));
    36233623    }
    36243624    return entry.first->second.get();
  • trunk/Source/JavaScriptCore/jit/JITStubs.h

    r94559 r94920  
    295295        MacroAssemblerCodePtr ctiSoftModulo() { return m_trampolineStructure.ctiSoftModulo; }
    296296
    297         MacroAssemblerCodePtr ctiStub(JSGlobalData* globalData, ThunkGenerator generator);
     297        MacroAssemblerCodeRef ctiStub(JSGlobalData*, ThunkGenerator);
    298298
    299299        NativeExecutable* hostFunctionStub(JSGlobalData*, NativeFunction);
     
    303303
    304304    private:
    305         typedef HashMap<ThunkGenerator, MacroAssemblerCodePtr> CTIStubMap;
     305        typedef HashMap<ThunkGenerator, MacroAssemblerCodeRef> CTIStubMap;
    306306        CTIStubMap m_ctiStubMap;
    307307        typedef HashMap<NativeFunction, Weak<NativeExecutable> > HostFunctionStubMap;
    308308        OwnPtr<HostFunctionStubMap> m_hostFunctionStubMap;
    309         RefPtr<ExecutablePool> m_executablePool;
     309        RefPtr<ExecutableMemoryHandle> m_executableMemory;
    310310
    311311        TrampolineStructure m_trampolineStructure;
  • trunk/Source/JavaScriptCore/jit/SpecializedThunkJIT.h

    r94500 r94920  
    3838    public:
    3939        static const int ThisArgument = -1;
    40         SpecializedThunkJIT(int expectedArgCount, JSGlobalData* globalData, ExecutablePool* pool)
     40        SpecializedThunkJIT(int expectedArgCount, JSGlobalData* globalData)
    4141            : m_expectedArgCount(expectedArgCount)
    4242            , m_globalData(globalData)
    43             , m_pool(pool)
    4443        {
    4544            // Check that we have the expected number of arguments
     
    135134        }
    136135       
    137         MacroAssemblerCodePtr finalize(JSGlobalData& globalData, MacroAssemblerCodePtr fallback)
     136        MacroAssemblerCodeRef finalize(JSGlobalData& globalData, MacroAssemblerCodePtr fallback)
    138137        {
    139             LinkBuffer patchBuffer(globalData, this, m_pool.get());
     138            LinkBuffer patchBuffer(globalData, this);
    140139            patchBuffer.link(m_failures, CodeLocationLabel(fallback));
    141140            for (unsigned i = 0; i < m_calls.size(); i++)
    142141                patchBuffer.link(m_calls[i].first, m_calls[i].second);
    143             return patchBuffer.finalizeCode().m_code;
     142            return patchBuffer.finalizeCode();
    144143        }
    145144
     
    175174        int m_expectedArgCount;
    176175        JSGlobalData* m_globalData;
    177         RefPtr<ExecutablePool> m_pool;
    178176        MacroAssembler::JumpList m_failures;
    179177        Vector<std::pair<Call, FunctionPtr> > m_calls;
  • trunk/Source/JavaScriptCore/jit/ThunkGenerators.cpp

    r90425 r94920  
    6464}
    6565
    66 MacroAssemblerCodePtr charCodeAtThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    67 {
    68     SpecializedThunkJIT jit(1, globalData, pool);
     66MacroAssemblerCodeRef charCodeAtThunkGenerator(JSGlobalData* globalData)
     67{
     68    SpecializedThunkJIT jit(1, globalData);
    6969    stringCharLoad(jit);
    7070    jit.returnInt32(SpecializedThunkJIT::regT0);
     
    7272}
    7373
    74 MacroAssemblerCodePtr charAtThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    75 {
    76     SpecializedThunkJIT jit(1, globalData, pool);
     74MacroAssemblerCodeRef charAtThunkGenerator(JSGlobalData* globalData)
     75{
     76    SpecializedThunkJIT jit(1, globalData);
    7777    stringCharLoad(jit);
    7878    charToString(jit, globalData, SpecializedThunkJIT::regT0, SpecializedThunkJIT::regT0, SpecializedThunkJIT::regT1);
     
    8181}
    8282
    83 MacroAssemblerCodePtr fromCharCodeThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    84 {
    85     SpecializedThunkJIT jit(1, globalData, pool);
     83MacroAssemblerCodeRef fromCharCodeThunkGenerator(JSGlobalData* globalData)
     84{
     85    SpecializedThunkJIT jit(1, globalData);
    8686    // load char code
    8787    jit.loadInt32Argument(0, SpecializedThunkJIT::regT0);
     
    9191}
    9292
    93 MacroAssemblerCodePtr sqrtThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    94 {
    95     SpecializedThunkJIT jit(1, globalData, pool);
     93MacroAssemblerCodeRef sqrtThunkGenerator(JSGlobalData* globalData)
     94{
     95    SpecializedThunkJIT jit(1, globalData);
    9696    if (!jit.supportsFloatingPointSqrt())
    97         return globalData->jitStubs->ctiNativeCall();
     97        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    9898
    9999    jit.loadDoubleArgument(0, SpecializedThunkJIT::fpRegT0, SpecializedThunkJIT::regT0);
     
    179179defineUnaryDoubleOpWrapper(ceil);
    180180
    181 MacroAssemblerCodePtr floorThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    182 {
    183     SpecializedThunkJIT jit(1, globalData, pool);
     181MacroAssemblerCodeRef floorThunkGenerator(JSGlobalData* globalData)
     182{
     183    SpecializedThunkJIT jit(1, globalData);
    184184    MacroAssembler::Jump nonIntJump;
    185185    if (!UnaryDoubleOpWrapper(floor) || !jit.supportsFloatingPoint())
    186         return globalData->jitStubs->ctiNativeCall();
     186        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    187187    jit.loadInt32Argument(0, SpecializedThunkJIT::regT0, nonIntJump);
    188188    jit.returnInt32(SpecializedThunkJIT::regT0);
     
    198198}
    199199
    200 MacroAssemblerCodePtr ceilThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    201 {
    202     SpecializedThunkJIT jit(1, globalData, pool);
     200MacroAssemblerCodeRef ceilThunkGenerator(JSGlobalData* globalData)
     201{
     202    SpecializedThunkJIT jit(1, globalData);
    203203    if (!UnaryDoubleOpWrapper(ceil) || !jit.supportsFloatingPoint())
    204         return globalData->jitStubs->ctiNativeCall();
     204        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    205205    MacroAssembler::Jump nonIntJump;
    206206    jit.loadInt32Argument(0, SpecializedThunkJIT::regT0, nonIntJump);
     
    221221static const double negativeHalfConstant = -0.5;
    222222   
    223 MacroAssemblerCodePtr roundThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    224 {
    225     SpecializedThunkJIT jit(1, globalData, pool);
     223MacroAssemblerCodeRef roundThunkGenerator(JSGlobalData* globalData)
     224{
     225    SpecializedThunkJIT jit(1, globalData);
    226226    if (!UnaryDoubleOpWrapper(jsRound) || !jit.supportsFloatingPoint())
    227         return globalData->jitStubs->ctiNativeCall();
     227        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    228228    MacroAssembler::Jump nonIntJump;
    229229    jit.loadInt32Argument(0, SpecializedThunkJIT::regT0, nonIntJump);
     
    240240}
    241241
    242 MacroAssemblerCodePtr expThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
     242MacroAssemblerCodeRef expThunkGenerator(JSGlobalData* globalData)
    243243{
    244244    if (!UnaryDoubleOpWrapper(exp))
    245         return globalData->jitStubs->ctiNativeCall();
    246     SpecializedThunkJIT jit(1, globalData, pool);
     245        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
     246    SpecializedThunkJIT jit(1, globalData);
    247247    if (!jit.supportsFloatingPoint())
    248         return globalData->jitStubs->ctiNativeCall();
     248        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    249249    jit.loadDoubleArgument(0, SpecializedThunkJIT::fpRegT0, SpecializedThunkJIT::regT0);
    250250    jit.callDoubleToDouble(UnaryDoubleOpWrapper(exp));
     
    253253}
    254254
    255 MacroAssemblerCodePtr logThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
     255MacroAssemblerCodeRef logThunkGenerator(JSGlobalData* globalData)
    256256{
    257257    if (!UnaryDoubleOpWrapper(log))
    258         return globalData->jitStubs->ctiNativeCall();
    259     SpecializedThunkJIT jit(1, globalData, pool);
     258        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
     259    SpecializedThunkJIT jit(1, globalData);
    260260    if (!jit.supportsFloatingPoint())
    261         return globalData->jitStubs->ctiNativeCall();
     261        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    262262    jit.loadDoubleArgument(0, SpecializedThunkJIT::fpRegT0, SpecializedThunkJIT::regT0);
    263263    jit.callDoubleToDouble(UnaryDoubleOpWrapper(log));
     
    266266}
    267267
    268 MacroAssemblerCodePtr absThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    269 {
    270     SpecializedThunkJIT jit(1, globalData, pool);
     268MacroAssemblerCodeRef absThunkGenerator(JSGlobalData* globalData)
     269{
     270    SpecializedThunkJIT jit(1, globalData);
    271271    if (!jit.supportsDoubleBitops())
    272         return globalData->jitStubs->ctiNativeCall();
     272        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    273273    MacroAssembler::Jump nonIntJump;
    274274    jit.loadInt32Argument(0, SpecializedThunkJIT::regT0, nonIntJump);
     
    287287}
    288288
    289 MacroAssemblerCodePtr powThunkGenerator(JSGlobalData* globalData, ExecutablePool* pool)
    290 {
    291     SpecializedThunkJIT jit(2, globalData, pool);
     289MacroAssemblerCodeRef powThunkGenerator(JSGlobalData* globalData)
     290{
     291    SpecializedThunkJIT jit(2, globalData);
    292292    if (!jit.supportsFloatingPoint())
    293         return globalData->jitStubs->ctiNativeCall();
     293        return MacroAssemblerCodeRef::createSelfManagedCodeRef(globalData->jitStubs->ctiNativeCall());
    294294
    295295    jit.loadDouble(&oneConstant, SpecializedThunkJIT::fpRegT1);
  • trunk/Source/JavaScriptCore/jit/ThunkGenerators.h

    r90237 r94920  
    3232    class JSGlobalData;
    3333    class NativeExecutable;
    34     class MacroAssemblerCodePtr;
     34    class MacroAssemblerCodeRef;
    3535
    36     typedef MacroAssemblerCodePtr (*ThunkGenerator)(JSGlobalData*, ExecutablePool*);
    37     MacroAssemblerCodePtr charCodeAtThunkGenerator(JSGlobalData*, ExecutablePool*);
    38     MacroAssemblerCodePtr charAtThunkGenerator(JSGlobalData*, ExecutablePool*);
    39     MacroAssemblerCodePtr fromCharCodeThunkGenerator(JSGlobalData*, ExecutablePool*);
    40     MacroAssemblerCodePtr absThunkGenerator(JSGlobalData*, ExecutablePool*);
    41     MacroAssemblerCodePtr ceilThunkGenerator(JSGlobalData*, ExecutablePool*);
    42     MacroAssemblerCodePtr expThunkGenerator(JSGlobalData*, ExecutablePool*);
    43     MacroAssemblerCodePtr floorThunkGenerator(JSGlobalData*, ExecutablePool*);
    44     MacroAssemblerCodePtr logThunkGenerator(JSGlobalData*, ExecutablePool*);
    45     MacroAssemblerCodePtr roundThunkGenerator(JSGlobalData*, ExecutablePool*);
    46     MacroAssemblerCodePtr sqrtThunkGenerator(JSGlobalData*, ExecutablePool*);
    47     MacroAssemblerCodePtr powThunkGenerator(JSGlobalData*, ExecutablePool*);
     36    typedef MacroAssemblerCodeRef (*ThunkGenerator)(JSGlobalData*);
     37    MacroAssemblerCodeRef charCodeAtThunkGenerator(JSGlobalData*);
     38    MacroAssemblerCodeRef charAtThunkGenerator(JSGlobalData*);
     39    MacroAssemblerCodeRef fromCharCodeThunkGenerator(JSGlobalData*);
     40    MacroAssemblerCodeRef absThunkGenerator(JSGlobalData*);
     41    MacroAssemblerCodeRef ceilThunkGenerator(JSGlobalData*);
     42    MacroAssemblerCodeRef expThunkGenerator(JSGlobalData*);
     43    MacroAssemblerCodeRef floorThunkGenerator(JSGlobalData*);
     44    MacroAssemblerCodeRef logThunkGenerator(JSGlobalData*);
     45    MacroAssemblerCodeRef roundThunkGenerator(JSGlobalData*);
     46    MacroAssemblerCodeRef sqrtThunkGenerator(JSGlobalData*);
     47    MacroAssemblerCodeRef powThunkGenerator(JSGlobalData*);
    4848}
    4949#endif
  • trunk/Source/JavaScriptCore/runtime/Executable.h

    r94599 r94920  
    174174
    175175#if ENABLE(JIT)
    176         static NativeExecutable* create(JSGlobalData& globalData, MacroAssemblerCodePtr callThunk, NativeFunction function, MacroAssemblerCodePtr constructThunk, NativeFunction constructor)
     176        static NativeExecutable* create(JSGlobalData& globalData, MacroAssemblerCodeRef callThunk, NativeFunction function, MacroAssemblerCodeRef constructThunk, NativeFunction constructor)
    177177        {
    178178            NativeExecutable* executable;
  • trunk/Source/JavaScriptCore/runtime/InitializeThreading.cpp

    r94514 r94920  
    3030#include "InitializeThreading.h"
    3131
     32#include "ExecutableAllocator.h"
    3233#include "Heap.h"
    3334#include "Identifier.h"
     
    3536#include "UString.h"
    3637#include "WriteBarrier.h"
     38#include "dtoa.h"
    3739#include <wtf/DateMath.h>
    3840#include <wtf/Threading.h>
     
    5557#endif
    5658    JSGlobalData::storeVPtrs();
     59    ExecutableAllocator::initializeAllocator();
    5760#if ENABLE(JSC_MULTIPLE_THREADS)
    5861    RegisterFile::initializeThreading();
  • trunk/Source/JavaScriptCore/runtime/JSGlobalData.cpp

    r94811 r94920  
    187187#if ENABLE(ASSEMBLER)
    188188    , executableAllocator(*this)
    189     , regexAllocator(*this)
    190189#endif
    191190    , lexer(new Lexer(this))
     
    441440{
    442441    interpreter->dumpSampleData(exec);
     442#if ENABLE(ASSEMBLER)
     443    ExecutableAllocator::dumpProfile();
     444#endif
    443445}
    444446
  • trunk/Source/JavaScriptCore/runtime/JSGlobalData.h

    r94629 r94920  
    194194#if ENABLE(ASSEMBLER)
    195195        ExecutableAllocator executableAllocator;
    196         ExecutableAllocator regexAllocator;
    197196#endif
    198197
     
    217216#if ENABLE(JIT)
    218217        OwnPtr<JITThunks> jitStubs;
    219         MacroAssemblerCodePtr getCTIStub(ThunkGenerator generator)
     218        MacroAssemblerCodeRef getCTIStub(ThunkGenerator generator)
    220219        {
    221220            return jitStubs->ctiStub(this, generator);
  • trunk/Source/JavaScriptCore/wtf/CMakeLists.txt

    r94452 r94920  
    4343    MathExtras.h
    4444    MessageQueue.h
     45    MetaAllocator.cpp
     46    MetaAllocator.h
     47    MetaAllocatorHandle.h
    4548    NonCopyingSort.h
    4649    ThreadRestrictionVerifier.h
     
    7073    RandomNumber.h
    7174    RandomNumberSeed.h
     75    RedBlackTree.h
    7276    RefCounted.h
    7377    RefCountedLeakCounter.h
  • trunk/Source/JavaScriptCore/wtf/wtf.pri

    r94890 r94920  
    2323    wtf/MD5.cpp \
    2424    wtf/MainThread.cpp \
     25    wtf/MetaAllocator.cpp \
    2526    wtf/NullPtr.cpp \
    2627    wtf/OSRandomSource.cpp \
  • trunk/Source/JavaScriptCore/yarr/YarrJIT.cpp

    r94254 r94920  
    24302430
    24312431        // Link & finalize the code.
    2432         LinkBuffer linkBuffer(*globalData, this, globalData->regexAllocator);
     2432        LinkBuffer linkBuffer(*globalData, this);
    24332433        m_backtrackingState.linkDataLabels(linkBuffer);
    24342434        jitObject.set(linkBuffer.finalizeCode());
  • trunk/Source/JavaScriptCore/yarr/YarrJIT.h

    r78042 r94920  
    6666    int execute(const UChar* input, unsigned start, unsigned length, int* output)
    6767    {
    68         return reinterpret_cast<YarrJITCode>(m_ref.m_code.executableAddress())(input, start, length, output);
     68        return reinterpret_cast<YarrJITCode>(m_ref.code().executableAddress())(input, start, length, output);
    6969    }
    7070
    7171#if ENABLE(REGEXP_TRACING)
    72     void *getAddr() { return m_ref.m_code.executableAddress(); }
     72    void *getAddr() { return m_ref.code().executableAddress(); }
    7373#endif
    7474
  • trunk/Source/JavaScriptGlue/ChangeLog

    r94875 r94920  
     12011-08-18  Filip Pizlo  <fpizlo@apple.com>
     2
     3        The executable allocator makes it difficult to free individual
     4        chunks of executable memory
     5        https://bugs.webkit.org/show_bug.cgi?id=66363
     6
     7        Reviewed by Oliver Hunt.
     8       
     9        Introduced a best-fit, balanced-tree based allocator. The allocator
     10        required a balanced tree that does not allocate memory and that
     11        permits the removal of individual nodes directly (as opposed to by
     12        key); neither AVLTree nor WebCore's PODRedBlackTree supported this.
     13        Changed all references to executable code to use a reference counted
     14        handle.
     15
     16        * ForwardingHeaders/wtf/MetaAllocatorHandle.h: Added.
     17
    1182011-09-09  Mark Hahnenberg  <mhahnenberg@apple.com>
    219
  • trunk/Source/WebCore/ChangeLog

    r94918 r94920  
     12011-09-01  Filip Pizlo  <fpizlo@apple.com>
     2
     3        The executable allocator makes it difficult to free individual
     4        chunks of executable memory
     5        https://bugs.webkit.org/show_bug.cgi?id=66363
     6
     7        Reviewed by Oliver Hunt.
     8       
     9        Introduced a best-fit, balanced-tree based allocator. The allocator
     10        required a balanced tree that does not allocate memory and that
     11        permits the removal of individual nodes directly (as opposed to by
     12        key); neither AVLTree nor WebCore's PODRedBlackTree supported this.
     13        Changed all references to executable code to use a reference counted
     14        handle.
     15
     16        No new layout tests because behavior is not changed.  New API unit
     17        tests:
     18        Tests/WTF/RedBlackTree.cpp
     19        Tests/WTF/MetaAllocator.cpp
     20
     21        * ForwardingHeaders/wtf/MetaAllocatorHandle.h: Added.
     22
    1232011-09-10  Sam Weinig  <sam@webkit.org>
    224
  • trunk/Tools/ChangeLog

    r94917 r94920  
     12011-09-01  Filip Pizlo  <fpizlo@apple.com>
     2
     3        The executable allocator makes it difficult to free individual
     4        chunks of executable memory
     5        https://bugs.webkit.org/show_bug.cgi?id=66363
     6
     7        Reviewed by Oliver Hunt.
     8       
     9        Introduced a best-fit, balanced-tree based allocator. The allocator
     10        required a balanced tree that does not allocate memory and that
     11        permits the removal of individual nodes directly (as opposed to by
     12        key); neither AVLTree nor WebCore's PODRedBlackTree supported this.
     13        Changed all references to executable code to use a reference counted
     14        handle.
     15
     16        * TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj:
     17        * TestWebKitAPI/Tests/WTF/MetaAllocator.cpp: Added.
     18        (TestWebKitAPI::TEST_F):
     19        * TestWebKitAPI/Tests/WTF/RedBlackTree.cpp: Added.
     20        (TestWebKitAPI::Pair::findExact):
     21        (TestWebKitAPI::Pair::remove):
     22        (TestWebKitAPI::Pair::findLeastGreaterThanOrEqual):
     23        (TestWebKitAPI::Pair::assertFoundAndRemove):
     24        (TestWebKitAPI::Pair::assertEqual):
     25        (TestWebKitAPI::Pair::assertSameValuesForKey):
     26        (TestWebKitAPI::Pair::testDriver):
     27        (TestWebKitAPI::TEST_F):
     28
    1292011-09-10  Andy Estes  <aestes@apple.com>
    230
  • trunk/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj

    r94812 r94920  
    88
    99/* Begin PBXBuildFile section */
     10                0FC6C4CC141027E0005B7F0C /* RedBlackTree.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0FC6C4CB141027E0005B7F0C /* RedBlackTree.cpp */; };
     11                0FC6C4CF141034AD005B7F0C /* MetaAllocator.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0FC6C4CE141034AD005B7F0C /* MetaAllocator.cpp */; };
    1012                1A02C84F125D4A8400E3F4BD /* Find.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 1A02C84E125D4A8400E3F4BD /* Find.cpp */; };
    1113                1A02C870125D4CFD00E3F4BD /* find.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 1A02C84B125D4A5E00E3F4BD /* find.html */; };
     
    131133
    132134/* Begin PBXFileReference section */
     135                0FC6C4CB141027E0005B7F0C /* RedBlackTree.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = RedBlackTree.cpp; path = WTF/RedBlackTree.cpp; sourceTree = "<group>"; };
     136                0FC6C4CE141034AD005B7F0C /* MetaAllocator.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = MetaAllocator.cpp; path = WTF/MetaAllocator.cpp; sourceTree = "<group>"; };
    133137                1A02C84B125D4A5E00E3F4BD /* find.html */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.html; path = find.html; sourceTree = "<group>"; };
    134138                1A02C84E125D4A8400E3F4BD /* Find.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = Find.cpp; sourceTree = "<group>"; };
     
    368372                        isa = PBXGroup;
    369373                        children = (
     374                                0FC6C4CE141034AD005B7F0C /* MetaAllocator.cpp */,
     375                                0FC6C4CB141027E0005B7F0C /* RedBlackTree.cpp */,
    370376                                A7A966DA140ECCC8005EF9B4 /* CheckedArithmeticOperations.cpp */,
    371377                                C01363C713C3997300EF3964 /* StringOperators.cpp */,
     
    578584                                C085880013FEC3A6001EF4E5 /* InstanceMethodSwizzler.mm in Sources */,
    579585                                37DC678D140D7C5000ABCCDB /* DOMRangeOfString.mm in Sources */,
     586                                0FC6C4CC141027E0005B7F0C /* RedBlackTree.cpp in Sources */,
     587                                0FC6C4CF141034AD005B7F0C /* MetaAllocator.cpp in Sources */,
    580588                                A7A966DB140ECCC8005EF9B4 /* CheckedArithmeticOperations.cpp in Sources */,
    581589                                939BA91714103412001A01BD /* DeviceScaleFactorOnBack.mm in Sources */,
Note: See TracChangeset for help on using the changeset viewer.