Announcing 7.0.0-alpha.2
In this version, we are announcing significant performance improvements! Let's see how caches can assist us in accomplishing such a feat!
Contextโ
The current container designโ
Like previous versions of Inversify, when a service resolution is requested by calling container.get, a planner schedules the bindings to be used in a tree structure, which is then used to build the instances that compose the requested service.
Adding a caching strategyโ
Previous versions of Inversify don't cache plans. It might sound surprising, but it's the way it is. Previous versions of Inversify expose too many internal APIs, making it impossible for plans to be deterministic. For example, previous versions of Inversify give container access via context to binding constraints, allowing the possibility to cause side effects in the planning phase!
With some reasonable restrictions, inversify@7 keeps the same degree of flexibility with deterministic internal plans. But, wait, if plans are deterministic, they can be cached!
Why cache plans?โ
You don't need to compute a cached plan again, and the garbage collector doesn't need to clear memory of a non-computed cached plan, which is an operation heavier than you might expect when optimizing these micro tasks.
Resultsโ
After running some benchmarks, Inversify 7 is faster than Inversify 6 when working with singleton scope services and way faster when working with transient scoped services.
We will use a faster container for reference. Tsyringe is the perfect candidate: their simplicity allows resolving services without any planning phase, and therefore it's going to be way faster than Inversify.
Benchmark implementation can be observed at the @inversifyjs/container-benchmarks package. Feel free to run them or have a look at any of the latest monorepo PRs which include benchmark results.
As of today, the latest benchmark results are as follows:
[CJS] Running benchmarks...
Get service in singleton scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '582.96 ยฑ 0.80%' โ '561.00' โ '1758258 ยฑ 0.01%' โ '1782531' โ 1715395 โ
โ 1 โ 'inversify6' โ '1136.19 ยฑ 0.35%' โ '982.00' โ '961212 ยฑ 0.03%' โ '1018330' โ 880137 โ
โ 2 โ 'tsyringe' โ '329.02 ยฑ 1.10%' โ '291.00' โ '3271688 ยฑ 0.01%' โ '3436426' โ 3039353 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
Get service in transient scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '1023.52 ยฑ 0.29%' โ '992.00' โ '1003603 ยฑ 0.01%' โ '1008065' โ 977022 โ
โ 1 โ 'inversify6' โ '5181.39 ยฑ 0.41%' โ '4960.00' โ '198404 ยฑ 0.03%' โ '201613' โ 192999 โ
โ 2 โ 'tsyringe' โ '466.06 ยฑ 0.85%' โ '441.00' โ '2261272 ยฑ 0.01%' โ '2267574' โ 2145645 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
Get complex service in singleton scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '614.26 ยฑ 1.74%' โ '591.00' โ '1673265 ยฑ 0.01%' โ '1692047' โ 1627989 โ
โ 1 โ 'inversify6' โ '1061.52 ยฑ 0.48%' โ '1001.00' โ '989393 ยฑ 0.01%' โ '999001' โ 942048 โ
โ 2 โ 'tsyringe' โ '335.53 ยฑ 0.58%' โ '311.00' โ '3171618 ยฑ 0.01%' โ '3215434' โ 2980330 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
Get complex service in transient scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '349104.25 ยฑ 0.51%' โ '340334.00' โ '2893 ยฑ 0.28%' โ '2938' โ 2865 โ
โ 1 โ 'inversify6' โ '5023278.98 ยฑ 1.16%' โ '4783030.50 ยฑ 4958.50' โ '200 ยฑ 1.00%' โ '209' โ 200 โ
โ 2 โ 'tsyringe' โ '249415.89 ยฑ 0.59%' โ '237333.00' โ '4088 ยฑ 0.32%' โ '4213' โ 4011 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
[ESM] Running benchmarks...
Get service in singleton scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '578.15 ยฑ 0.88%' โ '541.00' โ '1809935 ยฑ 0.01%' โ '1848429' โ 1729642 โ
โ 1 โ 'inversify6' โ '904.50 ยฑ 0.33%' โ '851.00' โ '1159099 ยฑ 0.01%' โ '1175088' โ 1105586 โ
โ 2 โ 'tsyringe' โ '287.15 ยฑ 1.12%' โ '261.00' โ '3722553 ยฑ 0.01%' โ '3831418' โ 3482456 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
Get service in transient scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '837.39 ยฑ 0.28%' โ '802.00' โ '1232351 ยฑ 0.01%' โ '1246883' โ 1194187 โ
โ 1 โ 'inversify6' โ '4868.25 ยฑ 0.32%' โ '4679.00' โ '211104 ยฑ 0.03%' โ '213721' โ 205413 โ
โ 2 โ 'tsyringe' โ '494.17 ยฑ 0.40%' โ '461.00' โ '2118431 ยฑ 0.01%' โ '2169197' โ 2023611 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
Get complex service in singleton scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '606.92 ยฑ 1.72%' โ '571.00' โ '1739723 ยฑ 0.01%' โ '1751313' โ 1647652 โ
โ 1 โ 'inversify6' โ '924.23 ยฑ 2.51%' โ '852.00' โ '1153936 ยฑ 0.01%' โ '1173709' โ 1081983 โ
โ 2 โ 'tsyringe' โ '307.19 ยฑ 0.48%' โ '290.00' โ '3432297 ยฑ 0.01%' โ '3448276' โ 3255286 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
Get complex service in transient scope
โโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโ
โ (index) โ Task name โ Latency average (ns) โ Latency median (ns) โ Throughput average (ops/s) โ Throughput median (ops/s) โ Samples โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ 'inversifyCurrent' โ '249696.67 ยฑ 0.47%' โ '241791.00' โ '4047 ยฑ 0.24%' โ '4136' โ 4005 โ
โ 1 โ 'inversify6' โ '4596691.40 ยฑ 1.18%' โ '4710800.50 ยฑ 16420.50' โ '219 ยฑ 0.99%' โ '212 ยฑ 1' โ 218 โ
โ 2 โ 'tsyringe' โ '235461.28 ยฑ 0.43%' โ '227424.00' โ '4296 ยฑ 0.25%' โ '4397' โ 4247 โ
โโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโ
As you can see, inversify@7 performs way better than inversify@6 in every scenario, especially when working with transient scopes. It's slower than tsyringe, but now I honestly believe Inversify is fast enough while providing a more flexible API.
What's nextโ
Further performance optimizations can be accomplished. It's out of the scope of inversify@7, but let me say that we will make Inversify faster than tsyringe while providing the same flexible API. If you want to be part of it, don't hesitate to open a discussion in the monorepo and join us in this adventure!
