Date:  16 May 2017 09:18:34 Hong Kong Time

Possible performance regressions across different versions of SpiderMonkey in interpreted mode

NNTP-Posting-Host:  2620:0:1006:1:186f:e19f:36fb:572


I work on a project called Cobalt (, it is an application container designed to run web apps (namely on embedded TV devices.  We use SpiderMonkey as our JavaScript engine, and are currently in the process of rebasing from SpiderMonkey 24 to SpiderMonkey 45.  On our application specific benchmarks, which measure input latency, which effectively measures total JavaScript execution time, we are noticing a performance regression of about 40% when running in interpreted mode on a Raspberry Pi 1, which is our reference low end platform.

While we plan on investigating as much as possible on our side as well, I would like to ask, is a performance regression of this magnitude to be expected?  Our configuration of SpiderMonkey, HTML application, and bindings code are all held about as constant as they can be.  If yes, then are there any high ROI configuration (or even code) changes that we can make in order to best mitigate this, and if no, then in what areas should we begin looking into first?