From:  David Teller <dteller@mozilla.com>
Date:  30 May 2017 20:44:21 Hong Kong Time
Newsgroup:  news.mozilla.org/mozilla.dev.tech.js-engine.internals
Subject:  

Re: Possible performance regressions across different versions of SpiderMonkey in interpreted mode

NNTP-Posting-Host:  63.245.214.181

   Hi Nathan,

 I'm sorry I can't answer your questions. I haven't heard of any
regression, but I don't think we perform regular testing on RPi. Do you,
by any chance, have profiles for both versions? I can't promise that
someone will have time to look at them, but if someone does, it would be
helpful.

Cheers,
 David

On 5/16/17 3:18 AM, nathanrogers--- via dev-tech-js-engine-internals wrote:
> Hello,
> 
> I work on a project called Cobalt (cobalt.foo), it is an application container designed to run web apps (namely youtube.com/tv) on embedded TV devices.  We use SpiderMonkey as our JavaScript engine, and are currently in the process of rebasing from SpiderMonkey 24 to SpiderMonkey 45.  On our application specific benchmarks, which measure input latency, which effectively measures total JavaScript execution time, we are noticing a performance regression of about 40% when running in interpreted mode on a Raspberry Pi 1, which is our reference low end platform.
> 
> While we plan on investigating as much as possible on our side as well, I would like to ask, is a performance regression of this magnitude to be expected?  Our configuration of SpiderMonkey, HTML application, and bindings code are all held about as constant as they can be.  If yes, then are there any high ROI configuration (or even code) changes that we can make in order to best mitigate this, and if no, then in what areas should we begin looking into first?
> 
> Thanks,
> 
> -Nathan
> _______________________________________________
> dev-tech-js-engine-internals mailing list
> dev-tech-js-engine-internals@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-tech-js-engine-internals
>