JavaScript has long outpaced its original target applications, being used not only for coding complex web clients, but also web servers,
game development and even desktop applications. The most appealing advantage of moving applications to JavaScript is its capability to run
the same code in a large number of different devices. It is not surprising that many compilers target JavaScript as an intermediate language.
However, writing optimizations and analyses passes for a compiler that emits JavaScript is challenging: a long time spent in optimizing the code
in a certain way can be excellent for some browsers, but a futile effort for others. For example, we show that applying JavaScript code optimiza-
tions in a tablet with Windows 8 and Internet Explorer 11 increased performance by, on average, 5 times, while running in a desktop with
Windows 7 and Firefox decreased performance by 20%. Such a scenario demands a radical new solution for the traditional compiler optimiza-
ow. This paper proposes collecting web clients performance data to build a crowdsourced compiler
ag suggestion system in the cloud that
helps the compiler perform the appropriate optimizations for each client platform. Since this information comes from crowdsourcing rather than
manual investigations, fruitless or harmful optimizations are automatically discarded. Our approach is based on live measurements done while
clients use the application on real platforms, proposing a new paradigm on how optimizations are tested.