Adam Caramon said: keithcurtis said: The problem is that it's not a universal issue. Others have chimed in to say that they (like I) have not experienced this. There's only so many things that can be tested for. There are a LOT of variables in play: OS, browser, extensions, hardware, including graphics cards), ISPs, Real world feedback is important. Which would be a good reason to release these kind of features on the development server first, and ask a wide array of users to test, right? You could even encourage testing on the DEV server by offering some cheap art pack to all users who perform a testing script on the DEV server and report any issues they run into. FWIW, here is a quick narrated recording of the Chrome Task Manager showing spikes in CPU usage when Roll20 is active on screen: <a href="https://youtu.be/7e7bFv9mcNY" rel="nofollow">https://youtu.be/7e7bFv9mcNY</a> -Adam Interesting. I get similar spikes on my Chromebook, but nothing that I could detect during the play experience—no lag, for instance. On my Mac, I got nothing above an 11.8% spike while scrolling around a very large DL map, but most of the time was around 3-4%. The Mac test might not be relevant though. The M1 handles memory and graphics very differently. This problem alsoseemed to hit about the same time as a Windows update? Since I don't use Windows, that's a possibility as well. A reward for posting reports on the Dev server is a good idea, but I would think it would have to be merit-based somehow, to prevent "Yeah, looks good" reports from people who didn't really test anything. Maybe a bounty for bugs? As it was, the old system of posting on Dev and asking for feedback did very little. Reports were very few, and could be skewed by the fact that only Pro users could access it. I agree a better testing system is needed, but I'm not sure if it would have really helped in this particular case. For example, despite several reports, I don't know how many people have bothered to return any useful info to Roll20 (Console logs, repro steps, machine specs, differences in game size or complexity, diligent testing on multiple browsers or disabling extensions, etc.). A testing script is also only as good as the problems you expect to find. It's tough getting useful info, which is why they had to resort to a form for this one? I don't know what the form looks like, since I'm not experiencing the issue.