PerfDog’s development started due to the explosive popularity of PUBG: Battlegrounds. In 2018, PUBG took the world by storm, significantly boosting global PC hardware sales by 40% that year. The game development team aimed to port this game to mobile platforms, maintaining the same gameplay and PC-like experience, leading to the development of PUBG Mobile. This was the first time the PUBG team attempted to create a next-gen quality game using the UE engine on a mobile platform, presenting a significant challenge.
During the performance testing phase, the team faced cross-platform testing issues between iOS and Android. For iOS, applicable Xcode tools only worked in Mac or debug environments. Given the million-plus lines of code in a UE engine game, compiling in debug mode without errors took at least a day, making the process inefficient. Furthermore, there was a lack of comparative data on competitors’ performance and industry standards.
On Android, its openness led manufacturers to release customized system versions with thousands of testing tools, but no unified tool for consistent analysis. Compatibility and accuracy issues further compounded testing challenges, resulting in frequent retests and misjudgments.
Faced with these challenges, Awen decided to lead the team in creating a solution. Their goal was to develop a simple, highly usable tool capable of supporting all mobile platforms. This marked the inception of PerfDog, which showed promising results in internal Tencent projects. As it was adopted for more projects, feedback from developers, testers, product managers, and designers helped refine PerfDog, leading to its public release in 2019.
From a game operator’s perspective, development teams strive to deliver better game experiences to players. With PUBG Mobile being seen as a potential phenomenon, OEMs paid close attention to its performance on their devices, sending their flagship phones to the game team for performance testing. This collaboration highlighted external demand for performance testing, prompting the PerfDog team to extend their tools to OEMs, enhancing player experiences across devices.
As mobile gaming grew, Awen and his team noticed players and tech YouTubers using PerfDog for performance evaluations. Users preferred PerfDog over traditional benchmarking, which only indicated hardware performance but not necessarily gaming experience quality. PerfDog effectively identified hardware performance issues and bugs, crucial factors affecting gameplay.
Social media and YouTube influenced players’ perceptions of device performance, impacting purchasing decisions, which in turn encouraged OEMs to optimize their hardware to address weaknesses identified by PerfDog. Sometimes, issues stem from inadequate GPU or chip optimization, prompting collaboration between OEMs and chip manufacturers for targeted improvements.
Through these partnerships, led by the shared goal of enhancing gaming performance, PerfDog fostered an ecosystem of continuous improvement, involving game companies, OEMs, and tech influencers.
Initially designed for gaming, PerfDog’s applicability caught the attention of non-gaming teams, demonstrating its versatility. By opening up to external manufacturers, PerfDog found applications across smartphones, chips, and IoT devices, evolving into a universal testing tool suitable for diverse scenarios.
To ensure reliability and professional performance metrics, Awen’s team continuously validates PerfDog’s accuracy. Collaborations with the National Institute of Metrology helped PerfDog achieve authoritative certification. Legal compliance aided by Tencent’s international legal team further solidified PerfDog’s standing.
Awen recalls, “Initially, PerfDog was to serve the internal PUBG project. Later, it expanded within Tencent, then to domestic manufacturers. Eventually, international partners, including Supercell, Riot, EA, and companies like Samsung and SK, expressed interest, leading to PerfDog’s global launch in late 2019. By 2020, its presence at GDC garnered even more attention.”
PerfDog has evolved from a test tool to integrating cloud services and the PerfDog Service for industrial performance management. It also developed products like PerfSight and CrashSight for user game performance solutions and crash analysis, offering comprehensive and convenient gaming performance monitoring. PerfDog now supports a wide range of devices, from mobile phones to VR, providing end-to-end performance testing for apps, videos, browsers, and networks.
PerfDog features several intuitive metrics developed from team experience, including unique indicators like Jank, Smooth, and Frame Power (FPower).
At WWDC 18, Apple introduced Frame Pacing, highlighting that a high frame rate doesn’t always equate to a smooth experience. Comparing left (40 FPS) and right (30 FPS) frames, the left shows obvious stutter due to a frame exceeding 100ms, while the right maintains a consistent 33ms/frame.
Though early Android had a reputation for lag, Google’s Project Butter - Jank, part of Android 4.4, introduced quantifiable smoothness metrics. Google Jank Calculation Logic: Considering the visual perception and the interval of v-sync in hardware, if the display does not refresh for a subsequent v-sync, it is considered a jank, meaning the next v-sync did not trigger a new frame refresh.
Awen observed that FPS does not fully encompass the user experience, necessitating the integration of multiple metrics for comprehensive analysis. The initial methods combining FPS and Jank yielded discrepancies with user perception, which led to PerfDog's refined Jank metrics. Following the upgrade, the PerfDog team launched the enhanced PerfDogJank indicator and promoted it to the gaming industry. However, a new challenge soon emerged. By 2019, the majority of mobile devices had refresh rates of 60 fps, while flagship phones released after 2020 already had refresh rates of over 120 fps. The rapid upgrade of hardware has led to an increase in user expectations regarding game performance. Even the slightest lag may now be perceived by users, and the established PerfDogJank standard is no longer sufficient for performance testing. Consequently, in subsequent versions, a new indicator, SmallJank, has been designed to accurately reflect minor stutters during the game and restore the real user experience.
The Smooth Index offers a more precise evaluation of the fluidity of a game or app. As illustrated in the frame rate screenshot below, while the overall frame rate of the image remains relatively stable, there is a notable fluctuation in the time taken for a single frame, as reflected in the FrameTime. This demonstrates that in instances where the frame rate remains unaltered, stuttering persists. Furthermore, the duration and performance outcomes of each lag vary. Consequently, the user's perception differs, despite the number of stutters remaining consistent. The development team required more precise metrics to analyse this phenomenon. To this end, PerfDog has introduced the Smooth indicator.
During the course of their interactions with numerous PerfDog project teams, the developers conveyed their aspiration for the incorporation of additional scientific and quantitative indicators to assess power consumption performance. This is also the rationale behind the FPower metric (energy consumption per frame) in PerfDog.
Awen commented, From the user perspective, it is important that games and apps run on their device with high frame rates, high image quality, low heat generation and low power consumption. However, this approach to performance scheduling will inevitably result in a fragmented user experience. As the temperature of your phone rises and its energy consumption increases, the processor operates less frequently, which in turn results in lower frame rates and a stuttering, overheating user experience. The PerfDog team conducted further analysis of the time and energy consumption of each frame of the software, which led to the development of FPower.
The frame rate is dependent on waiting and operation, while power consumption is determined by operation and mobilisation. It is only through optimising computing that power consumption can be truly optimised.
In the early stages of development, the team may implement measures such as reducing the frame rate to optimise energy consumption. While these adjustments reduce power consumption, they can negatively impact the user experience. The objective is to achieve power reduction without a noticeable change in frame rate. FPower = Power/Frame Rate is a more accurate metric that has become a key performance indicator for power optimisation in PerfDog.
Q: Are there any breakthroughs or plans for PerfDog to access deeper-level metrics?
Awen: Absolutely, we aim to provide users with more comprehensive and in-depth low-level metric information. We are actively collaborating with hardware manufacturers like Qualcomm and Imagination. In the near future, we hope PerfDog will be able to offer users detailed information at the hardware and even driver levels, facilitating better and faster performance issue identification.
Q: In game projects, how do various numerical parameters affect the client, and which performance metrics should a new game focus on optimizing?
Awen: Games are incredibly complex from an app perspective, second only to operating systems. During game testing, looking at just one or a few metrics won’t reveal the true performance standards. We often advise examining a comprehensive range of metrics. Furthermore, differences between game types mean performance metrics can vary widely, so it’s not advisable to arbitrarily set standards for projects. I suggest using industry benchmarks from similar competitors as a reference for optimization. PerfDog already offers this functionality, and I encourage users to explore it.
Q: Can PerfDog use a single account to test multiple devices simultaneously?
Awen: Testing multiple phones with a single account on one computer is already a feature of PerfDog. By launching PerfDog software multiple times on a PC, you can test up to three phones simultaneously. Users might wonder why it’s limited to three devices. Given the extensive performance metrics and UI displayed during testing, testing too many phones simultaneously could lead to incomplete results or system overloads. We recommend testing on a maximum of three devices.
Are you curious about how PerfDog can elevate your game testing experience? Or perhaps you'd like to dive deeper into our other cutting-edge testing strategies? Either way, we'd love to hear from you. Our expert team is here to connect and provide you with the guidance and support you need to ensure your game testing is both efficient and precise.
Book a Meeting with us!
Furthermore, we cordially invite you to try out Tencent's UDT platform, a cloud-based solution that grants you remote access to devices and seamlessly integrates with your local test devices, thereby broadening your testing horizons. We firmly believe that UDT can bring unmatched convenience and efficiency to your game testing endeavors.
WeTest, with over a decade of experience in quality management, is an integrated quality cloud platform dedicated to establishing global quality standards and enhancing product quality. As a member of the IEEE, approved Global Game Quality Assurance Working Group, it is recognized for its commitment to quality assurance. WeTest has served over 10,000 enterprise clients across 140+ countries.
Focusing on advanced testing tools development, WeTest integrates AI technology to launch professional game testing tools such as PerfDog, CrashSight, and UDT (Next-Gen Multi-Terminal Unified Access Management Automated Testing Platform), aiding over a million developers worldwide in boosting efficiency. Additionally, WeTest offers comprehensive testing service solutions for mobile, PC, and console games, covering compatibility, security, functionality, localization testing and other various services, ensuring product quality for over one thousand game companies globally.
Give it a try for free today. Register Now!