-
Notifications
You must be signed in to change notification settings - Fork 216
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Runtime Integrations Performance Plan #698
Comments
Just a couple of comments: I understand Cesium's priorities, but it might be useful to give a nod to native clients of cesium-native in general, not just those maintained by Cesium. I'm selfishly thinking of vsgCs, but there are / will be others, and the performance of cesium-native may be easier to characterize against a lighter-weight client than Unreal et al. I have had great luck with Tracy and have annotated parts of vsgCs that implement the cesium-native interfaces with the Tracy macros. It would be awesome if the cesium-native tracing annotations "just worked" with Tracy; maybe they do, but I haven't tried it yet. At least some of the performance impact of cesium-native on the client comes from the quality of the glTF model representation of the tiles. #610 would be useful for testing that, among other things. |
One thing that we should consider is identical test content across platforms where this is feasible, so that we can establish and maintain a consistent performance baseline across Unity, Unreal, Omniverse and others. |
@timoore, great comments. And yes, we should absolutely consider other projects using cesium-native. I've updated the verbiage for this. @r-veenstra Agreed. Lots of value in finding performance consistency (or inconsistency) among Cesium's integrations. I've updated the description, adding sections for Performance Reporting and Testing. There are more ideas and details need to be fleshed out... Comments still very welcome. |
Updated the description to add 'Unknowns' and 'Proposed Task Schedule' sections. I think the main ideas are fleshed out. There still unknowns as to the role of cesium-native and how to maintain performance testing consistency. At the very least, we are asking good questions. |
A couple of notes from my team at Cesium:
Making these easy to measure would be very helpful to us! |
Closing this. I believe we've reaped a good deal of value from this issue, and don't believe the work remaining is worthwhile to pursue, at least right now. And if there is, we should probably write it up in a smaller, more targeted issue. So please do so if any of your ideas got lost. Here are some highlights since this issue was written:
Some outstanding questions... (answered to the best of my ability)
Originally this issue created a plan, but along the way supported a methodology:
Thank you to everyone who contributed to this issue! |
(currently work in progress, feedback welcome)
Let this issue propose a methodology for determining performance and to answer...
Is our product performant?
What are the questions we should ask?
How do we answer them?
Applies to Cesium's runtime integrations, but also extends to any project that uses cesium-native in the community.
cesium-unreal
cesium-unity
cesium-omniverse
Background
When we say performance, what are we referring to?
How do we measure performance?
The Case for Performance Reporting
We need a baseline. A performance report shows how our products are doing, quickly, without a deep dive of technical data.
The details should be intuitive, whether the reader is in programming, project management, or business development.
Example Report Output
In the above example, the Cesium for Unreal plugin is being tested for load time, under various conditions. 3 test instances were run with each line corresponding to data about its conditions and passing state.
Reports should help with hard questions
A programmer might ask:
A business developer may ask:
A product owner could ask:
Performance Testing
Reports summarize based on data, which comes from performance testing.
Performance tests answer specific questions to predefined use cases. The best tests are simple ones. Ex. "How long did a level take to load?". Add more conditions of the test to create more coverage. Create enough tests with enough coverage, and you are closer to a comprehensive picture of performance.
Test data (even a report) should be generated without user intervention. The ultimate goal would be to run under continuous integration with extensive coverage across device types and use cases.
Example Tests
Test 1: Load times when starting an app
A pseudo-definition of how we could conceptually define a test.
The common section defines what any runtime integration could implement. It should be described well enough to:
This shows 2 states the cache can be in (empty or warm) in addition to 3 different locations to start in. This means at minimum 6 (2x3) different instances of the test should be run.
The client specific section would define additional aspects that cesium-native would not know about. This includes the versions of Unreal or platforms that are being built. This would add more conditions sets, multiplying the total runs by 3, then 5, totaling 90 (6x3x5).
Already we see that judicious of use of conditions will be needed to keep report generation time reasonable.
Test 2: Per-tile load times
Test 3: Frame rate when flying from location to location
The Role of Cesium-Native
Cesium-native does not know the about the project that is using it, but can provide supporting tools and methodology.
It is ultimately the runtime integration's responsibility to integrate performance tests into the project.
Research will be required here:
Unknowns
What will be the integration workflow to adopt performance testing?
For example, if I'm developing for the Cesium for Unreal plugin and want to see performance reporting, I would:
This sounds ok, but could it be better?
What does this offer implementations on mobile devices (tablets and phones)? Traditionally it is hard to integrate automated performance testing here. Does the shared nature of cesium-native give us an advantage?
What does this offer VR / AR / XR headset development?
How far into development workflow does this extend? As prevalent as unit tests in CI?
Proposed Task Schedule
(replace with actual issues as they are created)
Reference
Performance improvements for unreal and native
CesiumGS/cesium-unreal#867
Load tracing in cesium native
#267
The text was updated successfully, but these errors were encountered: