Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Model throws console warning and does not load #9237

Open
1 of 7 tasks
joshnice opened this issue Nov 7, 2024 · 4 comments
Open
1 of 7 tasks

[Bug] Model throws console warning and does not load #9237

joshnice opened this issue Nov 7, 2024 · 4 comments
Labels

Comments

@joshnice
Copy link

joshnice commented Nov 7, 2024

Description

Hi,

I have a model which I tried loading into the latest version of deck.gl however it does not load and I get this warning. This model did work in deck.gl version 8.

I understand this may be a Luma.gl and the issue may not be fixed, but do you know what I can do to the model to get it to load?

I was using the model on a Scenegraph layer being added to a Mapbox overlay.

"[.WebGL-00007A84000E4D00] GL_INVALID_OPERATION: Insufficient buffer size."

Locker.zip

Flavors

  • Script tag
  • React
  • Python/Jupyter notebook
  • MapboxOverlay
  • GoogleMapsOverlay
  • CartoLayer
  • ArcGIS

Expected Behavior

The model to load

Steps to Reproduce

Load file into DeckGl which is attached.

Environment

  • Framework version: Latest
  • Browser: Chrome
  • OS: Window

Logs

[.WebGL-00007A84000E4D00] GL_INVALID_OPERATION: Insufficient buffer size.

@joshnice joshnice added the bug label Nov 7, 2024
@donmccurdy
Copy link
Collaborator

donmccurdy commented Nov 14, 2024

It seems the problem is caused by the uint8 indices in this model. I'm not sure where the fix would belong, but you can work around the issue by switching to uint16 indices. After loading the model into https://gltf.report/, run the following in the script panel in the sidebar...

for (const mesh of document.getRoot().listMeshes()) {
  for (const prim of mesh.listPrimitives()) {
    const indices = prim.getIndices();
    if (indices) {
      indices.setArray(new Uint16Array(indices.getArray()));
    }
  }
}

... then re-export from the right-hand panel.

@ibgreen
Copy link
Collaborator

ibgreen commented Nov 14, 2024

I suppose we could add a simple step to loaders.gl post processing that converts indices. I'd prefer that to trying to chase down support for 8 bit indices across the frameworks:

  1. WebGPU doesn't support 8 bit indexes.
  2. I have lately dealt with index in other contexts (Arrow support) and come to favor always using 32 bit indices. It simplifies the code that uses the indexes
  3. the savings from using 8 and 16 bits are usually not significant, since they only work for small models where the savings won't be that big. A really big model where size savings matter then most would need 32 bit indexes anyway.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Nov 14, 2024

@ibgreen skipping support for uint8 indices, and doing the conversion in loaders.gl, sounds very reasonable to me. The graphics drivers under WebGL and WebGPU are likely converting uint8 indices to uint16 or uint32 behind the scenes anyway:

@joshnice
Copy link
Author

It seems the problem is caused by the uint8 indices in this model. I'm not sure where the fix would belong, but you can work around the issue by switching to uint16 indices. After loading the model into https://gltf.report/, run the following in the script panel in the sidebar...

for (const mesh of document.getRoot().listMeshes()) {
  for (const prim of mesh.listPrimitives()) {
    const indices = prim.getIndices();
    if (indices) {
      indices.setArray(new Uint16Array(indices.getArray()));
    }
  }
}

... then re-export from the right-hand panel.

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants