๐Ÿ“š ์„ธํ˜„'s Vault

๐ŸŒ ๋„๋ฉ”์ธ

  • ๐Ÿ”ฎ3D-Vision
  • ๐ŸŽจRendering
  • ๐Ÿค–Robotics
  • ๐Ÿง LLM
  • ๐Ÿ‘๏ธVLM
  • ๐ŸŽฌGenAI
  • ๐ŸฅฝXR
  • ๐ŸŽฎSimulation
  • ๐Ÿ› ๏ธDev-Tools
  • ๐Ÿ’ฐCrypto
  • ๐Ÿ“ˆFinance
  • ๐Ÿ“‹Productivity
  • ๐Ÿ“ฆ๊ธฐํƒ€

๐Ÿ“„ Papers

  • ๐Ÿ“š์ „์ฒด ๋…ผ๋ฌธ172
Home

โฏ

bookmarks

โฏ

im not entirely sure why webgpu lacks support for 16 bit uin

im-not-entirely-sure-why-webgpu-lacks-support-for-16-bit-uin

2024๋…„ 3์›” 27์ผ1 min read

  • Dev-Tools
  • web-graphics
  • threejs

Renaud (@onirenaud)

2024-03-27 | โค๏ธ 59 | ๐Ÿ” 5


Iโ€™m not entirely sure why WebGPU lacks support for 16-bit (u)integers, but to ensure an easier transition from WebGLRenderer to WebGPURenderer, Iโ€™ve implemented a fallback to 32-bit integers in the WebGPU backend: https://github.com/mrdoob/three.js/pull/28008

threejs


Tags

domain-web-graphics domain-rendering


๊ทธ๋ž˜ํ”„ ๋ทฐ

  • Renaud (@onirenaud)
  • Tags

๋ฐฑ๋งํฌ

  • domain-Dev-Tools

Created with Quartz v4.5.2 ยฉ 2026

  • GitHub
  • Sehyeon Park