2

I need GPU to do some computations, and I pass shader an int array using glTexImage2D (internal format is GL_RGBA). I use texture() to get the data, but it returns a vec4 value. Is it possible to convert vec4 to int in GLSL?

Alex
  • 396
  • 5
  • 13
KiBa1215
  • 23
  • 1
  • 5
  • Keep in mind that the average embedded system does not have the same processing power as a regular computer, depending on how your game looks you might not have that much gpu processing power to spare for physics (which I assume is your intent). – Daniel Carlsson Jun 24 '15 at 08:26

2 Answers2

3

I think you can convert it manually by manipulating the bytes yourself.

vec4 bytes = ivec4(col * 255);
uint integerValue = (bytes.r << 24) | (bytes.g << 16) | (bytes.b << 8) | (bytes.a)

The code is untested, but the basic idea is to convert your normalized RGBA colors (0-1) into 0-255, and shifting them appropriately and OR-ing the bytes together.

acidleaf
  • 66
  • 4
  • It's helpful. But in my shader, uvec4 bytes = uvec4(tex * 255.0);uint i = (bytes.a << 24) | (bytes.b << 16) | (bytes.g << 8) | (bytes.r); works for me. :) Thank you anyway. – KiBa1215 Jun 25 '15 at 07:32
  • Note that GL ES only implements bitshifts in version 3. This solution wouldn't work on gl es < 3 (i.e. webgl 1) – Edwin Joassart Aug 31 '22 at 07:23
2

Shaders are heavily optimized for floating point, not integer math. The four values you will get are your RGBA values, each from 0 to 1. That's what the texture is meant to represent, and that's what you'll get.

If you need to convert these four values to one larger value, you are more than welcome to do something like

highp float encoded = (sampled.a * 256) + (sampled.b * 65536) + (sampled.g * 16777216) + (sampled.r * 4294967296);

Or the other way around if you encoded your values in big endian.

Remember that the resulting value is a float, and you may have precision problems if you start jockeying with the result.

Panda Pajama
  • 13,395
  • 4
  • 44
  • 77
  • Thank you. i tried your approach, and now facing a problem: assumed that i have passed an int array(such as "123, 123, 123, ..., 123")to shader, but why what i get in shader through texture() is always vec4(0,0,0,1)?? @Panda Pajama – KiBa1215 Jun 24 '15 at 09:17
  • @KiBa1215: That would be a completely different problem. Are you loading the texture correctly? Is the sampler set correctly? Is the uniform value for your sampler set correctly? – Panda Pajama Jun 24 '15 at 09:46
  • Note that GL ES only implements bitshifts in version 3. This solution works also on gl es < 3 (i.e. webgl 1) – Edwin Joassart Aug 31 '22 at 07:22
  • @EdwinJoassart The snippet in my answer doesn't do bitshifts. – Panda Pajama Sep 24 '22 at 08:13
  • @PandaPajama that's what I'm saying. This is a great alternative to bitshifting which is not available below gles 3 – Edwin Joassart Sep 24 '22 at 15:32