Ah yes... Okay, I see. Thanks for the superclear update!
Jump to content
To add to Caleb's excellent summary, the difference in my technique for packing two values is that it encodes an integer using exactly representable floating point values (refer to some of those links I mention above), then decodes them on the shader side accordingly.
Unfortunately, I'm stuck with what GLES 2.0 gives me, so I'm still unsure whether I have an exact decoding or just a good approximation; thus the mention of a test. I'm at least heartened that it no longer wildly freaks out on some values. (To give an analogy, think about trying different identities for, say, 1 - cos(x). Mathematically they're all equal, but in real-world computing, some might give more accurate / stable results than others. In this case I'm aiming for an equation that is 100% exact.)
 - Up to 20 bits, so for example two integers from 0 to 1023, but able to just sneak in 1024 as well on a lowest-common-denominator device. Often these will only temporarily be integers, e.g. having undergone a math.floor(x * 1024) transformation, which will be undone once in the shader.
Wow, it's been a while. I've decided that, if uniform userdata is coming fairly soon, I'll just wait and go with that for clarity and "normal Corona-ness." I posted a topic about it here: https://forums.coronalabs.com/topic/59706-uniform-userdata/
Width of the layer in pixels, height of the layer in pixels (because GLSL sends coordinates in the range of [0,1]), tileset width in tiles, tileset height in tiles, width of each tile in pixels, height of each tile in pixels, and, preferably, margin and spacing of the tileset. That makes at least 6 inputs required, and ideally 8.
If you were willing to trade-off flexibility, might you require that the tileset scheme be one of a few supported configurations, then just pass a single number to describe it? (getting a 4-for-1 deal, otherwise you'd fall back to conventional tile rendering for "atypical" tileset schemes)
tileset scheme "1" might decode to: 256x256 sheet, 16x16 tiles
tileset scheme "2" might decode to: 512x512 sheet, 32x32 tiles
tileset scheme "3" might decode to: 512x512 sheet, 30x30 tiles, border 1 for aa edge extrude
Hm. Interesting idea. I'll think through it. Maybe a better way would be to encode only certain values as presets. That is, the layer width and height could be normal numbers, but the tile size could be presets, 'cause they're usually 16x16/32x32/64x64. Or, alternatively, I could assume people use square tiles and store tile size in one variable. Good stuff to think about; thanks for bringing this approach up.
Community Forum Software by IP.Board