Getting pixels from texture/image [OpenGL]

Five_stars

13-05-2009 18:29:26

I've go a strange (I think) problem. I've to read bytes from image. My code (based on viewtopic.php?f=3&t=4252&p=24307&hilit=data+pointer#p24307) works for DirectX renderer. But, in some reasons I've to use OpenGL.

My strangeness that at OpenGL I get only black pixels (0.0 0.0 0.0 0.0) at all kind of textures (png, jpg, dxt :), bmp :)).

At long last I can read pixels from Image, but there I got another problem, I can't convert any variable (1. from Image.getData (uchar*), 2. from Image.getPixelBox.data (int)) to my storage.

Code:
tex = None
data = None
width = None
height = None

if ogre.Root.getSingleton().getRenderSystem().getName() == 'OpenGL Rendering Subsystem':
img = ogre.Image()
img.load('mytex.jpg', ogre.ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME) # jpg or smth else

width = img.getWidth()
height = img.getHeight()

# data = ctypes.c_void_p(img.getPixelBox().data) ? Don't know what to do with data
# data = img.getData() ?
else: # DirectX case
tex = self.billboardEntity.getSubEntity(0).getMaterial().\
getTechnique(0).getPass(0).getTextureUnitState(0)._getTexturePtr()

width = tex.getWidth()
height = tex.getHeight()

data = tex.getBuffer().lock(0, tex.getWidth() * tex.getHeight() * 4, ogre.HardwareBuffer.LockOptions.HBL_READ_ONLY)

storageclass = ctypes.c_uint8* (width * height * 4)
buffer = storageclass.from_address(data)


Any solution, at least slow :), I can wait for 5 seconds :). Sorry, if it isn't Python-Ogre problem.

team23

21-05-2009 05:26:48

I don't believe its a Python-Ogre issue, I hit a similar issue with OpenGL on Windows and Mac, DirectX is working fine.

Found a topic on it a little while back:
http://www.ogre3d.org/forums/viewtopic.php?f=2&t=48284

No solution unfortunately.