Path: chuka.playstation.co.uk!scea!greg_labrec@interactive.sony.com From: mperdue@iquest.net (Mario Perdue) Newsgroups: scea.yaroze.programming.2d_graphics Subject: Re: More GsBG questions Date: Thu, 24 Apr 1997 15:19:41 GMT Organization: SCEA Net Yaroze News Lines: 89 Message-ID: <335f711f.2848316@205.149.189.29> References: <335F08E0.7758@charlie.cns.iit.edu> NNTP-Posting-Host: ind-0010-15.iquest.net X-Newsreader: Forte Free Agent 1.1/32.230 Hi Ed, >1) Why are 4-bit CLUT images reversed? If you look at the >image definitions for my tile1_image_data[] array, you'll >see that I needed to do some wierd byteswapping to get the >image to display the way I wanted it to. I started >experimenting with 15bit-direct images (with sprites), and >they were stored intuitively, not requiring byteswapping. That's just the way the data is stored. I'll post my file on the TIM format today. It's not the 'official' document, but it's based on a fax of that document. It will be a lot clearer after you see the format. >2) Performance: Is there a general difference in the >performance of working with 4bit vs. 8bit CLUT vs 15bit vs >24bit direct images? Just a generalization is all I'm after >at this point. I would assume that the CLUT images are >slower, since the routines or GPU must translate the CLUT >info into real RGB values before the image is painted into >the display buffer. You're probably right, but there hasn't been enough of a difference that I've noticed it. You may want to do some timing tests to verify it. >3) You'll notice in the attached code, when I define the >second cell, which LoadImage() puts his texture right next >to the texture of the first cell, I need to define it's >".u" value as being 16, even though it is physically in >VRAM 4 pixels to the right of the first cell. Apparently >the routines/(GPU?) automatically adjust the ".u" member >depending on if it is a 4bit/8bit CLUT or 15bit direct >texture? Is that right? I think you may be confusing shorts, bytes and pixels. You're working with an image that has a 4 bit color depth (16 colors). You've defined the image using shorts, with 4 shorts defining each line of the image. Each short is two bytes wide, and each byte holds 2 pixels (4 bits per pixel). That means the image is 16 pixels wide. So, in VRAM your image is 4 shorts to the right of the first image, which translates to 16 pixels. In short, yes, the color depth is factored in when determining a pixel's location. >4) Why is GsSortClear() somehow different than the other >GsSort() functions? All the GsSort functions do is add a >command to the list of GPU commands in the given OT, >right? They don't actually set the GPU into action, >right? Then later you tell the GPU to start working on >the complete list of commands, right? >Well, it's almost as if when preparing the commands, the >GsSort() funcs don't need to know about the current >display buffer (the GPU will figure out where exatally >in VRAM the drawing commands should write to when the >time comes), but GsSortClear() seems to peek at the >current display buffer to figure out what VRAM area to >create a clear command for. I don't get it. > Works: Gives Black Screen: > Clear OT Clear OT > Call GsSort() funcs Call GsSort() funcs > VSync/SwapBuffer Call GsSortClear() > Call GsSortClear() VSync/SwapBuffer > Draw OT Draw OT GsSortClear() works pretty much like all the other GsSortXxx() functions. In the example that works, you do the GsSortClear() AFTER you swap the display buffers. In the other example, you do the GsSortClear() BEFORE you you swap display buffers. GsSwapDispBuff() does a lot more than just changing the pointer to the active display buffer. Among other things, it sets the 2D clipping rectangle to prevent you from accidently writing to the displayed image. In the first example, the GsSortClear() can't do much because the clipping rectangle has been moved. In the second, it does exactly what you told it to do and clears the screen. >5) When I define the second of my double buffers via >GsDefDispBuf() to be on the VRAM y-location of 256 (as is >done in some examples), rather than *immediatly* below the >first buffer (at 240), the GsSortFixBg16() routines draws >the BG 16 pixels lower on the odd frames!!! I don't know what's going on here, I'll have to play around with it a bit. Mario