I feel dumb asking this. I really do. For months, maybe years, I've been trying to do this, but I still can't. I can't create a Z-buffer in Direct3D. No matter what I try, I can't get it to work. I've tried everything, I've followed every tutorial I can find - it just doesn't work. AddAttachedSurface fails. Here's the code. It's obvious, it's extremely standard, and it doesn't work. Note that I'm actually trying to create a stencil buffer, but I can't create a Z-buffer either. Code: bool D3D7VideoDriver::setupSurfaces (int nW, int nH, int nBPP, int nDXFullFlag, int nFlags) { HRESULT hr; bool bRet = false; m_pDepthStencilSurf = NULL; m_pPrimSurf = NULL; m_pBackSurf = NULL; m_pD3D = NULL; m_pD3DDev = NULL; do { if (nDXFullFlag != 0) { hr = m_pDD->SetCooperativeLevel(m_hWnd, DDSCL_EXCLUSIVE | DDSCL_FULLSCREEN); if (FAILED(hr)) break; hr = m_pDD->SetDisplayMode(nW, nH, nBPP, 0, 0); if (FAILED(hr)) break; } else { hr = m_pDD->SetCooperativeLevel(m_hWnd, DDSCL_NORMAL); if (FAILED(hr)) break; } // Create the primary surface DDSURFACEDESC2 ddsd; ZeroMemory(&ddsd, sizeof(DDSURFACEDESC2)); ddsd.dwSize = sizeof(DDSURFACEDESC2); ddsd.dwFlags = DDSD_CAPS; ddsd.ddsCaps.dwCaps = DDSCAPS_PRIMARYSURFACE; if (nDXFullFlag) { ddsd.dwFlags |= DDSD_BACKBUFFERCOUNT; ddsd.ddsCaps.dwCaps |= DDSCAPS_FLIP | DDSCAPS_COMPLEX | DDSCAPS_3DDEVICE; ddsd.dwBackBufferCount = 1; } hr = m_pDD->CreateSurface(&ddsd, &m_pPrimSurf, NULL); if (FAILED(hr)) break; // Create or get the back buffer if (nDXFullFlag) { DDSCAPS2 ddsCaps = { DDSCAPS_BACKBUFFER, 0, 0, 0 }; hr = m_pPrimSurf->GetAttachedSurface(&ddsCaps, &m_pBackSurf); if (FAILED(hr)) break; } else { ddsd.dwFlags = DDSD_WIDTH | DDSD_HEIGHT | DDSD_CAPS; ddsd.ddsCaps.dwCaps = DDSCAPS_OFFSCREENPLAIN | DDSCAPS_3DDEVICE; ddsd.dwWidth = nW; ddsd.dwHeight = nH; hr = m_pDD->CreateSurface(&ddsd, &m_pBackSurf, NULL); if (FAILED(hr)) break; // Create a clipper LPDIRECTDRAWCLIPPER pcClipper; hr = m_pDD->CreateClipper(0, &pcClipper, NULL); if (FAILED(hr)) break; pcClipper->SetHWnd(0, m_hWnd); m_pPrimSurf->SetClipper(pcClipper); pcClipper->Release(); } ddsd.dwSize = sizeof(DDSURFACEDESC2); m_pDD->GetDisplayMode(&ddsd); if (ddsd.ddpfPixelFormat.dwRGBBitCount <= 8) break; hr = m_pDD->QueryInterface(IID_IDirect3D7, (VOID**)&m_pD3D); if (FAILED(hr)) break; if (nFlags & GAF_STENCIL_BUFFER) { // Read supported formats s_lZBufferFormats.removeAll(); m_pD3D->EnumZBufferFormats(IID_IDirect3DHALDevice, __enumZBufferCallback, this); // Select the best stencil format DDSURFACEDESC2 pZDesc; ZeroMemory(&pZDesc, sizeof(DDSURFACEDESC2)); pZDesc.dwSize = sizeof(DDSURFACEDESC2); pZDesc.dwFlags = DDSD_CAPS | DDSD_PIXELFORMAT | DDSD_WIDTH | DDSD_HEIGHT; pZDesc.ddsCaps.dwCaps = DDSCAPS_ZBUFFER; pZDesc.dwWidth = ddsd.dwWidth; pZDesc.dwHeight = ddsd.dwHeight; GFC_FOREACH_VAR(DDPIXELFORMAT, pPF, s_lZBufferFormats) { if (GFC_FOREACH_INDEX() == 0 || pPF.dwStencilBitDepth > pZDesc.ddpfPixelFormat.dwStencilBitDepth) pZDesc.ddpfPixelFormat = pPF; } hr = m_pDD->CreateSurface(&pZDesc, &m_pDepthStencilSurf, NULL); if (FAILED(hr)) break; hr = m_pBackSurf->AddAttachedSurface(m_pDepthStencilSurf); if (FAILED(hr)) { GameApp::getApp()->writeToLog("D: Attach failed : 0x%08X\n", hr); break; } } hr = m_pD3D->CreateDevice(IID_IDirect3DHALDevice, m_pBackSurf, &m_pD3DDev); if (FAILED(hr)) break; DWORD dwRenderWidth = nW; DWORD dwRenderHeight = nH; D3DVIEWPORT7 vp = { 0, 0, dwRenderWidth, dwRenderHeight, 0.0f, 1.0f }; hr = m_pD3DDev->SetViewport(&vp); if (FAILED(hr)) break; bRet = true; } while (false); if (!bRet) { SAFE_RELEASE(m_pD3DDev); SAFE_RELEASE(m_pD3D); SAFE_RELEASE(m_pPrimSurf); if (nDXFullFlag) m_pBackSurf = NULL; else SAFE_RELEASE(m_pBackSurf); SAFE_RELEASE(m_pDepthStencilSurf); } return bRet; } Anyone?
Is moving up to something that's only 10 years old an option? In DX9, you want presentParams->AutoDepthStencilFormat=D3DFORMAT_D24S8 and you're done. Code below is to make it rock solid but most of that paranoia isn't needed. Code: tBOOL RZRenderMgr::PDCheckDepthBufferFormat (D3DFORMAT BackBufferFormat,D3DFORMAT DepthTestFormat) { HRESULT Res; D3DDISPLAYMODE AdapterMode; Res=D3DObject->GetAdapterDisplayMode(D3DADAPTER_DEFAULT,&AdapterMode); if (FAILED(Res)) return (FALSE); Res=D3DObject->CheckDeviceFormat(D3DADAPTER_DEFAULT,D3DDEVTYPE_HAL,AdapterMode.Format,D3DUSAGE_DEPTHSTENCIL,D3DRTYPE_SURFACE,DepthTestFormat); if (FAILED(Res)) return (FALSE); Res=D3DObject->CheckDepthStencilMatch(D3DADAPTER_DEFAULT,D3DDEVTYPE_HAL,AdapterMode.Format,BackBufferFormat,DepthTestFormat); if (FAILED(Res)) return (FALSE); return (TRUE); } tERROR RZRenderMgr::PDSetAppropriateDepthBufferFormat (D3DPRESENT_PARAMETERS *Presentage) { D3DFORMAT Format; RZDebug::LogEvent("Finding depth-buffer format"); // System can handle the zbuffer and optional stencil buffer for us Presentage->EnableAutoDepthStencil=TRUE; // But we still need to decide the format. Run through the possibles in order of desirabilty Format=D3DFMT_D24S8; if (PDCheckDepthBufferFormat(Presentage->BackBufferFormat,Format)) { Presentage->AutoDepthStencilFormat=Format; RZDebug::LogEvent(" Set to D3DFMT_D24S8"); return (ERR_NO_ERRORS); } Format=D3DFMT_D32; if (PDCheckDepthBufferFormat(Presentage->BackBufferFormat,Format)) { Presentage->AutoDepthStencilFormat=Format; RZDebug::LogEvent(" Set to D3DFMT_D32"); return (ERR_NO_ERRORS); } Format=D3DFMT_D24X8; if (PDCheckDepthBufferFormat(Presentage->BackBufferFormat,Format)) { Presentage->AutoDepthStencilFormat=Format; RZDebug::LogEvent(" Set to D3DFMT_D24X8"); return (ERR_NO_ERRORS); } Format=D3DFMT_D16; if (PDCheckDepthBufferFormat(Presentage->BackBufferFormat,Format)) { Presentage->AutoDepthStencilFormat=Format; RZDebug::LogEvent(" Set to D3DFMT_D16"); return (ERR_NO_ERRORS); } RZDebug::LogError(" No useable depth-buffer"); return (ERR_WRONG_FORMAT); } DX8 was crap for basic surface setup and DX7 was an utter nightmare iirc. I don't have code around anymore else I'd post it, but basically anything before DX9 is night and day to DX9 and after.
It's pretty hard to track problem with just src code. I'll be curious to see what inside pZDesc before calling m_pDD->CreateSurface(&pZDesc, &m_pDepthStencilSurf, NULL); also, what is this thing doing? if (GFC_FOREACH_INDEX() == 0 || pPF.dwStencilBitDepth > pZDesc.ddpfPixelFormat.dwStencilBitDepth) pZDesc.ddpfPixelFormat = pPF; why is it > and not >= ? JC
AppleWood, casual game dev can't give a crap on the technology the important thing is the game quality and polish. So, it doesn't matter if they use Dx7 or Dx10. The game won't be better because of that. So, if they have an engine that has worked for years and is uptodate with what the players are expecting quality wise, why would you want them to upgrade their games engine? If it isn't broken, why fix it? It took me a while to understand this (as I use to be an engine programer) but it make perfect sense. It's not like they are scared to update, it's just not worth it. JC
I'm doing ZeroMemory on it and just initializing the parts I need. As with everything else, I tried other ways to init that (copy the DESC from the back buffer,...) Trying to pick the pixel format with most stencil bits. I don't think > vs >= would do much difference - I also tried different pixel formats (the first one, the last one, one set by hand, and so on) You're right about upgrading, which is why my code is still DX7. Requiring a z buffer shouldn't be enough to require an upgrade But Applewoods suggestion isn't bad, really. If everything else fails, I'll end up doing just that.
From the directx7 help : So, assuming a hardware device, try this : From: Code: pZDesc.ddsCaps.dwCaps = DDSCAPS_ZBUFFER; to : Code: pZDesc.ddsCaps.dwCaps = DDSCAPS_ZBUFFER | DDSCAPS_VIDEOMEMORY; Hope that works. George
Lets not open that door again. I'm not advocating that you need to fill your game with ubershaders. DX9 can still rustle up a plain textured quad the same as everything else. The difference it has over DX7 though is that the drivers are up to date, the interfaces are sane, and it's generally a far less buggy and problematic system to use. Whilst I conceed that a few years ago DX7 made sense due to not having to supply the redist, but these days that argument's looking a bit sad imo, plus almost all people will have XP and a DX9 card. Yes, even in the 200 dollar laptop they bought many years ago. You are far more likely to find a driver fail on DX7 than you are on DX9. You mean apart from the bit where the O/P said his engine is broken and he can't fix it? A final note: If you're doing 2D only and are worried about compatibility, why the hell is anyone using DX anything? If you write an efficient sprite routine in software, you'd be amazed how much you can do. Those that are old enough, think back to what you could do in 640x400 on a non-pentium DX2-66Mhz and scale that up a thousandfold. You see, this "It must use DX7" thing doesn't hold water from whatever angle you come in at.
Actually a few weeks ago i tried GMarkou's WIP game in my brand new XP-powered Acer Aspire One and the game wouldn't launch because some DX9 DLL was missing . (there, i did it; i planted doubts in half of the people reading these posts ) Because of this i agree that the best would be to just write a software blitter. Its not rocket science and the algorithms are well known and documented.
Dumbest DirectX answer ever Then don't. - - - - Actually I have "dumb" class question too. I've pretty much sold myself on only doing a DirectX 9 graphics port (since I'll also have a GL Fixed Function port around for the less fortunate). But I'm not planning on touching that for a good few months. So, a question to put my mind at ease. The GMA 945/950, that crazy ass video card in Netbooks and cheap motherboards everywhere. Lets forget performance for a moment, and it's complete lack of hardware T&L. The million dollar gameshow question is: If I force feed that GPU driver some HLSL shader code (that meets the "Pixel Shader 2.0" spec), stuff in some vertices and UV's, will something resembling textured polygons come out? I've worked GLSL shaders a few times now, and recently spent some time with Direct3D Mobile (DirectX9 syntax, but fixed function). I'd just like to know if I can get something resembling pixel shading out of a GMA 945/950. Anything older than that, I really don't care (if you can't alternatively run GL, then it sucks to be you). More than happy to make people download a redistributable, if that's the only problem. K-THX-BYE-LOL!
AFAIK the GMAs support pixel shaders 2.0 fine (they also support vertex shaders but the calculation is done in CPU).
ALL of the intels, even then 8 series are a problem for one reason only. The can't do vertex shaders on hardware. This is not fatal nor even complicated to handle - they have a shared memory architecture so vertex access is fast and most VS stuff can be done quickly and efficiently. The actual reason half the apps out there fail is because people don't respond to the driver caps properly and assume you can make a hardwarepure device. And on these things you can't. You don't even need any caps tests, really. This code is all you'll need to put intel behind you: Code: HWND Window=RZCore::Core->GetWindowHandle(); Res=D3DObject->CreateDevice(D3DADAPTER_DEFAULT,D3DDEVTYPE_HAL,Window,D3DCREATE_HARDWARE_VERTEXPROCESSING|D3DCREATE_PUREDEVICE|D3DCREATE_FPU_PRESERVE|D3DCREATE_MULTITHREADED,&Presentage,&D3DDevice); if (FAILED(Res)) { Res=D3DObject->CreateDevice(D3DADAPTER_DEFAULT,D3DDEVTYPE_HAL,Window,D3DCREATE_SOFTWARE_VERTEXPROCESSING|D3DCREATE_FPU_PRESERVE|D3DCREATE_MULTITHREADED,&Presentage,&D3DDevice); if (FAILED(Res)) { RZDebug::LogError("Unable to create ANY D3D Device: Err=%08x, Bailing out",Res); return (ERR_NO_VIDEO_MODE); } }
It's DDERR_CANNOTATTACHSURFACE. The very useful documentation says "A surface cannot be attached to another requested surface". Thanks, MSDN.
While that's technically correct and I appreciate the irony, don't tell me a f'ing zbuffer is beyond the capabilities of DX7! It's not like I'm trying to do anything fancy here...
Heh, you kinda answered your own point there It should be easy but it's just not. If we assume there's just a simple bug in your setup code, ignore that for a moment and just look at how much code you actually have, just for a f'ing zbuffer! All versions of DirectX came out in fairly rapid succession every year or two for a few reasons. One of the main ones being that they were bodges and fixes for stuff that had gone before. One of the biggest overhauls was the entire swapchain and rendertarget mess that you are encountering. There were exceptions all over the place due to bad drivers and lies about caps bits etc. Then in 2002/3, DX9 hit and that stayed with us for a pretty damn long time. The reason for that is that we arrived at a stable and sensible SDK that did everything it's meant to do without much fuss and it just worked. When I look back on the codepaths for DX6&7 it makes me shudder, and I still can't believe people are using it through choice. Short version is that DX7 is ten years old and not really fit for purpose. If it was, there wouldn't have been a version 8 & 9 following fairly rapidly. (All newer versions since 9 are brand new architectures and could reasonably be called something else). Note that I've not once preached about advanced rendering features, this is all about getting a basic system running with workable drivers and no support calls. You all want that, right? You were lucky to get a verbose error message. Most of them iirc just said "invalid params". :s
have you tried the code on another machine? to rule out drivers. I agree with applewood. Directx7 was fine, when people writing video card drivers gave a damn about dx7. Nvidia don't give a monkeys buttocks if their cards work with directx7 games any more. The latest nvidia drivers screw up half of my back catalog. Do they care? No. Even if you don't use a single feature of DX9 or 10, move away from 7. Driver support for 7 or older will go from bad to terrible in the next few years. This sucks, but it's true.
Applewood, you are talking without knowing Read the posts properly: nobody ever said people MUST use Dx7. I said (and other said it many times), there is no point to upgrade if you don't need it. And most casual game don't need anything more than Dx7: DRAWTRIANGLE. Now, when I started as a casual game dev in 2006 my engine was using Dx9. It was a nightmare compatibility wise (and I used to be a 3D expert by the way). I made an engine using Dx7 in 3 days. Since then I'm still using it and it is ubber robust. My engine is 3x more robust than any stuff I did when creating engine for the AAA industry. Now, I spent the last 15 months working on my last game Anka (check it out, and you will see that I really work hard to produce it). Why would I express any need to make my dev time even longuer by updating an engine that as absolutely no need for an update? You said that Dx7 is badly supported nowdays? How would you know, you haven't touch it for a decade? Most top casual games are produce with dx7 (even if Dx8 is becoming the norms now) and these games are downloaded millions of times. Trust me, these games works on more PCs that any AAA games could claims. Now, I'm planning to upgrade my engine to Dx8 (not Dx9, they are still XP PCs that use only Dx8, like BadSector experienced). I'll make it cross platform and it's a good time to do it. If that wasn't the case, I would still be using Dx7 for my next game and don't see anything wrong with this. Again, noone ever said you must use Dx7, if people are still using it, it's because they didn't see any reasons to upgrade. JC
Nope, not touched it for a decade and that makes me a bad person to ask for advice about using DX7. You'll notice I made no attempt to actually fix Gabriel's problem. The point you're missing is that ATI and nVidia and even Intel haven't touched DX7 for a decade either, and that's something you should really pay attention to! I'm fully aware that there is a large amount of DX7 games out there. So what? You can still get a version of Bedlam that was 2D only and ran under DOS. But I bet you can't run it.... Actually, that point about DX7 has been hammered home on this very board many, many times. But asked why I care to try and update that opinion or in fact keep telling people to make life easy on the themselves, I couldn't really give an answer. If you like trying to support antiquated stuff because you're in some kind of comfort zone, it's really none of my business. Enjoy! Every casual game I've seen doesn't need that either. Point me at your favourite 3 casual games and I could rewrite the lot using pure software and guarantee you 100% compatibility forever. But that's hard so people don't do it. Yeah, I get that a lot, usually by people who don't like the message. I admittedly don't do much in the PC indie space so ignore me if you want. Cliff does though, listen to him!
As a DX7 user, I would recommend taking any opportunity to upgrade to DX9. Official support for DX7 has been dropped in 2007. Starting with Vista, there is a new driver model. DX9 was ported to the new driver model, but all previous versions of DX are emulated (mapped to DX9Ex I believe).