# Direct X 9.0 Crashing



## Projekt92 (Dec 14, 2008)

So I've recently upgraded my computer, and I thought I'd try out newer, better games. Or at least I hoped.

Anyway, I bought and installed Resident Evil 5 through Steam. Got into no problem, all ran smooth. It installed a generic version of Direct x 9.0 which was out of date, so I went and installed the most up to date version.

When I try to actually play the game, I get this error.

"RESIDENT EVIL 5: RE5DX9.exe - Fatal Application Error

D3DERR_DRIVERINTERNALERROR : present result"

Basic System Specs....

*OS:* Windows XP Home Edition 32 Bit (Service Pack 3)
*Motherboard:* A790GXM-A
*Processor:* AMD Athlon(tm) 64 X2 Dual Core Processor 6000+ 
*Video Card:* SAPPHIRE Radeon X1550 Series
*Direct X Version:* Direct X v9.0c
*RAM:* 2 GB Corsair memory (not sure of the speed by heart, but I know I meet requirements.

Thanks for anything ya got!


----------



## Projekt92 (Dec 14, 2008)

Also, just discovered this, but apparently before the game loads, there's a cinematic. For me, all I see is credits. Here's what I SHOULD be seeing. 

YouTube - Resident Evil 5 Opening HD

But all I see is a black screen, and those credits in white. I catch the subtitles too. Also I get sound.

The game crashes and gives me that DirectX error at the end of the cinematic, when the game starts to load in.

So I've put you all in my seat and given you some basic specs, hopefully someone has some ideas. :tongue:

Thanks
Mike


----------



## RockmasteR (Aug 10, 2007)

hello Projekt92,
the game needs an Nvidia Geforce 6800 so it's equal to an ATI HD 2400 or X1650
your card is below minimum requirements so I doubt it'll play good.
anyway be sure to get the latest drivers for your video card from here:

http://support.amd.com/us/gpudownlo...spx?type=2.4.1&product=2.4.1.3.8&lang=English

but first uninstall your current driver and then install the one I posted


----------



## Projekt92 (Dec 14, 2008)

Done, No luck. =\

My question for graphics requirements...

If I'm not mistaken, core clock speed is one of the bigger component we look at when determining whether or not you met requirements, yes? Here's what I found....

Raedon SAPPHIRE x1550 Series : 550 MHz
Nvidia GeForce 6800 series : 350 MHz
ATI Radeon HD 2400 Pro : 525 MHz

Just a question, thanks!

Mike


----------



## koala (Mar 27, 2005)

The core speed is only one of the factors that determines how well a graphics card will work in games. You also need to look at the memory clock, pixel/vertex shaders, texture units, etc.

So although the GeForce 6800 has a lower core clock speed than your X1550, its memory bandwidth and texture fill rates (22gb/sec, 3900mtexels/sec) are nearly double the X1550's (13gb/sec, 2200mtexels/sec). Also, the 6800 use a 256-bit memory bus, and your X1550 uses the much slower 128-bit.


Which graphics driver version is installed? Did you reboot after updating?

Go to Start > Run > *dxdiag* > Display tab, enable the Acceleration buttons if they are disabled, then run the 2 tests. Any problems listed in the Notes box?


----------



## RockmasteR (Aug 10, 2007)

the core clock is not the only factor to determine the Video card speed
you have the Memory Clock, Memory Interface, Memory Transfer Rate, Pixels per clock

so even if the 6800 has a lower Core clock than the X1550 but it's indeed faster

The memory clock plays a great role as well (800 Mhz DDR2 or 1.6 Ghz DDR3)
DDR3 is far better and faster than DDR2
Memory interface rate is very important
there are video cards that have a 64 bit transfare rate, that is very bad (like the HD 2400, being a low end card)
and there are cards that has 256 bit like the Geforce 9600 GT
or even 512 bit like the GTX 285 and so on, the more memory interface the card has, the better it is
pixels per clock also plays a good role in a video card, of course the more the better

so let's make a comparison ATI X1550, Geforce 6800

ATI X1550: 

Core: 550 MHz
Memory Clock:800 MHz DDR2
Memory interface: *64-bit or 128-bit*
Memory Transfare rate: 6.4 GB/s or 12.8 GB/s
Pixels per Clock: 4
Directx: 9.0c 


Geforce 6800:
Core:325 MHz
Memory Clock: *600 MHz*
Memory interface:*256-bit*
Memory Transfare rate:*19.2 GB/s*
Pixels per Clock:*12*
Directx:9.0c

here the Blue is better than Red
so you see, the Geforce 6800 is far better than X1550 (*64 bit or 128 bit Vs 256 bit*)

do you have any similar new games? and does the problem persists in other games? (old or new)


----------



## Projekt92 (Dec 14, 2008)

Ahhh, I understand.

We'll let's see. World of Warcraft runs a little better than before, but that's probably because of my motherboard upgrade. Left 4 Dead is a new game I got, that runs flawlessly, but that game doesn't really push the limits for graphics. Elder Scrolls 4: Oblivion is another game that now runs flawlessly, but again, not that new of a game.

I think what I might do is steal my little brothers GFX card for a bit. He has an ATI Raedon HD 4350. Well above what I need.

Any other ideas?


----------



## McNinja (Jun 22, 2008)

A ATI 43050 might run the game but it is not much of a upgrade from your current card. I might only have the newer pixel versions.

ATI X 1550 to ATI 4350.

You might get to be able to play the game on the lowest settings possible. The 4350 is only slightly better than you current card.

*ATI 4350*
Manufacturer:	ATi
Series:	Radeon HD 4k
GPU:	RV710
Release Date:	2008-09-29
Interface:	PCI-E 2.0 x16
Core Clock:	600 MHz
Shader Clock:	600 MHz
Memory Clock:	500 MHz (1000 DDR)
Memory Bandwidth:	8 GB/sec
FLOPS:	96 GFLOPS
Pixel Fill Rate:	2400 MPixels/sec
Texture Fill Rate:	4800 MTexels/sec

*ATI X1550*
Manufacturer:	ATi
Series:	Radeon X1K
GPU:	RV516
Release Date:	2007-01-08
Interface:	PCI
Core Clock:	550 MHz
Memory Clock:	400 MHz (800 DDR)
Memory Bandwidth:	12.8 GB/sec
Shader Operations:	2200 MOperations/sec
Pixel Fill Rate:	2200 MPixels/sec
Texture Fill Rate:	2200 MTexels/sec
Vertex Operations:	275 MVertices/sec

These stats are a little confusing but it looks like performance is alost the same except for the memory bandwidth which is lower in the 4350 but it has higher vertex shader bandwidth.

Not much of a upgrade if you ask me.


----------

