Installing Resident Evil 5!

Maybe its time to get a new video card.
My GeForce 550 Ti rocks and its been almost 3 years since I got it.
You might want to look into a GeForce 750 Ti
I've heard they are really good and do not require extra power

Excellent recomendation, GTX750Ti (also have nvenc if you want make screencasting) have very good performance/watt (and lower size) ratio

:)
 
Maybe its time to get a new video card.
My GeForce 550 Ti rocks and its been almost 3 years since I got it.
You might want to look into a GeForce 750 Ti
I've heard they are really good and do not require extra power
I know. Actually, I was aiming at collecting $300 for GTX970, but... I stuck on a problem- High School is comming soon for me and income is low, but I will need laptop in high school, because I need computer there and I won't be at home. But, if I find a way to get money, I will buy a new GPU:) Actually, there was a computer for $600 with GTX750 Ti. The day my parents bought computer they didn't know much about CPUs and GPUs and other hardware, so they bought this PC for $500, was a mistake:D
 
I know. Actually, I was aiming at collecting $300 for GTX970, but... I stuck on a problem- High School is comming soon for me and income is low, but I will need laptop in high school, because I need computer there and I won't be at home. But, if I find a way to get money, I will buy a new GPU:) Actually, there was a computer for $600 with GTX750 Ti. The day my parents bought computer they didn't know much about CPUs and GPUs and other hardware, so they bought this PC for $500, was a mistake:D
Maybe more affordable option could get 2GB GT 630/GT 730 but GK208 model (384 shaders), this card stay around 50 or 60us as max, dont cost much and improve your card between 2x or 2.5x ratio

I come to GT 520 (GT 520 is same chip that your GT610* 40nm - 48 shaders - 4 rops and 8 tmus) after buy actual Geforce GT 630 2048MB (GK208)


*This is gpu chart for both gpus (GT520 - GT610) from courtesy techpowerup




And this is gpu chart of my GT630 (GK208)


If compare between your GT610 and mine GT630 GK208 have this diferences

GPU process manufacturing

GT610: 40nm vs GT 630 (GK208): 28nm (lower nm is minor consume and more performance normally)

Shaders

GT610: 48 vs GT630 (GK208): 384

Rops

GT610: 4 vs GT630 (GK208): 8

Tmus

GT610: 8 vs GT630: 16

Resuming GT630/GT730 (GK208) offer 2 to 2.5x more performance ratio compared your actual GT610 but consume 25w for 60us

However in many games needs use moderate values (especially in wine) for more performance, for more information could see in my channel for game do you want run if i have tested

:)
 
Last edited:
But, actually, how much does raw GTX 750 Ti cost? And what does Ti mean? Does it mean that that card is also called GTX Titan?
 
But, actually, how much does raw GTX 750 Ti cost? And what does Ti mean? Does it mean that that card is also called GTX Titan?

And GTX750Ti must be cost around 130us but up is same relation between your actual card and mine

GTX 750 Non Ti around 120us for 2048MB model


GTX 750Ti around 135us for 2048MB model


Resume GTX750 Ti must be have between 2 to 2.5x performance ratio compared with Geforce GT630 GK208

However memory in GTX750 Ti is much much faster than mine vga, GT 630 GK208 have 1800mhz 64bit DDR3 (around 14.4gb/s of memory bandwidth) but GTX750Ti have 5400mhz 128bit DDR5 (around 84.3 gb/s memory bandwidth), in this case ratio is much higher GTX750 Ti offers 6 times more memory bandwidth than GT630 GK208

Respect GTX and Ti be only names provided for nvidia for discriminate normally more power editions than regular non ti, for example

This is gpu charts by gpudatabase for GTX750 (Non Ti) and GTX 750 Ti



Resuming GTX 750 (Non Ti) have 512 shaders, 32 tmus and 16 rops

And GTX 750 Ti have 640 shaders, 40 tmus and 16 rops

However both models came with 128bit DDR5 (around 80 gb/s memory bandwidth) normally

:)
 
Last edited:
And GTX750Ti must be cost around 130us but up is same relation between your actual card and mine

GTX 750 Non Ti around 120us for 2048MB model



GTX 750Ti around 135us for 2048MB model



Resume GTX750 Ti must be have between 2 to 2.5x performance ratio compared with Geforce GT630 GK208

However memory in GTX750 Ti is much much faster than mine vga, GT 630 GK208 have 1800mhz 64bit DDR3 (around 14.4gb/s of memory bandwidth) but GTX750Ti have 5400mhz 128bit DDR5 (around 84.3 gb/s memory bandwidth), in this case ratio is much higher GTX750 Ti offers 6 times more memory bandwidth than GT630 GK208

Respect GTX and Ti be only names provided for nvidia for discriminate normally more power editions than regular non ti, for example

This is gpu charts by gpudatabase for GTX750 (Non Ti) and GTX 750 Ti





Resuming GTX 750 (Non Ti) have 512 shaders, 32 tmus and 16 rops

And GTX 750 Ti have 640 shaders, 40 tmus and 16 rops

However both models came with 128bit DDR5 (around 80 gb/s memory bandwidth) normally

:)
Are GTX 750 s worth of their price? It doesn't sound much- $140- for a GPU that beats my current GPU about 5 times, but costs only 2,5x more...
 
Another idea:

Looking around in forums and elsewhere I find out that buying an AMD card for $150 will give much more than NVIDIA(propably, because NVIDIA just like Apple or Xbox requires you to pay also for the trademark). A more powerful cards for the same price. However, I have heard that AMD cards doesn't support some shaders, especially from GLSL(GLSL geometry shaders using version 400(OpenGL 4.0) and others so well. In Blender forums(I use Blender a lot for game development) I see often a complaints about shaders not working on AMD/ATI. Is it really like so AMD supports less shaders and doesn't support so recent OpenGL and GLSL versions aswell as some of GLSL shaders? And... As I know, AMD doesn't like PhysX, so it supports it weakly, so I think that Assassin's Creed would run on lower framerate, right? Well, if you can reccomend me card, I'd be happy, because it is even hard to make choice betwenn 2 trademarks not even looking further. And after that, getting money will be obstacle too...
 
Another idea:

Looking around in forums and elsewhere I find out that buying an AMD card for $150 will give much more than NVIDIA(propably, because NVIDIA just like Apple or Xbox requires you to pay also for the trademark). A more powerful cards for the same price. However, I have heard that AMD cards doesn't support some shaders, especially from GLSL(GLSL geometry shaders using version 400(OpenGL 4.0) and others so well. In Blender forums(I use Blender a lot for game development) I see often a complaints about shaders not working on AMD/ATI. Is it really like so AMD supports less shaders and doesn't support so recent OpenGL and GLSL versions aswell as some of GLSL shaders? And... As I know, AMD doesn't like PhysX, so it supports it weakly, so I think that Assassin's Creed would run on lower framerate, right? Well, if you can reccomend me card, I'd be happy, because it is even hard to make choice betwenn 2 trademarks not even looking further. And after that, getting money will be obstacle too...

AMD card consume so much and as you said opengl state stay very inmature (maybe stay in better state in sometime in future: 2 years maybe????)

Almost forget take care if you want buy used amd card many of them going using for mining, if want amd card maybe better choice search new card

PhysX for now is not reelevant in linux for now, in wine is different

Other point is most game developers give support only for propietary drivers (nvidia-amd), amd opensource drivers needs more work: stability, performance and compatibility and amd closed source give more performance but give troubles in compatibility and stability

Resuming if you want use wine games both native games your only option for now is nvidia with propietary drivers

:)
 
AMD card consume so much and as you said opengl state stay very inmature (maybe stay in better state in sometime in future: 2 years maybe????)

Almost forget take care if you want buy used amd card many of them going using for mining, if want amd card maybe better choice search new card

PhysX for now is not reelevant in linux for now, in wine is different

Other point is most game developers give support only for propietary drivers (nvidia-amd), amd opensource drivers needs more work: stability, performance and compatibility and amd closed source give more performance but give troubles in compatibility and stability

Resuming if you want use wine games both native games your only option for now is nvidia with propietary drivers

:)
OK! Got it... However, what's mining with graphics card? I expect it to not to be digging ores using graphics card instead of pickaxe or drill:D
 
OK! Got it... However, what's mining with graphics card? I expect it to not to be digging ores using graphics card instead of pickaxe or drill:D
Sometime ago when HD7xxx are very popular for bitcoin* (mining), in this times many people buy many radeon for this (go until amd had problems for satisfy user demand in this time) purpose


But when appears custom ASICs this practice leaves because new ASICs performs better than gpus

:)
 
As I read it, mostly it tells that you pay more for electricity than protif. Um... Is it even possible to convert Bitcoins to a real money in bank account?
 
Hi! I wanted to report a problem about me and RE5: it is much too dark for me, in some places it is almost black, I don't see anything, for example, chapter 2-1 start:
Any ideas how to fix it?
 
Hi! I wanted to report a problem about me and RE5: it is much too dark for me, in some places it is almost black, I don't see anything, for example, chapter 2-1 start:
Any ideas how to fix it?
In my case appears in closed houses maybe up gamma or brightness, maybe test more later

If you can upload this bug to wine bugzilla (needs create account)

:)
 
I've seen this with a few games. One side-effect to changing brightness and/or gamma is Linux might keep this setting even when the game is closed.
That IS a strange bug. Have you researched it online at all?
Might be a video driver setting too
 
Well, I don't know exactly of all this. Could you explain how to do this more smooth? In screen settings(on controller of screen: menu -> picture section there are brightness, R, G, B and other values, but no gamma. Why is it so?
 
What distro are you using again?
You may have to do some playing around with Linux settings and game settings.
Do any other games have this problem in PlayOnLinux
 
Here is something you can try...
Go to PlayOnLinux
Select Resident Evil 5
Click Configure
Go to Display Tab
Direct Draw Renderer = GDI
Close Configure
Launch Game

This change could really screw up other things, so its a shot-in-the-dark
If it doesn't work, then set it back to "default"
 
Back
Top