@MidnightButterSweats it’s true though, the game runs terrible for some people that have a pc way better than mine and I only use an i5-4460 and a amd 280 and it runs good for me.
FPS CONFIG
I really just can’t believe anyone that says this. I know it’s unoptimized, but I’m using hardware that was aging in 2012 + a Radeon 280 (not even an x) and I get 100+ steady on Ultra @ 1920x1200 res.[/quote]
LONG POST INCOMING!!
This isn’t directed at you or anybody else in particular so please nobody take offense. This is more of an educational post for people who might not know or have the right expectations when it comes to “optimization”.
The term “optimize” or “your games are not optimized” sometimes gets thrown out there as a catch-all term and can be used as a response so that people do not need to look at their own hardware/issues or understand what the engine (ue3) expects out of hardware/setups. It’s just easier for people to blame a game so they run with it and it’s a common response because “other games work fine”. A lot of people don’t understand what optimize even means and instead of blaming their now aging low to mid-line cpu/gpu that are possibly overheating (of course they never check that) they would rather throw around sentences like, “the game isn’t optimized” that excuses them from having to do any troubleshooting whatsoever.
For example, a lot of people with laptops have pretty weak processors. Sure, you were sold an i5 or i7 but they were the “M” variants (mobile) that are much weaker than similarly named desktop versions (which is part of the marketing to make people feel better about the power of their purchase). Mobile processors tend to have a lot more heat issues due to their compact nature and lower heat thresholds before something like Speedstep kicks in. Oh, and due to exposure to consistent excessive heat laptops will tend to have more heat issues as they age. Partly due to a lot of manufacturers using thermal compounds that will degrade over time/usage. That’s a fun fact many don’t know about laptops. Same can be said about the “M” version of GPU’s as they can be extremely under-powered and even sometimes using last gen tech despite having current gen naming schemes. Laptops also use tech that uses the onboard GPU for almost everything to save on power/heat. Then switches over to the discrete GPU when it detects a program that would use it. However, this isn’t always automatic. Especially for new games or games that are in pre-release states that haven’t had a profile created by Nvidia/AMD and require manual attention. So another fun fact of laptops for many who aren’t aware.
Another common issue is multi-core AMD cpus that have low single/dual thread performance won’t run a UE3 engine game very well because UE3 is not setup to use all of those cores (or all of your modern GPU). Strong single thread performance is what you want with UE3 games. Which tends to not be the older AMD cpu’s because AMD went all in on throwing in as many cores as possible hoping that would give them a performance lead over their competition. But many games/engines still don’t properly support more than 2 cores. I’m not slighting AMD owners. It’s just a fact that they are weaker when it comes to single/dual thread applications which includes many games/engines (but not all, of course). Also, UE3 is famous for having inconsistent frame draw times. So even if you can keep a constant 60/120/144 FPS like your counter says, it will still have a bunch of frame draw time spikes compared to some other games because it’s just how UE3 works in general. This frame draw time issue is made worse with other setups and lower end/older hardware. You can view these draw time issues using FRAPS with another tool called FRAFS Bench Viewer.
Also, when developers talk about how they are “optimizing” a game they try to look at it with a range of hardware and make sure that certain textures or views on maps and things like that don’t radically drop FPS for one reason or another for each of those platforms. They are not creating magic code to get you more FPS. They are simply checking to make sure that a particle effect on an explosion or a particular texture or geometry on a map won’t tank FPS on one platform compared to another. Or that something like a player joining the server doesn’t spike the server (DB still has that problem). That’s what the optimizing process typically consists of and SD has a guy going through their maps doing exactly this. But he’s only going to help FPS consistency when looking around the maps for some people and it’s not going to be a wand-waving magic fix to every person’s FPS woes.
Some developers who employ ENGINE WIZARDS (SD does not to my knowledge) will go into UE’s code (or DX but we’ll stick to EU for simplicity) and try to speed things up to maximize FPS for their individual game and focus on events that could impact frame draw times. But that is typically met with limited results as the engine is already pretty much set in it’s ways and as well optimized as possible without modification. UE3 is heavily CPU reliant and doesn’t offload to your GPU much which is why a lot of you will have GPU’s that are only being used at 40-50% of capacity with DB running. Any massive changes (like leaning on the GPU more) would require a lot of work at the engine feature level which is complicated and often not worth the time invested. Which is why many game devs do not create or even attempt to heavily modify engines and instead buy them as a platform to build on. Building your own engine or heavily modifying an existing one is complicated and hard and can screw up your milestone/release timetable if you run into issues (check the history of tons of canceled games due to this exact issue).
So that’s my take on the whole “optimization” fad that has swept gaming forums everywhere in the last couple of years. Learn what it is so you can set your expectations correctly. Also, don’t take this as a “DB is perfect” post…because it’s not. Just know that if you have a laptop (any hardware), are on older hardware, are on AMD hardware that relies heavily on multi-threading, have heat issues that you have not/refuse to troubleshoot (you have no idea how many people insist they don’t have heat issues when it turns out that they do) then you might (read again, MIGHT) run into FPS consistency with DB and UE3 games in general. If you’ve factored in all of that then well done. And if you then still have FPS issues or you are on hardware that is going to give you problems that you can’t do much about then you can move towards things like FPS configs to help save the day. If that doesn’t work then there is probably other software or hardware related issues going on that might not be easy to troubleshoot or impossible to overcome without changes. It’s not automatically, “the devs need to optimize their game!!!” and call it a day.
Also, this is a PS3 game, but you get a good idea of what “optimization” is from this making of video. That’s how devs can optimize at not just the engine level but the hardware level. Neither of which SD really has access to when they are working on DB on PC to my knowledge (somebody correct me if I am wrong here). Hardware level access or very close to it is actually a huge feature of both Vulkan and DirectX 12 so devs can use tools to try and squeeze frames out of every area of their games and they can check with tons of hardware platforms. I’m sure UE3 has some tools to help, but again, nothing will be a magic wand here
Hopefully a few more FPS is squeezed out, frame times stay consistent or as consistent as possible across common hardware and events like explosions or servers joins get attention. But nothing is going to make you go from 30fps to 90 or similar unless you had other issues.
[quote=“Amerika;198820”]UE3 is heavily CPU reliant and doesn’t offload to your GPU much which is why a lot of you will have GPU’s that are only being used at 40-50% of capacity with DB running. Any massive changes (like leaning on the GPU more) would require a lot of work at the engine feature level which is complicated and often not worth the time invested.
[/quote]
For NVIDIA cards, couldn’t you just set the game to render everything on the GPU by default in the NVIDIA control panel(and I’m sure the AMD equivalent, the Catalyst Control Panel) to negate the issue where UE3 doesn’t render on the GPU by default?
[quote=“Nibbles;198824”][quote=“Amerika;198820”]UE3 is heavily CPU reliant and doesn’t offload to your GPU much which is why a lot of you will have GPU’s that are only being used at 40-50% of capacity with DB running. Any massive changes (like leaning on the GPU more) would require a lot of work at the engine feature level which is complicated and often not worth the time invested.
[/quote]
For NVIDIA cards, couldn’t you just set the game to render everything on the GPU by default in the NVIDIA control panel(and I’m sure the AMD equivalent, the Catalyst Control Panel) to negate the issue where UE3 doesn’t render on the GPU by default?[/quote]
If only it was that simple. I’m sure some devs have came up with solutions but I’m honestly not that familiar with all the different developer specific iterations of UE3 where they have solved scaling issues. Others reading might have more specific information. I mostly just know of common UE3 issues that many games tend to have consistently. Especially ones by smaller dev teams.