Monitor

Users who are viewing this thread

Úlfheðinn

Section Moderator
So I am in the market for a decent monitor.

I'm not really willing to break the bank, but I'll also have a pretty powerful PC so it would be dumb to buy something too cheap. From what I've read online I think I am looking for a 24 or 27 inch monitor (leaning towards the 24 inch because it seems like 27 inch ones are a little bit less sharp at 1080 and only really kick ass at 1440).

I don't really play FPSes that much anymore and honestly I'm mostly going to be playing Battle of Stalingrad and other flight simulators.

With that being said I've been looking at the Asus VG248QE and the BenQ XL2420Z for the ridiculously good refresh rate and near lack of input lack.

At the same time I've hard good things about two IPS monitors: S24D390HL and Dell U2414H.

So I guess it hinges on whether a super fast refresh and input free monitor or a higher quality IPS one.

Any recommendations?
 
While this isn't particularly on topic, it's related.

Nvidia just announced some fancy new stuff. Most applicable is something called DSR - Dynamic Super Resolution. This allows their cards to render in 4k then downsample to 1920x1080. It's like any other kind of downsampling hack except this one is driver level.

It works on their brand new GPUs - GTX 980/970 at $550 and $330 launch price. Might be an excellent alternative to buying a high end monitor. Reported benchmarks show the 970 to be approximately equal or slightly better than an R9 290 which I have experience with, and let me tell you its a beast. You could pair a 970 with a 1080p monitor and get similar results to buying a higher resolution monitor except this would allow you to also run 1080p well on future graphics intensive games in 5 years. I would expect it to absolutely destroy your desired games at 4k.
 
Recycling;

So I found out that we have a "Salora 24" Full HD LED TV with a DVD-player, 200 Hz, 12V, DVB-T2, 3 x HDMI, USB" lying around in our cabinets I decided to ditch my olde monitor and upgrade for free yo.
I know literally nothing about monitors even though I've asked a few times and read responses to other people so pardon me for my ignorance. It's working I guess, on a VGA cable. With HDMI the picture quality wasn't as sharp and input lag was rather murderous, couldn't figure out how to deal with it. So, VGA, 1080p and 60Hz (60hz was maximum on the HDMI cable as well) despite the 200 advertised but I guess there's something that I don't understand about this.

Done with the rambling and into the questions: Am I missing out on something except for the audio on my TV-monitor-thingy when using a VGA cable over HDMI? Can I fix the input lag and not-so-sharp quality that happened with the HDMI cable I used (Note, it's not an expensive cable)? What to do about the top of my screen chopping off a few pixels E: actually it chops pixels from the left side as well. (Already tried mucking about the limited settings the monitor has)? It's not a huge problem but certainly annoying.

Also unrelated to the monitor (well not really), I didn't have a drop in frames in Arma 3 after upgrading from 900p, so if I make the conclusion that I'm bottlenecked by my i7 3820 more than my GTX 780 ti on Arma 3, am I wrong? I do remember reading Arma's being notoriously bad on the CPU.

I think that is all, thanks.
 
Arma is generally going to be bottlenecked by your single threaded performance before your GPU, although if you go to a wilderness area and crank all the settings up your GPU will struggle to keep up. If you really want to see where your GPU bottleneck is at, increase your scaling to 200% (4k), 300% (8k) or 400% (16k) and you'll dip into the dozens of frames.
 
Alright, thanks. I mean I already get a somewhat stable 30 that sometimes dips into the twenties so I think I'm fine  :lol:

I'd still like to deal with the borders of my screen though, I think I'll see to some googling tomorrow once I get home from school.
 
If you're on an AMD card then you need to go into the CCC and change the UI scaling to off, it's a slider somewhere in the options. Don't know why but seems like AMD decided it would be a wonderful idea to make the default setting downscale scale the entire image...

You should also be getting more than 20-30FPS on an i7 in Arma 3 short of the most intensive areas or servers with a lot of "stuff".
 
I wish I had AMD if it's that easy >.>

Well with graphics on ultra I get 20-40 FPS on one of the "tutorial" missions, peak is at 40 but it's usually around 30 and gets down to 20 when stuff happens or I spin around too much. If I host a server my self, no matter what graphics settings I use I get around 20 FPS which is pretty retarded but hey, gotta take one for the team.
 
Set up a dedicated server on your desktop as a different process. It eliminates entirely the extra overhead from running a server and client on the same process effectively doubling the concurrency and halving the single threaded load.

I split my 4 cores (hyperthreading disabled) between arma3.exe and arma3server.exe equally, leading to almost 100% utilization for some missions and more importantly good client FPS without sacrificing server stability/AI routines.

If you want you could run the server on a single core but it would very quickly maximize it and your AI would start to get dumber  :grin:
 
Could the problem with my screen borders extending beyond the actual screen be overscan? I don't even know if that is a thing with VGA but I tried all the overscan options with a HDMI cable and it didn't change anything really. I mucked about with my NVIDIA control panel and nothing seemed to work also the settings of the TV weren't of any use.
 
Oh, TVs are weird. Good luck getting it to fit correctly on the correct resolution. My "1080p" TV only displays properly at 1366'x768 for computer inputs.
 
****. Well I guess it's just a "minor" inconvenience as it doesn't really hide anything. Meh, gotta deal with it.

Thanks anyway. I'll try to remember to post here if I figure out how to fix it.
 
Back
Top Bottom