On the Brink of chang...ing my hardware


(Auzner) #41

Reread my post and you’ll see I was already aware of the nature of film capturing temporal data. I think it was amusing to you because what I said went over your head. Had you understood me you would not be repeating what I said. So I’ll explain it to you, by now I’m very used to this. Video games render a frame as if they were taken by a camera with a 1/∞ shutter speed, or an instantaneous moment in time. Things like water drops and fan blades will appear frozen and detailed. Placing 24 of those frames together in a second will make a very disjoint (laggy) animation. A much smoother view requires 30-60 frames, beyond that it will not appear disjoint or distract from an apparent lack of animation. Film on the other hand:

Anything moving will be a blur, but while animated at 24fps it seems natural. The film frame is able to represent light through time with that blur in 1/24th of a second. As a still frame it will lack decernable details, it would make a bad still or desktop wallpaper because we’re viewing light with time. In a 24fps animation of blurs the motion is seen as fluid and no one think it looks disjoint or too slow. If these facts weren’t true we would be using much higher film speeds now that the industry is strong compared to the early 20th century when film was emerging.

I know, I’m just a homo sapien. All those other forums always bring me to this conclusion.

I’ve played games before at 200fps+ on a 60 Hz CRT and “noticed” the “clarity” of it. Because a true 60fps was being achieved and there were absolutely no troughs of 10fps or 15fps. But personally I believe 40fps is good enough to not impact the player response. Fps is still an average and doesn’t mean every frame is sent through timed like film.


(BMXer) #42

You sounded smart with all that but your first post basically claiming anything above 60hz was pointless is still fail.
120hz>60hz, try it and you will see or refer back to what you quoted DarkangelUK saying.


(1234567) #43

We just bought this gizmo that will allow us to use our TV as our monitor.


(BMXer) #44

That gizmo will likely add a ton of input lag to your gameplay. Generally, playing PC games on anything other than a PC monitor results in some hideous input lag. Makes if feel like your xhair can’t keep up with your mouse movements. Input lag is one of main reasons you dont see people using smaller HDTVs has monitors.
For fast paced games like Quake and Brink, rule of thumb is to try to stay below 5ms response time or below.

That being said, it is pretty darn cool to watch shoutcasts and stuff on a 50" plasma! Mirroring whats on your gaming monitor so your lover can watch you shred noobs from the couch could be sick too!


(Otto) #45

[QUOTE=BMXer;268487]You sounded smart with all that but your first post basically claiming anything above 60hz was pointless is still fail.
120hz>60hz, try it and you will see or refer back to what you quoted DarkangelUK saying.[/QUOTE]

The statement is not accurate, and with out getting over complicated I will try to explain what has already been explained.
The refresh rate of the monitor is the limiting factor to your needed FPS

Your monitors maximum refresh rate is also equal to the best in game frame rate you should want to be able to generate.
ie:
60hz monitor = maximum useful FPS 60
120hz monitor = maximum useful FPS 120

The only time this is not the same case is when a monitor’s refresh is only unscaled by the monitor, simulating a higher rate then then its actually producing.

An other reason why some one would argue that having a higher FPS then the monitor can display would look better is frame drop, during a complicated/high action part of the game the frames can dip hard if your hardware isnt up for the task, creating what we have all experienced before jerky frame lag. If you lock your frames to that equal to your monitor’s refresh rate great, if not it can be prudent to aim to keep your FPS’s well above your monitors refresh rate, helping insure a great image no mater what you encounter.

edit:

[QUOTE=BMXer;268593]That gizmo will likely add a ton of input lag to your gameplay. Generally, playing PC games on anything other than a PC monitor results in some hideous input lag. Makes if feel like your xhair can’t keep up with your mouse movements. Input lag is one of main reasons you dont see people using smaller HDTVs has monitors.
For fast paced games like Quake and Brink, rule of thumb is to try to stay below 5ms response time or below.

That being said, it is pretty darn cool to watch shoutcasts and stuff on a 50" plasma! Mirroring whats on your gaming monitor so your lover can watch you shred noobs from the couch could be sick too![/QUOTE]

If the the Gizmo is wireless, then yes it might cause some lag. If he is like me who has been gaming from a 42"LCD HDTV for years and is now running a 100" LCD HDTV Epson projector, on a direct connection from ones video card to display, he has little to worry about. modern HDTV’s have come a long way, no per scale they dont have the same specs as say a 22" desktop monitor, but the difference is very minor.

No not every one prefers using such a huge screen as we do, at first, but when you get used to the massive step up in size its comes very natural. I for one felt out of place when i first went from my 17" old LCD motor and went to 100", but just like getting used to a mouse and keyboard if you have used a console all your life, I would never go back.


(Weapuh) #46

Refresh rates on TV’s are generally slower than PC monitors, it’s unnecessary since TV’s don’t need to cater to fast human input. There is what seems like an input lag (not hardware but refresh rate lag). My brother doesn’t notice it (most people don’t notice when it’s their primary) on his giant LCD HDTV but i find it unplayable


(LyndonL) #47

Obviously can’t be too bad since the majority of gamers (consoles) play on HDTVs :tongue:


(Weapuh) #48

It’s less noticeable when the controller isn’t 1:1 ratio of movement in-hand to on-screen as a mouse (without accel ect) is.


(Otto) #49

I find it funny that people notice such a critical level difference to make the “sacrifice” of playing on a HDTV. I’m not saying people can’t notice it, but we are talking about fractions of seconds.
Maybe I’m getting old, I’ve grew up on an NES on a used 20" CRT TV 1980’s tech, later on moved to a PC with a 15" budget CRT monitor 1993ish, years later 1999ish moved to a early gen 17"lcd monitor and used it competitively for years, when I made the move from it to a modern HDTV some years later 2007ish, it was paradise compared to the others. The difference between modern HDTV’s and monitors is mere fractions compared to the leaps and bounds made in the past. Perhaps many people today are spoiled these days when it comes to differences in technology.

I pulled up some specs on one of the best samsung 46" HDTV’s and one of samsungs best 22.6" desktop monitor
THE MONITOR 23.6"
http://www.futureshop.ca/en-CA/product/samsung-samsung-23-6-pcoip-lcd-monitor-with-5ms-response-time-lf24ppbcb-za-black-lf24ppbcb-za/10150969.aspx?path=f41bc7056dc4c06e7aeba77b60303b30en02
1080p
Response Time 5 ms
Refresh Rate: 120 Hz (assumed or less)

THE 46" HDTV
http://www.futureshop.ca/en-CA/product/samsung-samsung-46-1080p-120hz-led-hdtv-un46c6500-future-shop-exclusive-un46c6500/10140386.aspx?path=37e79adb4721e5d17ae5ad0b9374488cen02
Response Time 4 ms
Refresh Rate 120 Hz

Granted I am not a Video Display expert, but according to the specs, if response time is what every one is saying is what counts, we are talking about 1ms difference, what’s even funnier in this case is the HDTV which is twice the size has the better response. I know there are both monitors and HDTV’s with even better response times then these two random examples i picked, it is easier to find a desktop monitor with a 1ms rate then it is a HDTV. Even then it just seems very elitist to be so fussy over a difference of 3milliseconds.

I consent that this is a matter of personal preference, I just don’t like the idea of old misconceptions in technology to still linger on, And that these misconceptions are holding people back from the amazing experience of gaming on a big beautiful HDTV or Projector.


(BMXer) #50

The “sacrifice” is huge IMO. Input lag makes aiming more inaccurate. No amount of screen size is worth not being able to accurately control my xhair. Not to mention the fact that you either have to sit so far away from the display it makes it pointless to be so big or you have to physically move your head so much you miss stuff.

Try playing on a 120hz LCD in QuakeLive, ETQW, COD, etc with r_displayrefresh “120”. Now try your HDTV/projector/60hz monitor. Now tell me the 120hz LCD monitor doesn’t feel considerably smoother and more responsive. For me and everyone I know on 120hz, there is no turning back!

I’m no video expert either but everything I have read about HDTV refresh rates leads me to believe most of the 120,240hz stuff is basically a gimmick to try to eliminate motion blur. I have read a bunch of places saying 120hz on an HDTV is just 60hz doubled. Not at all the same as whats happening on a PC monitor. Hopefully someone will chime in who know what I mean and can explain better. I have yet to play an fps game on anything other than a PC monitor that I couldn’t immediately notice the input lag…granted I haven’t tried a 100" projector :).


(Nail) #51

Why, your new computer will have HDMI output, just use that


(Otto) #52

Computer location, My computer and ceiling mounted projector are only 9ft away from each other if i was to hang a HDMI cable through my room in mid air. In order for me to keep the room looking professional and still maintain the best possible video signal to the projector I had to rout a high grade 25ft HDMI cable through my walls. When you are spanning long distances with any form of data cable the single will degrade, to help prevent this you need a top grade HDMI cable. In my case 25ft of HDMI cost me $200 just for the HDMI cable, and that was on sale at ebay(retail same cable $350+tax). Cost is one thing, Putting the wire neatly in the walls and ceiling is an other.

I’m lucky my equipment is all in one room and it was as difficult as it was. If our gizmo friend is trying to stream video from one side of the house to the other, having to both get an HDMI long enough and wiring it though the rafters is another. Tho he isn’t getting the best video signal he could be getting, the labor alone of running the wire may not be practical in his situation.


(Nail) #53

so you’re saying use a HDMI cable, only gadget I know of, is a wireless media streamer, not what I’d use for gaming


(Auzner) #54

Monoprice.com sells long heavy gauge HDMI cables for very cheap. Also if you’re going to spend hundreds of dollars you might as well do HDMI over Cat6. They use the wiring standard, not ethernet protocols, to transfer video signals. It’s a non-standard and proprietary way that allows very long lengths with smaller wire. Video signals take a lot of bandwidth so wireless or ethernet is the wrong way to do HD. The “Gizmo” is also a waste, whatever it is.

[QUOTE=Otto;268632]I find it funny that people notice such a critical level difference to make the “sacrifice” of playing on a HDTV.

Granted I am not a Video Display expert, but according to the specs, if response time is what every one is saying is what counts, we are talking about 1ms difference[/QUOTE]
When people discuss input lag they mean the added delay of the display’s DSP for DVI to LVDS. HDTVs that aren’t LCD don’t have direct 1:1 pixel mapping and do strange things like overscan and underscan the signal. More processing/filtering is going on so there is a delay as the clocks cycle.
Response time is the physical polarization parameters of the pixels. There are advantageous ways to report this number which do not reflect practical use. I reasoned that at 60 Hz a response of under 16ms is needed. Lower is better incase they “lie”.

The display does stroboscopic processing on the signal for temporal aliasing. It attempts to create more “motion blur” frames inbetween the actual data that is a frame. The actual effect is that every movie looks slightly sped up or like a soap opera. It still does 3:2 pull down even. Some make the mistake of thinking that it’s superior because 120/24 = a nice integer. They’re not true 120 Hz televisions because only the signaling to the panel operates at that. The actual input standards (HDMI, DVI, etc) will only accept 60 Hz signals. You can’t feed it 120 Hz and refresh the panel at that. It only does 120 Hz for it’s own processed frames. I personally can’t stand movies “enhanced” by this. In store demos are always scrolling maps or blocks of text. While it’s interesting to see a difference, no movie is just a bunch of rapidly scrolling text so it’s an impractical demo. Plasma is something else and everything is still truly 60 Hz.


(Otto) #55

[QUOTE=Auzner;270708]
The actual input standards (HDMI, DVI, etc) will only accept 60 Hz signals. You can’t feed it 120 Hz and refresh the panel at that. It only does 120 Hz for it’s own processed frames. [/QUOTE]

Dose that mean since all modern displays use ether DVI or HDMI that there is screen(montitor/HDTV/projector/etc) out there that can accept a raw 120hz signal and passivly prosses it?


(brbrbr) #56

talking about DVI its true only for DVI-d capable, devices.
and seriously depend on resolution too, ie resulted bandwidth.


(Auzner) #57

3D capable TVs have to accept 120 Hz signals, so it is possible over the interface. I meant that the 120 Hz and 240 Hz panels that have “image smoothening” using HDMI 1.3 only take in 60 Hz signals and reprocess it higher. 3DTV has to use 120 Hz or higher so that you can have 60 Hz per eye with shutter glasses. I suppose you could give them 120 Hz 2D signals as well. TVs aren’t like computer monitors and tend to have more input lag.


(Auzner) #58

If anyone wants to neckbeard displays, the monitors that trump everything that no one can contest are the FW900 models. Sony GDM-FW900, Sony GDM-FW9010 Sun X7145A, and HP A7217A. These are 24" widescreen 1920x1200 CRTs.

http://www.sunshack.org/data/sh/2.0/infoserver.central/data/syshbk/Devices/Monitor/MONITOR_Color_24_FD_Premium_CRT.html

You can do that at 85 Hz despite what it says. Being analog there’s “no input lag”. If you can find one it’s probably cheaper than a 24" LCD since everyone wants to simply get rid of these.


(BMXer) #59

I actually have a FW900. I used it for about 6 months or so. I got mine used for $100! Its a great monitor. The ENORMOUS size and weight (literally 100lbs!) made it kind of a pain in the rear.
After playing on it and then going to a Viewsonic 22" 120hz LCD, I can without a doubt say I would never go back to the FW900. Its been sitting out in my shop collecting dust. The difference in input lag is not noticeable at all and the LCD felt just as smooth at 120hz in-game.

And I dont know man, I could go measure the thing but I could swear the actually screen size is about the same as my 22". Maybe its just the huge CRT case that makes the actually screen look smaller or something but I didn’t feel like I lost any screen size going to the 22" LCD… or maybe only 2 inches isn’t that noticeable. :slight_smile:


(Herandar) #60

Sounds like my television. Eleven years ago I finally broke down a got a high-end Sony television. This was a year or so before the first HD flat screen models came out, and I was fine with a 36" XBR.

I still have it, and it still works perfectly, It is a very sharp SD picture. I’m not afraid that anyone would ever steal it though. Damn thing weighs 286 pounds (130 kilos).