Author |
Message |
BigBill
|
Posted: Sun Oct 15, 2006 12:12 pm |
|
Joined: Sat Oct 14, 2006 12:38 pm
Posts: 24
Location:
San Diego
|
Just purchased a AirStar-HD5000AV-PCI for hdtv capture. I realize there aren't any 1080p stations in my area. My tv will take 1080p via hdmi and upto 1080i from component input. I have read here that the capture and writing to disk doesn't require encoding. Once the show is on the hard drive is it still in 720p or 1080i format, is it mpeg? Does it require decoding to play it on the tv? What card would give me the best resolution on the tv? Thanks, Bill
|
|
Top |
|
|
tjc
|
Posted: Sun Oct 15, 2006 12:48 pm |
|
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location:
Arlington, MA
|
[quote="BigBill"]Once the show is on the hard drive is it still in 720p or 1080i format, is it mpeg? Does it require decoding to play it on the tv? What card would give me the best resolution on the tv?[/quote]
Yes it's still in the original resolution, yes it's MPEG-2 (Digital HDTV broadcasts are just an encoded MPEG stream), yes it requires decoding to playback, and for the last question "it depends". You want a card which has the proper outputs to drive your HDTV at the appropriate resolutions, and which supports a fast enough dot clock to paint the screen at the maximum resolution. For a start look at what people are using for HDTV in the Tier 1 Hardware Recommendations, and the KRP Dragon is probably a good point of reference.
For example in [url]http://mysettopbox.tv/phpBB2/viewtopic.php?t=12158[/url] there is a mention that the maximum dot clock on the FX5200 card may not be high enough.
|
|
Top |
|
|
ceenvee703
|
Posted: Sun Oct 15, 2006 5:36 pm |
|
Joined: Fri Apr 02, 2004 10:08 am
Posts: 1637
Location:
Virginia, USA
|
[quote="BigBill"]JI realize there aren't any 1080p stations in my area.[/quote]
And just to clarify, there aren't any 1080p stations in ANY area of the US. 1080p is not part of the ATSC broadcast specification.
|
|
Top |
|
|
BigBill
|
Posted: Sun Oct 15, 2006 10:53 pm |
|
Joined: Sat Oct 14, 2006 12:38 pm
Posts: 24
Location:
San Diego
|
I believe 1080/24p is still part of the ATSC broadcast spec..
|
|
Top |
|
|
ceenvee703
|
Posted: Mon Oct 16, 2006 2:57 am |
|
Joined: Fri Apr 02, 2004 10:08 am
Posts: 1637
Location:
Virginia, USA
|
You are correct and I stand corrected. The problem with 1080p is one of bandwidth, not that it's not part of the spec.
http://en.wikipedia.org/wiki/1080p#Broa ... _standards
My original take-home message is still correct, though: there are no 1080p broadcasters in the US and probably won't be for a long time.
|
|
Top |
|
|
getnate
|
Posted: Tue Oct 17, 2006 4:10 pm |
|
Joined: Wed Apr 26, 2006 10:47 am
Posts: 17
|
tjc wrote: BigBill wrote: Once the show is on the hard drive is it still in 720p or 1080i format, is it mpeg? Does it require decoding to play it on the tv? What card would give me the best resolution on the tv? Yes it's still in the original resolution, yes it's MPEG-2 (Digital HDTV broadcasts are just an encoded MPEG stream), yes it requires decoding to playback, and for the last question "it depends". You want a card which has the proper outputs to drive your HDTV at the appropriate resolutions, and which supports a fast enough dot clock to paint the screen at the maximum resolution. For a start look at what people are using for HDTV in the Tier 1 Hardware Recommendations, and the KRP Dragon is probably a good point of reference. For example in http://mysettopbox.tv/phpBB2/viewtopic.php?t=12158 there is a mention that the maximum dot clock on the FX5200 card may not be high enough.
I struggled with this problem for a day but there is a solution. You can workaround the maximum pixel clock limitation in the latest drivers by configuring the "NoMaxPClkCheck" option in your FX86config-4. This option and others are documented in the readme for the drivers. I know the hardware in the FX5200 supports a pixel clock > 135MHz, 1080P and higher resolutions work in windows on the same machine!
FYI, I also had to disable all forms of EDID in the driver because my Westinghouse 1080p monitor was telling the card it did not support 1080p. After all this I got the FX5200 128MB working at 1080p with latest nvidia drivers.
|
|
Top |
|
|
gamelyn31
|
Posted: Thu Oct 19, 2006 7:15 am |
|
Joined: Sun Oct 23, 2005 7:25 pm
Posts: 62
Location:
Chicago, IL
|
getnate - just out of curiosity, are you using DVI, VGA, or Component (or something else?) Is the Westinghouse "monitor" classified as a computer monitor or a tv?
I only ask because I connect my mythbox to my Panasonic tv using DVI and even after disabling EDID it just won't display 1080i (though it displays 720p just fine by upconverting). The tv's specs show native support for 1080i and even give me the modeline settings for it - nonetheless it doesn't work (all I get is a black screen). I've tried several different modelines, no luck.
I'm wondering if it might work using component instead of DVI...
|
|
Top |
|
|
afrosheen
|
Posted: Tue Oct 31, 2006 9:59 am |
|
Joined: Sat Mar 26, 2005 3:49 pm
Posts: 290
|
1080i probably isn't native on that set, particularly if it's a plasma or last year's LCD. If the default resolution (or max resolution) is 1366x768 then it's 720p. If it's taking a 1080i signal, it's definitely rescaling it. My 32" Aquos is the same way. It will 'do' 1080i but the max it can physically display is 720p. All sources are either up or downconverted to fit the native res.
|
|
Top |
|
|
gamelyn31
|
Posted: Thu Nov 09, 2006 10:55 pm |
|
Joined: Sun Oct 23, 2005 7:25 pm
Posts: 62
Location:
Chicago, IL
|
I failed to mention that the Panasonic tv is a CRT. It is 1080i. I can get it to work with a regular set top box, via component, it just doesn't seem to want to work via DVI from my Mythbox. That's why I want to try Mythbox --> TV via component, just to see if it works.
Unfortunately, my 1-yr-old hard drive in my mythbox died, so I never got the chance. I finally got a replacement drive, so I'll be able to try it out this weekend... hey, at least I get to install R5D1 now!
|
|
Top |
|
|
ed.gatzke
|
Posted: Fri Nov 10, 2006 8:52 am |
|
Joined: Mon Oct 02, 2006 7:24 am
Posts: 39
|
Recent thread on mythtv-users
http://www.gossamer-threads.com/lists/m ... ers/235004
Or read some of the wikipedia stuff for basics on 1080p vs 1080i vs 720p etc.
I just read some people talk about some of the 720p sets rescaling the signal to native resolution, so when you see 1366x768, you are actually not getting a 720p true signal.
I think you might as well get a 1080p and upscale and interlace everything. I have been waiting for 1080p sets for years, since I heard about the numerous resolutions in the ATSC standard, so I am about to jump to HD finally...
|
|
Top |
|
|
afrosheen
|
Posted: Fri Nov 10, 2006 9:16 am |
|
Joined: Sat Mar 26, 2005 3:49 pm
Posts: 290
|
Well actually since the 720p spec is 1280x720 and the set's native resolution is 1366x768, you get true 720p but the set is using the extra pixels for overscan so it does a little scaling on its own. There's a reason 99% of hd tvs and monitors are all built to that resolution rather than a truer 1280x720.
I still think 1080i is pointless. My Aquos will accept a 1080i signal but it's 720p before it gets to my eyeballs, and progressive always looks better than interlaced anyway. Plus if you don't have the physical pixels needed for 1080, then it's extra pointless. Go with 720p on 1366x768 native sets and 1080p on 1920x1080 sets. Simple.
|
|
Top |
|
|
ed.gatzke
|
Posted: Fri Nov 10, 2006 1:00 pm |
|
Joined: Mon Oct 02, 2006 7:24 am
Posts: 39
|
Quote: There's a reason 99% of hd tvs and monitors are all built to that resolution rather than a truer 1280x720.
I think it goes back to the old 1024x768 resolution. 1266x768 is just a wide screen extension of the old XGA format. I forget why we get stuck with those old VGA, SVGA, XGA resolution (640x480, 800x600, 1024x768) but I think it has to do with optimally using memory buffers, back when 1MB was a lot. Maybe I am clueless on this.
Still, I want to see all my pixels, so any display less than 1080 lines of resolution seems like a waste to me.
|
|
Top |
|
|
afrosheen
|
Posted: Fri Nov 10, 2006 1:49 pm |
|
Joined: Sat Mar 26, 2005 3:49 pm
Posts: 290
|
If your resolution is at least 1280x720 you *are* seeing all your pixels, it just depends on what format they arrive in and if there's any scaling involved.
The 'p' in 720p means progressive and the i in 1080i is interlaced. Progressive is almost always better looking because each frame is a complete frame on screen, whereas interlaced video is every other line, every other frame. The links posted about 2 responses up pretty much cover the differences way better than I can.
Once again, if you get the choice, always go with progressive scan. All the new stuff is moving towards 1080p (Blu-Ray, HD-Dvd, new screens).
|
|
Top |
|
|