View unanswered posts    View active topics

All times are UTC - 6 hours





Post new topic Reply to topic  [ 14 posts ] 
Print view Previous topic   Next topic  
Author Message
Search for:
 Post subject: 1080i Fields Swapped
PostPosted: Sun Oct 01, 2006 2:57 pm 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
Hi.

I'm running R5D1 on a "near-Dragon" system. I have the ASUS 6200TC video card and the MSI K8N Neo4 Platinum motherboard.

When I play back a 1080i recording through my 1080i modeline, the interlaced fields are obviously swapped. The motion looks really strange and about half the time when there are scene changes you can see a field from the new scene, a field from the old scene, and then fields from the new scene. That is, scene changes flicker between the old and new scene for a moment.

This happens for XvMC or libmpeg2 or Standard playback. If I switch to a 1920x540p modeline and run Bob deinterlacing, it looks pretty good, but I wanted 1080i b/c then I wouldn't have to muck with the deinterlace settings each time I want to watch a 1080i show, and I would also be sure that the upper scanlines were drawn below the lower scanlines (and with 540p/Bob it's a crapshoot).

My modeline is:
Code:
    Modeline "1920x1080i" 79.84 1920 2040 2200 2368 1080 1090 1106 1125 +vsync -hsync interlace

but I've tried several and all have the same problem.

Thanks for any help.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Oct 02, 2006 12:24 am 
Offline
Joined: Tue Jan 18, 2005 2:07 am
Posts: 1532
Location: California
You're probably going to need to provide more info to get some help on this one, such as:

1. What version of knoppmyuth

2. What video card?

3. What video driver and version?

4. What output device are you displaying on?

5 What type of connection to the video display are you using?

In terms of fiddling the deinterlace settings -- I've documented a technique to automate that process in this thread http://mysettopbox.tv/phpBB2/viewtopic. ... ght=filter


Marc


Top
 Profile  
 
 Post subject:
PostPosted: Mon Oct 02, 2006 12:55 am 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
marc.aronson wrote:
1. What version of knoppmyuth

R5D1
marc.aronson wrote:
2. What video card?

ASUS Extreme N6200TC256/TD

marc.aronson wrote:
3. What video driver and version?

NVIDIA 8774
marc.aronson wrote:
4. What output device are you displaying on?

Sony KD34XBR960.
marc.aronson wrote:
5 What type of connection to the video display are you using?

An VGA->Audio Authority 9A60->Component. Since the signal is analog I'm sure the fields are probably swapped when they come out the VGA port on the card.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Oct 02, 2006 9:29 pm 
Offline
Joined: Tue Jan 18, 2005 2:07 am
Posts: 1532
Location: California
My HD video connection has always been either VGA or DVI -- never used the Audio Authority 9A60, but there are others on the fourm who have. Perhaps one of them has some insights into this...

Marc


Top
 Profile  
 
 Post subject:
PostPosted: Tue Oct 03, 2006 1:16 pm 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
marc.aronson wrote:
My HD video connection has always been either VGA or DVI -- never used the Audio Authority 9A60, but there are others on the fourm who have. Perhaps one of them has some insights into this...

Marc


Well, I was pretty sure it had nothing to do w/ the 9A60, so I connected my box to a ViewSonic P95f+ multiscan CRT (directly via VGA), and ran my usual 1080i mode. The problem persists.

Marc, if you run a 1080i modeline without problems, could you give it to me, and I can see if the modeline has anything to do with it?


Top
 Profile  
 
 Post subject:
PostPosted: Wed Oct 04, 2006 9:16 pm 
Offline
Joined: Tue Jan 18, 2005 2:07 am
Posts: 1532
Location: California
Sure. I don't use 1080i anymore as I know connect to me HDTV via DVI, and the DVI connection only understands 720p, but I used the following modeline for several months successfully with the VGA connection:

Code:
        modeLine "1920x1080" 74.25 1920 1960 2016 2200 1080 1082 1088 1125 Interlace


Marc


Top
 Profile  
 
 Post subject:
PostPosted: Sat Oct 07, 2006 8:20 pm 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
marc.aronson wrote:
Sure. I don't use 1080i anymore as I know connect to me HDTV via DVI, and the DVI connection only understands 720p, but I used the following modeline for several months successfully with the VGA connection:

Marc


No good. With your modeline I still have the problem.

Leif


Top
 Profile  
 
 Post subject:
PostPosted: Sat Oct 07, 2006 8:39 pm 
Offline
Joined: Tue Jan 18, 2005 2:07 am
Posts: 1532
Location: California
Maybe it's something else. Consider:


1. utilities/setup->setup->tv_settings->Playback->Screen 1 (General playback):
Do Not Select the following: "Enable OpenGL Vertical sync for timing", Enable realtime priority threads", "use video as timebase".

Make sure you do select "Extra Audio Buffering".

2. utilities/setup->setup->tv_settings->Playback->Screen 8 (Overscan): Make sure all of these are set to "0".

3. utilities/setup->setup->Appearance: Tell me how you have screen 3 (video mode settings) configured.

Marc


Top
 Profile  
 
 Post subject:
PostPosted: Sat Oct 07, 2006 9:10 pm 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
marc.aronson wrote:
1. utilities/setup->setup->tv_settings->Playback->Screen 1 (General playback):
Do Not Select the following: "Enable OpenGL Vertical sync for timing", Enable realtime priority threads", "use video as timebase".

Make sure you do select "Extra Audio Buffering".

Previously I had this just as you say, except for "Enable realtime priority threads," which I have now unchecked.

marc.aronson wrote:
2. utilities/setup->setup->tv_settings->Playback->Screen 8 (Overscan): Make sure all of these are set to "0".

This is how I have them set.

marc.aronson wrote:
3. utilities/setup->setup->Appearance: Tell me how you have screen 3 (video mode settings) configured.

Separate video modes for GUI and TV playback is checked.

GUI: 1280x720
Video Output: 1920x1080
Rate: Any
Aspect: 16:9

In X / In Y / Output / Rate / Aspect
1920 / 1080 / 1920x1080 / Any / 16:9
1280 / 720 / 1280x720 / Any / 16:9
0 / 0 / 400x300 / blank / Default

I made the one change you suggested (disable realtime priority thread) and I still have the problem.

I also tried changing the Y scan displacement from 0 to 1 or -1, and that doesn't fix it either!

I found another mention of fields being swapped:

http://www.gossamer-threads.com/lists/mythtv/users/214506#214506

I e-mailed him. Hopefully he's made some progress. :)

Leif


Top
 Profile  
 
 Post subject:
PostPosted: Sun Oct 08, 2006 12:42 am 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
I have a theory!

OK, so apparently MPEG data can have a top_field_first flag. I found a place in MythTV where this is handled by the BOB filter, but I can't find a place outside the BOB filter where this is handled (not completely sure tho).

Since my display works fine at 1920x540p w/ BOB, and not well without it, my bet is that my local station is sending MPEG data that has the top_field_first flag different from what my video card is doing in the interlaced mode.

This would explain why it doesn't happen to many people. You'd have to be in an area where the TV station sends MPEG this way.

I spent about half an hour trying to get ffmpeg to print the value of this field for my shows, but I'm giving up for now.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Oct 08, 2006 4:46 am 
Offline
Joined: Tue Jan 18, 2005 2:07 am
Posts: 1532
Location: California
Mythtv 0.20 advertises a feature where it automatically detects if a video required deinterlacing. While this won't solve the 1080i problem you described, it would allow you to always use 540p mode without having to manually toggle the deinterlace mode. There are postings that describe how to migrate R5D1 to 0.20.

Marc


Top
 Profile  
 
 Post subject:
PostPosted: Mon Oct 09, 2006 5:26 pm 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
marc.aronson wrote:
Mythtv 0.20 advertises a feature where it automatically detects if a video required deinterlacing. While this won't solve the 1080i problem you described, it would allow you to always use 540p mode without having to manually toggle the deinterlace mode. There are postings that describe how to migrate R5D1 to 0.20.

Marc


I've noticed a few people complaining that the detection in 0.20 isn't very good, and XvMC has issues also, so I think I'll skip it for now. And it would be nice to get this working w/ a real 1080i modeline, so I could have the full spatial resolution.

I ran my test, getting ffmpeg to tell me what the top_field_first flag was. It's always 1 for my recordings. I have been *completely* unable to figure out what my video card is supposed to be doing w/ an interlaced mode. I guess back in the day when interlaced modes were common, computers displayed static images and nobody cared.

I also read the code for the "Overscan" page and found out why changing the value to -1 or 1 did nothing. If you don't stretch or shrink the display by setting the first two entries to something, it ignores the last two entries. I'm tempted to compile my own 0.19-fixes without this restriction so I can try to move the video down 1 line and see if that fixes it for all my recordings.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Oct 10, 2006 12:16 pm 
Offline
Joined: Wed Dec 22, 2004 1:45 pm
Posts: 28
[quote="volfy"]
I ran my test, getting ffmpeg to tell me what the top_field_first flag was. It's always 1 for my recordings.
[/quote]
Oops, nevermind. Actually I have recordings that have top_field_first=0, and they play wrong too. So my theory that the problem is MythTV somehow not honoring that flag when not running BOB is wrong.

Back to square one.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Oct 10, 2006 1:52 pm 
Offline
Joined: Fri Apr 02, 2004 10:08 am
Posts: 1637
Location: Virginia, USA
[quote="volfy"]I have been *completely* unable to figure out what my video card is supposed to be doing w/ an interlaced mode.[/quote]

I've been following this thread and I think this is the key to your problem, namely, making sure that your video card is outputting interlaced, that it's outputting the proper interlacing for your TV, and that playback of interlaced material is synched appropriately so the right fields are played at the right times.

Given that I think most people just run progressive and let BOB handle deinterlacing. It's easier and the results are pretty smooth.


Top
 Profile  
 

Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 14 posts ] 


All times are UTC - 6 hours




Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group

Theme Created By ceyhansuyu