View unanswered posts    View active topics

All times are UTC - 6 hours





Post new topic Reply to topic  [ 84 posts ] 
Go to page Previous  1, 2, 3, 4, 5, 6  Next

Print view Previous topic   Next topic  
Author Message
Search for:
 Post subject:
PostPosted: Sat Jul 03, 2004 8:07 am 
Offline
Joined: Sat Mar 13, 2004 6:41 am
Posts: 44
Well, i don´t know how far R5 is, but these drivers are well worth thinking about. The interlaced modes are just great for TV-OUT, no need for deinterlacing anymore (good for weaker machines) and Overscan and antiflicker are just great as well, as the nvidia-settings tool is.

And for the glx problem i guess the debian gurus here might solve this anyway (maybe we need kernel 2.6 for this)


btw How can i run a script before mythfrontend autostarts? Wanna run nvidia-settings -l before mythfrontend.

Thanks


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jul 03, 2004 11:22 am 
Offline
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location: Arlington, MA
tanzania wrote:
btw How can i run a script before mythfrontend autostarts? Wanna run nvidia-settings -l before mythfrontend.

Just add it to your /home/mythtv/.fvwm/.fvwm2rc out near the end.

Most of the things that it controls can also be done through settings in your /etc/X11/XF86Config-4 or through the Xv controls, which has the advantage of reducing your "part count". The biggest advantage of the tool is being able to interactively tune the video output. 8-)


Top
 Profile  
 
 Post subject: Re: works great
PostPosted: Sat Jul 03, 2004 5:00 pm 
Offline
Joined: Mon Jun 28, 2004 7:35 pm
Posts: 18
xmichael wrote:
I'm still sad that NVidia hasn't done anything to provide "flicker reduction" features, like there windows counterpart...


Update the drivers and then nvidia-settings and there is a flicker reduction exactly like windows. It works very well, makes text for program info much more readable

**Edit- Oops, it's nvidia-settings not nvidia-configure


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jul 04, 2004 8:45 am 
Offline
Joined: Sun Mar 07, 2004 5:34 am
Posts: 116
Location: UK
Hate to rain on everyones parade.

I have pulled my Matrox card out and put back my 440MX, loaded the new drivers, played for hours with modelines etc.

I am using DVB feeds which are interlaced in nature. As far as I can see there is no benifit in interlaced over the original drivers. Seems that the improvement is interlaced modes for the VGA output/ It does not seem to help with passing interlaced content seemlessly through the card to the s-video output. Therefore this is great for those with an HDTV with VGA input, but for us old fashioned people with S-Video Composite, there is nothing to be gained on the interlacing front. :(


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jul 04, 2004 10:49 am 
Offline
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location: Arlington, MA
Are you running the display at 640x480 (or the PAL native resolution)?


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jul 04, 2004 1:31 pm 
Offline
Joined: Sun Mar 07, 2004 5:34 am
Posts: 116
Location: UK
I am running at 800x600 50Hz interlaced. Unfortunately the new drivers refuse the 720x576 modeline that worked perfectly well with previous drivers and marks it as (not a valid tv out mode ) [Telll that to the whole of europe apart from france, Australia ...... :lol: :lol: ]

I was running the Myth GUI at 720x576 and sizing the picture to that, and then using overscan, which does work :D, to fill the screen.

I have just seen there is a similar thread running on the Myth mailing list. Seems I am not alone in pointing out the emperor needs a new tailor 8)

Playing films and anything that is frame based works great with out kerneldeint, but it always has. A good test is sports or fast panning TV studio shows.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jul 04, 2004 4:33 pm 
Offline
Joined: Fri Apr 02, 2004 10:08 am
Posts: 1637
Location: Virginia, USA
I had always heard that nVidia's TV-Out did not properly leave alone an interlaced signal. I saw as much when trying to do PVR-type stuff under Windows... although the nVidia software claimed to support passthrough of interlacing, I never saw it on sports and live TV shows.

It seemed that only certain boards (Matrox, possibly?) support passthrough of an interlaced signal through their TV outs.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Jul 05, 2004 2:58 pm 
Offline
Joined: Thu Jun 24, 2004 4:06 pm
Posts: 23
red321 wrote:
Hate to rain on everyones parade.

I have pulled my Matrox card out and put back my 440MX, loaded the new drivers, played for hours with modelines etc.

I am using DVB feeds which are interlaced in nature. As far as I can see there is no benifit in interlaced over the original drivers. Seems that the improvement is interlaced modes for the VGA output/ It does not seem to help with passing interlaced content seemlessly through the card to the s-video output. Therefore this is great for those with an HDTV with VGA input, but for us old fashioned people with S-Video Composite, there is nothing to be gained on the interlacing front. :(


Maybe you had your settings wrong. I just updated to the new nvidia drivers, I have a GeForce4 MX card, and I use S-video out. By switching the output mode to either of the new HD480 modes (instead of the NTSC-M mode), I get an awesome, flicker-free output on my Toshiba 27" TV. And the XvMC video playback is MUCH better than it ever was with the old drivers. I had previously given up on XvMC and was using software decoding with the kerneldeint filter active.

The video playback quality with these new drivers and XvMC enabled is vastly superior to software decoding with the standard deinterlacing filter included with MythTV, and it's almost as good (maybe even exactly as good) as using the kerneldeint filter. Only now the processor load has dropped drastically during video playback. There's no distracting ghosting effect that I can see, and even my girlfriend who knows nothing about such things commented on how nice the video output was.

As far as the anti-flicker setting goes, that seems to be enabled when you use any of the HDxxxx output modes. I didn't have to run nvidia-settings to get the flicker to go away. All thin horizontal lines in the settings screens are now rock-steady and smooth. All fonts are much easier to read. As far as the menus go, the viewing quality is now comparable to my Bell ExpressVu satellite receiver's menu screens. Clear, sharp, and no flicker whatsoever.

One note; I did have to comment out the GLX module for the new driver to work. I don't use GLX-enabled stuff anyways, so I didn't bother fixing that Debian-related problem.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Jul 06, 2004 9:20 am 
Offline
Joined: Sun Mar 07, 2004 5:34 am
Posts: 116
Location: UK
Can you post your XF86config-4 ? I would be very happy to be wrong on this one :lol: but my understanding agrees with this thread:

http://www.gossamer-threads.com/lists/m ... sers/75519

I am using PAL with interlaced content, and it will not accept any 576 modelines as valid.

If it works for you I am happy, but I would be even happier if it works for me :wink:


Top
 Profile  
 
 Post subject: nvidia-settings
PostPosted: Thu Jul 08, 2004 5:06 am 
Hi,
I am running R4 with the new Nvidia 6106 drivers. When I start an xterm and then start nvidia-settings, the program opens in a window at the bottom of the screen so that most of the window is off-screen. I can only see about the top inch of the nvidia-settings program window.

I have tried click and drag on the window to move it or resize it or maximize it with no luck. I can get it to minimize but cannot ever seem to get the window so I can actually use nvidia-settings.

Is anyone else having this problem? How do I fix it? Help!

THANKS!

Andrew Lynch


Top
  
 
 Post subject:
PostPosted: Thu Jul 08, 2004 2:22 pm 
Offline
Joined: Sat Mar 13, 2004 6:41 am
Posts: 44
glx problem solved, apt-get update, apt-get install nvidia-glx worked now. Seems as if it took some time until they were released.


Top
 Profile  
 
 Post subject: Re: nvidia-settings
PostPosted: Thu Jul 08, 2004 4:11 pm 
Offline
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location: Arlington, MA
lynchaj wrote:
Hi,
I am running R4 with the new Nvidia 6106 drivers. When I start an xterm and then start nvidia-settings, the program opens in a window at the bottom of the screen so that most of the window is off-screen. I can only see about the top inch of the nvidia-settings program window.


At a guess you have a virtual screen size that is bigger than the actual screen size. Look at your X logs (in /var/log/XFree86.0.log) to see what the actual resolution is, you may even find a virtual vs. actual message in there. Fix your /etc/X11/XF86Config-4 to use that same resolution.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jul 08, 2004 4:16 pm 
Offline
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location: Arlington, MA
tanzania wrote:
glx problem solved, apt-get update, apt-get install nvidia-glx worked now. Seems as if it took some time until they were released.
Did you do your initial install via apt-get? What was your package list?

I used the Nvidia script (with the results previously mentioned), and am wondering if I should rip it out and take the apt-get route...


Top
 Profile  
 
 Post subject: virtual vs. actual size
PostPosted: Thu Jul 08, 2004 6:59 pm 
Is this what you are referring to in XFree86.0.log?

(**) NVIDIA(0): Validated modes for display device TV-0:
(**) NVIDIA(0): Mode "640x480": 25.2 MHz, 31.5 kHz, 59.9 Hz
(II) NVIDIA(0): Virtual screen size determined to be 640 x 480
(==) NVIDIA(0): DPI set to (75, 75)

I fiddled around with this a bit more and ended up starting a second X client with

startx -- :1

On that, plain old icewm is the wm and (ahhhh....) no more funky window management.

I got the nvidia-settings to run and write a config file. I'll just edit that with vi from now on and skip the nvidia-settings.

Thanks!

Andrew Lynch


Top
  
 
PostPosted: Thu Jul 08, 2004 8:20 pm 
Offline
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location: Arlington, MA
lynchaj wrote:
Is this what you are referring to in XFree86.0.log?

(**) NVIDIA(0): Validated modes for display device TV-0:
(**) NVIDIA(0): Mode "640x480": 25.2 MHz, 31.5 kHz, 59.9 Hz
(II) NVIDIA(0): Virtual screen size determined to be 640 x 480
(==) NVIDIA(0): DPI set to (75, 75)

Yes, but it looks like it's right, unless fvwm has a bigger virtual size...
lynchaj wrote:
On that, plain old icewm is the wm and (ahhhh....) no more funky window management.

"Plain old icewm"! My how times have changed. The fvwm project was started to be a lightweight and minimal virtual window manager based on twm. It's still pretty svelte compared to a KDE or Gnome environment.


Top
 Profile  
 

Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 84 posts ] 
Go to page Previous  1, 2, 3, 4, 5, 6  Next



All times are UTC - 6 hours




Who is online

Users browsing this forum: No registered users and 4 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group

Theme Created By ceyhansuyu