nVIDIA and 3D Acceleration - Puppy 2.12 & 2.13
nVIDIA and 3D Acceleration - Puppy 2.12 & 2.13
Hello All
First let me say thankyou for Puppy,its cool small & FAST
and i like it a lot,and think i will be spending a lot of time
playing around with it.
In fact i had (still have)a type2 install of puppy2.12
that i had been playing with, ie: trying to sort out 3D acceleration
using the nvidia driver,3DCC etc (Not sure if i succeeded or not really)
Then mooching about on the forum today, i noticed that Puppy2.13 has been
released.....mmmm.. well i had to didnt i. So i did
I downloaded and installed (type2,different partition) Puppy2.13
And proceeded as i had with Puppy2.12 in trying to sort out 3D acceleration
That is to say, i followed the instructions posted here:
http://www.murga-linux.com/puppy/viewto ... ia&t=13155
Posted: Mon Nov 27, 2006 9:30 am
By tempestuous.
And here:
http://www.murga-linux.com/puppy/viewto ... en&t=11192
Posted: Wed Sep 20, 2006 11:59 pm
Quote
#######################################################
It's designed to be uncompressed from the uppermost directory, so do this -
cd /
tar -zxvf /path/to/my/files/NVIDIA-v8756.tar.gz
######################################################
I had already dowloaded NVIDIA-9629-k2.6.18.1.tar.gz and put it in /
So i did this in a console
cd /
tar -zxvf /NVIDIA-9629-k2.6.18.1.tar.gz
(I assume uppermost directory means /
as opposed to /root
Am i correct in that or not ?)
And then continued as per instructions,Those in first link.
Quote
######################################################
Run "depmod".
The main nvidia module should load the 2 modules already in Puppy that it needs; agpgart and i2c-core ...
but in practice this fails because of Puppy 2.12's zdrv_212.sfs setup.
The fix is to manually load those modules now so they're ready in /lib/modules/...
Do this -
modprobe agpgart
modprobe i2c-core
######################################################
( Both these were already loaded ,in both 2.12 & 2.13, according to lsmod )
Quote
######################################################
Another problem is that for some nVidia cards Puppy will automatically load the rivafb module,
which blocks the nvidia module from loading.
My fix is to add this line to the start of /usr/X11R7/bin/xwin -
rmmod -f rivafb
######################################################
( This did not show in lsmod,in either 2.12 or 2.13 )
Quote
######################################################
Now modify your /etc/X11/xorg.conf -
In Section "Module"
ADD:
Load "glx"
Load "dbe"
######################################################
( These were both there already )
Quote
######################################################
REMOVE:
Load "dri"
Load "GLcore"
######################################################
( Neither of these existed )
Quote
######################################################
In Section "Device"
CHANGE "nv" to "nvidia"
######################################################
( This i did change )
Quote
######################################################
Now restart Xorg and you should have accelerated 3D graphics,
for games, and for hardware accelerated XvMC video playback.
######################################################
Well then, how does one test this out ??
I came across this thread
Puppy 2.12: 3D-Control-Center V2.00
http://www.murga-linux.com/puppy/viewto ... er+control
I dowloaded http://www.dotpups.de/3DCC/2.12/3DCC.pup
Which apparrently requires Gtklist04MU.pup
So with both installed i open 3DCC and do as described
1.) Install DRM-modules to enable accelleration for your Kernel.
2.) Install OpenGL to be able to run 3D-applications.
3.) Install a DRI-module for your graficscard.
4.) REBOOT the computer !
5.) Test 3D with the Demo.
Except part 3.) Do i need to do anything here,if using the nVIDIA driver ?
Running the Demo gives a result of 16 FPS (ish) in BOTH 2.12 & 2.13
Judging by some of the figures being bandied about the forum, this appears
pretty pathetic.
Could someone give an idea of what to expect from a MSI FX5200-TD128 (nVIDIA GeForce)
just a ballpark sort of figure.?
So after mucho searching and trying various suggestions found, i still get 16 FPS
Am i missing something ????
I have tried this on several FRESH installs using either
NVIDIA-9629-k2.6.18.1.tar.gz from tempestuous
NVIDIA-7184-k2.6.18.1.tar.gz from tempestuous
or
NVIDIA-Linux-x86-1.0-9746-pkg1.run from nvidia.com
NVIDIA-Linux-x86-1.0-7184-pkg1.run from nvidia.com
Still it is always around 16 FPS
So some more reading reveals something interesting (to me at least)
Comparing the files mentioned in
Appendix C. Installed Components of the READ ME
(http://download.nvidia.com/XFree86/Linu ... dix-c.html)
and those in puppy.
I noticed that
/usr/X11R7/lib/xorg/modules/extensions/libglx.so
wasnt a symlink to
/usr/X11R7/lib/xorg/modules/extensions/libglx.so.1.0.9629
So i renamed it to
/usr/X11R7/lib/xorg/modules/extensions/libglx.soORIGINAL
i then right clicked on
/usr/X11R7/lib/xorg/modules/extensions/libglx.so.1.0.9629
to create a link named libglx.so
Then i reboot the system and try the Demo again... now i get 50 FPS
I did this in both puppy2.12 & puppy2.13 with the same result 50 FPS
Is that it or i am still doing something wromg ??
By comparison i installed the nVIDIA driver in SUSE 10.2 then ran the
antinspect screensaver, that gives around 200 FPS with shadows.
This is all on the same machine
So my final question is
IS this the best i can hope for (50 FPS) in puppy or does somebody have any
ideas what im doing wrong.
Maybe a NEW HOWTO that doesnt just point from one thread to the next
( it gets a bit confusing for us Newbs ).
Any advise or suggestions very welcome
If you need more info just ask and i will post it
P.S
I know i shouldnt just go about changing/creating files...but
thats the beauty of puppy.It dont take long to do a fresh install
So if i break it, no problem but a lesson learnt.
MSI KT6 Delta
AMD XP2000
512 MB RAM
MSI FX5200-TD128 (nVIDIA GeForce)
TIA
CatDude
First let me say thankyou for Puppy,its cool small & FAST
and i like it a lot,and think i will be spending a lot of time
playing around with it.
In fact i had (still have)a type2 install of puppy2.12
that i had been playing with, ie: trying to sort out 3D acceleration
using the nvidia driver,3DCC etc (Not sure if i succeeded or not really)
Then mooching about on the forum today, i noticed that Puppy2.13 has been
released.....mmmm.. well i had to didnt i. So i did
I downloaded and installed (type2,different partition) Puppy2.13
And proceeded as i had with Puppy2.12 in trying to sort out 3D acceleration
That is to say, i followed the instructions posted here:
http://www.murga-linux.com/puppy/viewto ... ia&t=13155
Posted: Mon Nov 27, 2006 9:30 am
By tempestuous.
And here:
http://www.murga-linux.com/puppy/viewto ... en&t=11192
Posted: Wed Sep 20, 2006 11:59 pm
Quote
#######################################################
It's designed to be uncompressed from the uppermost directory, so do this -
cd /
tar -zxvf /path/to/my/files/NVIDIA-v8756.tar.gz
######################################################
I had already dowloaded NVIDIA-9629-k2.6.18.1.tar.gz and put it in /
So i did this in a console
cd /
tar -zxvf /NVIDIA-9629-k2.6.18.1.tar.gz
(I assume uppermost directory means /
as opposed to /root
Am i correct in that or not ?)
And then continued as per instructions,Those in first link.
Quote
######################################################
Run "depmod".
The main nvidia module should load the 2 modules already in Puppy that it needs; agpgart and i2c-core ...
but in practice this fails because of Puppy 2.12's zdrv_212.sfs setup.
The fix is to manually load those modules now so they're ready in /lib/modules/...
Do this -
modprobe agpgart
modprobe i2c-core
######################################################
( Both these were already loaded ,in both 2.12 & 2.13, according to lsmod )
Quote
######################################################
Another problem is that for some nVidia cards Puppy will automatically load the rivafb module,
which blocks the nvidia module from loading.
My fix is to add this line to the start of /usr/X11R7/bin/xwin -
rmmod -f rivafb
######################################################
( This did not show in lsmod,in either 2.12 or 2.13 )
Quote
######################################################
Now modify your /etc/X11/xorg.conf -
In Section "Module"
ADD:
Load "glx"
Load "dbe"
######################################################
( These were both there already )
Quote
######################################################
REMOVE:
Load "dri"
Load "GLcore"
######################################################
( Neither of these existed )
Quote
######################################################
In Section "Device"
CHANGE "nv" to "nvidia"
######################################################
( This i did change )
Quote
######################################################
Now restart Xorg and you should have accelerated 3D graphics,
for games, and for hardware accelerated XvMC video playback.
######################################################
Well then, how does one test this out ??
I came across this thread
Puppy 2.12: 3D-Control-Center V2.00
http://www.murga-linux.com/puppy/viewto ... er+control
I dowloaded http://www.dotpups.de/3DCC/2.12/3DCC.pup
Which apparrently requires Gtklist04MU.pup
So with both installed i open 3DCC and do as described
1.) Install DRM-modules to enable accelleration for your Kernel.
2.) Install OpenGL to be able to run 3D-applications.
3.) Install a DRI-module for your graficscard.
4.) REBOOT the computer !
5.) Test 3D with the Demo.
Except part 3.) Do i need to do anything here,if using the nVIDIA driver ?
Running the Demo gives a result of 16 FPS (ish) in BOTH 2.12 & 2.13
Judging by some of the figures being bandied about the forum, this appears
pretty pathetic.
Could someone give an idea of what to expect from a MSI FX5200-TD128 (nVIDIA GeForce)
just a ballpark sort of figure.?
So after mucho searching and trying various suggestions found, i still get 16 FPS
Am i missing something ????
I have tried this on several FRESH installs using either
NVIDIA-9629-k2.6.18.1.tar.gz from tempestuous
NVIDIA-7184-k2.6.18.1.tar.gz from tempestuous
or
NVIDIA-Linux-x86-1.0-9746-pkg1.run from nvidia.com
NVIDIA-Linux-x86-1.0-7184-pkg1.run from nvidia.com
Still it is always around 16 FPS
So some more reading reveals something interesting (to me at least)
Comparing the files mentioned in
Appendix C. Installed Components of the READ ME
(http://download.nvidia.com/XFree86/Linu ... dix-c.html)
and those in puppy.
I noticed that
/usr/X11R7/lib/xorg/modules/extensions/libglx.so
wasnt a symlink to
/usr/X11R7/lib/xorg/modules/extensions/libglx.so.1.0.9629
So i renamed it to
/usr/X11R7/lib/xorg/modules/extensions/libglx.soORIGINAL
i then right clicked on
/usr/X11R7/lib/xorg/modules/extensions/libglx.so.1.0.9629
to create a link named libglx.so
Then i reboot the system and try the Demo again... now i get 50 FPS
I did this in both puppy2.12 & puppy2.13 with the same result 50 FPS
Is that it or i am still doing something wromg ??
By comparison i installed the nVIDIA driver in SUSE 10.2 then ran the
antinspect screensaver, that gives around 200 FPS with shadows.
This is all on the same machine
So my final question is
IS this the best i can hope for (50 FPS) in puppy or does somebody have any
ideas what im doing wrong.
Maybe a NEW HOWTO that doesnt just point from one thread to the next
( it gets a bit confusing for us Newbs ).
Any advise or suggestions very welcome
If you need more info just ask and i will post it
P.S
I know i shouldnt just go about changing/creating files...but
thats the beauty of puppy.It dont take long to do a fresh install
So if i break it, no problem but a lesson learnt.
MSI KT6 Delta
AMD XP2000
512 MB RAM
MSI FX5200-TD128 (nVIDIA GeForce)
TIA
CatDude
Nivdia does not use DRI -> commenting out the line is correct.
If 3dcc Pkge does not include Nvidia docs - Please see:
Nvidia Support Home Page
Nvidia User Forums
When user downloads, installs own drivers (highly recommended)
They include very extensive documentation - how to enable all enhancements - BIG trouble-shooting section !
Yes, when proprietary Nvidia drivers are used - only the included frame buffer support (mandatory) and own AGP should be used.
If Puppy has Paste in results - (do not change the window size that pops up)
Not a good test - but most have it, so can compare
Should show over 5400 FPS - let it run for few seconds to get a rough average
Check your /var/log folder for X log files - see results
Paste in /etc/X11/xorg.conf file for comparison of settings now used.
Problem with distribution supplied Pkge - they have pre-set dependencies.
If any are incorrectly symlinked -or kernel /Gcc etc libraries change -
The Nvidia kernel wrapper & own GLX extensions will not work until re-compiled aginst user's own system.
If any developers's Pkge is not compiled on clean system, inadvertant optional dependencies get included.
All Specs. must match (true of any hardware driver)
HTH
If 3dcc Pkge does not include Nvidia docs - Please see:
Nvidia Support Home Page
Nvidia User Forums
When user downloads, installs own drivers (highly recommended)
They include very extensive documentation - how to enable all enhancements - BIG trouble-shooting section !
Yes, when proprietary Nvidia drivers are used - only the included frame buffer support (mandatory) and own AGP should be used.
If Puppy has
Code: Select all
glxgears
Not a good test - but most have it, so can compare
Should show over 5400 FPS - let it run for few seconds to get a rough average
Check your /var/log folder for X log files - see results
Paste in /etc/X11/xorg.conf file for comparison of settings now used.
Problem with distribution supplied Pkge - they have pre-set dependencies.
If any are incorrectly symlinked -or kernel /Gcc etc libraries change -
The Nvidia kernel wrapper & own GLX extensions will not work until re-compiled aginst user's own system.
If any developers's Pkge is not compiled on clean system, inadvertant optional dependencies get included.
All Specs. must match (true of any hardware driver)
HTH
do not install OpenGL with 3DCC, as it overwrites Nvidias OpenGL files with incompatible ones from the original Puppy.
To fix that, you would have to extract the Nvidia-package again.
But the 3D-demo should be one way to test accelleration.
Maybe Nvidia has own tools like fglrxinfo or fglrxgears.
Or install a small game like sable:
http://murga-linux.com/puppy/viewtopic. ... ble&t=7909
Note: it requires a colour-depth of 24 bit.
Mark
To fix that, you would have to extract the Nvidia-package again.
But the 3D-demo should be one way to test accelleration.
Maybe Nvidia has own tools like fglrxinfo or fglrxgears.
Or install a small game like sable:
http://murga-linux.com/puppy/viewtopic. ... ble&t=7909
Note: it requires a colour-depth of 24 bit.
Mark
Gn2 & MU
Thankyou both for your replies.
So then, if i was to do a fresh install of puppy-2.13-seamonkey-fulldrivers.iso
Would you suggest that i use the driver package supplied by tempestuous
or the driver from nvidia.com.
If i used tempestuous's package, am i right in thinking that when he says
" It's designed to be uncompressed from the uppermost directory"
he means / and not /root.
Should this be done whilst still in X or am i supposed to exit to prompt first.
Also if i use this package do i need to use 3DCC or not ? for the DRM part maybe.
If i used the driver from nvidia.com
Has the above ISO got what is necessary or do i need zdrv_213.sfs as well.And would i require the use of 3DCC
Sorry for all the questions, its just that i cant find a Puppy specific howto
that does not leap about between threads spread over several months.
It confuses the hell out of me, it really does.
Surely it is possible to have one (a howto that is) that is on 1 page only explaining
exactly what is required,and in what order they should be done.
Anyway back to what i mentioned in my first post, about the symlink.
It obviously had an effect on the result from the Demo in 3DCC
ie: from 16 to 50 FPS, but it still is only a quarter of what i get in SUSE 200 FPS.
Can i hope for better or not ?
What ever happens Puppy is here to stay. Oh yeah
CatDude
Thankyou both for your replies.
So then, if i was to do a fresh install of puppy-2.13-seamonkey-fulldrivers.iso
Would you suggest that i use the driver package supplied by tempestuous
or the driver from nvidia.com.
If i used tempestuous's package, am i right in thinking that when he says
" It's designed to be uncompressed from the uppermost directory"
he means / and not /root.
Should this be done whilst still in X or am i supposed to exit to prompt first.
Also if i use this package do i need to use 3DCC or not ? for the DRM part maybe.
If i used the driver from nvidia.com
Has the above ISO got what is necessary or do i need zdrv_213.sfs as well.And would i require the use of 3DCC
Sorry for all the questions, its just that i cant find a Puppy specific howto
that does not leap about between threads spread over several months.
It confuses the hell out of me, it really does.
Surely it is possible to have one (a howto that is) that is on 1 page only explaining
exactly what is required,and in what order they should be done.
Anyway back to what i mentioned in my first post, about the symlink.
It obviously had an effect on the result from the Demo in 3DCC
ie: from 16 to 50 FPS, but it still is only a quarter of what i get in SUSE 200 FPS.
Can i hope for better or not ?
What ever happens Puppy is here to stay. Oh yeah
CatDude
the 3DCC is not required for tempestous package.
I would suggest to use his package, I don't know, how easy it is to use the installer from nvidia.
You would need devx_213.sfs, that has the compiler.
tempestous package does not need that, as it already includes a compiled Kernelmodule.
To extract the archive, you can stay in X, but would have to restart after extraction.
You could use these consolecommands (assuming you saved the tar.gz to "/"):
cd /
tar -xzvf NVIDIA-9629-k2.6.18.1.tar.gz
200 fps - with the 3D-Demo from 3DCC in Suse?
Does it work in Suse?
I ask, because it only makes sense to compare values using the same program, and the same screensettings (resolution, colourdepth).
Mark
I would suggest to use his package, I don't know, how easy it is to use the installer from nvidia.
You would need devx_213.sfs, that has the compiler.
tempestous package does not need that, as it already includes a compiled Kernelmodule.
To extract the archive, you can stay in X, but would have to restart after extraction.
You could use these consolecommands (assuming you saved the tar.gz to "/"):
cd /
tar -xzvf NVIDIA-9629-k2.6.18.1.tar.gz
from 16 to 50 FPS, but it still is only a quarter of what i get in SUSE 200 FPS.
200 fps - with the 3D-Demo from 3DCC in Suse?
Does it work in Suse?
I ask, because it only makes sense to compare values using the same program, and the same screensettings (resolution, colourdepth).
Mark
ah I see.
No, don't know why you get 50 frames only.
Though this is not a bad value.
Would be interesting to compare some games now.
They usually switch to an own fullscreenmode.
You could try in Suse:
antinspect -fps -window
to have similar conditions as in Puppy.
I don't know how to switch Puppys antinspect fullscreen, if you want to test:
/usr/local/3DCC/resource/antinspect
Mark
No, don't know why you get 50 frames only.
Though this is not a bad value.
Would be interesting to compare some games now.
They usually switch to an own fullscreenmode.
You could try in Suse:
antinspect -fps -window
to have similar conditions as in Puppy.
I don't know how to switch Puppys antinspect fullscreen, if you want to test:
/usr/local/3DCC/resource/antinspect
Mark
Hello MUMU wrote: Maybe you can run antinspect from Puppy in Suse?
Mark
First an apology for being such a numpty yesterday
i forgot to put ./ in front of the command you gave me to try in SUSE.
So i decided to give it another shot, i also copied over to SUSE
/usr/local/3DCC/resource/antinspect as you also suggested.
Well i booted over into SUSE again and made the following observations
This is the SUSE screensaver Antinspect
being run from a terminal using your command.
suseman@susedragon:/usr/lib/xscreensaver> ./antinspect -fps -window
RESULT = 50 FPS
This is the PUPPY 3DCC Demo
being run from a terminal using your command.
suseman@susedragon:~/Desktop> ./antinspect -fps -window
RESULT = 50 FPS
Mmmm interesting, like i said running the SUSE screensaver via desktop setup
it reports 200 FPS.
For the pure hell of it, i tried running the SUSE antinspect in Puppy
Its a no go im afraid,i got this error message.
sh-3.00# ./antinspect -fps -window
./antinspect: error while loading shared libraries: libpam.so.0: cannot open
shared object file: No such file or directory
sh-3.00#
So the conclusion i have come to,
is that I probably should accept 50 FPS as the limit in Puppy (as regards the 3DCC Demo that is)
Not being a programmer or anything i wouldnt know how to look any deeper into this, so for now i wont even try.
Thanks anyways for your help
CatDude
ok, your results are clear:
if you run antinspect in a window, it is able to use 50 fps in Suse and in Puppy.
So it seems, your card is configured optimal.
That you get 200 fps in suse fullscreen, is because this is a totally different screenmode.
X does not have to calculate windowborders and other things, so performance is better.
I experience the same running games:
in windowed mode they are often much slower than in fullscreenmode.
Mark
if you run antinspect in a window, it is able to use 50 fps in Suse and in Puppy.
So it seems, your card is configured optimal.
That you get 200 fps in suse fullscreen, is because this is a totally different screenmode.
X does not have to calculate windowborders and other things, so performance is better.
I experience the same running games:
in windowed mode they are often much slower than in fullscreenmode.
Mark
Hi MU
Thanks for setting my mind to rest
PS
The 3DCC Demo works in Mepis as well, but i only got 14 FPS
because i failed to get the nvidia driver installed....ah well
CatDude
I didnt realise that m8 (silly me)MU wrote: That you get 200 fps in suse fullscreen, is because this is a totally different screenmode.
X does not have to calculate windowborders and other things, so performance is better.
Mark
Thanks for setting my mind to rest
PS
The 3DCC Demo works in Mepis as well, but i only got 14 FPS
because i failed to get the nvidia driver installed....ah well
CatDude
-
- Posts: 7
- Joined: Wed 10 Jan 2007, 20:10
@CatDudeMaybe a NEW HOWTO that doesnt just point from one thread to the next
( it gets a bit confusing for us Newbs ).
May i ask if maybe you could write a step by step UtraNewbieGuide how you get your nvidiadriver working?
I´m new to Puppy and other linuxes(?) as well,
so a step by step howto would be a great help
(at least for me )
cu
8.)
due to how Puppy works, the "rivafb.ko" module is loaded on every computer-restart from "zdrv_213.sfs".
So you cannot simply delete that, and X will not start after a reboot.
To avoid that, add these 2 lines to /etc/rc.d/rc.local:
If you forget that and encounter that X does not start, type:
rmmod rivafb
xwin
If you like, you can test a preview of an installer:
http://murga-linux.com/puppy/viewtopic.php?t=14438
Mark
due to how Puppy works, the "rivafb.ko" module is loaded on every computer-restart from "zdrv_213.sfs".
So you cannot simply delete that, and X will not start after a reboot.
To avoid that, add these 2 lines to /etc/rc.d/rc.local:
Code: Select all
rmmod rivafb
rmmod rivafb
xwin
If you like, you can test a preview of an installer:
http://murga-linux.com/puppy/viewtopic.php?t=14438
Mark
Last edited by MU on Sat 13 Jan 2007, 15:04, edited 1 time in total.
-
- Posts: 7
- Joined: Wed 10 Jan 2007, 20:10