Not your average Puppy box

For talk and support relating specifically to Puppy derivatives
Message
Author
User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

Not your average Puppy box

#1 Post by prehistoric »

I'm posting this from an unusually large Puppy box. The hard_info report is attached, (and yes it is really gzipped.) The motherboard is by Gigabyte, a used GA-970A-DS3. The processor, also used, is an AMD 8320, with 8 cores running at up to 3.5 GHz without overclocking. It has 32 GB of 1600 MHz DDR3 RAM which could be tweaked to run at 1866 MHz. I've opted for stability instead. There are 4x3 terabyte Seagate hard drives currently installed. Formatted capacity is about 2.73 TB each, or a total 0f 10.93 TB, most of which should be available for scratch memory during computations.

There were some adventures in getting to this point, and I'm not through with problems, but quirky doesn't seem to be much more problematic than many standard distros when confronted with nothing but GPT disks. I've also tried Ubuntu 15.04 and Linux Mint 17.1. Even when told they had complete freedom to wipe disks and start over these typically failed installation.

In the past I've used distros like Linux Mint which had been designed to make installation automatic on machines with a UEFI BIOS. Once I had any kind of Linux booting, it was possible to edit GRUB2 configurations to boot other systems. This was particularly helpful when dealing with laptops, which were never really intended to boot anything except Windoze.

At the moment I'm not able to recommend any distro which really does well on problem machines with UEFI BIOS. I'm booting this one from a flash drive because of problems installing to those GPT-only drives. For me this counts as a failure. On other machines I have actually installed SSDs for the OS, to get around problems in booting, leaving GPT hard drives for data.

Other problems have included either not allowing cpu frequency scaling, or setting this to such conservative values as to make the resulting system useless for the heavy computation intended. It is possible to override such settings, but I don't feel like investing that effort in a distro which would rather I did not use it that way. Quirky is now straightforward in this regard.

Still another problem turned up when several distributions failed to establish a working Internet connection using the Realtech 8169 ethernet controller. This seemed to be close to working, but it was easier to use Quirky than to debug the problems on other systems.

There were even problems recognizing mouse and keyboard when these were connected via a hub in a KVM switch. I ended up plugging in directly, though Quirky did a better job than others. It was just too much trouble to rearrange cables when I switched distros.

I've decided that implementation of GPT partition tables is uneven. The only characteristic I've really found consistent in different BIOSes is the ability to boot Windoze. Simply using Gparted on different distributions is likely to wreck partition tables and filesystems. For safety, stay with one version you can trust, there are definite incompatibilities. I was fortunate that I was building this box from scratch, and did not have any files to risk on those drives.

My overall impression of UEFI is that a great deal has been done to complicate a simple idea of being able to initialize a variety of devices, address huge mass storage devices and bootstrap systems from minimal code in a BIOS. (We tend to forget that the first BIOS was in 8 KB of mask-programmed ROM. We now have a complete OS in some BIOSes.) Whatever abstractions have been introduced to make code independent of particular hardware have been severely compromised by vendors attempting to gain temporary competitive advantage.

So far, this machine has demonstrated stability by calculating Pi to 5,000,000,000 decimal places in a little over half an hour. I've still got more to do in convincing existing computational software it is not a good idea to put huge temporary files in the same place as the OS.

I'm open to suggestions for what you do with this much computational power, besides running servers or playing games. (My only reason for that rebuilt Radeon card is to get video memory bandwidth off the processor-memory bus.) This is a machine one could scarcely have imagined when I started in computing. The millennium has arrived. What do we do next?

p.s. there was a mysterious problem with drive icons. They got messed up during experiments with Gparted changing disk drive partition tables and file systems, and never recovered. At one point the software identified an internal hard drive as a USB drive for reasons I can't explain. At another point it showed icons for USB drives not plugged in. I did a clean install from DVD to USB flash drive to get this running nicely for the screenshot.
Attachments
screenshot.jpg.gz
(126.21 KiB) Downloaded 323 times
dmesg.txt.gz
(12.97 KiB) Downloaded 194 times
hardinfo_report.html.gz
hardinfo report on machine actually running Quirky 7.0.4.1
(7.01 KiB) Downloaded 218 times

rokytnji
Posts: 2262
Joined: Tue 20 Jan 2009, 15:54

#2 Post by rokytnji »

Sounds pretty prehistoric. :)

Poor mans version I guess would be a HP Z800 Workstation with 12 ram slots.

Linux-Lite installed to my Apple 120 gig GPT SSD drive on my Dell E5500 laptop ok.
Not sure why you are having trouble with GPT partitions.
I'm open to suggestions for what you do with this much computational powe
Kernel building will only take minutes than hours. Video editing should fly like the wind also.
But I am no expert on these sort of things.
Boinc maybe?

Jasper

#3 Post by Jasper »

Hi prehistoric,

Re your: "I'm open to suggestions for what you do with this much computational power,......"

The largest prime number found thus far has 17,425,170 decimal digits so it would likely take more than a year to make a manuscript copy (your pi calc was impressive, too).

So you might join GIMPS (Great Internet Mersenne Prime Search)
http://www.mersenne.org/ letting us know if you do and especially if you become a "winner".

That's a lot of power you have (perhaps with even more oomph than Forum member Q5sys and his jet-propelled flying m/c).

One of our sons has just given me his discarded 8to10 year old desktop computer (lightning fast to me). I run Precise 5.6 Multi-session-CD/DVD (so it runs entirely in ram).

A 240 M sized remaster using Gzip-1 and the code below takes 10 seconds here and it would be interesting to know your result for an identical, or similar, experiment.

My regards
Attachments
ram2sfs.jpg
amend the highlighted text as, and if, appropriate
(50.29 KiB) Downloaded 336 times
ram2sfs.gz
rename after download by removing the fake (dot).gz
(313 Bytes) Downloaded 197 times

User avatar
Ted Dog
Posts: 3965
Joined: Wed 14 Sep 2005, 02:35
Location: Heart of Texas

#4 Post by Ted Dog »

lol, why just why are you a puppylinux fan? You seem so out of the norm here, of course you should be using expand2 ram option in FatDog with all that memory. :wink:

Hay Mods can we force his name from prehistoric to future-storic, I do not know if I could even fill up that amount of hard drive space in my lifetime.

User avatar
bigpup
Posts: 13886
Joined: Sun 11 Oct 2009, 18:15
Location: S.C. USA

#5 Post by bigpup »

When using Gparted. Use the latest version, not what comes with an operating system.

Gparted live CD or USB.

For partitioning I would suggest you use the Gparted live CD or USB that you can get from here.
You can download a free version to make your own Gparted live CD or USB.
It is up to date and specifically made to run Gparted.
Info:
http://gparted.sourceforge.net/livecd.php
Download:
http://sourceforge.net/projects/gparted ... ve-stable/

From the latest version of Gparted manual.
[quote]Note

The default partition table type is msdos for disks smaller than 2 Tebibytes in size (assuming a 512 byte sector size) and gpt for disks 2 Tebibytes and larger.

See the section called “Specifying Partition Type
The things they do not tell you, are usually the clue to solving the problem.
When I was a kid I wanted to be older.... This is not what I expected :shock:
YaPI(any iso installer)

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#6 Post by prehistoric »

Jasper wrote:...One of our sons has just given me his discarded 8to10 year old desktop computer (lightning fast to me). I run Precise 5.6 Multi-session-CD/DVD (so it runs entirely in ram).

A 240 M sized remaster using Gzip-1 and the code below takes 10 seconds here and it would be interesting to know your result for an identical, or similar, experiment.
I've downloaded your script, and will report results when I get them, but right now I'm embroiled in some pesky activities required to take care of two houses and a menagerie. Computers get deferred because I've never had one crap on the floor.

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#7 Post by prehistoric »

Ted Dog wrote:lol, why just why are you a puppylinux fan? You seem so out of the norm here, of course you should be using expand2 ram option in FatDog with all that memory. :wink:

Hay Mods can we force his name from prehistoric to future-storic, I do not know if I could even fill up that amount of hard drive space in my lifetime.
You will understand that prehistoric remains a suitable moniker when you learn that I'm running the heavy computation from a pure console interface without a desktop. I only use the desktop in setting things up for a run. If I can separate the trivial-sized temporary files from the humongous ones, quirky may be the fastest OS available. Running from RAM cuts I/O activity, which becomes a bottleneck when you are working with large scratch files or RAID arrays. A few megabytes out of 32 GB is a small price to pay for this speed up.

If I had to, I could install a 5th 3 TB drive. This would give me over 12 TB for scratch space. This would be enough to calculate Pi to 1.5 trillion decimal digits, though it would take a while. I just got a buy on 3 TB drives, and had this chassis sitting here, so I thought I'd see what happens when I try to max it out. The drives may still go in other computers, with the statement they have been carefully burned in and tested for reliability.

I'm actually curious about problems in combinatorial optimization which might be cracked this way, maybe something which might give Puppy systems an unfair advantage. More generally, I wonder about problems which might benefit people in general which could use a dedicated machine, perhaps something in computational biology.

For integer arithmetic I think this AMD cpu is quite competitive. For compute-bound problems with heavy floating point, Intel processors probably have an advantage, but I'm not planning to run out and plop down a kilobuck for a chip with a lower clock rate.

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#8 Post by prehistoric »

bigpup wrote:When using Gparted. Use the latest version, not what comes with an operating system.

Gparted live CD or USB.

For partitioning I would suggest you use the Gparted live CD or USB that you can get from here.
You can download a free version to make your own Gparted live CD or USB.
It is up to date and specifically made to run Gparted.
Info:
http://gparted.sourceforge.net/livecd.php
Download:
http://sourceforge.net/projects/gparted ... ve-stable/...
Thanks bigpup! I'll try the latest.

I have an older gparted CD, but didn't want to mix versions while I was trying automatic installation. I think some of these installation systems in distros depend on defects in the version they call to match defects in their set-up scripts. Ugh!

I'm hoping BarryK will figure this mess out in 7.0.5 or so. My short term memory is not what it was when I was a young whippersnapper. I can recall a time when I read an 8 foot shelf of manuals before I understood how to install a new computer. Today, I forget what I'm doing while I doing it. I've become extremely cautious about things like leaving stoves on or water running.

I also have a problem with not completely forgetting things like how to punch PL/1 scientific characters on an O26 keypunch.

User avatar
8Geee
Posts: 2181
Joined: Mon 12 May 2008, 11:29
Location: N.E. USA

#9 Post by 8Geee »

I'm a bit behind the times here also... I thought UEFI could be disabled by using the legacy setting in bios. Or maybe the MoBo will not allow? AMD going UEFI... perish the thought.
Linux user #498913 "Some people need to reimagine their thinking."
"Zuckerberg: a large city inhabited by mentally challenged people."

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#10 Post by prehistoric »

8Geee wrote:I'm a bit behind the times here also... I thought UEFI could be disabled by using the legacy setting in bios. Or maybe the MoBo will not allow? AMD going UEFI... perish the thought.
There certainly are workarounds, one of which I'm using at present. I've turned off secure boot. I can boot legacy devices.

The problem is that accessing a full 3 TB drive without workarounds currently requires gpt disks w/o traditional partition tables. I can see the handwriting on the wall for a time when proprietary systems will close off those other options, if not absolutely, then by making the complexity even worse for consumers wishing to do anything that is not proprietary.

Open Source systems are currently at a disadvantage they did not create themselves. Perversely, this complexity is then attributed to those systems. These changes are always being pushed as matters of "security". This despite abundant evidence that many commercial systems are riddled with terrible vulnerabilities. The term "security" must mean "provides us with a secure cash-flow."

I support some systems for a few friends who can't wean themselves from dependence on Microsoft, etc. Keeping them from doing their banking via Moldova is a constant struggle.

I not only run Linux myself, I have also provided Linux systems to people willing to make the effort. Some use Puppy, some use Ubuntu or Linux Mint. I'll admit that updates for Puppy are not yet as easy as I'd like. Updates for Ubuntu and Linux Mint are now easier to deal with than those for Windoze.

(If you don't believe me, try to take a W7 netbook with 2 GB RAM back to system restore, then do all the updates to bring it up to date. I spent several days on the telephone with one friend who had to do this after his system was infested. It would have taken 30 minutes to do a complete reinstall of a popular Linux system from scratch, and I'm not even considering Puppy. Oh yes, anyone else get hit by the Windows update that declared some legitimate W7 systems were pirated?)

One friend who had IT experience, but not much Linux experience, took my suggestion about putting a popular Linux system on the machine in his workshop. (He chose Ubuntu.) I get about 5 questions from him about problems with Windoze for every question concerning Linux. We've almost found substitutes for the last applications tying him to Windoze.

At the other extreme I want people to understand that the U.S. Office of Personnel Management was hacked, losing records on some 14 million employees with security clearances. These included a 127-page background questionnaire which one friend spent a month filling out. (How does he feel about Chinese agents getting this? Unprintable.) Do government leaders have better security at their command? Nein. What about computer security companies? Oops.

All this is without considering "coming attractions". I still haven't seen the USB flashdrive firmware hack in the wild, (probably because normal security is so poor it isn't necessary to go to such lengths.) And what about firmware modifications in storage systems?

I lack the resources to deal with all these potential threats, but there is one thing I can do right now: stop running common software which makes it easy for such compromises to "phone home" to their creators. If such hacks need to try many different approaches to compromising the OS the probability of detecting them goes way up.

Does this help explain why I am doing things "the hard way"?

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#11 Post by prehistoric »

I'm still not on that monster machine, so various experiments will have to wait.

There is probably some misunderstanding of what I'm asking for in terms of suggested computational tasks. I use programs for calculating ridiculous numbers of digits for Pi or gamma to test that machines are reliable. I have no particular need of billions of digits which might be mistaken for random numbers.

I'm aware of such things as searches for Mersenne primes or non-trivial zeros of the Riemann zeta function, and feel like this has gone quite far already. I don't feel like further computation there is of much use to anyone, except for bragging rights.

I'm probably not a good sysadmin for a compile farm, though I might be willing to do something particularly important if it doesn't overload my declining mental powers. I no longer consider myself a programmer, and my mental stamina is limited.

What I'm wondering about are applications of modern techniques in combinatorial optimization which nobody else has tried. Here's something pretty strange in antenna design as an example. This used a genetic algorithm, but there are other techniques like simulated annealing. These are just two approaches out of many. Here's a book which gives an overview. (Caution: if you buy the ebook watch out for "preapproved future charges" to credit cards or Paypal. This could be a legitimate way to provide updates, or it could be a scam.) You can read the book on-line for free.

There is no shortage of algorithms, and a range of programs are available. What I'm looking for are applications which have not already been done. It would be especially cute to apply this to the problem of packing a maximum of functionality into Puppy, if only we could automate the evaluation of solutions.

User avatar
Puppus Dogfellow
Posts: 1667
Joined: Tue 08 Jan 2013, 01:39
Location: nyc

#12 Post by Puppus Dogfellow »

prehistoric wrote: [...]
What I'm looking for are applications which have not already been done. It would be especially cute to apply this to the problem of packing a maximum of functionality into Puppy, if only we could automate the evaluation of solutions.
i'm not sure if this is what you have in mind, but BarryK said it took him 12 hours to compile libreoffice from scratch in (iirc) april and that he hasn't yet attempted to do so in t2. from the post that follows the link:

BarryK wrote:
Bindee wrote:
takes about 12 hours to compile.
:shock:

What are you using , An atom based notebook.?

:mrgreen:
I have an Intel i3-370M CPU, with effectively 4 cores, however I compiled Libreoffice on 1 core only, which made it extra slow.

I have a problem with big compiles if use all cores, it seems my CPU overheats and the laptop abruptly turns itself off.

I get this compiling gcc in T2, and Seamonkey inside or outside T2. For T2, the default was one-less than the max cores, so it was using 3. No good, still crashed the laptop.

maybe 2 cores would work, but I have got T2 set to one core, very painfully slow. Besides, we have discovered some apps fail to compile in T2 if use more than one core.
Though, T2 can be setup to use one core on a per-app basis.

link has Barry's notes as an attachment if you want to look into it further. maybe you could tax the machine by running a 32 bit virtual machine simultaneously and compile updated for 32 and 64 at one go. see if you could beat barry's time. if the doggy bag benchmark takes off, useful apps may start really rolling in. or not. time to beat: 12 hours.

:D
(i think he's a minor revision or two behind and i don't think he plans to make many more stripped down libreoffice packages).

User avatar
NeroVance
Posts: 201
Joined: Wed 10 Oct 2012, 23:00
Location: Halifax, Canada

#13 Post by NeroVance »

For a prehistoric fellow, you sure got some highend tech there mate :wink:.
I imagine that there are probably people out there who would kill for a machine like that, though as for what to do with such a beast, I can't say atm.

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#14 Post by prehistoric »

NeroVance wrote:For a prehistoric fellow, you sure got some highend tech there mate :wink:.
I imagine that there are probably people out there who would kill for a machine like that, though as for what to do with such a beast, I can't say atm.
Please note that I did not buy this stuff new, and I couldn't very well get it used until the bloom was off the rose as far as gamers were concerned.

@Puppus Dogfellow,

I've checked that this system does not overheat when all cores are fully loaded. That Pi computation is an example. I doubt there would be a problem with compiling.

I'm running this with a stock AMD cooler, though even that has heat pipes. I can check the progress of a heavy computation by listening to the CPU fan rev up.

It would help if someone else were to do the business of setting various flags, and writing scripts. I've stopped running kernel compiles myself because there are simply too many things to remember for my impaired short-term memory.

I'm still in the business of house and dog sitting. I'll have a chance to experiment with that monster in a few days.

gcmartin

#15 Post by gcmartin »

Hello @Prehistoric

2 PUPPY distros come to mind which scale to your system. Namely @StemSee's distro(s) and @Dry Falls distro(s).

Both of these provide OOTB operations with almost NO NEED to install additional software to become immediately useful. @StemSee's distro has all the necessary stuff builtin to allow your machine to run as many VM Guests that you would ever want, again, with no need to install software to immediately test and use.

Post any behavior you experience on their threads for any findings, assistance, or guidance you desire, as they are responsively helpful.

Hope this is useful info as you proceed. And congratulation on your newest BIG DOG! 8)

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#16 Post by prehistoric »

@gcmartin

I'm not really shopping for distros for this box. I've tried several, including Ubuntu, Linux Mint, Fatdog and some other Puplets I haven't listed. My problem is that Quirky 7.0.4.1 handles all the installed hardware, while older kernels fail on things like the gigabit network adapter. It is apparent this early board was built with a buggy 8169 chip from Realtek, a company which has created problems for any number of Linux systems. The 3.19 kernel in Quirky has code to work around the problem. I also must have CPU scaling to avoid overheating.

I'm not very worried about user interface. When I run a session with heavy computing I typically drop back to the console interface to cut overhead from system tasks. I did this with those tests computing Pi and gamma to ridiculous numbers of decimal digits.

My comment about compiling was intended to limit requests to build stuff where I have to figure out a complicated build environment and tweak large numbers of flags. If someone else sets things up so I can build stuff while the computer does all the heavy lifting I'm amendable to that approach.

I'm fairly certain I do not want to run a server. I have enough trouble maintaining a few systems for people I know well. The Internet is a jungle, and defense against predators becomes a full-time job.

What I have been thinking about are combinatorial optimization problems for which I can get programs already designed. I would like problems for this to be as independent of computing environment as possible. This gets me out of the loop of constantly fighting with computing environments which are convenient for somebody else.

An extreme example would be problems and solutions in the form of tables or graph structures which could be transferred as comma-separated lists, with a simple description of the criteria to measure value of a proposed solution. I think I could encode such descriptions in formulas to be passed to optimization programs.

This is all a kind of thought experiment to drill down to the question of what can you use computational power for, other than running more and more complicated systems which themselves consume not only computer memory and processor cycles, but also user time and mental effort.

gcmartin

#17 Post by gcmartin »

... use computational power for ...
Validation. Simulation. Mathematical description of Physical/Chemical/Biological phenomenon. Predictor of Quantum Behavior. Calculation of the date of the Sun's collapse which will consume Earth in it collapse (a primary reasoning behind the Mars colonization). Seti.

All kinds of efforts that "computationals" help mankind with.

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#18 Post by prehistoric »

gcmartin wrote:
... use computational power for ...
Validation. Simulation. Mathematical description of Physical/Chemical/Biological phenomenon. Predictor of Quantum Behavior. Calculation of the date of the Sun's collapse which will consume Earth in it collapse (a primary reasoning behind the Mars colonization). Seti.

All kinds of efforts that "computationals" help mankind with.
I'll admit that the possibility the Sun will consume the Earth in 500 million years does not panic me. Even there, I'm satisfied with modest calculations already done.

Surprisingly, mathematical models of stellar interiors seem to be pretty good in the region where radiative transfer of energy predominates. The modeling problems set in about half way to the surface, along with convective transport, which moves stuff toward the visible surface at about walking speed. Moving plasma also generates magnetic fields, and by the time these reach the corona they can dominate gravitation forces in some prominences. If we really understood magnetohydrodynamics, we'd have working fusion power generators. Last time I checked we were still short of modeling the full complexities of tokamaks.

Modeling the convective region is very important to understanding mixing, and mixing is a major factor in composition of stellar interiors. This is a primary variable in the evolution of main-sequence stars like the Sun.

If all that is not enough, there is a problem of mass loss from the surface via "solar wind". This will pick up as the Sun ages. We can expect the Sun to lose a substantial fraction of its total mass before it becomes a red giant. This is another big factor in those predictions.

Fortunately, we, or the species which replaces us, will have hundreds of millions of years to figure things out.

Could we have some more suggestions appropriate for this century?

User avatar
prehistoric
Posts: 1744
Joined: Tue 23 Oct 2007, 17:34

#19 Post by prehistoric »

Here's a worthy cause concerned with discovering how proteins fold, which is an important aspect of many diseases: Folding@Home.

This can run on some Puppy derivatives related to Ubuntu, but unfortunately I have not been able to get my Ubuntu versions to correctly handle the buggy RTL8169 gigabit Ethernet interface on this box. Internet connectivity is necessary for this project. I will definitely want to run a 64-bit version.

Getting this to run on Quirky 7 would allow me to keep the computer busier than its owner.

Added: I've now taken wyzguy's advice, and changed the IOMMU setting in the BIOS. Also changed some related IO settings. There is no particular problem with that RTL8169 on common 64-bit distros.
Last edited by prehistoric on Sat 11 Jul 2015, 15:02, edited 1 time in total.

User avatar
Deacon
Posts: 185
Joined: Tue 19 Mar 2013, 15:14
Location: USA

Re: Not your average Puppy box

#20 Post by Deacon »

prehistoric wrote: So far, this machine has demonstrated stability by calculating Pi to 5,000,000,000 decimal places in a little over half an hour. I've still got more to do in convincing existing computational software it is not a good idea to put huge temporary files in the same place as the OS.

I'm open to suggestions for what you do with this much computational power, besides running servers or playing games. (My only reason for that rebuilt Radeon card is to get video memory bandwidth off the processor-memory bus.) This is a machine one could scarcely have imagined when I started in computing. The millennium has arrived. What do we do next?
Mine cryptocurrencies and donate them to "Deacon's Hamburger Fund". (I will gladly pay you Tuesday.) I also accept PayPal.

Post Reply