Puppy Linux Discussion Forum Forum Index Puppy Linux Discussion Forum
Puppy HOME page : puppylinux.com
"THE" alternative forum : puppylinux.info
 
 FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

The time now is Thu 14 Dec 2017, 17:25
All times are UTC - 4
 Forum index » Advanced Topics » Cutting edge
I've seen the future of Linux...
Moderators: Flash, Ian, JohnMurga
Post new topic   Reply to topic View previous topic :: View next topic
Page 4 of 4 [56 Posts]   Goto page: Previous 1, 2, 3, 4
Author Message
mcewanw

Joined: 16 Aug 2007
Posts: 3194
Location: New Zealand

PostPosted: Thu 19 Mar 2009, 10:27    Post subject: Re: C can do more than bash since it's a lower level language  

DMcCunney wrote:

It's not just bash (or the Bourne shell, C shell, Korn shell, Z shell, etc.). The various utilities you call from shell scripts also tend to assume ASCII strings. Awk, diff, grep, sed, tr and the like are intended to process ASCII files. As mentioned, the developers of Unix were software developers, and many of the utilities provided with a *nix system are intended to manipulate and transform source code in various ways.


Very true in general, and in traditional UNIX "philosophy" - bash, Bourne, Korn and so on fitting in with that philosophy. Just didn't want anyone imagining that ASCII was an enforced limitation. There are many utilities which also work with binary strings and such utilities can easily be written in C for calling from bash. The command cmp for example compares files byte by byte; the files don't need to be ASCII files as far as I remember. But the traditional shell, along with utilities such as those you mentioned, was designed and optimised for processing text files, which is a very transportable format (which, via the likes of the more recently thought up XML way of formatting text data can also be provided with a complex structure yet remain able to be processed/parsed/transformed with the old UNIX utils such as sed and awk). Awk's a particularly interesting utility too, of course, since it is a programming language in its own right.
Back to top
View user's profile Send private message Visit poster's website 
DMcCunney

Joined: 02 Feb 2009
Posts: 894

PostPosted: Thu 19 Mar 2009, 10:52    Post subject: Re: bash AS the C extension module for Python and so on  

mcewanw wrote:
DMcCunney wrote:

Whether it's more human readable is entirely a matter of whether you know the language. A lot of folks are proficient in Bourne shell scripting (and the bash script language is Bourne shell with extensions.) Less people are proficient in Python and TclTk.

This is odd but absolutely true. Python "should" be easier to learn (and probably is) but bash (or Bourne) is so endemic to UNIX/Linux programming that anyone and everyone wanting to play with Linux system operation pushes themselves to learn the sometimes ungainly and painful bash syntax.

You pretty much have to to advance beyond the end user level, since most of the system is configured and controlled by shell scripts.

Quote:
I did, and I also learned C (for other reasons though). Later I worked hard to learn Python, and though it is an extremely easy to understand and read syntax, I nevertheless struggled terribly to get on top of it - as MU suggests earlier, for some bizarre reason my brain had become too used to the less than nice looking syntax of C and the way it almost exactly models the way data is used in UNIX. But that brings me to the following, which I don't agree with at all [EDIT: sorry... I do agree with DMcCunney to some extent afterall; see my next post]

It's not a surprise. The development of computer languages has involved progressively higher levels of abstraction. If you are accustomed to and comfortable with dealing with the level provided by C/C++, Python can be a challenge. The data you are processing just doesn't look the same.

Quote:
DMcCunney wrote:

First, Unix was developed by programmers, who wanted a better environment for software development. Software source code is all ASCII, and the implicit assumption of the shell and pipelines is that programs are passing around ASCII strings. If the data you are diddling is not ASCII, this breaks down.

Linux, as with modern UNIX, is written in C,

Well, more likely C++ now, which is not the same thing. (A lot of issues stem from programmers trying to treat C++ like it was C.)

Quote:
though in some ways the relationship between C and UNIX is like the chicken and the egg scenario. It is as if C is written 'for' UNIX/Linux; it provides all the system calls and functionality to provide pipes and redirection.

Up through about v6, Unix was written in Macro-11, the assembly language provided on the DEC mini it was developed on. Around v7, C had matured enough to be substituted, and most of Unix was rewritten in it. The design of Unix influenced C, and vice versa.

Quote:
In many ways, all the bash shell does is provide a slightly more convenient way to use the underlying system calls - via libraries of very simple C routines. Though an interpretive language, each utility provided by bash is actually very efficient (since small bits of C code).

We may be talking about different things. From my viewpoint, bash doesn't provide the utilities. Bash implements high-level control constructs you can use to tie external utilities together to perform processing tasks. The Unix philosophy was one tool for one job, each optimized to do a particular thing. If you needed to do more than one thing to data, you tied together more than one tool in a script.

Quote:
But C doesn't store data as ASCII strings at all (in fact C doesn't 'really' have a concept of 'ASCII string' handling at all), everything is stored as binary with an address pointer marking the beginning and a null character (a binary byte) marking the end of the so-called 'string' (a binary string). i.e. its stored as binary, not ASCII. Pipes handle binary data, not just ASCII and the facility is actually provided by C system calls; bash just provides a human user interface to the facility.

Agreed. C is a language intended for systems programming, and can manipulate any kind of data. I never suggested C used or expected ASCII. I said the sort of things you used shell scripts to do expected ASCII.

Quote:
So why look for a C extension module to bolt onto Python in order to get rid of bash (or Bourne, Korn or the C shell etc)? Bash IS an optimised extension module written in C for just the purpose required. But the syntax is ugly, because it is so low-level and close to C itself. bash just mirrors the underlying C system calls and, in a way, therefore, UNIX/Linux mirrors C. In human culture the individual, some say, is constructed from language - we 'are' the language that produces us. The relationship between C, bash, and Linux, follows that same model.

Agreed. And why bolt it on to python? A lot of what was done by shell scripts calling external utilities is now done in perl, which includes the capabilities formerly provided by things like awk and sed.

Quote:
But I absolutely agree that much of the time it would be nice to write in a more human-oriented computer language - for much of ALL programming, including scripting. And why have different syntax for that higher level scripting language in comparison with the compiling language for producing faster executables (where speed of execution is critical)?'

"One language to rule them all, one language to bind them..."

The C shell, developed by Bill Joy, was intended to have a script language whose syntax resembled C. It's still available - the original C shell got released as open source, and the enhanced tcsh (which fixes an assortment of csh parser bugs) always was. But every *nix system I'm aware of makes extensive use of Bourne shell language scripts for configuration and control. Attempting to rip that out and replace it would be an enormous task, and arguably a bad idea.

Quote:
So I also agree thoroughly with the concept being described by BigPilot at the start of this thread and argued strongly for by Mr. Maxwell. Python syntax for both the main system script language AND the compiling language (e.g. Genie) is a holy grail to look for. But you still need bash (as the lower-level C extension module/library of routines) to provide that level of system call interface to the higher level Python or Python-like code. No need to reinvent the wheel - you still need a pretty low level library of system calls such as provided by bash.

You can actually get the low-level calls in Python. It's a language in which complete applications can be written, and as such requires that sort of access. It simply provides a higher level of abstraction.

Quote:
The only problem is that Python itself tends to be too big to contemplate using as a scripting language for a Linux distribution which is purposively being kept unbloated in size.

As far as I can tell, the problem isn't that Python is big. (The PET for Python 2.5.4 is 4.5MB.) The bigger problem is that Python is really intended for faster, more powerful hardware than the sort of machine Puppy tends to be installed on. Barry complained in a blog post he though Python was bloated. Well, yeah, if your target system is a P200 with 64MB of RAM...

Most Python installations are on machines where it's feasible to write an entire application in Python because performance will be acceptable. On a Puppy machine, this may not be the case.

Quote:
Lua would be a better fit. But then we need a version of say Genie that also provides a Lua-like syntax.

Why?

Genie and Vala are essentially wrappers around C, providing a higher-level view of the system. But the output is C, to be compiled and built by GCC and friends, and you have to have the devx_XXX.sfs package installed to create your applications.

The nice thing about script languages like Python and TclTk is that you don't have to have the Gnu Compiler Suite installed to write in them and run the results.

Quote:
And the scripting language Lua needs to provide a GUI toolkit that looks good and is compatible with the most popular applications - i.e. a version of Lua, say, that interfaces to a gtk2 toolkit. John Murga's murgaLua uses FLTK, and that is probably its downfall; otherwise I'm sure such a tiny, powerful language would have been a permanent part of all Puppy distributions.

There is an effort to add gtk to Lua 5. See http://oertl.at/wolfgang/en/project/lua-gtk

Quote:
Useful though gtkdialog has been to the Puppy community, it is limited in power (and that limitation has a major effect on Puppy development in my opinion); Genie/Vala looks like a very positive way forward (but still I'd like a scripting language with the same or similar syntax (with bash as an extension to provide the lower-level pipe interface and so on). Because of the size factor, I still favour a Lua syntax, especially since Lua, like Python is useful as a cross-platform language and a very popular language, actively used by the gaming community for scripting purposes. If only Genie/Vala was more popular/cross-platform, maybe C and C++ would become less important (which would make a lot of people happy since they are hardly considered easy to learn, like alone the associated "popular" GUI toolkit interfaces for them).

I think Genie/Vala were intended precisely to address that "difficult to learn" issue, but they still use C under the hood.

Quote:
Lua isn't 'quite' as pretty or obvious as Python to look at, so I honestly believe that I have a much better chance of mastering it.

It's also a lot smaller and less resource intensive, and is a good for for low-end oriented distros like Puppy. DSL makes extensive use of it.
______
Dennis
Back to top
View user's profile Send private message 
mcewanw

Joined: 16 Aug 2007
Posts: 3194
Location: New Zealand

PostPosted: Thu 19 Mar 2009, 11:38    Post subject: Lua syntax for everything near or damn it - would do me  

DMcCunney wrote:

mcewanw wrote:
Lua would be a better fit. But then we need a version of say Genie that also provides a Lua-like syntax.

Why?

Genie and Vala are essentially wrappers around C, providing a higher-level view of the system. But the output is C, to be compiled and built by GCC and friends, and you have to have the devx_XXX.sfs package installed to create your applications.


Why... because nice though Python code looks I remember being bored stiff with the language (no idea why). But ... I dabbled much more recently with Lua and that interested me. And ... I'm lazy. I would also sometimes like to use a scripting language which had less of the idiosyncrasies endemic to bash; getting a bash script to work as planned can sometimes be a real pain to the brain. But I would also like to use a higher level compiling language at times rather than C. And being lazy, I can't be bothered trying to learn and remember a different syntax for the scripting language and the compiling language. So Lua for scripting and a Lua-like-syntax-wrapper-for-C is exactly what I'd like to have available at the moment (rather than Genie, with its Python-like wrapper for C). Well, Lua-syntax wrapper in preference to a Lua-like-syntax wrapper.

I'm pretty sure Linux is written in C and not C++
I used to teach Linux kernel programming and I don't think the basics has changed.
Back to top
View user's profile Send private message Visit poster's website 
DMcCunney

Joined: 02 Feb 2009
Posts: 894

PostPosted: Thu 19 Mar 2009, 16:19    Post subject: Re: Lua syntax for everything near or damn it - would do me  

mcewanw wrote:
DMcCunney wrote:

mcewanw wrote:
Lua would be a better fit. But then we need a version of say Genie that also provides a Lua-like syntax.

Why?

Genie and Vala are essentially wrappers around C, providing a higher-level view of the system. But the output is C, to be compiled and built by GCC and friends, and you have to have the devx_XXX.sfs package installed to create your applications.

Why... because nice though Python code looks I remember being bored stiff with the language (no idea why). But ... I dabbled much more recently with Lua and that interested me. And ... I'm lazy. I would also sometimes like to use a scripting language which had less of the idiosyncrasies endemic to bash; getting a bash script to work as planned can sometimes be a real pain to the brain.

The issues I've seen bite are understanding parsing and precedence, and wrapping your mind around regular expressions.

Quote:
But I would also like to use a higher level compiling language at times rather than C. And being lazy, I can't be bothered trying to learn and remember a different syntax for the scripting language and the compiling language. So Lua for scripting and a Lua-like-syntax-wrapper-for-C is exactly what I'd like to have available at the moment (rather than Genie, with its Python-like wrapper for C). Well, Lua-syntax wrapper in preference to a Lua-like-syntax wrapper.

I haven't seen one, and I don't expect to. Lua is aimed at a different class of problems. For instance, it's great for embedding in other programs to provide scripting capabilities. There's an editor called SciTE-it based on SciTE and the Scintill edit control that uses Lua as the extension language, and the author of SciTE-it finally abandoned that effort and created TextAdept, which is a small C core and a whole lot of Lua to provide functionality and extensibility. (I have both installed under Puppy.)

Genie/Vala look nice, but I prefer things like Java and Python on projects like Puppy because you don't have to have the Gnu Compiler suite installed to write and run applications.

Quote:
I'm pretty sure Linux is written in C and not C++
I used to teach Linux kernel programming and I don't think the basics has changed.

No, I think you're right. I was misremembering, because AT&T at one point was rewriting the Unix kernel in C++.

I can see Linux not wanting to go there, because of variance in different C++ compilers. The Mozilla project had to put in various workarounds in their code for breakages in things like the C++ compiler provided on HP-UX. I can only imagine what the Linux kernel would require. ANSI C is at least a fairly clearly defined standard, with a lot less disagreement about exactly how to implement it.
_______
Dennis
Back to top
View user's profile Send private message 
DMcCunney

Joined: 02 Feb 2009
Posts: 894

PostPosted: Thu 19 Mar 2009, 16:29    Post subject: Re: C can do more than bash since it's a lower level language  

mcewanw wrote:
DMcCunney wrote:

It's not just bash (or the Bourne shell, C shell, Korn shell, Z shell, etc.). The various utilities you call from shell scripts also tend to assume ASCII strings. Awk, diff, grep, sed, tr and the like are intended to process ASCII files. As mentioned, the developers of Unix were software developers, and many of the utilities provided with a *nix system are intended to manipulate and transform source code in various ways.

Very true in general, and in traditional UNIX "philosophy" - bash, Bourne, Korn and so on fitting in with that philosophy. Just didn't want anyone imagining that ASCII was an enforced limitation. There are many utilities which also work with binary strings and such utilities can easily be written in C for calling from bash. The command cmp for example compares files byte by byte; the files don't need to be ASCII files as far as I remember.

No, they don't. And we now have things like bdiff for binary files.

Quote:
But the traditional shell, along with utilities such as those you mentioned, was designed and optimised for processing text files, which is a very transportable format (which, via the likes of the more recently thought up XML way of formatting text data can also be provided with a complex structure yet remain able to be processed/parsed/transformed with the old UNIX utils such as sed and awk). Awk's a particularly interesting utility too, of course, since it is a programming language in its own right.

I attended a talk once by Thomas Weinberger, the "W" in awk. He stated that awk was originally designed to do "one liners" at the command prompt to examine files and return a result, and he described his shock the first time he encountered a multi-page awk script. Razz

XML has potential to transform a lot of processing. Another thing I've been following with interest is Microsoft's PowerShell. It provides a command prompt and permits scripts, pipelining and the like, but what it passes around aren't ASCII strings - they're fully qualified .NET objects. I suspect there's an enormous potential in something like this, even if I do loathe Microsoft.
______
Dennis
Back to top
View user's profile Send private message 
vader

Joined: 07 Apr 2009
Posts: 19

PostPosted: Sat 11 Apr 2009, 23:58    Post subject:  

This thread may be a little old, but I just read it Smile

Firstly, I'm a professional programmer (have been for way too long). Every language has its pros and cons.

The shell was originally written to interface with the system (ie. run programs etc). Shell scripts are really just macros of commands, albeit quite complex now. If you are running many command line utilities, then shell is one of the easiest.

C was written because assembler was too hard (no seriously). It has the lowest level access to the machine, and the best performance. It is harder than the other languages to write something (less is done for you), but gives the smallest and fastest program (memory footprint).

C++ tried to make C easier by creating well defined API (interfaces) between sections of code. This made writing easier, but at a cost of size and speed (although not that much).

Java is compiled to byte code (fake machine code) which can be run by the java program. Byte code doesn't depend on machine architecture as true machine code does. This means it can be run on any architecture that runs the java program. Most modern java implementations compile the byte code to native code for speed. Java is still (much) slower than C, but faster than interpreted languages. Java has a very simple syntax that relies on libraries to do anything.

Perl is an odd mix. When it runs, perl reads the text script, and compiles it to internal byte code (or native machine code), and the runs that program. In many cases, perl can run as fast as native code + the time taken to compile the script. Eg. 0.5 seconds to compile, and 0.5 seconds to run instead of C taking 0.5 seconds.

Python also compiles to byte code, so it has the advantages of better speed, however it has the overhead of compilation. There are apps that create native code, but like perl and java, they are still much less efficient than true native code.

As far as booting, shell is more memory efficient than perl, python or java - which is why it is still used. C programs would be faster, but are much trickier to modify. In the boot process, you spend *far* more time running the apps which detect/set hardware etc, than running the script. Even if python was magically 10 times faster, then speed increase in boot time would be negligible.

Utilities are another matter. They are stand alone apps which the user invokes. A language which allow quick development and is simple is better in most cases than a very efficient, fast, small app that takes twice as long to write and debug. In this case, script with gtkdialog, or python are great. Java is stuck in the middle ground, not fast enough to warrant the extra programming effort.

I have not had any experience in vala/genie, but they seem to be exactly the right solution for utility apps (from what I read Smile

One other thing to answer DMcCunney - XML is a very inefficient way to pass messages. The reason people use it is because it is meant to be "humanly readable". It is easier to program than serialised or binary protocols, however it comes at a memory and speed cost. It makes reasonable config files, but is still expensive to read. Micro$oft can be easy to program for, but is very inefficient compared to properly coded, specific apps (hope my bias isn't showing Smile ) A multi meg hello world app is a bit silly Smile

Cheers,
Vader

PS. I work everyday with linux - recently trying to decrease to boot time of an embedded linux device Smile
Back to top
View user's profile Send private message 
Colonel Panic


Joined: 16 Sep 2006
Posts: 1915

PostPosted: Fri 19 Mar 2010, 19:26    Post subject:  

I can't honestly compete with this level of erudition (though I've enjoyed reading this thread). Just one observation and a question, if I may;

Firstly, I've downloaded and tried Pardus now and I have to say that I did find it slow on my machine, though it's probably an excellent distro for someone with a newer one and who likes KDE 4. I'm guessing that its reliance on Python is the reason it's slow.

Secondly, the question; Plan 9 was once called "the Unix which Unix should have been." Has anyone ever tried to base a distro on Plan 9 instead of a Unix/Linux base? Did Bill Gates ever think of basing a version of Windows on it?

_________________
Acer Aspire M1610 (Core 2 Duo, 2.3 GHz), 3 GB of RAM, 320 GB hard drive running Debian 9.2.1, Slackware 14.2 (32-bit), VLocity 7.2 Final, X-Slacko 4.3, FerenOS (32-bit), Devuan 1.0.0, Stella 6.8 and Pardus 1.7.1.
Back to top
View user's profile Send private message 
Pizzasgood


Joined: 04 May 2005
Posts: 6266
Location: Knoxville, TN, USA

PostPosted: Sat 20 Mar 2010, 00:54    Post subject:  

There are some projects based on Plan 9 directly, or on other projects based on Plan 9. For example, Octopus. Mostly though, Plan 9 derived projects seem to be aimed at researchers.

Plan 9 is on my list of OSes to learn more about in the future, along with Haiku and BSD, but I have more important priorities to deal with in the near term.

_________________
Between depriving a man of one hour from his life and depriving him of his life there exists only a difference of degree. --Muad'Dib

Back to top
View user's profile Send private message Visit poster's website 
Colonel Panic


Joined: 16 Sep 2006
Posts: 1915

PostPosted: Sun 21 Mar 2010, 20:58    Post subject:  

Thanks for replying PG. I think it was Eric Raymond who said the trouble with Plan 9 was not that it wasn't better than Unix (it was) but that it wasn't better enough to justify replacing Unix with it.
_________________
Acer Aspire M1610 (Core 2 Duo, 2.3 GHz), 3 GB of RAM, 320 GB hard drive running Debian 9.2.1, Slackware 14.2 (32-bit), VLocity 7.2 Final, X-Slacko 4.3, FerenOS (32-bit), Devuan 1.0.0, Stella 6.8 and Pardus 1.7.1.
Back to top
View user's profile Send private message 
Colonel Panic


Joined: 16 Sep 2006
Posts: 1915

PostPosted: Mon 20 Nov 2017, 17:39    Post subject:  

Sorry, wrong thread.
_________________
Acer Aspire M1610 (Core 2 Duo, 2.3 GHz), 3 GB of RAM, 320 GB hard drive running Debian 9.2.1, Slackware 14.2 (32-bit), VLocity 7.2 Final, X-Slacko 4.3, FerenOS (32-bit), Devuan 1.0.0, Stella 6.8 and Pardus 1.7.1.
Back to top
View user's profile Send private message 
amigo

Joined: 02 Apr 2007
Posts: 2634

PostPosted: Tue 21 Nov 2017, 14:20    Post subject:  

Actually useful that you bumped the thread -I've bee looking at lua-bash again recently, and some other loadable builtins from bashdiff.
Back to top
View user's profile Send private message 
Display posts from previous:   Sort by:   
Page 4 of 4 [56 Posts]   Goto page: Previous 1, 2, 3, 4
Post new topic   Reply to topic View previous topic :: View next topic
 Forum index » Advanced Topics » Cutting edge
Jump to:  

You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum


Powered by phpBB © 2001, 2005 phpBB Group
[ Time: 0.1331s ][ Queries: 14 (0.0095s) ][ GZIP on ]