Puppy Package Site - Planning Stages

News, happenings
Message
Author
User avatar
smokey01
Posts: 2813
Joined: Sat 30 Dec 2006, 23:15
Location: South Australia :-(
Contact:

#41 Post by smokey01 »

Jemimah if you provide the details in a PM I can set it up for you.

Names etc.

ocpaul20
Posts: 260
Joined: Thu 31 Jan 2008, 08:00
Location: PRC

#42 Post by ocpaul20 »

Did anything happen to this?

I saw somewhere that someone suggested a torrent method of downloading packages.

We could easily create a torrent cache kind-of site where we kept pointers to where the packages were being kept and maybe a small (or large) XML file with details of package, who maintains it, where the package pet or sfs or iso is located and such like.

That way the XML files could be ordered into distros, packages, etc and the packages themselves could be spread out all over the internet. They could be on individual's hosted servers if necessary.

I think the problem of too many downloads from one site needs to be spread out so that no one site gets hit with all the users requesting packages.

If this idea was adopted, then we just need to decide what information is required in the XML file and write a fairly simple application to sort it in different order.

Each night a script could go through and check on the availability of each server and each time a package was requested if it is not available then the link could turn red or yellow for example so that others knew it was down.

I think Torrents are the way forward but there is no easy small application which runs in the background and notifies the user when the package is ready for installation.

PPM would need to start automatically perhaps once the md5 had been checked and download was complete and a check on shutdown to say that there was still a download in progress or stop the torrent download until next boot.
==================
Running DebianDog Jessie Frugal with /live and maybe with changes or savefile or.., who knows?

User avatar
Q5sys
Posts: 1105
Joined: Thu 11 Dec 2008, 19:49
Contact:

#43 Post by Q5sys »

ocpaul20 wrote:Did anything happen to this?
No one was really very interested in working on it.
ocpaul20 wrote:I think Torrents are the way forward but there is no easy small application which runs in the background and notifies the user when the package is ready for installation.
Torrents are only good in situations where you have alot of people who wnat the same file and are willing to share it for a long period of time.

Within a few months most torrents will die and we'd be left with one server hosting everything, which is no different than just hosting it there anyway.

The number of people that are going to be actively downloading random_package.pet at the same time is going to be very very low, so there is no benefit to torrents. I have a friend who runs a +100gbit CDN and will give me an extremely low rate, but I need help with the site design to make it viable.

I'm not a web developer, so I need help from someone who is to help build up the site. Hosting the packages itself is the easy part.

ocpaul20
Posts: 260
Joined: Thu 31 Jan 2008, 08:00
Location: PRC

#44 Post by ocpaul20 »

As I understand it, the beauty of peer-2-peer is that
a) you can stop downloads and pick them up later - since the downloads come in many "containers" of information.

b) the downloaded "containers" are shared amongst the peers while the peers are leeching which spreads the load (since the "containers" dont HAVE to come from the original download source seeder)

c) not sure how trackers come into it yet.

========================
OK, so if p2P is not the answer, then maybe the small XML files could be the component which allows people to search and select for the package they want.

We would have to define a format which would provide us with fields for all the information we needed.

In my mind, the difficulty would be getting the package uploader to fill in the required fields so that as a whole, it made sense when it was combined with all the other entries which other people had made for their uploaded packages.
==================
Running DebianDog Jessie Frugal with /live and maybe with changes or savefile or.., who knows?

User avatar
Q5sys
Posts: 1105
Joined: Thu 11 Dec 2008, 19:49
Contact:

#45 Post by Q5sys »

ocpaul20 wrote:As I understand it, the beauty of peer-2-peer is that
a) you can stop downloads and pick them up later - since the downloads come in many "containers" of information.
This is true, and would be helpful for distributing ISOs, its really doesnt make sense for distributing 10mb packages. How often do you really need to pause and resume a 10mb download?
ocpaul20 wrote:b) the downloaded "containers" are shared amongst the peers while the peers are leeching which spreads the load (since the "containers" dont HAVE to come from the original download source seeder)
Again I doubt you're going to have many people all downloading the same package at the same time... so the load isnt going to be spread around. And most people arent going to continue to 'seed' the package one they get it and install it. Remember Puppy for the most part is focusing on people with minimal hardware, they arent going to want a torrent client running on their machine all day long.
ocpaul20 wrote:c) not sure how trackers come into it yet.
Well we'd have to run our own tracker most likely. It's not hard, i just dont think its the best option.

Also keep in mind some ISPs like to throttle torrent traffic. So it means some people will get slower rates than just downloading directly through HTTP/S and/or FTP.
ocpaul20 wrote:OK, so if p2P is not the answer, then maybe the small XML files could be the component which allows people to search and select for the package they want.

We would have to define a format which would provide us with fields for all the information we needed.

In my mind, the difficulty would be getting the package uploader to fill in the required fields so that as a whole, it made sense when it was combined with all the other entries which other people had made for their uploaded packages.
Client side:

I think the way TazOC handled this in LightHousePup is probably a good way to go. Because it leaves the ability for people to come along later and tweak it eaiser for their own use if they want. As much as I like 'standards', I also like making things so people can go their own way.

Server side:

You simply have a form that someone needs to fill out for their package to be made public and visible to everyone. If they cant be bothered to fill out the form... then I guess they dont really want people to have whatever.


What I think makes the most sense:

I have long thought that the way that Arch linux handles the information for AUR packages is the simpliest and best way. http://aur.archlinux.org Also since its all web based, there is no need for a 'package search' on the client side. They have a web browser, why try to write something custom to run on the users system. This also means that the system can be used across various puppy bases.
Using the AUR site as an example, all we would need to do is add in a field for release version (slacko 5.7, Tahr, etc) so people only search for packages that will work on their system.

This way everything is in one central place, everyone can share their packages in one place, bugs, feedback, etc can all be in the same place instead of scattered around on different sites like we currently have (forum, wiki, random site where package is stored).

All of this is stuff I've wanted to set up before, but of the people I've talked to... no one else really seems keen on using it. I'm not going to put time and money into building something for everyone in the community to ignore it.

User avatar
ally
Posts: 1957
Joined: Sat 19 May 2012, 19:29
Location: lincoln, uk
Contact:

#46 Post by ally »

he files I upload to archive.org can be downloaded as torrents

:)

Post Reply