How to loop saved Internet stream?

Booting, installing, newbie
Post Reply
Message
Author
User avatar
fabrice_035
Posts: 765
Joined: Mon 28 Apr 2014, 17:54
Location: Bretagne / France

How to loop saved Internet stream?

#1 Post by fabrice_035 »

Hi,

i try to save a stream from internet in the same file with loop, it's not easy to explain.

For example, when we save a stream in a file, the schem like that:

|start|------record-in-continous---------[file enlarge with record time]

I looking for that (with a max file size = 20 MB or less)

|start|------record-in-continous-------< loop to start

:roll:

actualy i download the stream with :

wget "http://www.streaming_site.com/part_increase_number.ts" -O - >> output_file.ts

After a while the output_file.ts is too big. I want juste keep always the last five minutes. ( such as dashcam )

Thanks for help.
Regard.

User avatar
trapster
Posts: 2117
Joined: Mon 28 Nov 2005, 23:14
Location: Maine, USA
Contact:

#2 Post by trapster »

Streamripper can save to a max file size.
I'm not sure it's what you're looking for.

Code: Select all

# streamripper
Usage: streamripper URL [OPTIONS]
Opts: -h             - Print this listing
      -v             - Print version info and quit
      -a [file]      - Rip to single file, default name is timestamped
      -A             - Don't write individual tracks
      -d dir         - The destination directory
      -D pattern     - Write files using specified pattern
      -s             - Don't create a directory for each stream
      -r [[ip:]port] - Create relay server on base ip:port, default port 8000
      -R #connect    - Max connections to relay, default 1, -R 0 is no limit
      -L file        - Create a relay playlist file
      -z             - Don't scan for free ports if base port is not avail
      -p url         - Use HTTP proxy server at <url>
      -o always      - Always overwrite tracks in complete
      -o never       - Never overwrite tracks in complete
      -t             - Don't overwrite tracks in incomplete
      -c             - Don't auto-reconnect
      -l seconds     - Number of seconds to run, otherwise runs forever
      -M megabytes   - Stop ripping after this many megabytes
      -q [start]     - Add sequence number to output file
      -i             - Don't add ID3V1 Tags to output file
      -u useragent   - Use a different UserAgent than "Streamripper"
      -w rulefile    - Parse metadata using rules in file.
      -m timeout     - Number of seconds before force-closing stalled conn
      -k count       - Skip over first <count> tracks before starting to rip
      -I interface   - Rip from specified interface (e.g. eth0)
      -T             - Truncate duplicated tracks in incomplete
      --quiet        - Don't print ripping status to console
      --debug        - Save debugging trace
      --xs_???       - Invoke splitpoint detection rules (see README/man page)
# 
trapster
Maine, USA

Asus eeepc 1005HA PU1X-BK
Frugal install: Slacko
Currently using full install: DebianDog

User avatar
fabrice_035
Posts: 765
Joined: Mon 28 Apr 2014, 17:54
Location: Bretagne / France

#3 Post by fabrice_035 »

@trapster, THX but it's video stream. And i can't use ffmpeg too. Only wget or curl is right.


My code (need youtube-dl / python) download video stream

Code: Select all


#!/bin/sh

rm file.ts
youtube-dl -j 'http://www.giniko.com/watch.php?id=220' > delux.txt

m3u=`echo http:$(cat delux.txt |  python -mjson.tool | grep "\"manifest_url"\" | awk -F":" '{print $3}'| tr -d ',' | tr -d '\"' | head -1 )`
curl $m3u > delux.txt

m3u=$(cat delux.txt | tail -1)

while /bin/true; do
echo "loop^"

curl "http://nimlive6.giniko.com/deluxemusic/deluxemusic/"$m3u > m3u.txt
start=$(cat m3u.txt | grep SEQUENCE | awk -F ":" '{print $2}')
bas="media-u9p46989a_w924998918_"

while /bin/true; do

end=$(cat m3u.txt | tail -1 | awk -F"?" '{print $2}')

wget -q --spider "http://nimlive6.giniko.com/deluxemusic/deluxemusic/"$bas$start.ts"?"$end

EXITCODE=$?
echo "Exit code was $EXITCODE"
if [ $EXITCODE -ne 0 ]
then
echo "wget executed with error"
sleep 3
else

wget "http://nimlive6.giniko.com/deluxemusic/deluxemusic/"$bas$start.ts"?"$end -O -  >> file.ts

start=$(($start + 1))
fi

done
done

jafadmin
Posts: 1249
Joined: Thu 19 Mar 2009, 15:10

#4 Post by jafadmin »

One could write a "vidloop" utility to do this in C.

Create the ".ts" buffer in RAM and write to it using pointers for the loop. Write out the buffer to disk when wget terminates.

Code: Select all

wget "http://www.streaming_site.com/part_increase_number.ts" -O - | vidloop -s 20m -f output_file.ts 
"-s" specifies size of buffer, "-f" specifies output file path

It would be too I/O intensive to attempt a direct disk FIFO type file approach.

.

User avatar
fabrice_035
Posts: 765
Joined: Mon 28 Apr 2014, 17:54
Location: Bretagne / France

#5 Post by fabrice_035 »

@jafadmin , yeah good idea. I'm not able to do that.
:shock:

Post Reply