How to record Youtube live streams on a Linux server

Coachella streams a load of their sets every year on Youtube and nobody seems to record them all. So I had to volunteer starting in 2015. Some rips you find are one-off screen recordings using some software like Camtasia, which nearly always drop frames and are kinda shit. You have to record directly from the Youtube source for the best quality. youtube-dl is great but is designed for downloading normal Youtube videos and I never got it working to record live. The only thing that worked for me was livestreamer, which was discontinued but was forked as streamlink. Here’s my streamlink command which worked for recording Coachella 2018 channel 1:
streamlink --http-no-ssl-verify https://www.youtube.com/watch?v=9TUBf6l7FBg best -o "output.ts"

I had a problem with SSL certs so had to add a no-verify but you probably don’t need it. Run that command and streamlink should start recording. Ctrl+C to stop recording. You’re now left with a .ts transport stream file which Youtube broadcasts in to help with error correction over the Internet. You can play these .ts files in VLC and MPC but seeking in them is annoyingly slow and VLC can sometimes play the audio out of sync. Also, they don’t seem to have a table of contents or whatever it’s called, so they’re missing a duration. This is bad for me as I need to play the video files direct from the server and pick the start and stop times to cut the Coachella streams into individual sets. So I prefer to convert to MP4 using ffmpeg, with this command:
ffmpeg -i input.ts -acodec copy -bsf:a aac_adtstoasc -vcodec copy output.mp4

If you’re recording a long Youtube live stream, like I am with the 9 hour Coachella streams, sometimes the stream can fail for whatever reason. Youtube seems to always break at 6 hours. You need a way for Streamlink to start back up if it fails so you don’t miss anything. The only solution I could come up with is a while true loop in a streamlink bash shell script.
#! /bin/bash
while true;
do
dir=sunday_ch1
name=coachella_ch1
i=1
while [[ -e $dir/ts_segments/$name-$i.ts ]] ; do
let i++
done
name=$name-$i
streamlink --http-no-ssl-verify https://www.youtube.com/watch?v=9TUBf6l7FBg best -o "$dir/ts_segments/$name.ts";
done;

Note that when you run this bash script, you can’t just stop it with Ctrl+C. It’ll restart the streamlink command like it’s supposed to. You have to kill -9 it. Find the PID of the bash script with ps -ef and kill it. You’re left with an orphaned streamlink command which you’ve to kill -9 as well, but it won’t restart as the bash script was already killed. Make sure you understand how to stop the shell script before starting it!

The shell script will start with saving the .ts files to sunday_ch1/ts_segments/coachella_ch1-1.ts. The while -e line returns true if the file exists, and if so increments i by one to ensure you always write sequentially. So when the script restarts, it writes to coachella_ch1-2.ts and won’t overwrite coachella_ch1-1.ts and so on.

I run my bash scripts in screensessions so I can log out of the server and let it do its thing. So that’s my recording setup. Other stuff I do is cut the long streams into individual sets with ffmpeg, add those sets to my database, upload them to Mega or Google Drive using the APIs so I get the Mega/GDrive link into the database and output the links into a formatted Reddit post, but that’s all outside the scope of this blog post.

Hope this helps someone! Leave a comment if you’ve any questions, don’t email me.