Creating a Temporary File Hoster

Last modification on 2022-04-27

For the past couple years, whenever I wanted to upload a file, I would curl the file to lainsafe[0], i/[1], and recently[2].

0: lainsafe 1: kallist uploader 2:

Since I want to selfhost, I thought i can just use either of what those three used. Earlier today though, I realized I could just copy the file(s) I want to upload via rsync/scp to a public directory that gets served by an httpd or gopherd.

From what I understand, the previous file hosters had a program running that read the file that the user uploads to them, does some renaming, and writes that to a directory that is served. After some time, that file is deleted. The first part can be handled via rsync/scp like mentioned previously. For automatic deletion, I recently saw in find's man page that it can list that haven't been modified via the -mtime flag, so that can be used with a cron job.

But while thinking of this idea, I got stumped by how to print back the url to this file that is uploaded since printing the filename as is appended to its baseurl, there could be spaces and other invalid unescaped characters which programs trying to download it may not like.

I thought I could just create a separate program for this. However, doing this seemed more complicated than just copying the file to the server. So, with the help of awk and some StackExchanging, I've been able to do it.

urlencode() {
	awk '
BEGIN { for (i = 1; i < 256; i++) hex[sprintf("%c", i)] = sprintf("%%%02X", i) }
        for (i = 1; i <= length($0); i++) {
                c = substr($0, i, 1)
                printf("%s", c ~ /^[-._~0-9a-zA-Z]$/ ? c : hex[c])
        printf "\n"


[ -z "$1" ] && exit 1

scp "$FILE" "$SERVER":files/ || exit 1
printf "%s/" "$BASEURL"
basename "$FILE" | urlencode

Then to purge these files after they become too old (e.g. 3 days), you can put something like this in a cron job to run daily (replace file directory):

0 0 * * * find /path/to/dir/ -mtime +3 -exec rm {} \;

You can also put this command in /etc/daily.local or /etc/cron/daily, or whatever file your root crontab's @daily runs (if there is one).

And that's it! The only difficult part that I experienced was encoding the name of the file and originally did that in C. However, having a mixed C and shell program just for file uploading didn't sit right with me. It seems like whenever you're in doubt, you can rely on awk huh.