Search
Text Size:
Smaller Text Normal Text Larger Text

Facebook page

Twitter profile

YouTube channel


Download of images while observing remotely

During a remote observing session, one of the concerns is the retrieval of images. You need the means to access this data and you can also be worried about bandwidth issues, because downloading big amounts of data is a resource demanding task that could affect the responsiveness of the system on the remote side

To address the first problem, whenever our users need immediate access to the data, the images are copied during the observing session to our public FTP area (anonymous access - ie., if asked for user and password, provide anonymous as the user and leave the password blank, or introduce your mail address instead). Every image is available at the FTP within a couple minutes after being saved on our main storage area (but an effort is made to have it there after just a few seconds). You can donwload the data in there at will.

To address the second problem (bandwidth), we suggest using downloading tools that let you limit the download rate.

Of course, downloading a corpus of data that grows over the time forces you to either make periodic downloads over the night (if done manually, you can miss some file - but it's ok if you cherry-pick only a few images) or to download the data in one go during the day time, when nobody is using the line. If, anyway, you want to download the images as they're created, we've writen a tool to save you the time on doing it yourself

Image-downloading script

The script itself is available in here. It's a UNIX bash shell-script. Upon download, you should change its permissions to make it executable:

 $ chmod +x download.sh

Using the script

The script is profusely commented and provides a usage guide when run without providing arguments, but we'll comment a few things in here for the sake of redundancy.

  • The scripts runs in a loop, checking every 5 seconds if there's any change on the FTP. This means you can just run it at the beginning of the night and let it download the images for you as they are obtained.
    Press Ctrl+C if you want to terminate the script
  • This script relies on two widely known and available tools: wget and curl. Please, be sure that they're installed on your system. Chances are that either of them is installed by default on your computer, but not both at once. In a Debian or Debian-derived system (like Ubuntu), the following line will do the trick:
     $ sudo apt-get install wget curl
  • The script will, by default limit the download rate, in order to interfere the less with remote operations. The rate can be varied passing an argument on the command line. The limitation can also be dropped using a flag on the command line.
    See the comments on the script if you want to set the default download speed rate to a higher value.
  • You can disable the download limitation passing a flag on the command line (-f). Doing this is a good idea if you're going to download the files in bulk (eg. during the morning), or if your bandwidth is high.
  • The script downloads the files under the directory that was the current one when it was run. The script has some intelligence coded into it, so if you use the same directory during the whole observation (or sets of observations), it will take care of saving time and effort, by not downloading files that have been downloaded already
    The script assumes that every file except for the very latest has been downloaded without problems. It will attempt to resume the downloading of this last file, just in case the downloading was interrupted for some reason (eg. hitting Ctrl+C).
    This means that you can cancel the download at any time and resume it later without fearing data loss.
    The script will also create a subdirectory for each instrument, meaning that the recommended setup is:
     $ mkdir directoryfordownloads
     $ cd directoryfordownloads
     $ ...... # in here you run the script with your preferred options

Example

I'll assume you have copied the script to the base directory for the downloads. Say you want to download ALFOSC images, temporary limiting the download rate to half a megabyte. This is roughly 500kB, so:

 $ ./download.sh 500 alfosc
Back to top Last modified: December 12 2022