Recovering from Stuck Files in Uploads Directory
SFTP Gateway uses a linux tool called incrond to monitor directories for incoming files. If you are dealing with a lot of nested subdirectories or if there are many different files coming into many different directories all at once, this tool can get overwhelmed causing the monitoring of some directories to be skipped. If a single directory is not setup for monitoring successfully, it will appear that files are "stuck" as they will sit in those directories and not transfer to S3.
If this happens to your instance, please send us a detailed recursive listing of the users upload directory that shows
the stuck files (the output of this command: ls -alR /home//var/log/movetos3/movetos3.log
file to support@thorntech.com and we can help you pinpoint exactly what went wrong.
One way to avoid directories from getting skipped is to enable the "sftpgateway.singlethread=yes" option as described in our Best Practices for Production KB article (Best Practices for Production). This feature will prevent incrond from becoming overloaded by processing file events sequentially instead of all at once (which can overwhelm the server leading to missed directories/files).
The "addsftpuser" command line script is designed to be idempotent, meaning that you can run it as many times as you want, and it will always configure the user's directory for proper monitoring. The last step of this command is to scan the users upload directory for existing sub directories and make sure they are properly setup for monitoring.
Once "addsftpuser" has been re-run to fix monitoring of all directories, you may still have existing files that need to get transferred to S3. It is not necessary to upload these files a second time. Files can simply be "touched" and they will automatically transfer to S3 as long as the directory is setup for monitoring (you can verify the directory is being monitored by running "sudo incrontab -l").
Run the following command as the root user in the users upload directory to automatically touch all existing files:
sudo su -
cd /home/user/home/user/uploads
find . -mmin +60 -type f ! -name *.tmp ! -name *.filepart -exec touch {} \;
This command will recursively find all files that are older than 60 minutes that do not have an extension of ".tmp" or ".filepart" and then perform the linux "touch" command on them. These filters are in place to prevent touching a file that is actively in the process of uploading via sftp.
Version 1.003.01 has a known bug that could skip directories if multiple subdirectories are uploaded recursively from certain SFTP clients. This was fixed in version 1.003.2.