Authorguda
Those days we almost use cloud for everthing. But sometimes we need to pull files from sftp server. Here are two solutions for that
Pull and remove with sftp
This solution pulls the files then removes them from the remote. There is a gotcha that if you expect a lot of files there might be a chance a file to arrive while the “get -r …” command is executing. Then the “rm *” will remove it. So this is suitable if you expect a few files a week/day
Create a batchfile.sh
get -r upload/* incoming/
rm upload/*
Then add cron
0 5 * * * /usr/bin/sftp -b batchfile.sh username@sftp-corp.company.com
Only pulling with lftp
When I don’t have permissions to remove the files from the remote sftp I use the following off-the-shelf aproach.
This cron is synchronizing files all files to /home/USERNAME/incoming
0 5 * * * /usr/bin/lftp -u USERNAME,none -e 'mirror --newer-than="now-7days" --only-newer --exclude .ssh --only-missing / /home/USERNAME/incoming; quit' sftp://sftp-corp.company.com
When in your distribution the postgres is stick to version 10 and you have to upgrade to postgres-11 a good way to do a capistrano deploy is like this
Do the system install with
yum install postgresql10-contrib postgresql10-devel
And then in your /shared/.bundle/config add a line showing the location of the pg libraries
---
BUNDLE_PATH: "/opt/application/shared/bundle"
BUNDLE_BUILD__PG: "--with-pg-config=/usr/pgsql-10/bin/pg_config"
BUNDLE_FROZEN: "true"
BUNDLE_JOBS: "4"
BUNDLE_WITHOUT: "development:test"
Thanks to my colleague Kris for finding the solution.
https://blog.juliobiason.net/thoughts/things-i-learnt-the-hard-way/
“phlpwcspweb3” is found at the “Amazon Web Services – Tagging Best Practices“
From what I can decode from “phlpwcspweb3” this is something related to web, and probably there are at least 3 instances of that kind.
According to AWS this should be meaningful hostname.
If you have decoded this you probably do not need to read further….
© 2025 Gudasoft
Theme by Anders Norén — Up ↑