Basically, you should accept all HTTP headers. Also, you should pass the necessary headers and cookies.
- 10.0.0.1-10.0.0.255: 10.0.0.0/24
- 10.0.0.1-10.0.255.255: 10.0.0.0/16
- 10.0.0.1-10.255.255.255: 10.0.0.0/8
Add a soft link to /usr/local/bin/php should solve the problem.
The ftp account is correct, but failed to get directory list: “Could not retrieve directory listing”. Strange, isn’t it?
At first, I thought it was something wrong with the ftp add user script. So I tried to do it manually but still didn’t work.
After a one-hour research, here is the answer: I should use passive mode. And for Pureftpd to run on passive mode, I should open more ports to the public. So this is mainly a firewall issue.
Spent the whole afternoon searching for answers to secure files on AWS S3. Here is how I solve it:
Upload the files to S3 as “private” files. So the public users won’t have access even knowing the file url.
Then use Laravel routing to retrieve the file. And at the same time, adding auth middleware to the route.