Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

ralphmorris's avatar

Using Amazon S3 or storing user uploaded files on your server

I have been looking at amazon S3 today trying to figure out the advantages/disadvantages and use cases for when it is a good choice to use it.

I have a web app where many users can have many uploaded files in the following places for example: Portfolio, Review images, booking notes with attached images. A very rough estimation of 50 users could have as many as 3k images with the potential to grow a lot more with more users if all goes well.

Would this be a good candidate for using S3? I am debating with myself about wanting to keep things simple and not integrate if it wouldn't affect performance but if it is best practice to use a service like S3 for a project like this then I'd rather do it now than do a migration down the line.

Any advise/ideas appreciated.

0 likes
12 replies
bashy's avatar

If you do not want to have to worry about keeping the files on your server and the extra security concerns that comes with that, I'd use S3. It's cheap for what you get.

You can always extract all the data and put it elsewhere if you decide it's not best?

1 like
dipasquo's avatar

Take a look at Laravel's Filesystem. It provides an abstraction layer which means you can store files locally today and on AWS S3 tomorrow, or vice versa, with very little code change required. https://laravel.com/docs/5.4/filesystem If you're already running your app on EC2 instance(s), then S3 network latency is nominal, why not use S3 for file storage, it's likely to give you flexibility down the road that local file storage will not. Whether or not your app is running on EC2, if upload actions are infrequent but downloads are frequent, using S3 for storage would allow you to forgo storage and bandwidth concerns.

ralphmorris's avatar

Thanks @bashy and @dipasquo for your responses.

My app is close to going live and I am looking at hosting solutions at the moment with either AWS or Digital Ocean.

@bashy by extra security concerns are you referring to access or backups etc? I think piece of mind is definitely what I am looking for from a performance and reliability point of view. I'm sure S3 is very reliable but I'm always hesitant to rely on to many external services incase of changes to their API etc.

I also read the S3 had some big outage earlier this year. Obviously that can happen with any host but I guess you'd still be safer with a service like S3 over storing locally? Though as @dipasquo points out I could have them stored locally but do full backups to S3.

bashy's avatar
bashy
Best Answer
Level 65

@ralphmorris Yeah you can use multiple disks in Laravel and you can store a backup locally as well. By security, I meant storing and serving the files from the app. Long as you have validation and image/file checks to make sure it's not malicious.

ralphmorris's avatar

Cool thanks. I think my approach will be each upload gets uploaded to local storage and S3. If S3 were to ever have problems then I can just change a config to start showing the images from local storage.

dipasquo's avatar

You can also look at S3 cross-region replication - the AWS outage of a few months ago, I do not believe there was any data loss, just data inaccessibility. People who had multi-region deployments did not suffer the way people who were isolated to the affected region.

bashy's avatar

Yeah I'd definitely suggest looking at replicating the data across regions or hosts.

ralphmorris's avatar

That's really interesting thanks. I didn't realise that was a service. I've just been reading about it and it looks like if there was an outage I could just change the config to point to the other region and it would keep working. Also it looks like you can set it up to be bidirectional so if I did Puts to the backup bucket it should sync as well. I guess the only question if would it sync automatically after there was a problem on one of them. But I think generally that sounds like a really good solution.

If there are too copies I think i'd prefer not to keep one locally also just for space and also keeping the code cleaner rather than have to write/update/delete to two places.

ralphmorris's avatar

@bashy @dipasquo Ok last question I promise :) This has been really helpful.

Currently my app is in L5.2. Although I'd really like to update it to get some of 5.4's features this isn't going to happen for quite a while.

Maybe a dumb question but is the S3 API likely to change to the point where Storage::put('file.jpg', $resource); using league/flysystem-aws-s3-v3 ~1.0 would just stop working?

Or in 5.2 is this thing likely to be maintained with a simple composer update on local, make any amendments and then composer install on production server?

Or am I likely to not have these kind of problems with it?

This is my first app built solely by myself so haven't got much experience in maintaining a Laravel app after it's built.

Thanks again

bashy's avatar

@ralphmorris Basically, if laravel/framework for 5.2 or thephpleague packages get an update (doubtful for laravel 5.2), you can just composer install locally, check it still works and then push & deploy.

If you upgrade, you'd need to check if the API for the Storage flysystem is still the same obviously.

ralphmorris's avatar

Thanks @bashy that makes sense. But is there likely to ever be an update on the S3 side that would force you to have to update your thephpleague package / L5.2 to keep it working?

I'm sure it's impossible to say for certain but would guess the S3 api is pretty damn solid and wouldn't need updating hugely?

Cheers

bashy's avatar

@ralphmorris There shouldn't be any because it will use a certain API version for AWS. They might stop supporting a version if it gets too old but I doubt that's anytime soon. You could/should probably keep the league package up to date though within the major version (1.*) for security fixes and whatnot.

1 like

Please or to participate in this conversation.