How to force disable S3 storage?


#1

I was testing using Minio as the backend for the amazon-s3 storage engine.

I configured all of the settings, and first tried to migrate a file to amazon-s3 (with --copy) and it failed, ok, configuration error in Minio (turns out if the bucket doesn’t exist beforehand, there is no error returned and Phabricator thinks the file got uploaded correctly).

So then I created the bucket, tried migrate again, and this time it worked.

Now, happy with my tests, I wanted to disable S3 storage. I deleted the storage.s3.bucket key from the config, thinking that it will not use S3.

When I upload an image to a comment box by pasting from my clipboard the image buffer (take a screenshot with printscreen, paste), the image itself uploads fine to the blob engine, but the image-preview.png which is autogenerated is broken, and when trying to view the file in the web interface I just get:

Unhandled Exception ("PhabricatorFileStorageConfigurationException")

No 'storage.s3.bucket' specified!

And now all image thumbnails are trying to get to S3, and I don’t know how to stop S3 from being used! I deleted the amazon-s3.endpoint key as well, but it isn’t changing anything.

So, for my corrupt files (where the migrate failed) I can just delete it and re-upload it, but I’d love to have thumbnails back.

Edit

One more note: I restarted the webserver and phd, then I saw a small setup error saying “incomplete S3 configuration”, and the page said “S3 is disabled until you fill in missing details” (cool, I thought, maybe its fixed). To be safe, I deleted all amazon-s3 keys, reload page, OK error gone.

Paste image from clipboard to comment: preview still broken with No 'storage.s3.bucket' specified!


#2

OK, I realized that the file I migrated to S3 which failed at the very beginning (it was an image-preview) is the one that is constantly getting re-created as an S3 entry.

I’m guessing there is some hash of this preview image and its just pointing to the same underlying storage location for some sort of deduplication.

If its save to delete rows from phabricator_file.file I can just delete the offending files outright (it seems no one has the permission to delete thumbnails…)


#3

OK I fixed this:

Found the original blob entry in phabricator_file.file_storageblob, changed the storageEngine and storageHandle to blob and the row id from phabricator_file.file_storageblob respectively.

Deleted all my duplicate uploads that were created in an attempt to get thumbnails working.

Files are clean and the original blob is still alive and well.


#4

Thanks for the report and followup. It’s likely that we should be doing a better job handling the case where the destination S3 bucket doesn’t exist but the upload appears to succeed.


#5

Oh wait, I think I misunderstood. From your description, it sounds like it’s Minio that’s swallowing the “no such bucket” exception.


#6

To be honest I’m not sure where the fault lies, I will investigate further (as I’d like to switch to a Minio-backed storage eventually).

Either Phabricator didn’t understand the file wasn’t sent, or Minio didn’t reject it properly. I’ll find out.