I have uploading files to S3 working fine with my Wagtail/django application (both static and uploads). Now I'm trying to use ManifestStaticFilesStorage
to enable cache busting. The urls are correctly being generated by the application and files are being copied with hashes to S3.
But each time I run collectstatic
some files get copied twice to S3 - each with a different hash. So far the issue is ocurring for all CSS files.
file.a.css
is loaded by the application and is the file referenced in staticfiles.json
- however it is a 20.0B file in S3 (should be 6.3KB).
file.b.css
has the correct contents in S3 - however it does NOT appear in the output generated by collectstatic
.
# custom_storages.py
from django.conf import settings
from django.contrib.staticfiles.storage import ManifestFilesMixin
from storages.backends.s3boto import S3BotoStorage
class CachedS3Storage(ManifestFilesMixin, S3BotoStorage):
pass
class StaticStorage(CachedS3Storage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3BotoStorage):
location = settings.MEDIAFILES_LOCATION
file_overwrite = False
Deps:
"boto==2.47.0",
"boto3==1.4.4",
"django-storages==1.5.2"
"Django==2.0.8"
Any pointers on where to look to track down this issue would be appreciated! :)
Edit:
Looking more carefully at all the files copied to S3 the issue is ONLY occurring for CSS files.
Disabling pushing assets to S3 and writing them to the local filesystem works as expected.
Edit 2:
Updated all the deps to the latest version - same behavior as above.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…