Django “chunked uploads” to Amazon s3
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
We're using the S3Boto3Storage to upload media files to our s3 storage on Amazon. This works pretty well. Since we're using Cloudflare as a "free" version we're limited to a maximum of 100MB per request. This is a big problem. Even the Enterprise plan is limited to 500MB.
Is there a way to use a kind of "chunked uploads" to bypass the 100MB-per-request limit?
model.py
class Media(models.Model):
name = models.CharField(max_length=100, null=True)
file = models.FileField(upload_to=get_path)
storage.py
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
location = 'media'
file_overwrite = False
views.py
@api_view(['POST'])
def upload_media(request):
if request.method == 'POST':
serializer = MediaSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
django cloudflare django-storage
add a comment |
We're using the S3Boto3Storage to upload media files to our s3 storage on Amazon. This works pretty well. Since we're using Cloudflare as a "free" version we're limited to a maximum of 100MB per request. This is a big problem. Even the Enterprise plan is limited to 500MB.
Is there a way to use a kind of "chunked uploads" to bypass the 100MB-per-request limit?
model.py
class Media(models.Model):
name = models.CharField(max_length=100, null=True)
file = models.FileField(upload_to=get_path)
storage.py
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
location = 'media'
file_overwrite = False
views.py
@api_view(['POST'])
def upload_media(request):
if request.method == 'POST':
serializer = MediaSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
django cloudflare django-storage
Amazon recommends multipart uploads for big files. Haven't worked with S3Boto3Storage in a while, but maybe you'll find this gist useful: gist.github.com/Hydriz/4413028 It contains an example of multipar uploads of 200mb each.
– Alex
Nov 23 '18 at 16:09
add a comment |
We're using the S3Boto3Storage to upload media files to our s3 storage on Amazon. This works pretty well. Since we're using Cloudflare as a "free" version we're limited to a maximum of 100MB per request. This is a big problem. Even the Enterprise plan is limited to 500MB.
Is there a way to use a kind of "chunked uploads" to bypass the 100MB-per-request limit?
model.py
class Media(models.Model):
name = models.CharField(max_length=100, null=True)
file = models.FileField(upload_to=get_path)
storage.py
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
location = 'media'
file_overwrite = False
views.py
@api_view(['POST'])
def upload_media(request):
if request.method == 'POST':
serializer = MediaSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
django cloudflare django-storage
We're using the S3Boto3Storage to upload media files to our s3 storage on Amazon. This works pretty well. Since we're using Cloudflare as a "free" version we're limited to a maximum of 100MB per request. This is a big problem. Even the Enterprise plan is limited to 500MB.
Is there a way to use a kind of "chunked uploads" to bypass the 100MB-per-request limit?
model.py
class Media(models.Model):
name = models.CharField(max_length=100, null=True)
file = models.FileField(upload_to=get_path)
storage.py
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
location = 'media'
file_overwrite = False
views.py
@api_view(['POST'])
def upload_media(request):
if request.method == 'POST':
serializer = MediaSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
django cloudflare django-storage
django cloudflare django-storage
edited Nov 23 '18 at 15:30
PinkyTV
asked Nov 23 '18 at 15:15
PinkyTVPinkyTV
589
589
Amazon recommends multipart uploads for big files. Haven't worked with S3Boto3Storage in a while, but maybe you'll find this gist useful: gist.github.com/Hydriz/4413028 It contains an example of multipar uploads of 200mb each.
– Alex
Nov 23 '18 at 16:09
add a comment |
Amazon recommends multipart uploads for big files. Haven't worked with S3Boto3Storage in a while, but maybe you'll find this gist useful: gist.github.com/Hydriz/4413028 It contains an example of multipar uploads of 200mb each.
– Alex
Nov 23 '18 at 16:09
Amazon recommends multipart uploads for big files. Haven't worked with S3Boto3Storage in a while, but maybe you'll find this gist useful: gist.github.com/Hydriz/4413028 It contains an example of multipar uploads of 200mb each.
– Alex
Nov 23 '18 at 16:09
Amazon recommends multipart uploads for big files. Haven't worked with S3Boto3Storage in a while, but maybe you'll find this gist useful: gist.github.com/Hydriz/4413028 It contains an example of multipar uploads of 200mb each.
– Alex
Nov 23 '18 at 16:09
add a comment |
1 Answer
1
active
oldest
votes
In order to bypass that limit, you'll have to use something like resumable.js on the client-side to chunk the upload in parts to send to the server via a REST call. On the sever side, you will then have to reassemble the file on the server side before you push to s3.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53449148%2fdjango-chunked-uploads-to-amazon-s3%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
In order to bypass that limit, you'll have to use something like resumable.js on the client-side to chunk the upload in parts to send to the server via a REST call. On the sever side, you will then have to reassemble the file on the server side before you push to s3.
add a comment |
In order to bypass that limit, you'll have to use something like resumable.js on the client-side to chunk the upload in parts to send to the server via a REST call. On the sever side, you will then have to reassemble the file on the server side before you push to s3.
add a comment |
In order to bypass that limit, you'll have to use something like resumable.js on the client-side to chunk the upload in parts to send to the server via a REST call. On the sever side, you will then have to reassemble the file on the server side before you push to s3.
In order to bypass that limit, you'll have to use something like resumable.js on the client-side to chunk the upload in parts to send to the server via a REST call. On the sever side, you will then have to reassemble the file on the server side before you push to s3.
answered Nov 23 '18 at 19:36
2ps2ps
8,16221131
8,16221131
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53449148%2fdjango-chunked-uploads-to-amazon-s3%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Amazon recommends multipart uploads for big files. Haven't worked with S3Boto3Storage in a while, but maybe you'll find this gist useful: gist.github.com/Hydriz/4413028 It contains an example of multipar uploads of 200mb each.
– Alex
Nov 23 '18 at 16:09