-
-
Notifications
You must be signed in to change notification settings - Fork 387
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to stream upload a file using multipart/form-data #474
Comments
I did some digging and the issue seems to stem from form = parse_multipart((body.stream if hasattr(body, 'stream') else body), header_params) Here,
I tried to go ahead and replace the call to #form = parse_multipart((body.stream if hasattr(body, 'stream') else body), header_params)
form = FieldStorage(fp=body,outerboundary=header_params['boundary'])
print(form) Output from
@timothycrosley If you have any suggestion on how I could move forward with implementing this with |
Update: Using this kind of invocation: form = FieldStorage(fp=body,outerboundary=header_params['boundary'],\
environ={'REQUEST_METHOD':'POST','CONTENT_TYPE':'MULTIPART/FORM-DATA'}) I was able to get an instance where My gut-feeling is that |
@timothycrosley I had success working this out by incorporating the multipart parser into @content_type('multipart/form-data')
def multipart(body, **header_params):
"""Converts multipart form data into native Python objects"""
from multipart import MultipartParser
if header_params and 'boundary' in header_params:
if type(header_params['boundary']) is str:
header_params['boundary'] = header_params['boundary'].encode()
parser = MultipartParser(stream=body,boundary=header_params['boundary'],disk_limit=17179869184)
form = dict(zip([p.name for p in parser.parts()],\
[(p.filename,p.file) if p.filename else p.file.read().decode() for p in parser.parts()]))
#form = parse_multipart((body.stream if hasattr(body, 'stream') else body), header_params)
#for key, value in form.items():
# if type(value) is list and len(value) is 1:
# form[key] = value[0]
return form What The
Hug example code: @hug.post('/upload',versions=1)
def upload_file(body,request,response):
"""Receives a stream of bytes and writes them to a file."""
print(body)
filename = body['file'][0]
filebody = body['file'][1]
with open(filename,'wb') as f:
chunksize = 4096
while True:
chunk = filebody.read(chunksize)
if not chunk:
break
f.write(chunk)
return The resulting {
'file': ('example_file', <_io.BytesIO object at 0x7fc32549dee8>),
'bar': '0123456789',
'foo': 'hellohello'
} |
@milancurcic I currently maintain streaming-form-data which is a some others running into a similar issue with flask have found it useful so I thought I might suggest using it for hug as well. hopefully it helps with the issue here too. |
1cgi module cannot handle POST with multipart/form-data in 3.x https://bugs.python.org/issue4953 the multipart parser adapted it! # multipar.line 232
def _lineiter(self):
#####
for line in lines:
if line.endswith(b"\r\n"):
yield line[:-2], b"\r\n"
elif line.endswith(b"\n"):
yield line[:-1], b"\n"
elif line.endswith(b"\r"):
yield line[:-1], b"\r"
else:
yield line, b"" 2ulmentflam commented |
I am trying to get hug to receive a
multipart/form-data
POST request and stream body in chunks straight to disk. I was able to successfully upload stream a large binary file usingapplication/octet-stream
POST method. Here is my hug method:And here is my curl snippet:
The above works and I'm able to stream upload the file like this because in the
upload_file
function,body
is agunicorn.http.body.Body
instance which I am able to stream straight to disk in chunks.However I need to be able to upload files from browser, which sends a
multipart/form-data
POST request. To emulate this with curl, I do:This time, in hug, the
body
is a dictionary, andbody['file']
is aBytes
instance. However I don't know how to stream this to disk without loading the whole thing in memory first.Is there a way I could obtain the body as a file object that I could stream straight to disk?
Any help much appreciated and thank you for the fantastic work on Hug!
The text was updated successfully, but these errors were encountered: