I would like to know which one of json.dump()
or json.dumps()
are the most efficient when it comes to encoding a large array to json format.
Can you please show me an example of using json.dump()
?
Actually I am making a Python CGI that gets large amount of data from a MySQL database using the ORM SQlAlchemy, and after some user triggered processing, I store the final output in an Array that I finally convert to Json.
But when converting to JSON with :
print json.dumps({'success': True, 'data': data}) #data is my array
I get the following error:
Traceback (most recent call last):
File "C:/script/cgi/translate_parameters.py", line 617, in <module>
f.write(json.dumps(mytab,default=dthandler,indent=4))
File "C:Python27libjson\__init__.py", line 250, in dumps
sort_keys=sort_keys, **kw).encode(obj)
File "C:Python27libjsonencoder.py", line 209, in encode
chunks = list(chunks)
MemoryError
So, my guess is using json.dump()
to convert data by chunks. Any ideas on how to do this?
Or other ideas besides using json.dump()
?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…