Suppose I have a classic sql-like database. Up to now, I let applications/users connect directly to the database and execute queries. This connection is odbc-like or even better database specific, optimized as usual for data transfer in binary mode.
Now I am asked to deny direct access to the database, and write an API layer to retrieve information.
Question is: an http API call with json response is very very inefficient compared to direct access: the data transferred is full of junk (json is text compared to binary mode, even if compressed, and the json structure needs columns name to be repeated for each row as per standards), so the bandwidth is hugely affected in this way.
I know about pagination, chunking, and whatever, so it is technically feasable: but at the very end, the total time required by clients and the total size consumed are way far the ones got with direct access.
So I wonder: is there a better specification to handle large data via API? Maybe a binary content? Maybe another protocol instead of http? I have no idea if there are better ways as best practices to retrieve data from a database without direct connection but still in an efficient way.
If a user used to do a SELECT* before with success, he should keep doing the same SELECT* via API without limitations in terms of data size and time.
Please share what you think about it, thank you!
question from:
https://stackoverflow.com/questions/65926365/what-are-alternatives-to-send-large-amount-of-data-via-api 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…