Ever wanted to server large JSON information over the community (like 100+ MB information).
The environment friendly approach we are able to deal with this downside is by changing the JSON information to binary after which ship it to the consumer.
Firs lets simply convert JSON to .gz
file and see the scale distinction.
Comparability:
Precise file measurement – 107MB
, compressed .gz
file measurement – 3.2MB
.
Linux command to transform JSON to .gz
gzip filename.json
It can convert your file to the .gz
file.
Now you’ll be able to simply server this file over the community and the consumer should decode the file from .gz
file to JSON.
A helpful library to transform binary information to string is (https://www.npmjs.com/package deal/pako)[pako].
Use under code in React.js to transform .gz
to .json
file.
import { inflate } from "pako";
const parse = (bin: any) => inflate(bin, { to: "string" })
export const fetchJSON = () => {
return fetch("/information/information.json.gz").then(
async (response) => {
const blob = await response.blob();
readSingleFile(blob).then((res) => {
let jsonData = parse(rs);
console.log(jsonData);
return jsonData;
});
});
};
async perform readSingleFile(file: File | Blob) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = perform (e) {
resolve(e?.goal?.consequence);
};
reader.readAsArrayBuffer(file);
reader.onabort = reject;
reader.onerror = reject;
});
};
*Why this text even make sense: *
Properly just lately I used to be working with deck.gl
and needed to render large JSON dataset information (over 150+ MB). If the information had been domestically current that is OK however think about serving these dataset information from a CDN 😨😨.
For similar downside I researched over the web about tips on how to serve massive dataset information effectively/optimize in deck.gl
and located nothing, and ultimately I find yourself with doing binary conversion and decoding these information in browser to render map contents.
I do know this isn’t the optimum method however If anybody has a greater method or has any expertise with deck.gl of rendering massive datasets. Please remark down under.
Thanks.