How to export a big image? - google-earth-engine

I want to export the imagecollection by 'geemap.ee_export_image_collection_to_drive()', and I got this: Error: Exported image is too big (64240321180032 bytes > 64000000000000). (Error code: 3)
All the images in collection are useful so it's better not to use clip() or reduceRegion().
So how to export an image using geemap.ee_export_image_collection_to_drive() if it is bigger than the max bytes?

This is a 64TB image, that is pretty big. I've never been close to exporting anything that size. You might have run into some EE limitation. I would suggest that you tile your AOI, and make multiple exports. When exporting to Google Drive, if you're image is somewhat big, it'll be exported in tiles to begin with. So if you want a single image, you still have to either merge these tiles or create a VRT.
Still, that size of image lead me to believe that you have done something wrong. If you're sure you haven't, make sure you have enough free space in Google Drive before exporting.

Related

Firebase: cloud storage high bandwith

I'm making a Flutter app that is like social media, so uses pictures a lot. With that in mind, looking at the Firestore data, the reads with about 15 people look like this:
But they are far outpaced by my storage bandwidth usage:
I'm thinking about possible reasons why this is the case. First, I am not sure if this could potentially be a problem, but I save each user's images into a respective folder:
Furthermore, looking through the images, I notice that file sizes average around a few hundred kilobytes, with a few being sized in megabytes, the largest I found is 9 MB. Are these too large?
Are there any other ideas I am missing? I am trying to implement caching on my front end to help with this problem, but I am open to any other possible reasons and solutions.
Furthermore, looking through the images, I notice that file sizes average around a few hundred kilobytes.
If you take a look, at the major social media apps, the average size is around a few tens of kilobytes and not a few hundred kilobytes. If you look at Instagram for instance, all images are around 40-50 kilobytes.
With a few being sized in megabytes, the largest I found is 9 MB. Are these too large?
Way too large.
Are there any other ideas I am missing?
Yes, resize the images before uploading them to storage.
I am trying to implement caching on my front end to help with this problem, but I am open to any other possible reasons and solutions.
For not reading an image from the server, each time the user uses your app, a caching mechanism will be helpful. However, it will not help the first time the user opens the app. When indeed it is needed to be downloaded.
Instead of having a single 9 MB image, it's better to have 180 images of 50 KB.

Barcode that can hold a large amount of data (50 mb)

I am creating an app that allows you to quickly share a file (like a photo or video). It will work by generating something like a barcode, but it can be animated or anything. Just a scannable thing that can be used to shared large amounts of data. I know a QR code can only hold up to 3kb, but in this case I’m not limited to a static image. Anything that can be used to transfer data from a screen to a high resolution camera works. (However I don’t want it to upload the file to a server and then generate a QR code link to that file)
Thanks!
You need to have a storage account somewhere and hold the data in that storage and your QR code needs to redirect to that storage

Is there a good way to display map tiles dynamic and real time?

Ⅰ using python-mapnik(linux)+postgis
I've tried using mapnik to show big data(aboult more than 600,0000 polygon features with much points in postgis),I display it real time from python service without cache. but i meet the problem:
1.At the beginning,load table from database lost much time
2.when I zoom to 12level,map tile will loading slowly
Ⅱ using python(flask)+postgis(MVT)+mapbox-gl
1.display 100,0000 features(simple polygon) so fast,but display big data(aboult more than 600,0000 polygon features with much points in postgis),i find selet query need much time. it's slowly than mapnik
Now I don't know how to complete my research about displaying big vector tiles fast and realtime!!
Is there a persion like me who are interested in quickly displaying data??Any help or suggestion would be appreciated!
At last ,forgive my poor English descrption.
some information about vector tiles I've found,maybe is useful to somebody like me:
Vector tiles, PostGIS and OpenLayers
An update on MVT encoders
Aggregating data for faster map tiles
PostGIS Performance Profiling
MVT generation: Mapnik vs PostGIS
awesome-vector-tiles
You may want to reduce as much as possible the amount of data being transferred from your database to your rendering engine.
This blog post from CARTO may give you some ideas even it's focused in point data.
For polygon datasets, in order to reduce the amount of data moved to the renderer you may want to create simplified versions to use based on the zoom level. Mapshaper is a nice tool to simplify polygons but still retain their topology. And in any case, always combine ST_RemoveRepeatedPoints with ST_SnapToGrid to be sure you are not wasting rendering CPU with wasted pixels.

cloud functions truncating logs

I currently rely on getting some information from cloud functions through console.log(information)
However, whenever the logs are long they are truncated when viewing on the browser. Scrolling bring up the next set of logs, hence the logs are incomplete (see attached image)
I'm aware that I can store his data in the database, but sometimes it's convenient viewing the logs as well
Anyone with a better way of going round this?
There is no easy workaround for this. You would have to build a string of whatever you want to log, split it into segments that fit on screen, then log those substrings separately. Or, you could write the log somewhere else temporarily, such a database.
This is a known issue, but please feel free to file a bug report to add your voice.

Why does Teradata SQL Assistant stop color-coding text?

I am using a very long .sql file (5,000 lines) in Teradata SQL Assistant. When I initially copy-pasted the text in, I got the usual font-colors (SELECT, FROM, etc show up in blue font, text strings in a pink/purple color, etc.)
However, when I saved and then re-opened this file directly, the font was all black. No colors at all. If I copy-paste the code into a new SQL Assistant query window, the color shows up again. But if I save that new query and then re-open it, I get just a solid black font.
Can anyone help me figure out what is going on here? It's not an absolute deal-breaker (I can still run the code), but it's definitely annoying to debug a wall of uniform, black font.
I can't post the code because (1) it's for work and (2) there's a ton of it. It's a long series of INSERT statements into a diagnostic table with the results of running SELECT on a trio of other tables for the purposes of looking for bad data. It does include some long SQL-code snippets as text (within quotes).
This might be due to the size of the query, some people complained that parsing/highlighting is too slow for huge source codes.
Now there are two options found in Tools - Options - Query:
Limit parsing for Queries larger than ... KB
Disable parsing for Queries larger than ... KB
Try increasing those values.

Resources