.accdb file smaller than .mdb? - ms-access-2010

I created a new form in an already existing database and also added many new fields.
The database is old and saved in .mdb format. My computer automatically updated it to .accdb.
I finished the new form and nothing was changed in the database except adding several new fields. I noticed, however, that the original .mdb version is a much larger file (9 MB) than the new .accdb version (6 MB).
I did not delete anything from the original nor did I compress the file (at least to my knowledge).
Does .accdb automatically compress .mdb files when converting?

Related

databricks autoLoader - why new data is not write to table when original csv file is deleted and new csv file is uploaded

I have a question about autoload writestream.
I have below user case:
Days before I uploaded 2 csv files into databricks file system, then read and write it to table by autoloader.
Today, I found that the files uploaded days before has wrong data that faked. so I deleted these old csv file, and then uploaded 2 new correct csv file.
Then I read and write the new files by autoloader streaming.
I found that the streaming can read the data from new files successfully, but failed to write to table by writestream.
Then I tried to delete the checkpoint folder and all sub folders or files and re-create the checkpoint folder, and read and write by stream again, found that the data is write to table successfully.
Questions:
Since the autoloader has detect the new files, why it can't write to table succesfully until I delete the checkpoint folder and create a new one.
AutoLoader works best when new files are ingested into a directory. Overwriting files might give unexpected results. I haven't worked with the option cloudFiles.allowOverwrites set to True yet, but this might help you (see documentation below).
On the question about readStream detecting the overwritten files, but writeStream not: This is because of the checkpoint. The checkpoint is always linked to the writeStream operation. If you do
df = (spark.readStream.format("cloudFiles")
.option("cloudFiles.schemaLocation", "<path_to_checkpoint>")
.load("filepath"))
display(df)
then you will always view the data of all the files in the directory. If you use writeStream, you need to add .option("checkpointLocation", "<path_to_checkpoint>"). This checkpoint will remember that the files (that were overwritten) already have been processed. This is why the overwritten files will only be processed again after you deleted the checkpoint.
Here is some more documentation about the topic:
https://learn.microsoft.com/en-us/azure/databricks/ingestion/auto-loader/faq#does-auto-loader-process-the-file-again-when-the-file-gets-appended-or-overwritten
cloudFiles.allowOverwrites https://docs.databricks.com/ingestion/auto-loader/options.html#common-auto-loader-options

For Xamarin new version release will local database file be overwritten?

I have a local database file being put in my assets and resources folder. The database file gets copied as a local database after I create all the tables. If I release a new version with a new updated database file in the assets & resources file will the database overwrite the previous one? If so what happens with the local database I have already created in the user's app?
Do I have to delete all the tables upon new version start up then recreate them with the new data? Or can they be merged?
I am coming to an end of coding my first app and I wanted to take this into consideration. Trying to future proof as much as possible.

ASP.NET MVC 4 Get file input stream before completely filling the server's memory

I am having hard time figuring out how to get the file InputStream from file upload Post request to server, before it gets completely loaded into memory.
This is not problematic for smaller files, but I am worried what happens after trying to upload a larger file (1 or more GB). I found a possible solution with using HttpContext.Request.GetBufferlessInputStream(true), but this stream includes the whole request not just the uploading file and if I use it to upload a file for example into the Azure Blob Storage or anywhere else I end up with the corrupted file. I also lose all the information about the file (file name, size, etc.).
Is there any convenient way of uploading a large file to server without filling its memory? I would like to get the stream and then use it to upload a file anywhere in chunks.
Thank you.
I used DevExpress UploadControl for a similar task. It supports large file upload by chunks. A temporary file is saved on a server hard drive and you can get it using FileSteam without full loading in server memory. It also supports direct upload to Azure, Amazon and Dropbox.
The same is true for their MVC Upload control.

Trying to open ArcGIS created Dbf file in Sqlite3

I am using a GIS program called ArcGIS to create a .dbf file from shapefile data. I have tried opening the .dbf file in sqlite3. sqlite3 stated "Error: file is encrypted or is not a database". What is causing this error? Why can't I open the .dbf in sqlite3?
When I open the dbf file in Excel I have no issues.
Edit: I am new user to Stack Overflow. SO I am confused at why there no explaination for -1 Vote. What does it mean? And if I get no input into why down votes happen, how can I learn to write better questions?
Edit 2: Since getting an answer, I have researched more and understand better that dbf files are old format that has no SQL component to it. Originally I thought (wrongly) if dbf and sqlite are both databases then they must be compatible. When just starting out, basic questions to you may not be so basic to others.
A .dbf file is a dBase database file. SQLite is a different database system with a completely different database file format. SQLite clients are not made to handle .dbf files. So the behavior you see is expected.
If you really need to access this data with a SQLite client, you could use ArcGIS's Create SQLite Database tool and copy the data from the shapefile to a SQLite database.

Save Downloaded File in www Folder ionic 2

I'm creating an app where users need to work with large databases. Rather than having the users download the data and populate an SQLite database on the client (which would take a long time), I'm hoping I can use downloadable, pre-populated databases.
I found cordova-sqlite-ext, which allows working with pre-populated databases, but SQLite files must be located in the www folder for this to work. Is it actually possible to download files to this folder in Ionic/Cordova (on non-rooted devices)?
It's always good practise to store your files in app's directory. Check this comment of mine and see if it helps you:
https://github.com/driftyco/ionic-native/issues/881#issuecomment-270832521
I had a requirement of downloading a zip file(with sqlite file in it), unzip the file, store the file in app's directory and query the DB. I was able to achieve it using plugins and works quite well.

Resources