Using geonames service locally - google-maps-api-3

I was planning to use geonames API but actually it seems far more sensible to use the geonames locally and not use their web API. So my question is, does anyone know how to interface with the geonames data locally?
Do I need to manually import it into a database and then reference it like a normal DB?

Yes you will have to import the database manually if you want to query the database locally.
1 - If you plan to use MySQL you can follow the explanations from this post
Importing data from geonames.org database into MySQL DB
2 - If the table structure is not up to date you can also refer to the GeoNames forum:
http://forum.geonames.org/gforum/posts/list/732.page
3 - Once you have created your tables following the given structure you can use LOAD DATA INFILE or LOAD DATA LOCAL INFILE to bulk import the data from the CSV files.
4 - The importation of the allcountries table can take more than 10 minutes locally. It might be better to create your indexes before you import the data.
(A shell Script is also available if you are familiar with the command)

Related

Is there a way to back up my App Services / Usergrid data

App Services is a great place to store data but now that I have a lot of critial info in there I realized there isn't a way to create a backup or roll back to an earlier state (in case I did something stupid like -X DELETE /users)
Any way to back up this data either online or offline?
Apart from API access to fetch records x by x and storing locally, there is no solution at the moment. Team is planning an S3 integration (export data to S3) but no completion date is defined for that yet.
Looks like the only way is to query the data using e.g. CURL and save the results to a local file. I dont believe there is a way to export natively.
http://apigee.com/docs/app-services/content/working-queries
From 2014/2015 Usergrid versions it is possible to make exports and imports using "Usergrid tools"
On this page it is explained how to install them :
https://github.com/apache/incubator-usergrid/tree/master/stack/tools
Basically once you run
$ java -jar usergrid-tools.jar export
and this will export your data as json files in an export directory.
There are several export and import tools avaible, the best way to see them is to visit this page :
https://github.com/apache/incubator-usergrid/tree/6d962b7fe1cd5b47896ca16c0d0b9a297df45a54/stack/tools/src/main/java/org/apache/usergrid/tools

Clone Oracle Express Edition 11g R2

I have installed Oracle XE 11g R2 on my machine. I ran few scripts which does the setup by creating schemas, procedures for our application. Now I want to clone this database so that other people by using the cloned dbf file can see the base schema on their respective machine and work on their individual requirement on top of that.
Now it has 6 dbf files
CONTROL.DBF
SYSAUX.DBF
SYSTEM.DBF
TEMP.DBF
UNDO.DBF
USER.DBF
Can i just give them the files or I need to create server parameter file (SPFILE) or Control file. What about the REDO logs.
I have very little knowledge in Database administration. Please suggest. I understand that it is not Enterprise Edition so all things might not supported but assuming cloning process is similar for XE.
While it is possible to restore a database using the data files, I strongly suspect that is not what you're really after. If you're not an experienced DBA, the number of possible issues you'll encounter trying to restore a backup on a different machine and then creating an appropriate database instance are rather large.
More likely, what you really want to do is generate a full export of your database. The other people that need your application would then install Oracle and import the export that you generated.
The simplest possible approach would be at a command line to
exp / as sysdba full=y file=myDump.dmp
You would then send myDump.dmp to the other users who would import that into their own database
imp / as sysdba full=y file=myDump.dmp
This will only be a logical backup of your database. It will not include things like the parameters that the database has been set to use so other users may be configured to use more (or less) memory or to have a different file layout or even a slightly different version of Oracle. But it does not sound like you need that degree of cloning. If you have a large amount of data, using the DataPump version of the export and import utilities would be more efficient. My guess from the fact that you haven't even created a new tablespace is that you don't have enough data for this to be a concern.
For more information, consult the Oracle documentation on the export and import utilities.
Removing content as it is not valid here

Load sqlite database into Postgres

I have been developing locally for some time and am now pushing everything to production. Of course I was also adding data to the development server without thinking that I hadn't reconfigured it to be Postgres.
Now I have a SQLite DB who's information I need to be on a remote VPS on a Postgres DB there.
I have tried dumping to a .sql file but am getting a lot of syntax complaints from Postgres. What's the best way to do this?
For pretty much any conversion between two databases the options are:
Do a schema-only dump from the source database. Hand-convert it and load it into the target database. Then do a data only dump from the source DB in the most compatible form of SQL dump it offers. Try loading that into the target DB. When you hit problems, script transformations to the dump using sed/awk/perl/whatever and try again. Repeat until it loads and the results match.
Like (1), hand-convert the schema. Then write a script in your preferred language that connects to both databases, SELECTs from one, and INSERTs into the other, possibly with some transformations of data types and representations.
Use an ETL tool like Talend or Pentaho to connect to both databases and convert between them. ETL tools are like a "somebody else already wrote it" version of (2), but they can take some learning.
Hope that you can find a pre-written conversion too. Heroku one called sequel that will work for SQLite -> PostgreSQL; is it available without Heroku and able to function without all the other Heroku infrastructure and code?
After any of those, some post-transfer steps like using setval() to initialize sequences is typically required.
Heroku's database conversion tool is called sequel. Here are the ruby gems you need:
gem install sequel
gem install sqlite3
gem install pg
Then this worked for me for a sqlite database file named 'tweets.db' in the current working directory:
sequel -C sqlite://tweets.db postgres://pgusername:pgpassword#localhost/pgdatabasename
PostgreSQL supports "foreign data wrappers", which allow you to directly access any data source through the DB, including sqlite. Even up to automatically importing the schema. You can then use create table localtbl as (select * from remotetbl) to get your data into the actual PG storage.
https://wiki.postgresql.org/wiki/Foreign_data_wrappers
https://github.com/pgspider/sqlite_fdw

How can I convert my Access database (.accdb) to SQLite?

How can I convert my Access database (.accdb) to an SQLite database(.sqlite)?
May be you can use several step algoritm:
1. Export (convert) Access table or query to Excel file
2. Save Excel file as CSV file.
3. Use any SQLLite manager (for example, phpLiteAdmin) to import data from CSV file to exist SQLLite table.
Except Android and IOS, that use SQLLite, there are still webhostings, that use no more database engine, except for SQLLite.
1) If you want to convert a structure of db you shoud use any DB-modeling tools:
create new model from existing Access Database
generate sql scripte for creating SQLite database
use this script in SQL helper
2) If you want to import data from Access Database to your android app. I think you can do case #1, migrate all data from Access Database to temporary SQLite database, save it to asset folder and rewrite from asset to internal SQLite database during first app. start

How to import csv file( already uploaded in blob storage) in Azure(.Net)

I want to import csv file(already uploaded in blob storage) in Azure.
For example I have uploaded test.csv on blob storage, now I just want to import that test.csv file in .net(azure) and after importing I will insert that data into azure database. I am using C# .net. Please suggest how can I achieve this. I want to follow below steps:-
Creating a cvs file with all rows.
Upload it as blob.
Parse it with a Worker role and insert it in the sql azure db.
Thanks.
A bit more clarification around your question would be helpful. Are you trying to upload a file to Azure blob storage? Download it from there for your app to consume? What language(s) are you using?
There are plenty of examples of loading files into and pulling them from Azure blob storage using .NET at least a handful for doing it with Java or PHP.
I you can clarify what you're trying to do, I'd be happy to point you at the appropriate ones. :)
-- answer based on comment update --
The steps for retrieving the blob are fairly straight forward:
1) create your Azure storage client using your azure storage credentials
add a using clause:
using Microsoft.WindowsAzure.StorageClient;
get a client for accessing blob storage:
CloudBlobClient tmpClient = new CloudBlobClient("<nameofyourconfigsetting>");
get a referrence to the blob you want to download:
CloudBlob myBlob = tmpClient.GetBlobReference("container/myblob.csv");
2) read the blob & save to a file
myBlob.DownloadToFile("<path>/myblob.csv");
The save location can be the %temp% location or if its a large file you may want to allocate some local storage space and put it there. The other thing you want to keep in mind is that if you are doing this in a role instance, you'll need to make sure you have measures in place to prevent two instances from concurrently trying to process the same file. If the file is small enough, you can probably even keep it as a memory stream and process it that way. If this is the case, you can use the DownloadToStream property of the CloudBlob object.
For additionally reading, I'd recommend checking out the MSDN library for the details on the StorageClient and CloudBlob contains. Additionally, the Windows Azure Platform Training Kit has some good labs to help you get a better understanding of how Azure Storage works.

Resources