read wordpress user data into a local ms access database - wordpress

I'm a MS Access developer who has a client with a wordpress website. I need a way of reading any new users who are added into the wordpress site into a microsoft access database that resides on the users local machine.
I only need to add any new records and don't need to write anything back to the site.
What would be the easiest way of doing something like this? I would sooner do it from the Access side as I'm not very familiar with wordpress, also I would prefer it to be automated so the user doesn't have to manually export from wordpress then import the data every time.
Any advice appreciated.
Thanks
Justin

There's no way to have WordPress connect to an Access DB because Access isn't persistent, it's on demand. The only way I can imagine handling it would depend on Access more than anything. Assuming Access can using the ODBC connectors installed on the windows system it's running on, you could do the following:
Setup the MySQL ODBC Connector on any systems which would run the Access DB you're setting up. I assume the WordPress install is using MySQL.
Setup a new ODBC connection using that connector within Access to the Wordpress
database (you can use the same connection information stored in the
WordPress root directory in a file called wp_config.php).
If you're connecting successfully, you can then read the wp_users table using properly formatted SQL commands (select * from wp_users, for instance).
The function within Access that manages this can either be scheduled to do it periodically while Access is open or just when Access is initialized. Depending on your needs.
You'll have to compare it to a local table of users to find differences if you're interested in all changes, though the user_login is static through normal channels so it's a good key. There's also a "user_registered" date/time column in wp_users so you could just look for users who registered since your last update/change to the local access table.
I'm not familiar with Access beyond a cursory understanding of it as a data source and some minor development functions, so there may be a much easier way to do it, but this is how I'd do it in any system that needed the user information from Wordpress.

Because Wordpress and Ms-Access live in very different environments, a trick is needed to "marry" the two. Another way we can use:
Use IIS as a webserver, so as not to collide with the port used by wordpress, so don't use port 80. Just use any port.
Use ASP to access Access Database (of course there is a block of ASP code to activate the connector to the Access file, then create a recordset via that connector for record2 and display it to the browser). Call it with the file name: recordshower.asp
The recordshower.asp file must be able to be called by the browser.
After that, go back to a page in wordpress, insert <iframe src=recordshower.asp width:.... ></iframe>
So, a wordpress page can display the contents of an Access table, with the help of ASP

I export the WordPress entries to a .CSV, I then run an Excel macro, which is stored in my personal workbook, to open the file from my downloads directory, convert the .CSV to .XLS, and write it to an XLS file in a known location. I then push a button in my MS Access program to read the entries from the .XLS file (which is externally linked to my MS Access as a Excel spreadsheet) and update my MS Access tables with the data.

Related

Can i hide .laccdb files or change its name with MS Access settings

Title says it simply.
I have a MS Access Database on a shared drive and the majority of users aren't experts so quite often the leave their PC with the Database open then it locks the PC, someone else will come along and switch it to their account, go to open the Database and get confused by the two files with the same name.
I can think of solutions for this e.g. using shortcuts so they dont actually see the laccdb or accdb file
But what I want to know is if there are any settings in Access (2010) that can make the .laccdb file hidden when it is created or just give it a random name like word or excel tmp files?
When I google this the results are more for removing people from the database so you can delete the laccdb file
This is actually a multi-user setup even though just one user can be active.
So you need to distribute the frontend to each user while having the backend in a folder with access for all users - that could be a subfolder of C:\Users\Public.
Here's is a script that will handle the distribution:
Deploy and update a Microsoft Access application in a Citrix environment

Saving images from a website and accessing it from another website

I have created a website let's name it a.com. Now in this website (a.com) the users can upload an image that gets saved to a folder on remote server and the path is getting saved to database.
There is a second website (b.com) which has been hosted on the same server where the image needs to be retrieved.
Can we do this? If yes then please suggest some solution
Language Used = Asp.net 4.0 C#
Backend = SQL2008
It cane be done, if both website can access the remote folder. You need to use something like System.Net.WebClient to download image from remote server.
Since your using SQL and .Net im going to assume your using IIS aswell,
Creating a Virtual directory is probably your best bet, Easy to setup, easy to manage and Extremely handy to understand (if not vital)
http://www.iis.net/learn/get-started/planning-your-iis-architecture/understanding-sites-applications-and-virtual-directories-on-iis
This is assuming you want to display the image in b.com, if you simply want to retrieve it you can do so by retrieving the path and using the image in any other way, you may need to set permissions on the image folder.
first of all both sites should use the same database to share information about images.
If I understand correctly b.com and the files are on the same server. If so put folder's virtual path to configuration.
After that, check database for new records. If found one, parse the name of the file and find it on the server.
As an example you get an image on a.com and insert it on database as
name="image.jpg"
b.com checks database for new records and finds image.jpg.
b.com reads configuration and finds
path="http://c.com/"
combines path and imagename
fulllink="http://c.com/image.jpg"

In App downloadable content

I'm trying implement a kind of private 'cloud store' for may web application. Let me explain, I have 'reports' (a file containing queries etc) which can be installed on a client pc. Normally, we e-mail them the files to end users whenever we create new ones and they manually import them into the app (using standard file upload in any browser).
Now we want to take it a step further and create a page which will pull a list of files from our site, eg. www.me.com/reports. The app will go through the list, compare to those installed and display new ones, updated ones etc. An end user could then just click on a button and the files are downloaded on the server and installed.
I'm trying to avoid writing any web server code, I'd prefer to just create a windows authenticated virtual directory that allows for file listing (or something close to this). I'm thinking maybe some javascript that will silently download the file to the client, then upload it back to the intranet iis server. All done without user interaction. Is this even possible?
I'd like to get anyone's thoughts on how something like this could be implemented, and what pitfalls I should watch out for.
Thanks
JK

Creating a MAMP Local Copy of a Drupal 6 Website

We're currently rebranding a client of ours and it's come the time to take the new brand to their website.
I've not much experience with Drupal other than the theming (I've themed a Drupal website in the past but not very familiar with the software's inner workings).
As this website is live, it's obviously not feasible for me to make any changes to the live environment, so I have downloaded the source files of the website to a local webserver (MAMP).
I also have a MySQL dump of the database.
I'm not sure what files need to be changed inside Drupal to allow access to the MAMP webserver. Could somebody point me in the right direction here?
How would I connect the database to the website, which files need modification?
I think the client is running Drupal 6.
Update:
I've installed the database and linked it up using the below line:
$db_url = 'mysql://root#localhost/databasename';
I've hidden databasename for anonymity.
As it's MAMP, the database has no password. When I load up the website I get an error that install.php is not found. It's not there because the website is already 'installed'.
I've also updated the $base_url to read:
$base_url = 'http://localhost:8888/foldername';
You only need to modify one file, 'sites/default/settings.php'; you'll just need to change the database connection string in there to match your new database settings. There may be a couple of other settings in there you need to tweak depending on the set up of the site (for example the $base_url or $cookie_domain).
Other than that everything in your installation should be relatively path-ed so there shouldn't be any need to make more changes.
i was facing same problem after couple of hours try i got solution : we have to check the DB (tick on list of databases in local host) [ observe this after DB list : Enabling the database statistics here might cause heavy traffic between the web server and the MySQL server. so enable only the db you want to use] this will redirect to http://localhost:8888/foldername/install.php successfully :D :D

ASP.NET Image Upload Architecture

What would be the best method to implement the following scenario:
The web site calls for a image gallery that has both private and public images to be stored. I've heard that you can either store them in a file hierarchy or a database. In a file hierarchy setup how would prevent direct access to the image. In a database setup access to the images would only be possible via the web page view. What would be a effective solution to pursue?
[Edit] Thanks all for the responses. I decided that the database route is the best option for this application since I do not have direct access to the server. Confined to a webroot folder. All the responses were most appreciated.
Having used both methods I'd say go with the database. If you store them on the filestore and they need protecting then you'd have to store them outside the web-root and then use a handler (like John mentions) to retrieve them, anyway. It's as easy to write a handler to stream them direct from database and you get a few advantages:
With database you don't need to worry about filestore permissions or generating unique filenames or folder hierarchies etc.
With database you can easily apply permissions and protection directly - no trying to work out who can view what based on paths etc.
With a database you can store the image and metadata all together - when you delete the metadata you delete the image - no possibility of orphaned records where you delete from database but not from filestore
Easier to back-up database and images and then restore
The disadvantage is that of performance, but you can use caching etc. to help with that. You can also use FILESTREAM storeage in SQL Server 2008 (and 05?) which means you get filesystem performance but via the DB:
"FILESTREAM integrates the SQL Server
Database Engine with an NTFS file
system by storing varbinary(max)
binary large object (BLOB) data as
files on the file system. Transact-SQL
statements can insert, update, query,
search, and back up FILESTREAM data.
Win32 file system interfaces provide
streaming access to the data.
FILESTREAM uses the NT system cache
for caching file data. This helps
reduce any effect that FILESTREAM data
might have on Database Engine
performance. The SQL Server buffer
pool is not used; therefore, this
memory is available for query
processing."
Using file hierarchy, you can put the files out of the website file folder, for example, suppose the web folder is c:/inetpub/wwwroot/somesite, put the file under c:/images/, so that the web users won't be able to access the image files. but you cannot use the direct link in your website neither, you need to create some procedure to read the file, return the stream.
personally I think it's better to put the file in the database, still create some procedure to retrieve the binary image data and return to wherever it needed.
In reality both scenarios are very similar, so it's up to you... Databases weren't designed to serve files, but if the size isn't really a concern for you, I don't see a problem with doing it.
To answer your question about direct access, you'd setup the file images the same way you would for the database: You'd use some sort of page (probably a .ashx handler) that serves the images, allowing you a layer of logic between the user and image to determine whether or not they should have access to it. The actual directory the images are located in would then need to either a) not be part of the directory structure in IIS or b) if it is part of IIS, only allow windows authenticated access, and only allow the account the application process is running under access to the directory.
If you're using IIS7, since .net jumps in the pipeline early I believe you can protect jpg files as well, just by using a role manager and applying roles to file system folders. If you're using IIS6, I've done something similar to the answer by John, where I store the actual file outside of the wwwroot, and use a handler to decide if the user has the correct credentials to view the image.
I would avoid the database unless you have a strong reason to do this - and I don't think a photo gallery is one of them.
Neither. Amazon S3 offers a very simple API for accepting uploads. You can use SimpleDB or your SQL database to track the URLs and permissions. Set the entire S3 bucket to private, and authenticate to it using your AWS key on the ASP.NET server.
Very little code is required to upload to S3, and very little more would be required to perform bookeeping in SQL.
Once they're in S3, grab the image resizer library and the S3 Reader plugin and you can have your entire system running in under an hour. And - it will scale properly. No disk or database space limits. Ever.
You can implement authorization using the AuthorizeImage event of the Image Resizer library. Just throw an AccessDeniedException if access isn't allowed for the current user.
If you want to tune performance a bit mare, add both the DiskCache and CloudFront plugins. CloudFront can edge-cache the public images (inexpensively), and DiskCache will handle the private images, serving them at static-file speeds.

Resources