How can I read a web site's data automatically at a special time? For example, I want my site to be able to read any newspaper's articles automatically each morning. My mean for doing so is reading any another website's data each day automatically.
Have you done enough brainstorming on what exactly you need to implement. I think you need this
Run code at specific time c#
Now google it. Where does the first three links point to??
Once you have this then you need to look into what design best suits you. You can have this timer logic in your server side code , or create a window service which pulls the news feed.
Actually you can implement this without a timer by creating a 'scheduled task in windows'.
Related
I've been working with an ASP Classic site with over 500 files, some of which aren't used and some of which are; along with a database with hundreds of procs and functions and tables, in the same shape.
I need a way to get a grip on the project so I can eventually migrate it. I don't have time yet to walk through every single page and look at the SQL (stored procedures are in the database and are called properly within the ASP pages), so I'm at a loss as to how to get a handle on this.
My immediate thought is to make ASP classes and put them into the pages as I go - they'd pretty much be used for getting and setting fields, validation, and sending recordsets into display functions.
Is this a reasonable approach? Am I missing some strategy?
How would you approach this? A migration to another platform at this point is considered, but not feasible for the short term (next couple of months)
You can try to compile the project using http://aspclassiccompiler.codeplex.com/ or you can migrate to ASP.net MVC one page at a time (when needed) and using a mix of both in the meantime.
My simple advice is stop think about code. Spend more time with the UI actually using it and spend time examining in detail the database schema.
Edit
If you are trying to determine what pages are active then use IIS logging to harvest distinct pages hit. Also do some scripting to collect the names of files and text search the files in the site looking for any occurance of those files. This info should identify parts of the site that are rarely used or dead.
However in all probability there will considerable content in the "active" files which are also dead. Let me re-iterate do not actually to add classes or refactor the code at this stage you should concentrate on understanding what it does not how it does it. Understanding the DB Schema is a vital step and then understanding what UI interactions bring about specific changes in the DB.
I'm looking for a portlet like-solution that would collect and report usage analytics in Liferay... but google analytics is not an option, unfortunately.
Stats by community, group, session tracking, apart from the usual bounce and exit rates, referrals, origin, etc. I know I'm kind of asking the reinvention of the wheel, but there are plenty of usage data that can be collected by Liferay that google can't. I've already checked PiWik, and it looks very impressive.
Any suggestions? TIA,
As of 2015 there is Audience Targeting plugin, which (at least for Liferay 6.2) comes bundled with analytics-api / analytics-hook modules, which collect some useful analytics data. Mind now:
So far it doesn't look like there is any standalone use for them as they were introduced, I believe, to enable the content visited, page visited and other such rules in the Audience Targeting itself; you can't see the raw events in any of the provided portlets
The events are stored as rows in a SQL database, so I would be concerned about it's performance in the long run (with thousands of clicks every minute etc.), although I say this purely theoretically as I haven't done any tests myself nor checked if there are some performance enhancing measures implemented
What you can do, however, is to put together your own portlet which would create some graphs etc. based on the data stored in CT_Analytics_AnalyticsEvent table.
Right now, I dont think there is any out of the box feature available for this, you might need to create this. There can be 2 things
1) You need to create a javascript library if you need realtime/web analysis (this is same like creating google analytics lib)
2) This option is quite easy. Liferay stores everything in db, you can have a report portlet which will show the report based on the data. We did this for one project where we were tracking the session ids/ip and logged in user details for portlets.
To achieve point 2) you can create new Liferay service, which will be used to store these data and retrieve.
Hope this helps
You already mention Piwik, which is similar to google analytics. You probably have your own theme (almost everybody changes the appearance to look like their own site) and it's quite appropriate to place the relevant piwik-stats-snippet in there.
You can also, as Felix suggests, mine your log files. Liferay stores some data, your webserver access logs also are quite worth to mine. And, of course, you can change your theme to log even more for every page access, just take care that you don't create a performance bottleneck by writing too much during one page request.
So, coming back to your question: Built-in like google analytics: No. Easily integrateable (like Piwik): Yes, of course. Completely customizeable: Yes, of course.
Edit: It just happens that David has created and documented an integration that makes using Piwik even easier
I've been using this site for quite a while, usually being able to sort out my questions by browsing through the questions and following tags. However, I've recently come across a question that is rather hard to lookup amongst the great number of questions asked - a question I hope some of you might be able to share your opinion on.
As my problem is a bit hard to fit into a single line, going in the title, I'll try to give a bit more details on the problem I've encountered. So, as the title says I need to filter, or limit, some of the response data my standard ASP.NET Soap-based Web service returns on invoking various web methods. The web service is used to return data used by other systems (a data repository more or less), where the client today is able to specify a few parameters on how the data should be filtered and in return a full-set of data back.
Well, easy enough I thought, just put additional filtering options on the existing web methods which needs a bit more filtered applied, make adjustments on the server-side and we are all set to go - well, unfortunately it turned out to be a bit more tricky then this.
The problem I am facing is that I'm working on a web service running in a production environment, which needs to be extended in such that additional filters can be applied to existing web method being invoked w/o affecting the calls already being made by other systems used by the customer using their client stubs. This is where I am a bit troubled, since I can't seem to find a "right solution" on extending the current web service.
Today, the filter is send as a custom data structure which holds information on which data should filtered, but I am not sure if I can simply just add more information to this data structure w/o breaking code at the clients? One of my co-workers suggested that I could implement a solution where I would extend the web.config on the server-side to hold a section with details on which data should be excluded (filtered out), but I don't find this to be a viable solution long-sighted - and I don't trust customers with such an option since this is likely to go wrong at some point. So the solution I am looking for is a way that I can apply a "second filter" to the data I am requesting from the client so instead of getting a full-set of data back it should only give a fraction, it implemented in such that the filter can be easily modified and it must not affect the current client calls.
Any suggestions on how I should approach this problem?
Thanks!
Kind regards,
E.
A pretty common practice is to create another instance of the application OR use part of the url to signify the version of the endpoint they are connecting to, perhaps the virtual directory is the date. That way old calls will go to the old API and new calls will come in on the new API.
http://api.example.com/dostuff
vs
http://api.example.com/6-7-2011/dostuff
I've asked this before but I was hoping for another answer and perhaps some code samples because I've been having a difficult time with this. I have an asp.net page. The user hits the "Run" button and I have code IN AN ASSEMBLY, not in the APP_CODE folder that is called and runs a long process that moves product info from a file into the database. While the user waits, I would like them to see status updates like what product the import process in on and status info. I'm assuming I'd break off into another thread and use Ajax but I have no idea how to do this. Some code samples would be very helpful, thanks.
A simpler way to do this without needing to go into multi threading (which can cause all sorts of nasty, hard to track down bugs) is to use AsyncResults in .NET and AJAX which allow you to query a process.
A good example to start you off can be found here.
found it by using HttpResponse.Flush
Due to a lack of response to my original question, probably due to poor wording on my part. Since then, I have thought about my original question and decided to reword it, hopefully for the better! :)
We create custom business software for our customers, and quite often they want attachments to be added to certain business entities. For example, they want to attach a Word document to a customer, or an image to a job. I'm curious as to how other are handling the following:
How the user attaches documents? Single attachment? Batch attachment?
How you display the attached
documents? Simple list? Detailed list?
And the killer question, how the
user then edits attached documents? Is this even possible in a web environment? Granted the user can just view the attachment.
Is there a good control library to help manage this process?
Our current development environment is ASP.NET and C#, but I don't think this is a pretty agnostic question when it comes to development tools, save for the fact I need to work in a web environment.
It seems we always run into problems with the customer and working with attachments in a web environment so I am looking for some successes that other programmers have had with their user base on how best to interact with attachments.
Start with one file upload control ("Browse button"), and use JavaScript to dynamically add more upload controls if they want to attach multiple files in a single batch.
Display them in a simple list format (Filename, type, size, date), but provide full details somewhere else if they want them.
If they want to edit the files, they have to download them, then re-upload them. Hence, you need a way that they can say "this attachment overrides that old attachment".
I'm not familiar with C# and ASP.NET, so I can't recommend any libraries that will help.
http://developer.yahoo.com/yui/uploader/