How to configure IIS7 Advanced Logging rollover by local time? - iis-7

How to configure IIS7 Advanced Logging rollover by local time?
In options I set daily schedule (see picture http://gyazo.com/05843b6e29d51f21ee7ed2d6aa4d5157). But IIS create new log file using UTC time, not local time.

Advanced logging doesn't support that feature, the following page documents that the filename is always in UTC format:
http://technet.microsoft.com/en-us/library/ee791730%28v=ws.10%29.aspx
I don't know if it's useful for you but normal and advanced logging can be enabled at the same time. For regular IIS logging you have the "Use local time for file naming and rollover" option available.
Otherwise the only option I can think of would be to log hourly and write a scheduled script to read the files and convert the filenames and data to local time.

Related

Is there any Logviewer application available to see 4GL logs? -PROGRESS 4GL

For me its quite hard to view the 4GL log file. Is there any log viewer application available?
The logfiles are plain text, any text viewer will do.
Which is best depends on your OS and personal preferences.
On linux I'd use less, or vi for "small" logfiles.
On windows I use baretail regularly, but I also use Vs Code, Editpad, ..., notepad
The 4GL Log files should really not be that hard to read?
As a possible built in option: in OpenEdge Management there's a log viewer.
On a newly set up server with classic appserver/adminserver it generally runs on http://<server-ip>:9090/ with credentials admin/admin (to be changed at first login).
This might be different with PAS (I'm not working with that, at least not yet).
We have our databases hosted in AWS so I send all my logs (DB, Webspeed, PASOE, AdminServer, etc etc etc) to Cloudwatch with a relevant retention policy. It's not hard to build a Date/Time parsing formula to calculate when the log entry happens. That way I can give people access to the logfiles easily without them needing access to all the different servers. It's a very neat solution. And we can send non-Progress logs over too, so Apache etc. And grepping them is dead easy too.

nopcommerce 4.0 datasettings.json transform

This may seem a bit trivial...but how do you go about transforming the db connection for a nopcommerce app as it is deployed to various environments.
The db connection is set in app_data\datasettings.json.
Normally this type of stuff is handled with web.config transforms.
How do you go about setting up build transforms for different environments (dev, test, prod)?
I am also looking around this topic.
In my humble opinion, the nopCommerce config is a pain, because it makes it really hard to do proper Continuous Integration/Continuous Delivery while keeping secrets safe.
At initial deployment you are greeted with the install page. The problem is that the installation process writes a a bunch of files to on server, including datasettings.json, where the connection string to the DB is hard-coded.
This means that when I deploy nopCommerce to Azure App Service, for deployments after installation, I have to make sure NOT to delete "additional files on the server" or the config will be deleted, since these config files written by the installer, are not in source control.
It is really impractical not to be able to use standards ASP.NET connection strings, environment variables or KeyVault.
To answer your question on how you do transformation on the config file, one possibility is to use a PowerShell script to read, transform, and write the config file directly on the App Service instance. There is an API for that.
https://blogs.msdn.microsoft.com/gabeshapiro/2017/01/01/samples-for-using-the-azure-app-service-kudu-rest-api-to-programmatically-manage-files-in-your-site/
https://github.com/projectkudu/kudu/wiki/REST-API
Alternatively, you can modify the source to read from Web.Config:
Change the connection string of nopCommerce?

What is the best way of storing last run date-time/ execution date-time for windows service?

I have a windows service that runs every 1 hour. I would like to store the last run date-time so that I can check the value (last run date-time) every time the service runs.
This value is used as one of the parameter for stored procedure used by the service to get data.
Now, my question is what is the best/ideal way of storing this last run date value?
I can think of 3 ways:
Store it in a database table.
Store it in a text or XML file in the application folder (question here is: is it good to create a text
file in application folder and update it every hour?).
Create a section in config file and update it every time service is executed.
Experts, please advise.
If you have a single service that needs to check the last time the process it executes was run to completion, I would store in as a user setting item. If you are thinking that maybe you'll have many services that share this task, putting it a shared location (like a database) would be helpful--although, it should be easy to migrate the data from user setting to a database should you need to use multiple services.
I would avoid a custom XML or data file simply because there's already support for user (run-time) settings in .NET and Visual Studio.
Update:
If you want to use user settings, you can simply double-click the Settings in Solution Explorer (under Properties in a Project) or in the properties for a Project, select the Settings tab. Add a value with a User Scope (probably of type System.DateTime. And if you named LastExecuted it you would use code like the following to update the value:
Properties.Settings.Default.LastExecuted = DateTime.Now;
Properties.Settings.Default.Save();

How do I get site usage from IIS?

I want to build a list of User-Url
How can I do that ?
By default, IIS creates log files in the system32\LogFiles directory of your Windows folder. Each website has its own folder beginning in “W3SVC” then incrementing sequentially from there (i.e. “W3SVC1”, “W3SVC2” etc). In there you’ll find a series of log files containing details of each request to your website.
To analyse the files, you can either parse them manually (i.e. suck them into SQL Server and query them) or use a tool like WebTrends Log Analyser. Having said that, if you really want to track website usage you might be better off taking a look at Google Analytics. Much simpler to use without dealing with large volumes of log files or paying heft license fees.
if you have any means of identifying your users via web server logs (e.g. username in the cookie) then you can do it by parsing your web logs and getting info from csUriquery and csCookie fields.
Alternatively you can rely on external tracking systems (e.g Omniture)
I ended up finding the log files in C:\inetpub\logs\LogFiles.
I used Log Parser Studio from Microsoft to parse the data. It has lots of documentation on how to query iis log files, including sample querys.

IIS7 Profiling

Is there a way to profile IIS7? (freeware?)
Number of connections
Bandwidth usage
Errors (Event Viewer?)
-...
thx, Lieven Cardoen
ps: Something similar to mssqlserver profiling
There's nothing quite like MSSQL's profiler, but a set of tools:
Perfmon will show you the # of current connections per website. Perfmon.msc, web service, current connections, select website, click add. Don't like the interactive nature of perfmon? No problem, use logman.exe, a nice CLI for perfmon.
Bandwidth usage you can get from your log files if you enable bytes sent & bytes received in your iis log files. This is also available via performance counters - web service, bytes sent/received/sec. I think the two complement each other fairly well.
IIS7 has a new feature called Failed Request Tracing. You can tell it to log on all 500's,or any .aspx page that takes 15 seconds to run, or based on event severity. It saves all of this information in an XML file for you under \inetpub so it's easily parseable, and also gives you a nice XSLT to view it in your browser and drill down if you like.
http://learn.iis.net/page.aspx/266/troubleshooting-failed-requests-using-tracing-in-iis7/
Try out the Administration Pack for IIS 7.0. It has:
Configuration Editor:
The configuration editor module will help you manage your configuration files. This tool is available for server administrators only. It allows you to edit any section, attribute, element or collection in your configuration file. In addition to editing these values you are also able to lock and unlock them. The configuration editor also allows you to generate scripts based on the actions you take as well as search the file to see where values are being used.
IIS Reports:
The IIS Reports module enables you to view key statistics about your website. You can also generate your own module reports to gather information relevant to you and your business. Currently you can view the output of these reports as charts and/or tables.
Database Manager:
This module is no longer part of the Administration Pack and instead is offered as a separate download in the IIS Download Center.
UI Extensions:
UI Extension modules allow you to manage existing features through IIS Manager.
The FastCGI module allows you to manage your FastCGI settings.
The two ASP.NET modules allow you to manage your authorization and custom errors settings.
Finally the HTTP Request Filtering allows you to setup rules for http request filtering.

Resources