I may be asking some basic question but I am not able to find any information. I know to view the log files in PCF i can use
cf logs and cf logs --recent
The issue with this is it will send only the recent log files. I want to get the archive log file from PCF for my jobs, is there a way to get those information?
Related
I have several Symfony projects generating logs. I want to be able to view these logs in a single place. This is my plan…
Each system writes logs to separate folder in an AWS s3 bucket
A Laravel project will then access these logs probably by pulling
them down to local store. Using Arcanedev Logviewer I will then view
the currently available logs.
Has anyone been able to write Symfony logs directly to an AWS s3 bucket (I've only found reference to uploading them using a cron schedule)?
As an aside, has anyone come across a web based log service that allows you to simply view logs in a format as good as Arcanedev Logviewer. We currently use Cloudwatch which is good for what it does and certainly has its place but for me the killer feature of Arcanedev Logviewer is this screen…
Arcanedev Logviewer screenshot
Thank you
Gary
For some unexplained reason I can't use alfresco from yesterday.
Let me tell you how happens.
First of all, I didn't change any conf file or something like that.
I started tomcat and postgre services and after that, I tried to load "localhost:8080/share" but it was loading forever.
I tried to check the logs files, but no use, too. There is no error messages, nothing unusual.
After that, I deleted alfresco and share folder inside the "webapps", just in case, but it failed, too.
Finally, I can't stop these services from service manager, cause I am at work and I have no access privilege.
My main concern is that I don't even know the cause of this issue, so I don't even know how to ask for help.
When you don't have permission to delete the folder(share + alfresco) and stopping the services also. Without stopping the services, you can't delete the complete files from alfresco and share folders.
You need to find the problem is in Alfresco share, Alfresco Repo or database or tomcat.
Check Tomcat
Type http://localhost:8080 and make sure Tomcat is running or not.
Check Database
Connect your database service from Service manager, via PgAdmin tool to check the database service is running or not.
Check Repo
http://localhost:8080/alfresco - It should display some basic information about Alfresco Repo otherwise, it is clearly decided the Alfresco Repo itself is failed.
Check Share
http://localhost:8080/share - It should display the login page, if everything works fine.
Logs
Check and share, alfresco.log, share.log, solr.log, catalina, tomcatstdout and tomcatstderror logs files. Definitely, some of the error information would have recorded any of these logs files.
I have a R script which calls Big Query API and then executes some queries. It works fine if I start this script using batch file. However when I try to start script as System, I see that (more likely) it can't log into bigquery. Maybe it is so because BQ autontification file (.httr-oauth) is valid for my user< not SYSTEM.
The info in .httr-oauth file is hashed so I can't change user (if there is info aboust user there). Maybe there is some way one can make another .httr-oauth file for system? Or is another error i bumped into?
I have a dll that opens a file for processing. It attempts to find the file with FindFile() function. I also have a service that calls the dll and here is the problem - when the path to the file is a network path, FindFile() fails to find it but only when called from the service, if I call it directly from my application it finds the file. I'm sure the FindFile() function gets the same parameters in both cases as I write a log file with it. Parameter looks like this:
"\SERVER\SERVER_USERS\USERX\TEST.TXT"
I know this is 6 months after the question, but I figured I'd answer it anyway ... Usually, it is a permissions thing. If the service does not have access to the network folder, then it won't find anything. Many services run as a local system account by default, and that account doesn't have built-in access to network files. So try making sure the service is running as an account that has access to the network folder in question.
I want to build a list of User-Url
How can I do that ?
By default, IIS creates log files in the system32\LogFiles directory of your Windows folder. Each website has its own folder beginning in “W3SVC” then incrementing sequentially from there (i.e. “W3SVC1”, “W3SVC2” etc). In there you’ll find a series of log files containing details of each request to your website.
To analyse the files, you can either parse them manually (i.e. suck them into SQL Server and query them) or use a tool like WebTrends Log Analyser. Having said that, if you really want to track website usage you might be better off taking a look at Google Analytics. Much simpler to use without dealing with large volumes of log files or paying heft license fees.
if you have any means of identifying your users via web server logs (e.g. username in the cookie) then you can do it by parsing your web logs and getting info from csUriquery and csCookie fields.
Alternatively you can rely on external tracking systems (e.g Omniture)
I ended up finding the log files in C:\inetpub\logs\LogFiles.
I used Log Parser Studio from Microsoft to parse the data. It has lots of documentation on how to query iis log files, including sample querys.