The issue is i have a catalogue of reports that have security policies assigned to them. Let's say things like 'manager' can access this report, but not this. This goes on throughout the catalogue. I have been told that a large chunk of reports need these security policies updated. The only thing is is seems very tedious going through each any every report and updating access manually.
Is there a quick way of doing this for say 500+ reports, some nested within others?
Use the command line:
runcat.cmd -cmd setItemPermissions
Use this command with -help suffix to get the documentation and then write what you need.
Related
So a friend of mine asked me to help him configure an automatic replication of a table on his MariaDB database to another table that's supposed to be an exact copy of the source/primary table.
The databases are on the same server. MariaDB version 10.2.44. The databases are on a cPanel managed webserver run by a webhost. We are accessing the databases using HeidiSQL, which is what I'm hoping I can use to configure everything.
Upon lots of googling, this is the article I suspect makes the most sense for what we want to do, but it doesn't look like this is automatic to any extent: https://mariadb.com/kb/en/setting-up-replication/
Is this the best way to do what we're trying to do? Is there a better way? Any suggestions?
Thanks!
Like #ysth said, in this case, triggers can be used.
When creating a trigger that "works between different databases", you need to specify the database on the trigger name. So for example:
CREATE TRIGGER database_name.trigger_name
Otherwise you'll get an "Out of schema" error.
The database you need to specify is the one where the "listener" is located. Basically, the place where the condition for the trigger is being checked.
I have probably multiple newbie questions but I am unsure about how to work with telepat based on just the document.
While creating an APP, we are expected to give a Key. However the field name is keys. Is there any reason for it? I am assuming that it would have to be unique but document does not mention if that is the case or the error we should expect in case the rule is violated.
Referring to http://docs.telepat.io/api.html#api-Admin-AdminCreateContext Admin Create does not seem to require authentication even when doing it from API. It also misses the response on success. Just a 200 may be sufficient but..
There is no way to get App ID. What am I missing?
First of all, what version of Telepat are you using ? Changes to the infrastructure happen often. The latest version is 0.2.5 (although I'd try to download from develop branch since improvements and bug-fixes appear on a day-by-day basis).
You can add multiple API keys for an application and distribute them in whatever way you want. The system is not bothered if you add a key that already exists at the moment.
May be because of old Telepat build. Can't get into detail with this.
admin/app/create returns the application object, including its ID. Also /admin/apps returns a list of all applications you have.
I have a client who has set-up a testing environment in some AI language. It basically runs some predefined test cases and stores the results in as log files (comma separated txt files). My job is to identify and suggest a reporting system and I have these options in mind. either
1. Importing the logs into MSSQL and use the reporting(SSRS) it uses
2. or us import the logs to MySQL and use PHP to develop custom reporting.
I am thinking that going with option2 is better. The reason for this is, the logs are inconsistent and contain unexpected wild characters that normally DB's don't accept. So, I can write some scripts in php before loading them to the database.
Can anyone please suggest if this is your problem what will you suggest to do?
It depends how fancy you need to be. If the data is in CSV files, you could even go so simple as to load it into Excel (or their favorite spreadsheet tool), and use spreadsheet macros to analyze it.
I want to build a list of User-Url
How can I do that ?
By default, IIS creates log files in the system32\LogFiles directory of your Windows folder. Each website has its own folder beginning in “W3SVC” then incrementing sequentially from there (i.e. “W3SVC1”, “W3SVC2” etc). In there you’ll find a series of log files containing details of each request to your website.
To analyse the files, you can either parse them manually (i.e. suck them into SQL Server and query them) or use a tool like WebTrends Log Analyser. Having said that, if you really want to track website usage you might be better off taking a look at Google Analytics. Much simpler to use without dealing with large volumes of log files or paying heft license fees.
if you have any means of identifying your users via web server logs (e.g. username in the cookie) then you can do it by parsing your web logs and getting info from csUriquery and csCookie fields.
Alternatively you can rely on external tracking systems (e.g Omniture)
I ended up finding the log files in C:\inetpub\logs\LogFiles.
I used Log Parser Studio from Microsoft to parse the data. It has lots of documentation on how to query iis log files, including sample querys.
The scenario is this. I have a SQL Server database online that I am demoing an application. During development, I have added extra fields, modified field types, changed keys and added some new tables locally.
What's the best way for me to update the online database with the new structure and not lose the data? The database is a SQL Server 2005 one.
Download a trial of Red Gate SQL Compare, compare your two servers and you are done. If you do this often, it is well worth the $400, or get one of their bundles for a better bang for the buck.
And I do not work for Red Gate, just a happy customer!
Write update scripts to modify your live database structure to the new structure, as well as inserting any data which is required.
You may find it necessary to use temporary tables to do this.
It's probably best if you test this process on a test environment, before running the scripts on the live environment.
Depending on what exactly you've done you may be able to get away with alter statements, though from the sounds of it (removing keys and whatnot) you're doing some heavy lifting that may make that a less-than-ideal solution. You should probably look into creating a maintenance plan or, better yet, a SQL Server Integration Services project in Visual Studio. You should be able to migrate the data in the existing database to a new one using those tools.
This probably isn't of huge help retrospectively, but I always script all structural DB changes to my development database and then using a version number to determine the current version of the DB I can run the required scripts on the live DB, hence bringing it back in line at the same time as the new code is uploaded.
This also works for any content changes, for instance if the change in the underlying structure has an effect on the conent stored you can also write scripts to migrate the data accordingly.
Make a copy of the existing database to copy from.
Make another copy and alter it to your new schema. save DDL for reuse.
Write queries that copy data from #1 to #2. Save the queries for reuse.
Check the results.
Repeat until done.