ODBC Microsoft Query BMC Remedy SLM Status Table - odbc

I want to use Microsoft Query to pull out stats on incident SLA status that can normally be seen in the SLM Status window. See the pictures below for reference.
However, I am struggling with finding the proper table to get the data from. What table is available to use as a ODBC data source for getting this information?
[

The data you are looking for is stored in the SLM:Measurement form. You'll want the following fields:
SVTTitle (SVT Title) (300411500)
GoalCategoryChar (Incident Response Time) (300426800)
GoalTimeHr (Hours) (300396000)
GoalTimeMin (Min) (300451200)
GoalSchedCost (Cost Per Min) (301489500)
SVTDueDate (Due Date/Time) (300364900)
MeasurementStatus (Progress) (300365100)
ApplicationUserFriendlyID (Incident ID) (301238500)
From what I can tell, the "Next Target Date:" is calculated by Remedy when the SLM dialog is opened using the active links:
SLM:IntegrationDialog:OnLoadSelectTimeBasedTab_SetNextDueDate-Incident
SLM:IntegrationDialog:OnLoadSelectTimeBasedTab_SetNextDueDate-Change
SLM:IntegrationDialog:OnLoadSelectTimeBasedTab_SetNextDueDate-Request
It isn't stored in the table.
Hope this helps!

Related

how to parse json and create trigger based on incoming json data

Hi I want to monitor postgres database using ODBC and to show notification based on condition, I'm creating item with db.odbc.get[,{$DSN_NAME}], please find the screen shot my item configuration.
I can be able to get data, please find the below screen of receiving data
Now I want to process this data and to show notification for user that these jobs are failed if status equals 8, I have tried it with trigger, but I can't get rid of it.
please find the screen shot for trigger configuration and also error that has been occured
The following error is occured
Can any one help me on this, and also please correct me if my approach is wrong, since I'm very new for this.
I'm also trying with low level discovery, but I don't exact way of doing it,
I have tried below where I'm facing the following issue that
Cannot create item: item with the same key "db.odbc.select[testing_odbc {#job_name},{$DSN_NAME}]" already exists.
.
Find the screen shot of discovery rule below
Then I'm creating item prototype as below
please find the sample data from discovery rule
{
"data":[{"job_name":"job1","job_status":1},{"job_name":"job2","job_status":0},{"job_name":"job3","job_status":2}]
}
I'm scheduling the discovery rule for every 20 seconds and item prototype for every 30 seconds, and I guess for every 20 seconds it's trying to create item with same id as before.
How to resolve and for the sql query in item prototype what need to give.
That JSON text is not a number, so you can't compare it to a number.
Options:
change your query to return a number.
Use JSONPath preprocessing to select the number from the JSON (ie: $[0]["Status"])

Show LAST REFRESH timestamp in a PUBLISHED report on POWERBI SERVICE

I already checked Stackoverflow for an answer, but I only found question related to showing a timestamp in PowerBI Desktop, which is pretty different from the behaviour in the PowerBI Service, e.g. see
How to display current date and time in power bi visuals?
Visualizing last refresh date in power bi
Why?
I don't want to see in my report the timestamp of the current date
and time, since I already have this in the status bar of my
operating system.
I don't want to see in my report the timestamp of
the last "report" refresh, when only the measures get updated (like
in the Service).
I don't want to see the timestamp of last re-import
of (most-likely unchanged) data in the Desktop/Service.
What I want to see in my report is the timestamp of the last "dataset" refresh in the Service, which cannot be achived by a measure, but a M-function only!
The problem now is that the Service runs in UTC time, while I'm of cause interested in local time, and all the M-functions to convert a datetimezone number only only accept a fixed time-shift in hours, but do not consider daylight savings.
How would a solution look like to properly overcome this deficit and to show the proper local time of the dataset refresh in a PBI Service report?!
For whatever reason, Microsoft hasn't built in native daylight saving handling yet but please vote for them to fix this here.
However, various people have suggested workarounds involving defining the dates/times when DST changes things or referencing an external oracle.
https://intellitect.com/convert-utc-local-time-daylight-savings-support-power-bi/
https://blog.crossjoin.co.uk/2017/03/28/daylight-saving-time-and-time-zones-in-m/
https://powerpivotpro.com/2019/01/dst-refresh-date-function-power-bi-service/
https://radacad.com/solving-dax-time-zone-issue-in-power-bi
As a workaround I've been pulling the proper local time from worldtimeapi.org so far, see e.g. this PowerQuery M-script:
let
Source = Json.Document(
Web.Contents("http://worldtimeapi.org/api/timezone/Europe/Berlin")),
#"Converted to Table" = Record.ToTable(Source),
#"Filtered Rows" = Table.SelectRows(
#"Converted to Table", each ([Name] = "datetime")),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Name"}),
#"Changed Type" = Table.TransformColumnTypes(
#"Removed Columns",{{"Value", type datetimezone}}),
#"Renamed Columns" = Table.RenameColumns(
#"Changed Type",{{"Value", "Europe/Berlin"}})
in
#"Renamed Columns"
However, I just realized that this has become somewhat obsolete meanwhile:
In the PowerBI Service switch the New Look to ON and then in the title bar next to the report name you get e.g. "Data updated 26/04/20" and in the drop-down menu you can even see the exact update time.

How do I create a running count of outcomes sequentially by date and unique to a specific person/ID?

I have a list of unique customers who have made transactions over a year (Jan – Dec). They have bought products using 3 different methods (card, cash, check). My goal is to build a multi-classification model to predict the method pf payment.
To do this I am engineering some Recency and Frequency features into my training data, but am having trouble with the following frequency count because the only way I know how to do it is in Excel using the Countifs and SUMIFs functions, which are inhibitingly slow. If someone can help and/or suggest another solution, it would be very much appreciated:
So I have a data set with 3 columns (Customer ID, Purchase Date, and Payment Type) that is sorted by Purchase Date then Customer ID. How do I then get a prior frequency count of payment type by date that does not include the count of the current row transaction or any future transactions that are > the Purchase Date. So basically I want to do a running count of each payment option, based on a unique Customer ID, and a date range that is < purchase date of that training row. In my head I see it as “crawling” backwards through the transactions and counting. Simplified screenshot of data frame is below with the 3 prior count columns I am looking to generate programmatically.
Screenshot
This gives you the answer as a list of CustomerID, PurchaseDate, PaymentMethod and prior counts
SELECT CustomerID, PurchaseDate, PaymentMethod,
(
select count(CustomerID) from History T
where
T.CustomerID=History.CustomerID
and T.PaymentMethod=History.PaymentMethod
and T.PurchaseDate<History.PurchaseDate
)
AS PriorCount
FROM History;
You can save this query and use it as the source for a crosstab query to get the columnar format you want
Some notes:
I assumed "History" as the source table name - you can change the query above to use the correct source
To use this as a query, open a new query in design view. Close the window that asks what tables the query is to be built on. Open the SQL view of the query design - like design view, but it shows the SQL instead of the normal design interface. Copy the above into the SQL view.
You should now be able to switch to datasheet view and see the results
When the query is working to your satisfaction, save it with any appropriate name
Open a new query in design view
When you get the list of tables to include, switch to the list of queries and include the query you just saved
Change the query type to crosstab and update the query as needed to select rows, columns and values - look up "access crosstab queries" if you need more help.
Another tip to see what is happening here:
You can take the subquery - the parts inside the () above - and make
just that statement into it's own query, excluding the opening and closing (). Then you can look at it's design view to see what it does
Save it with an appropriate name and put it into the query above in place of the statement in () - then you can look at the design view.
Sometimes it's easier to visualize and learn from 2 queries strung together this way than to work with sub queries.

Database table in which WTPart or Change activity Maturity History is stored

I need to pull the date on which a WTPart was in inwork state or A CN was in published state.
I did my analysis and found that there should be a Maturity history table in database, but i ended up with a table called MaturityBaseline table which does not hold this information. I need guidance on which table this information is stored. Even in API com.ptc.windchill.enterprise.history.HistoryTablesCommands.maturityHistory(wtObject); they are using Maturity History class.
Have you tried this method from same class?
com.ptc.windchill.enterprise.history.HistoryTablesCommands.getLegacyLifeCycleHistory(LifeCycleManaged arg0);
I have never tried by myself though. Also check for HistoryRecord table in database whether it have any info related to this.
I know this question is may years old now, but I stumbled across it looking at how to find the history of a change issue/problem report.
My problem report has a number TA00025 and I want to find when it was completed. The enter_phase action will tell me whenever the state has changed. This SQL will tell me changes of state for the nominated change issue/problem report (the same object type in Windchill)
select LH.action, LH.state, LH.updateStampA2
from wcadmin.[wcadmin].ObjectHistory OH, wcadmin.[wcadmin].LifeCycleHistory LH,
wcadmin.wcadmin.WTChangeIssueMaster CIM, wcadmin.wcadmin.WTChangeIssue CI
where OH.idA3A5 = CI.idA2A2
and OH.idA3B5 = LH.idA2A2
and CI.idA3masterReference = CIM.idA2A2
and CIM.WTCHGISSUENUMBER = 'TA00025'
and LH.action = 'Enter_Phase'
For a WTPart the history all sits in the table WTPart which will have a full history for a part wth multiple records.

Unable to view uploaded timeseries data in UI chart

Running the Databus server from the command line I have successfully uploaded timeseries data via curl, and am able to query the same data with the api. I'm unable to view any of the data in the table in the UI. After selecting "My Databus" -> Tables is says "You do not belong to any groups that have tables yet. Add some groups, then tables!!!". Navigating to the Database and selecting the table -> chart no data comes back there either.
I have noticed that the query it issues is from a recent time range, while the data I loaded is for an earlier time period. Is there a default way to show the most recent data available in a table?
Is your table type relational or stream? If relational, what is the primary key?
If time series, this url will give you the last 10 values because of the parameter 10 and reverse=true.
http://[yourhost]/api/firstvaluesV1/10/rawdataV1/[yourtablename]?reverse=true
If relational table, you can retreive all values like so
http://[yourhost]/api/getdataV1/select+c+from+[yourtablename]+as+c
replace either urls [yourhost] and [yourtablename] values.
We do not use the tables page much. It is better to click in the specific database as in My Databus -> Databases and then click on the database that has your table. We are about to add a view data link in there showing most recent 1000 values or something like that. There is already a view chart which shows most recent 2 hours(again, we want to change that to most recent 1000 data points instead as well).

Resources