Peoplecode TRANSFER to page without search record - peoplesoft

Is there a way to use the PeopleCode "transfer" function to transfer (and fill in fields) to a page where the component uses INSTALLATION as the search record?
We are using FSCM 9.1 on PT 8.53.
The page I'm trying to transfer to is AP_VOUCHER_INQUIRY
The menu path is: Accounts Payable -> Review Accounts Payable Info -> Vouchers -> Voucher
I have the business_unit and the voucher_id that I need to pass to it. Inside of the Activate PeopleCode for the page, I see this:
/* If Business Unit and Voucher ID are being passed to this page, this code picks it out and populates the From/To fields and executes the search statement. This code allows another page to use the TRANSFER function passing "some_record.BUSINESS_UNIT" and "some_record.VOUCHER_ID" as keys while opening Voucher Inquiry in a new browser. */
&bu = Unencode(%Request.GetParameter("BUSINESS_UNIT"));
&vchr_id = Unencode(%Request.GetParameter("VOUCHER_ID"));
&vchr_style_inq = Unencode(%Request.GetParameter("VOUCHER_STYLE_INQ"));
&vndr_setid = Unencode(%Request.GetParameter("VENDOR_SETID"));

Thanks to Darryls99, I found a way to do this:
&url = GenerateComponentContentURL(%Portal, %Node, MenuName."ENTER_VOUCHER_INFORMATION", %Market, Component."AP_VCHR_INQ", Page."AP_VOUCHER_INQUIRY", "U");
%Response.RedirectURL(&url | "&BUSINESS_UNIT=" | "BUPO" | "&VOUCHER_ID=" | &row.USM_ACTEXP_WRK2.VOUCHER_ID.value);

Related

Datasource Paging Issue (Revised Again)

See Datasource Paging Issue (Revised)
for the original question.
Markus, you were kind enough to help with out with the issue of incorporating a record count into a query using a calculated datasource. I have a search form with 15 widgets - a mix of date ranges, dropdowns, text values and ._contains, ._equals, ._greaterThanOrEquals, ._lessThanOrEquals, etc.
I have tested this extensively against mySQL SQL code and it works fine.
I have now added a 16th parameter PropertyNames, which is a list with binding #datasource.query.filters.Property.PropertyName._in and Options blank. The widget on the form is hidden because it is only used for additional filtering.
Logic such as the following is used, such that a particular logged-in user can only view their own properties. So if they perform a search and the Property is not specified we do:-
if (params.param_Property === null && canViewAllRecords === false) {
console.log(params.param_PropertyNames); // correct output
ds.filters.Property.PropertyName._in = params.param_PropertyNames;
}
The record count (records.length) is correct, and if I for loop through the array of records the record set is correct.
However, on the results page the table displays a larger resultset which omits the PropertyNames filter. So if I was to search on Status 'Open' (mySQL results 50) and then I add a single value ['Property Name London SW45'] for params.param_PropertyNames the record count is 6, the records array is 6 but the datasource display is 50. So the datasource is not filtering on the property array.
Initially I tried without adding the additional parameter and form widget and just using code such as
if (params.param_Property === null && canViewAllRecords === false) {
console.log(params.param_PropertyNames); // correct output
ds.filters.Property.PropertyName._in = properties; // an array of
properties to filter out
}
But this didn't work, hence the idea of adding a form widget and an additional parameter to the calculated recordcount datasource.
If I inspect at query.parameters then I see:-
"param_Status": "Open",
"param_PropertyNames": ["Property Name London SW45"],
If I inspect query.filters:-
name=param_Status, value=Open
name=param_PropertyNames, value=[]}]}
It looks as though the filter isn't set. Even hard coding
ds.filters.Property.PropertyName._in = ['Property Name London SW45'],
I get the same reuslt.
Have you got any idea what would be causing this issue and what I can do for a workaround ?
Using a server side solution I would suggest editing both your SQL datasource query script (server side) that is supposed to filter by this property list and including the same code in your server side script for your calculated Count datasource. The code would look something like this, not knowing your exact details:
var subquery = app.models.Directory.newQuery();
subquery.filters.PrimaryEmail._equals = Session.getActiveUser().getEmail();
subquery.prefetch.Property._add();
var results = subquery.run();
if(!results[0].CanViewAllRecords) {
query.filters.Property.PropertyName._in = results[0].Property.map(function(i) {return i.PropertyName;});
}
By adding this code you are filtering your directory by your current user and prefetching the Property relation table, then you set the filter only if your user canviewallRecords is false and use JS map function to create an array of the PropertyName field in the Property table. As I stated, your code may not be exactly the same depending on how you have to retrieve your user canviewallrecords property and then of course I don't know your relation between user and Property table either, is it one-to-many or other. But this should give you an idea how to implement this on server side.

Relational Query - 2 degrees away

I have three models:
Timesheets
Employee
Manager
I am looking for all timesheets that need to be approved by a manager (many timesheets per employee, one manager per employee).
I have tried creating datasources and prefetching both Employee and Employee.Manager, but I so far no success as of yet.
Is there a trick to this? Do I need to load the query and then do another load? Or create an intermediary datasource that holds both the Timesheet and Employee data or something else?
You can do it by applying a query filter to the datasource onDataLoad event or another event. For example, you could bind the value of a dropdown with Managers to:
#datasource.query.filters.Employee.Manager._equals
- assuming that the datasource of the widget is set to Timesheets.
If you are linking to the page from another page, you could also call a script instead of using a preset action. On the link click, invoke the script below, passing it the desired manager object from the linking page.
function loadPageTimesheets(manager){
app.showPage(app.pages.Timesheets);
app.pages.Timesheets.datasource.query.filters.Employee.Manager._equals = manager;
app.pages.Timesheets.datasource.load();
}
I would recommend to redesign your app a little bit to use full power of App Maker. You can go with Directory Model (Manager -> Employees) plus one table with data (Timesheets). In this case your timesheets query can look similar to this:
// Server side script
function getTimesheets(query) {
var managerEmail = query.parameters.ManagerEmail;
var dirQuery = app.models.Directory.newQuery();
dirQuery.filters.PrimaryEmail._equals = managerEmail;
dirQuery.prefetch.DirectReports._add();
var people = dirQuery.run();
if (people.length === 0) {
return [];
}
var manager = people[0];
// Subordinates lookup can look fancier if you need recursively
// include everybody down the hierarchy chart. In this case
// it also will make sense to update prefetch above to include
// reports of reports of reports...
var subortinatesEmails = manager.DirectReports.map(function(employee) {
return employee.PrimaryEmail;
});
var tsQuery = app.models.Timesheet.newQuery();
tsQuery.filters.EmployeeEmail._in = subortinatesEmails;
return tsQuery.run();
}

Session, Cache OR Cookies which one to use to retain Search Criteria of User? (Posted With Case Study)

I am developing a Web application which is based on ASP.NET 4.0, JQUERY, AJAX and Javascript. I have a particular search page in which a user can search via multiple factors i.e. either by Date, Name, Code, Category etc.
For e.g.
A) In a SearchProducts form, user can search a product via its unique Number OR Name OR Start Date/End Date OR Category OR etc etc.
B) User can search by either one or all of the parameters which a standard search form should be able to do.
C) If user searches via Start Date and End Date say 1st Dec 2012 to 31st Dec 2012 so for example my Search Results consist of 4 Products i.e. 4 products are purchased from 1st Dec to 31st Dec
D) Results are displayed in the grid and by clicking on the Product Number its redirecting to its View page (selected Product Specific full details) with ProductID via Query string.
E) I have a requirement which enables the user to retain search results which he/she has searched via Back To Search button in View page (selected Product Specific full details) page.
Now, What I have planned is as follows:
1) When a user submits on the Search then I want to store the refference of Search Paramters i.e Date, Name, Category etc which user has entered.
2) I will set a value in query string to differentiate normal request and request Via Back to Search button.
3) code in Search Page:
if (!(IsPostBack))
{
string tempRequestMode = string.Empty;
if (Request.QueryString["requestMode"] != null)
{
tempRequestMode = Request.QueryString["requestMode"].ToString();
if (tempRequestMode == "searchResults")
{
//RestoreValues();
//Fetch results from the database again based on above results
}
}
}
Now, My question is:
I wanted to use ASP.NET Cache for this purpose:
Advantages: its expiration and dependencies
Disadvantages: its has the application scope i.e. its not per user wise as Session is.
Second option is session:
Advantages: its per user wise.
Disadvantages: Session is more memory intensive.
I am confused that what Should I use. Is there any other option to use as Search Criteria is different for different users so want user wise maintenance of data.
You can write all search criteria to QueryString.
When a User clicks the Search Button, run this Javascript:
CLIENT-SIDE
<script type="text/javascript">
var url = "Invoices.aspx?type=__type__&status=__status__&order=__order__";
var type= $("#<%= drpType.ClientID%>").val(); // Type
var status = $("#<%= drpStatus.ClientID %>").val(); // Status Parameter
var order = $("#<%= drpOrder.ClientID %>").val(); // order Parameter
window.location = url.replace("__type__",type).replace("__status__",status).replace("__order__",order).replace(");
</script>
SERVER-SIDE
protected void Page_Load(object cart, EventArgs curt)
{
_type = Request.QueryString["type"];
if (string.IsNullOrEmpty(_type))
_type = Enums.InvoiceUserTypes.RS.ToString(); //Default
_status = Request.QueryString["status"];
if (string.IsNullOrEmpty(_status)) _status = "ALL"; //Default
_order= Request.QueryString["order"];
if (string.IsNullOrEmpty(_order)) _order = "date"; //Default
drpStatus.Value = _status;
drpType.Value = _type;
drpOrder.Value = _order;
RunReport();
}
When user click Back button . Search parameters will be on the URL
I think the best approach to this case is to redo the search using the stored filters once the user gets back to the search page. Any other approaches will bring you a big drawback.
Caches Expire and if you use the default implementation they won't allow your app to scale to multiple machines since they are local.
Using sessions is a bad idea too because they will eat your resources AND won't allow you to scale too.
If you must store the results you should store them serialized (LOB pattern) in a database or another network accessible resource so you could retrieve them in any application server.

Dynamics Ax: Alert when any record changes

I want to send an alert in Ax, when any field in the vendor table changes (and on create/delete of a record).
In the alert, I would like to include the previous and current value.
But, it appears that you can't set alerts for when any field in a table changes, but need to set one up for EVERY FIELD?! I hope I am mistaken.
And how can I send this notification to a group of people
I have created a new class with a static method that I can easily call from any .update() method to alert me when a record changes, and what changed in the record.
It uses the built in email templates of Ax as well.
static void CompareAndEmail(str emailTemplateName, str nameField, str recipient, Common original, Common modified)
{
UserInfo userInfo;
Map emailParameterMap = new Map(Types::String, Types::String);
str changes;
int i, fieldId;
DictTable dictTable = new DictTable(original.TableId);
DictField dictField;
;
for (i=1; i<=dictTable.fieldCnt(); i++)
{
fieldId = dictTable.fieldCnt2Id(i);
dictField = dictTable.fieldObject(fieldId);
if (dictField.isSystem())
continue;
if (original.(fieldId) != modified.(fieldId))
{
changes += strfmt("%1: %2 -> %3 \n\r",
dictField.name(),
original.(fieldId),
modified.(fieldId)
);
}
}
//Send Notification Email
select Name from UserInfo where userInfo.id == curUserId();
emailParameterMap.insert("modifiedBy", userInfo.Name);
emailParameterMap.insert("tableName", dictTable.name());
emailParameterMap.insert("recordName", original.(dictTable.fieldName2Id(nameField)));
emailParameterMap.insert("recordChanges", changes);
SysEmailTable::sendMail(emailTemplateName, "en-us", recipient, emailParameterMap);
}
Then in the .update() method I just add this one line
//Compare and email differences
RecordChangeNotification::CompareAndEmail(
"RecChange", //Template to use
"Name", //Name field of the record (MUST BE VALID)
"user#domain.com", //Recipient email
this_Orig, //Original record
this //Modified record
);
The only things I want to improve upon are:
moving the template name and recipient into a table, for easier maintenance
better formatting for the change list, I don't know how to template that (see: here)
As you have observed the alert system is not designed for "any" field changes, only specific field changes.
This is a bogus request anyway as it would generate many alarts. The right thing to do is to enable database logging of the VendTable table, then send a daily report (in batch) to those interested.
This is done in Administration\Setup\Database logging. There is a report in Administration\Reports. You will need to know the table number to select the table.
This solution requires a "Database logging" license key.
If you really need this feature, then you can create a class that sends a message/email with the footprint of the old record vs the new record. Then simply add some code in the table method "write"/"update"/"save" to make sure you class is run whenever vendtable gets edited.
But I have to agree with Jan. This will generate a lot of alerts. I'd spend some energy checking if the modifications done in vendtable are according to the business needs, and prohibit illegal modifications. That includes making sure only the right people have enough access.
Good luck!
I do agree with suggestion of Skaue.you just write and class to send the mail of changes in vend table.
and execute this class on update method of vendtable.
thanks and Regards,
Deepak Kumar

ASP.NET KB system / search engine

Our company has a "MyAccount" where we would like to put a knowledge base behind. We have a CRM system where the help calls are recorded and some knowledge base articles are written into the database. The master problem (same basic help call) is tagged with keyword(s). We also have CHM help files for the software we sell (some users never use the internal help system, they go online), PDF whitepapers and tutorials in a protected directory. I would like to either buy, or quickly build an ASP.NET solution where a user can search the database to display the help article and also show tutorials or whitepapers or a help file from the CHM.
Requirements: It must look like our website. I have a master page, so any content page has to pretty much be white...no graphics, colors, etc.
Does anyone know a 3rd party search engine, or an example with some source code on how to use Lucene.NET to build a search index database from an existing database?
You can build such solution with Lucene .Net.
Keep your docs in database (as already) and index with Lucene.Net docs you want.
Lucene will have its own index in file system.
You need to provide synchronization between your docs in DB and Lucene index, so when document in DB changes, you need to re-index it with Lucene.
Synchronization (matching between DB and Lucene index) can be based on some unique key value from DB (ex: ID).
So, when you want to add some document to Lucene index, you index the document content (you don't need to save content in Lucene) and 'save' it in Lucene with unique key value from DB (lets say ID).
Then you can search Lucene index and get list of matching document IDs.
And retreive them from your DB by those IDs and show to user.
Below is example method from my project, it adds document to Lucene index.
InformationAsset in method argument is the document from DB I want to index.
This method creates 'Lucene document' with few 'fields':
'field': content of the doc from db (InformationAsset from method argument)
'fieldId': it's ID of the InformationAsset from database, to match Database and Lucene index
'fieldPubDate': publication date, I can create advanced queries to Lucene engine basing on all fields.
'fieldDataSource': it's some kind of category.
public void AddToIndex(Entities.InformationAsset infAsset, IList<Keyword> additionalKeywords)
{
Analyzer analyzer = new StandardAnalyzer();
IndexWriter indexWriter = new IndexWriter(LuceneDir, analyzer, false);
Document doc = new Document();
// string z dodatkowymi slowami po ktorych ma byc tez zindeksowana tresc
string addKeysStr = "";
if(additionalKeywords != null)
{
foreach (Keyword keyword in additionalKeywords)
{
addKeysStr += " " + keyword.Value;
}
}
addKeysStr += " " + m_RootKeyword;
string contentStr;
contentStr = infAsset.Title + " " + infAsset.Content + addKeysStr;
// indeksacja pola z trescia
Field field = new Field(LuceneFieldName.Content, contentStr, Field.Store.NO, Field.Index.TOKENIZED,
Field.TermVector.YES);
// pole z Id
Field fieldId = new Field(LuceneFieldName.Id, infAsset.Id.ToString(), Field.Store.YES, Field.Index.UN_TOKENIZED);
// pole publish date
Field fieldPubDate = new Field(LuceneFieldName.PublishDate,
DateTools.DateToString(infAsset.PublishingDate, DateTools.Resolution.MINUTE),
Field.Store.YES, Field.Index.NO_NORMS, Field.TermVector.YES);
// pole DataSource
// pole z Id
Field fieldDataSource = new Field(LuceneFieldName.DataSourceId, infAsset.DataSource.Id.ToString(), Field.Store.YES,
Field.Index.UN_TOKENIZED);
doc.Add(field);
doc.Add(fieldId);
doc.Add(fieldPubDate);
doc.Add(fieldDataSource);
doc.SetBoost((float)CalculateDocBoostForInfAsset(infAsset));
indexWriter.AddDocument(doc);
indexWriter.Optimize();
indexWriter.Close();
}

Resources