Query on schema implementation in Berkeley DB - berkeley-db

I found below parameters which our code is using, amidst schema implementation phase. We are just using the default values as shown below.
enum { CACHE_SIZE_KBYTES = 10000,
LOG_SIZE_KBYTES = 2000,
CHKPT_COALESCE_KBYTES = 1,
CHKPT_COALESCE_MINS = 0,
TXN_PER_LOG_PRUNE_CHK = 50,
PAGE_SIZE_KBYTES = 0,
MAX_LOG_FILE_NAME_LEN = 25 };
My question:
Can you please help me understand the significance of these above 7 parameters? so that i can know when to tune these values, as required?

Those variables are application specific, not part of Berkeley DB.
Find how those variables are used in Berkeley DB API calls and then read the Berkeley DB documentation at oracle.com for those calls.

Related

Streaming data out of Kusto

There is enough ADX documentation available about streaming ingestion but I don't find anything about streaming it out of Kusto. Even continuous export has minimum limit on export frequency (I think 5 mins) and that is far from being a streaming method of exporting. Is there a way to stream high volume data out of ADX to BLOB or ADLS Gen2?
Continuous export is the recommended approach for continuously exporting high volume of data from Kusto since the exporting is distributed. The minimal frequency is one minute, see the "frequency" section in the doc :
For other exports of a large amount of data, use the applicable method in the SDKs. For example, here is the Java SDK and here is the .Net sample:
KustoConnectionStringBuilder kcsb = new KustoConnectionStringBuilder(connectionString);
kcsb.Streaming = true;
using (KustoDataContext context = new KustoDataContext(kcsb))
using (var reader = context.ExecuteQuery(query, requestProperties: requestProperties))
using (var csvStream = new CsvFromDataReaderJitStream(reader, leaveOpen: false, writeHeader: true))
{
// consume stream here
}

DateTime problems when moving to hosted environment

I'm have the following code
Function SaveSession(ByVal model As ViewModelTrainingSession) As JsonResult
...
...
ses.Date = New DateTime(model.Date.Year, model.Date.Month, model.Date.Day, 0, 0, 0, 0, DateTimeKind.Local)
db.TrainingSessions.Add(ses)
db.SaveChanges()
When I run this code local (using debug) with let's say, 2014, 12, 24... The date that is saved in my database is 2014-12-24 00:00:00.000, which is what I want.
Now when I publish my code the server (shared hosting) the same code will end-up putting 2014-12-23 00:00:00.000 in the database.
I was using Azure database before and everything was working well. Now, I use the database on the new server I'm hosted on. The code did not change. But I get that date conversion now which baffles me. It is only when running from the published code that I get that date difference. When I run in debug mode local (but connecting to the de remotely) I have no problem
Any ideas?
Update
Here's what I do on the client side
bSoft.AjaxSaveSession = function (model) {
/* Convert it to JSON and send it to controller */
var json = ko.toJSON(model);
var targetURL = DataSrvOptimaxUrl + '/' + 'SaveSession';
bSoft.globals.vm.showLoadingWheel(true);
$.ajax({
url: targetURL,
type: "POST",
contentType: 'application/json; charset=utf-8',
Model contain various fields but two of them are named Date and DateStart, they are populated with javascript Date (I'm using XDate)
The model is sent to the server using the Ajax Call that you see.
On the other side (the server side), I have a ViewModel (ViewModelTrainingSession) defined with VB.NET. Again, that model contain a bunch of properties, two of them are Date and DateStart...
There is some magic performed by ASP.NET, the field are converted from JSON to VB based on the field names.
Function SaveSession(ByVal model As ViewModelTrainingSession) As JsonResult
If ModelState.IsValid Then
Try
'....
'Make sure date is save using time of 00.00.00:000
ses.StartTime = New DateTime(model.Date.Year, model.Date.Month, model.Date.Day, 0, 0, 0, 0, DateTimeKind.Local)
ses.Date = New DateTime(model.Date.Year, model.Date.Month, model.Date.Day, 0, 0, 0, 0, DateTimeKind.Local)
'...
db.TrainingSessions.Add(ses)
db.SaveChanges()
I think the dates get messed up in that conversion from JSON to VB. It is not the db which is the problem.
I think I'm going to try sending the date component instead of a date, I'll have a field called year, on called month, on called day, and put the component together on the server... This should prevent conversion and time zone issues.
You should switch from DateTimeKind.Local to DateTimeKind.Utc to avoid surprises like this one.
The hour difference stems from your server being in a different time zone than your emulator.
In general you should convert local times to Utc times before storing, and then convert the stored Utc times to local time immediately before displaying them. And get accustomed to seeing Utc times in your database's raw data. :-)

Modify memory of hareware key of HaspFile which is ReadOnly

I'd like to ask a question. I know it is possible to modify the memory of a hareware key by modifying HaspFile which is ReadWrite
Dim file As HaspFile = hasp.GetFile(HaspFileId.ReadWrite)
Dim newBytes() AsByte = New Byte() {1, 2, 3, 4, 5, 6, 7}
status = file.Write(newBytes, 0, newBytes.Length)
But I'd like to know, is it possible to modify the readonly memory part of the Key without having a Mater key?
It seems not possible to do it via code?
But is it possible to do via Tools such as Vendor Suit?
Thank you very much for your kindness help in advance.
From the v.5.10 Software Protection and Licensing Guide (emphasis mine):
In the context of Sentinel HASP, Read‐only memory (ROM) is a segment
of the memory that can contain data that the protection application
can access, but cannot overwrite. Sentinel HASP keys contain two ROM
segments, one of which contains Sentinel HASP Feature‐based licenses.
The second segment provides an area in which vendor‐customized data
can be stored. These segments can only be updated using remote
updates.
The "remote updates" that the documentation is referring to is the "Remote Update System" (RUS), which is the C2V/V2C method of updating a key.
Since a Master Key is required to generate a remote update, that means a Master Key is required to make modifications to the read-only memory section of a key.
The only component of the Vendor Suite that is capable of modifying the read-only memory is Business Studio (by creating an order for a remote update).

Except string datatype, not getting any columns populated through AIF Service

I have a simple custom table in AX, for which i need to create a Web service so that data could be populated by an external system. I created a Document service with all the Axd* and Ax* objects involved. Next i wrote a simple console application to try populating some dummy data. Interesting thing is i can only get string data type columns populated. Int, Real, Date, Enums etc are not coming through.
In the code below, i am only getting StoreItemId and Barcode columns, as both are strings. Cost is real, ErrorType is Enum, DocketDate is date and none of them get any values. I have discussed this issue with many colleagues and none is aware whats happening. Does anyone know or could point me in some new direction? Thanks a lot.
PS - I have limited experience with AIF and if i am missing something fundamental, please excuse and do let me know. Thanks.
AxdEntity_MMSStagingSalesImport stagingSalesImport = new AxdEntity_MMSStagingSalesImport();
stagingSalesImport.StoreItemId = "9999";
stagingSalesImport.Barcode = "1234546";
stagingSalesImport.Cost = 22;
stagingSalesImport.ErrorType = AxdEnum_MMSImportErrorType.Posting;
stagingSalesImport.DocketDate = new DateTime(2014, 4, 4);
stagingSalesImport.IsDuplicate = AxdEnum_NoYes.Yes;
For some types, you have to specify that you have set the values, so that AX knows the difference between a null value and a value that is set:
stagingSalesImport.Cost = 22;
stagingSalesImport.CostSpecified = true;
stagingSalesImport.ErrorType = AxdEnum_MMSImportErrorType.Posting;
stagingSalesImport.ErrorTypeSpecified = AxdEnum_MMSImportErrorType.Posting;
Thanks for replying Klaas, i like your blog as well.
Should have responded earlier but i fixed the issue.
I didn't try Klaas's option but i had a look at the data policies for the inbound port and found that none of the columns were enabled. I enabled the ones i needed and made most of them required as well. And guess what, that worked. I was expecting that the columns should have been enabled by default.

Dynamics GP Web Service -- Returning list of sales order based on specific criteria

For a web application, I need to get a list or collection of all SalesOrders that meet the folowing criteria:
Have a WarehouseKey.ID equal to "test", "lucmo" or "Inno"
Have Lines that have a QuantityToBackorder greater than 0
Have Lines that have a RequestedShipDate greater than current day.
I've succesfully used these two methods to retrieve documents, but I can't figure out how return only the ones that meet above criteria.
http://msdn.microsoft.com/en-us/library/cc508527.aspx
http://msdn.microsoft.com/en-us/library/cc508537.aspx
Please help!
Short answer: your query isn't possible through the GP Web Services. Even your warehouse key isn't an accepted criteria for GetSalesOrderList. To do what you want, you'll need to drop to eConnect or direct table access. eConnect has come a long way in .Net if you use the Microsoft.Dynamics.GP.eConnect and Microsoft.Dynamics.GP.eConnect.Serialization libraries (which I highly recommend). Even in eConnect, you're stuck with querying based on the document header rather than line item values, though, so direct table access may be the only way you're going to make it work.
In eConnect, the key piece you'll need is generating a valid RQeConnectOutType. Note the "ForList = 1" part. That's important. Since I've done something similar, here's what it might start out as (you'd need to experiment with the capabilities of the WhereClause, I've never done more than a straightforward equal):
private RQeConnectOutType getRequest(string warehouseId)
{
eConnectOut outDoc = new eConnectOut()
{
DOCTYPE = "Sales_Transaction",
OUTPUTTYPE = 1,
FORLIST = 1,
INDEX1FROM = "A001",
INDEX1TO = "Z001",
WhereClause = string.Format("WarehouseId = '{0}'", warehouseId)
};
RQeConnectOutType outType = new RQeConnectOutType()
{
eConnectOut = outDoc
};
return outType;
}
If you have to drop to direct table access, I recommend going through one of the built-in views. In this case, it looks like ReqSOLineView has the fields you need (LOCNCODE for the warehouseIds, QTYBAOR for backordered quantity, and ReqShipDate for requested ship date). Pull the SOPNUMBE and use them in a call to GetSalesOrderByKey.
And yes, hybrid solutions kinda suck rocks, but I've found you really have to adapt if you're going to use GP Web Services for anything with any complexity to it. Personally, I isolate my libraries by access type and then use libraries specific to whatever process I'm using to coordinate them. So I have Integration.GPWebServices, Integration.eConnect, and Integration.Data libraries that I use practically everywhere and then my individual process libraries coordinate on top of those.

Resources