Modify memory of hareware key of HaspFile which is ReadOnly - hasp

I'd like to ask a question. I know it is possible to modify the memory of a hareware key by modifying HaspFile which is ReadWrite
Dim file As HaspFile = hasp.GetFile(HaspFileId.ReadWrite)
Dim newBytes() AsByte = New Byte() {1, 2, 3, 4, 5, 6, 7}
status = file.Write(newBytes, 0, newBytes.Length)
But I'd like to know, is it possible to modify the readonly memory part of the Key without having a Mater key?
It seems not possible to do it via code?
But is it possible to do via Tools such as Vendor Suit?
Thank you very much for your kindness help in advance.

From the v.5.10 Software Protection and Licensing Guide (emphasis mine):
In the context of Sentinel HASP, Read‐only memory (ROM) is a segment
of the memory that can contain data that the protection application
can access, but cannot overwrite. Sentinel HASP keys contain two ROM
segments, one of which contains Sentinel HASP Feature‐based licenses.
The second segment provides an area in which vendor‐customized data
can be stored. These segments can only be updated using remote
updates.
The "remote updates" that the documentation is referring to is the "Remote Update System" (RUS), which is the C2V/V2C method of updating a key.
Since a Master Key is required to generate a remote update, that means a Master Key is required to make modifications to the read-only memory section of a key.
The only component of the Vendor Suite that is capable of modifying the read-only memory is Business Studio (by creating an order for a remote update).

Related

Is there a syslog private enterprise number for custom/internal use?

So I recently was looking for a way to add extra metadata to logs and found out that syslog got me covered. I can add custom metadata using SD-ID feature like this:
[meta#1234 project="project-name" version="1.0.0-RC5" environment="staging" user="somebody#example.com"]
The problem is that 1234 has to be a syslog private enterprise number.
I assume those are given to big companies like microsoft or apple, but not to indie developers.
So My question is, is there a reserved number for internal use that everyone could use without registration for internal purpose?
If you use RFC5424-formatted messages, you can (or could) create custom fields in the SDATA (Structured Data) part of the message.
The latter part of a custom field in the SDATA is, as you mentioned, the private enterprise number (or enterpiseId).
As per RFC5424 defined:
7.2.2. enterpriseId
The "enterpriseId" parameter MUST be a 'SMI Network Management Private Enterprise Code', maintained by IANA, whose prefix is iso.org.dod.internet.private.enterprise (1.3.6.1.4.1). The number that follows MUST be unique and MUST be registered with IANA as per RFC 2578 [RFC2578].
Of course it depends on what you're using it for, if it's only for local logs, you can use any enterpriseId or you can even use a predefined SDATA field with a reserved SD-ID and rewrite it's value. (See: syslog-ng Guide)

Ionic app how key and value storage works

So I am trying to make a calendar app in ionic and I want to store events, if a user makes one, I looked at the ionic documentation, and it makes it seem too simple I basically copied exactly what they have with a few adjustments and I do not know how to test it. Here is what I have:
save() {
var n = 0
this.event.startTime = new Date(this.readDescription())
this.storage.set('fooditem'+this.increaseVal(),this.event);
this.modalCtrl.dismiss({event: this.event})
}
increaseVal() function just increments the key name so I have a new key for every new value (this is a temporary fix)
I know I probably need to get the data stored after it has been saved but I just need to make sure it actually saves
I'm answering this question with the assumption that you want to know where the data is being stored & to see the value at that location.
Ionic Storage gives options to use SQLite, IndexedDB, WebSQL and localstorage as ways to store data on the device. With the first 3, there is no way to access the data besides fetching it. Local Storage however, is accessible in Chrome Dev Tools > Application > Local Storage
Ionic Storage allows the developer to configure the driver used to store data by specifying it in the driverOrder options in App Module. Simply put localstorage as the first value in this as such driverOrder: ['localstorage', 'indexeddb', 'sqlite', 'websql'] to force the app to store data in localstorage and then you can see the value stored in the location described above.

How to use #GetDocField in IBM Lotus Domino for a database different than the current

I need to "compute when compose" a field in a document so Client+Formula are the only options I have here.
I am using a #DbLookup("" :"NoCache";"server" :"db" ;"View" ; "key" ; fieldName); command that looks into a different server/database and comes back with a UNID of a specific document. The UNID is valid for the server/db database, not the current one. How can I use this UNID to fetch/set a value on the remote document.
In IBM documentation I only found #GetDocField(UNID,fieldName) and #SetDocField(UNID, fieldName, value) that are for the local DB only!!!
How can one actually meaningfully use this UNID since it represents a document on a remote database. I searched for 40 minutes for an answer!
This is not possible using formula language.
In LotusScript you can use
Dim db as New NotesDatabase( "server" , "db" )
Dim doc as NotesDocument
Set doc = db.getDocumentByUnid( uuid )
Call doc.ReplaceItemValue( "fieldname" , value )
if you really are not able to use lotusscript in the context you have (normally there IS an option to use LotusScript wherever / whenever you need it, you probably only try to use the wrong context / event / whatever), then the possibility would be to write a litte LotusScript- Agent with the above code and hand the uuid to it via notes.ini- Parameter, profile- document or whatever fits best for you.
My reputation is too low to comment, so I have to resort to posting an answer to add a thought. Have you tried #UpdateFormulaContext?
UNID := #DbLookup(...[ReturnDocumentUniqueID]);
#Command([OpenDocument]....UNID);
#UpdateFormulaContext;
#SetDocField(#DocumentUniqueID; ...);
#UpdateFormulaContext definitely allows you to reset the 'current' document (from the IBM Domino Designer Help):
tempDate := #GetDocField(#DocumentUniqueID;"CreatedDate");
#Command([NavPrev]);
#Command([EditDocument]);
#UpdateFormulaContext;
#SetDocField(#DocumentUniqueID;"nextCreated";tempDate)
"You can use #UpdateFormulaContext to extract values from or set values in external documents. You can even access document- and database-specific information using functions such as #DbName, #DbTitle, #Created, #DocumentUniqueID, #GetDocField, #GetField, #GetProfileDocument."
Worth a punt.

How to check which SQL query is so CPU intensive

Is there any possible way to check which query is so CPU intensive in _sqlsrv2 process?
Something which give me information about executed query in that process in that moment.
Is there any way to terminate that query without killing _sqlsrv2 process?
I cannot find any official materials in that subject.
Thank You for any help.
You could look into client database-request caching.
Code examples below assume you have ABL access to the environment. If not you will have to use SQL instead but it shouldn't be to hard to "translate" the code below
I haven't used this a lot myself but I wouldn't be surprised if it has some impact on performance.
You need to start caching in the active connection. This can be done in the connection itself or remotely via VST tables (as long as your remote session is connected to the same database) so you need to be able to identify your connections. This can be done via the process ID.
Generally how to enable the caching:
/* "_myconnection" is your current connection. You shouldn't do this */
FIND _myconnection NO-LOCK.
FIND _connect WHERE _connect-usr = _myconnection._MyConn-userid.
/* Start caching */
_connect._Connect-CachingType = 3.
DISPLAY _connect WITH FRAME x1 SIDE-LABELS WIDTH 100 1 COLUMN.
/* End caching */
_connect._Connect-CachingType = 0.
You need to identify your process first, via top or another program.
Then you can do something like:
/* Assuming pid 21966 */
FIND FIRST _connect NO-LOCK WHERE _Connect._Connect-Pid = 21966 NO-ERROR.
IF AVAILABLE _Connect THEN
DISPLAY _connect.
You could also look at the _Connect-Type. It should be 'SQLC' for SQL connections.
FOR EACH _Connect NO-LOCK WHERE _Connect._connect-type = "SQLC":
DISPLAY _connect._connect-type.
END.
Best of all would be to do this in a separate environment. If you can't at least try it in a test environment first.
Here's a good guide.
You can use a Select like this:
select
c."_Connect-type",
c."_Connect-PID" as 'PID',
c."_connect-ipaddress" as 'IP',
c."_Connect-CacheInfo"
from
pub."_connect" c
where
c."_Connect-CacheInfo" is not null
But first you need to enable connection cache, follow this example

Dynamics GP Web Service -- Returning list of sales order based on specific criteria

For a web application, I need to get a list or collection of all SalesOrders that meet the folowing criteria:
Have a WarehouseKey.ID equal to "test", "lucmo" or "Inno"
Have Lines that have a QuantityToBackorder greater than 0
Have Lines that have a RequestedShipDate greater than current day.
I've succesfully used these two methods to retrieve documents, but I can't figure out how return only the ones that meet above criteria.
http://msdn.microsoft.com/en-us/library/cc508527.aspx
http://msdn.microsoft.com/en-us/library/cc508537.aspx
Please help!
Short answer: your query isn't possible through the GP Web Services. Even your warehouse key isn't an accepted criteria for GetSalesOrderList. To do what you want, you'll need to drop to eConnect or direct table access. eConnect has come a long way in .Net if you use the Microsoft.Dynamics.GP.eConnect and Microsoft.Dynamics.GP.eConnect.Serialization libraries (which I highly recommend). Even in eConnect, you're stuck with querying based on the document header rather than line item values, though, so direct table access may be the only way you're going to make it work.
In eConnect, the key piece you'll need is generating a valid RQeConnectOutType. Note the "ForList = 1" part. That's important. Since I've done something similar, here's what it might start out as (you'd need to experiment with the capabilities of the WhereClause, I've never done more than a straightforward equal):
private RQeConnectOutType getRequest(string warehouseId)
{
eConnectOut outDoc = new eConnectOut()
{
DOCTYPE = "Sales_Transaction",
OUTPUTTYPE = 1,
FORLIST = 1,
INDEX1FROM = "A001",
INDEX1TO = "Z001",
WhereClause = string.Format("WarehouseId = '{0}'", warehouseId)
};
RQeConnectOutType outType = new RQeConnectOutType()
{
eConnectOut = outDoc
};
return outType;
}
If you have to drop to direct table access, I recommend going through one of the built-in views. In this case, it looks like ReqSOLineView has the fields you need (LOCNCODE for the warehouseIds, QTYBAOR for backordered quantity, and ReqShipDate for requested ship date). Pull the SOPNUMBE and use them in a call to GetSalesOrderByKey.
And yes, hybrid solutions kinda suck rocks, but I've found you really have to adapt if you're going to use GP Web Services for anything with any complexity to it. Personally, I isolate my libraries by access type and then use libraries specific to whatever process I'm using to coordinate them. So I have Integration.GPWebServices, Integration.eConnect, and Integration.Data libraries that I use practically everywhere and then my individual process libraries coordinate on top of those.

Resources