AzerothCore: does rate.drop.item.quality from the worldserver.conf affect world drops? - azerothcore

I want to modify world drop rates, but I'm not sure how the rate.drop.item works. It seems to not affect all drops as I would have expected.
I've done some testing using various values from 200 all the way up to 10,000,000 and the results seem strange. Normal enemies don't seem to drop world blues any higher than normal, unless it's something normally on their drop table, like a blueprint or something (I tested this on many enemies across a few different zones). Enemies in dungeons drop many more BoE blues, but it would appear to be BoE drops specific to the instance, not world drops.
Oddly, chests seem to act as I would expect, if I have the rate.drop.item.rare set to a very high value, I will receive several world drop rares from the chest. It just doesn't seem to affect creatures.
I don't think this matters, but I was testing this with GM privileges.
Can anyone clarify more precisely how this configuration option works, and if it doesn't affect world drops, is there a config that does, or will I have to edit some tables to achieve this?
Thanks.
UPDATE
I ended up modifying the database directly. Here's the update I ran
UPDATE acore_world.creature_loot_template INNER JOIN reference_loot_template ON creature_loot_template.reference = reference_loot_template.entry
INNER JOIN item_template ON reference_loot_template.item = item_template.entry
SET creature_loot_template.Chance = creature_loot_template.Chance*5
WHERE creature_loot_template.Reference != 0 AND creature_loot_template.Chance < 1 AND (item_template.quality = 3 OR item_template.quality = 4)
AND creature_loot_template.GroupId != 0 AND (reference_loot_template.comment NOT LIKE '%ReferenceTable%')

The way Loots are managed in AzerothCore is more complex than it seems: it is not only about the drop chance of every single item but there are also other factors for example Groups etc...
I recommend reading:
https://www.azerothcore.org/wiki/loot_template
To understand how the core reads the loot tables.

Related

How to implement JS infine scroll

I would like to display a table which have several thousand rows with complex formatting (color, font, border, etc. done on the ASP.Net Core server).
Initially, I generated an html copy of all the data (stored in a SQL Server database), but realised it wasn't optimal since the generated html data accounted for more than 50MB.
No, I only generate about 200 rows; 100 visible and 50 hidden above and below (cache). I would like to freely scroll the tablen, but when there are only 25 hidden rows above or below, fetch new rows from the controller which are then prepend or append to the table. Basically, I want to give enough room so I can can populate the table when I'm scrolling through the "hidden" (cache) rows.
Everything seems to work well, but I believe I need to use a web-worker to run the function in a background thread which add new rows to the table independently of the table being scrolled.
Below is a excerpt of the code :
I use a debunce function to only catch the lastest position of the mouse scroll.
The scroll function basically only checks whether there are enough hidden rows (cache) above or below the table. If it reaches the threshold, it either prepends (scroll upwards) or appends (scroll downwards) rows obtained from the controller.
The main issue is that I can't scroll the table when the new rows are being fetch as the page freezes. It only takes about 1 to 2 seconds to populate to new (scrollable) rows but it isn't smooth.
Could anyone help me improve the code? (general ideas) I also read that there are already existing libraries but can't really get my head around them..
$('#fields-table > tbody').on('wheel', _.debounce(async function (event) {
await scroll(); // Probably change it to a web-worker or promise?
}
async function scroll() {
var threshold = 200; // Corresponds to approximatively 50 rows (above and below).
var above = $('#fields-table').scrollTop();
var below = $('#fields-table > tbody').height() - $('#fields-table').height() - above;
// Gets the scroll delta based on the table heights.
var delta = 0
if (above < threshold) delta = above - threshold; // Scrolls upwards.
if (below < threshold) delta = threshold - below; // Scrolls downwards.
await addCacheRows(delta); // Prepends (delta < 0) or appends (delta > 0) or appends rows obtained via the fetch API.
}
Your problem is unlikely to resolve with a web worker. Without seeing more code I cannot tell for sure, but I suspect your code to generate new rows is not sufficiently efficient. Remember:
Use DocumentFragment to create the HTML, do not immediately append it to the main DOM tree row by row. Appending elements to a document triggers some recalculations.
Unless this is a LOT of data or requires lots of work serverside, you can immediately start preloading next/previous rows. Keep the promise object and only await it once you need them, that's the simplest way to go around it
Use passive scroll event listener - Firefox even shows a console warning whenever you do not do that
There is no way generating 200 rows of table data should take seconds. Since you use JQuery anyway (really, in 2022?), note that there are plugins for this. I don't remember what I used, but it worked perfect and scrolled smooth with much more data than what you have.
Thank you for your help. I realise it won't be as straightforward as I initally thought (I made some tests with WPF virtualization as well).
Regarding the time it takes to generate the extra rows, I believe it mostly comes from the server. Sure, I can probably load new rows independently of the threshold.
I've never heard about DocumentFragment, but that something I should definitely consider.

After Effects Expressions - controling comps from main comp using an "Expressions Layer"

THE GOAL I WANT TO ACHEIVE:
Control AE timelines using ONE EXPRESSION LAYER (much like using Actionscript) to trigger frequently used comps such as blinking, walking, flying etc... for cartoon animation.
I want animate a the blinking of a cartoon character. (and other actions, explained below) Rather than "re posting" the comp or key frames movements every time I want a blink or a particular action, I want to create a script where I can trigger the Blink comp to play. Is this possible? (Sidenote: A random blink through entire movie would be nice) but I still want to know how to do this for the reasons below.
Ideally: I would like to create an "Expressions layer" in the main comp to TRIGGER other comps to play. At certain points I would like to add triggers to call frequently used comps that contain actions like.. Blinking, Walking, Flying, Look Left and Right etc...
IT WOULD BE AMAZING IF somehow we could trigger other comps to begin, repeat, stop, maybe reverse, and do this all from one Main Comp using an expression layer.
WHY DO IT THIS WAY?
Why not just paste a comp in the spot you want it to play every time you want such action? Well in after effects if you wanted a "blink comp" to play 40 times in two minutes you would have to create 40 layers, or pate the key frames on that comp 40 times. Wouldn't it be awesome to trigger or call it from one one layer when you wanted it from one expressions layer?
We do something like this in Flash using Actionscript all the time. It would be awesome if there was a method out there to achieve this effect. This would be an OUTSTANDING tutorial and I believe it would be very popular if someone did it. It could be used for a MULTITUDE of amazing effects and could save a ton of time for everyone. Heck, help me figure this out and perhaps I will make a tutorial.
Thank you all ye "overflowing Stackers" who contribute! :)
I found the answer and that is...
IT'S NOT POSSIBLE.
After Effects expressions can not control other timelines. Unfortunately you have to put an expression on each layer you want to affect.
The next best solution, and to achieve something close to what I was asking can be found on this link: motionscript.com/design-guide/marker-sync.html
We can only hope that Adobe will someday give the power to expressions like they did with action-script.
HOPEFULLY SOON! Anyone reading this who works for Adobe please plead our case. Thanks
Part 1: Reference other layers in pre-Comps
Simply replace "thisComp" with "comp("ComName")"
To reference Effect-Controllers between compositions, follow the below formula:
comp("ComName").layer("LayerWithExpression").effect("EffectControlerName")("EffectControllerType")
More In-depth Answer: Adobe's Docs - Skip to the Layer Sub-objects part
As I understand the Adobe documentation, only Layers can be accessed,
not footage. What this means is that you will need to create your
expression link utilizing a pre-Comp. Footage can not access this so
that also means no nulls, adjustment layers, etc.
As an added bonus, if you use the essential graphics panel, you can put all the controllers in one pre-comp, but have the controls available no matter which comp you are in. Just select it in the Essential-Graphics dropdown.
Part 2: Start/End based on other layers within pre-comps:
Regarding the next part where you want the expressions to activate based on other compositions, I recommend using the in-out Point expression.
inPoint | Return type: Number. Returns the In point of the layer, in seconds.
outPoint | Return type: Number. Returns the Out point of the layer, in seconds.
If you utilize the start time overrides you can pull this from:
startTime | Return type: Number. Returns the start time of the layer, in seconds.
Alternate Option:
I would recommend avoiding this as the keyframes are basically referenced as an index, so things can get messed up if you add one ahead of a keyframe you were already using - def incorporate some error handling.
Refer to the Key attributes and methods (expression reference) Here
Part 3: Interpolation & Time Reversal
You can time reverse the layer in the rightclick->time, otherwise this is all interpolation expressions like loop out etc - you can loopOut("FOO") a pre-comp if you not only cut it correctly, but also enable time remapping.
then use this to loop those keyframes;
try{ timeStart = thisProperty.key(1).time; duration = thisProperty.key(thisProperty.numKeys).time-timeStart; pingPong =
false; //change to true value if you want to loop animationn back &
forth quant=Math.floor((time-timeStart)/duration);
if(quant<0) quant = 0
if(quant%2 == 1 && pingPong == true){
t = 2*timeStart+ (quant+1)*duration - time;
}
else{
t = time-quant*duration;
}
}
catch(err){
t = time;
}
thisProperty.valueAtTime(t)

Applying more than one sort on a Tridion broker query

I have a broker query where I need to sort by 2 different fields (using JSP and 2011 SP1)
The API has the method "addSorting" which I am applying.
It appears, however, that the second addSorting call is overwriting the first addSorting call - rather than adding the second sort:
// Sort by Date
CustomMetaKeyColumn customMetaKeyColumnDate = new CustomMetaKeyColumn("date", MetadataType.DATE);
query.addSorting(new SortParameter(customMetaKeyColumnDate, SortParameter.DESCENDING));
// Sort by Owner
CustomMetaKeyColumn customMetaKeyColumnOwner = new CustomMetaKeyColumn("owner", MetadataType.STRING);
query.addSorting(new SortParameter(customMetaKeyColumnOwner, SortParameter.ASCENDING));
They sorts work fine individually.
Is this expected? Is addSorting really a setSorting - where only 1 sort can be specified or am I missing a way to combine 2 sorts?
The addSorting method works just fine. However, it simply does not work for CustomMeta columns!!! There is already a confirmed defect regarding this subject with the following summary: "SortParameter does not work with two metadata fields". This is still an open defect for 2011SP1 and is scheduled to be fixed only for the next release.
Cheers,
Daniel.

Dynamics GP Web Service -- Returning list of sales order based on specific criteria

For a web application, I need to get a list or collection of all SalesOrders that meet the folowing criteria:
Have a WarehouseKey.ID equal to "test", "lucmo" or "Inno"
Have Lines that have a QuantityToBackorder greater than 0
Have Lines that have a RequestedShipDate greater than current day.
I've succesfully used these two methods to retrieve documents, but I can't figure out how return only the ones that meet above criteria.
http://msdn.microsoft.com/en-us/library/cc508527.aspx
http://msdn.microsoft.com/en-us/library/cc508537.aspx
Please help!
Short answer: your query isn't possible through the GP Web Services. Even your warehouse key isn't an accepted criteria for GetSalesOrderList. To do what you want, you'll need to drop to eConnect or direct table access. eConnect has come a long way in .Net if you use the Microsoft.Dynamics.GP.eConnect and Microsoft.Dynamics.GP.eConnect.Serialization libraries (which I highly recommend). Even in eConnect, you're stuck with querying based on the document header rather than line item values, though, so direct table access may be the only way you're going to make it work.
In eConnect, the key piece you'll need is generating a valid RQeConnectOutType. Note the "ForList = 1" part. That's important. Since I've done something similar, here's what it might start out as (you'd need to experiment with the capabilities of the WhereClause, I've never done more than a straightforward equal):
private RQeConnectOutType getRequest(string warehouseId)
{
eConnectOut outDoc = new eConnectOut()
{
DOCTYPE = "Sales_Transaction",
OUTPUTTYPE = 1,
FORLIST = 1,
INDEX1FROM = "A001",
INDEX1TO = "Z001",
WhereClause = string.Format("WarehouseId = '{0}'", warehouseId)
};
RQeConnectOutType outType = new RQeConnectOutType()
{
eConnectOut = outDoc
};
return outType;
}
If you have to drop to direct table access, I recommend going through one of the built-in views. In this case, it looks like ReqSOLineView has the fields you need (LOCNCODE for the warehouseIds, QTYBAOR for backordered quantity, and ReqShipDate for requested ship date). Pull the SOPNUMBE and use them in a call to GetSalesOrderByKey.
And yes, hybrid solutions kinda suck rocks, but I've found you really have to adapt if you're going to use GP Web Services for anything with any complexity to it. Personally, I isolate my libraries by access type and then use libraries specific to whatever process I'm using to coordinate them. So I have Integration.GPWebServices, Integration.eConnect, and Integration.Data libraries that I use practically everywhere and then my individual process libraries coordinate on top of those.

VBScript Out Of Memory Error

I have a classic ASP CRM that was built by a third party company. Currently, I have access to the source code and am able to make any changes required.
Randomly throughout the day, usually after some prolonged usage by users, most of my pages start getting an Out of Memory error.
The way that the application is built, is all the pages and scripts pull core functions from a Global.asp file. In that file are embeds to other global files as well, but the error presented shows
Out Of Memory
WhateverScriptYouTriedToRun.asp Line 0
Line 0 is the include for the global.asp file. Once the error occurs, after an unspecified amount of time the error occurence subsides for some time but then begins to reoccur again. With how the application is written, and the functions it uses, and the "diagnostics" I've already done - it seems to be a common used function that is withholding data such as recordset or something of that nature and then not releasing it properly. Other users then try to use the same function and eventually it just fills up causing the error. The only way for me to effectively clear the error is to actually restart IIS, Recycle the App Pool, and Restart the SQL Server Services.
Needless to say, myself and my users are getting annoyed....
I can't pinpoint the error due to the actual error message presented being Line 0 - but from there I have no idea where in the 20K lines of code it could be hanging up. Any thoughts or ideas on how to isolate or at least point me in the right direction to begin clearing this up? Is there a way for me to increase "memory" size for VBScript? I know there is a limitation but is it set at say...512K and you can increase it to 1GB?
Here are things I have tried:
Removing SQL Inline statements into Views
Going through several hundred scripts and ensuring that every OpenConnection & OpenRecordSet is followed by an appropriate Close.
Going through the Global File and commenting out any large SQL statements such as ApplicationLog (A function that writes the executed query into a table).
Some smaller script edits.
Common Memory Leak
You say you are closing all recordsets and connections which is good.
But are you deleting objects?
For example:
Set adoCon = new
Set rsCommon = new
'Do query stuff
'You do this:
rsCommon.close
adocon.close
'But do you do this?
Set adoCon = nothing
Set rsCommon = nothing
No garbage collection in classic ASP, so any objects not destroyed will remain in memory.
Also, ensure your closes/nothings are run in every branch. For example:
adocon.open
rscommon.open etc
'Sql query
myData = rscommon("condition")
if(myData) then
response.write("ok")
else
response.redirect("error.asp")
end if
'close
rsCommon.close
adocon.close
Set adoCon = nothing
Set rsCommon = nothing
Nothing is closed/destroyed before the redirect so it will only empty memory some of the time as not all branches of logic lead to the proper memory clearance.
Better Design
Also unfortunately it sounds like the website wasn't designed well. I always structure my classic ASP as:
<%
Option Explicit
'Declare all vars
Dim this
Dim that
'Open connections
Set adoCon...
adocon.open()
'Fetch required data
rscommon.open strSQL, adoCon
this = rsCommon.getRows()
rsCommon.close
'Fetch something else
rscommon.open strSQL, adoCon
that = rsCommon.getRows()
rsCommon.close
'Close connections and drop objects
adoCon.close
set adoCon = nothing
set rscommon = nothing
'Process redirects
if(condition) then
response.redirect(url)
end if
%>
<html>
<body>
<%
'Use data
for(i = 0 to ubound(this,2)
response.write(this(0, i) & " " & this(1, i) & "<br />")
next
%>
</body>
</html>
Hope some of this helped.
Have you looked at using a memory monitoring tool to see how much memory fragmentation is happening? My guess at a possible cause is that some object of a size is trying to be created but there isn't enough room in the memory to store it as one contiguous chunk. Imagine needing room to store an object that would take 100 MB and while there may be several hundred megabytes free, the largest contiguous chunk is 90MB then this doesn't fit.
Debug Diagnostic Tool v1.1 would be a tool where Bernard's articles may help in understanding how to use the tool.
Another thought is the question of how much string concatenation is there in the code? I remember where I used to work had problems with doing a lot of string concatenation operations that sucked up memory that may be another idea to consider.
Yeah, I could see some shock at that kind of number the first few times you see it but then if you understand what the code is doing it may make sense for why so much space gets reserved right off the bat at times.
I haven't used that debug tool specifically but I did have a tool that took a snapshot of memory when pages were hung so I couldn't tell if there was a performance impact of the tool or not. Course in my case I used a similar tool in 2004 so it has been a few years since I've had to research this kind of issue.
Just going to throw this in here, but this problem has taken a long time to solve. Here's a breakdown of what we did:
We took all the inline SQL and made SQL Views, every SELECT statement is now handled with a VIEW first.
I took every single SQL INSERT and UPDATE (as much as I could without breaking the system) and put them into Stored Procedures.
#2 was the one item that really made the biggest difference
Went through several thousand scripts, and ensured that variables were properly disposed of, and all the DB Open Connections were followed correctly with a Close Connection and same with Open/Close RecordSet.
One of the slow killers was doing something like:
ID = Request.QueryString("ID)
at the top of the page. Before redirecting, or closing a page, there is always a:
Set ID = Nothing
or the complete removal of the inference.

Resources