I am trying to get my total balance of coins that are issued by me i.e.
( Node A ) in the entire corda ecosystem/network of nodes, is there an easy method to get all coins .issue minus .exit that was generated by me? I have thought of two possibilities for workarounds but do not like the designs for both:-
take the transaction.snapshot and loop through the whole list to obtain the transactions that was self-issued ( cash.state & command.issue ) minus off the command.exit for my own vault, however I don't prefer this approach due to the number of records to go through and the looping mechanisms in place, adding on the pagination aspect to continually loop or to extend the page problem.
query all nodes for the current balance including myself and do a summation of total balance. which would be similar to link, however this would lead to misleading total balance if any other party in the network self-issued some cash of the same currency, adding on the subflow of ReceiveStateAndRefFlow have no timeout session where it would wait for a node to come alive indefinately.
Any advices/comments for this issue? Responses are greatly appreciated
One other way might be to create a balance state who's sole purpose is to keep a running count of issuances and exits - every time you issue new cash/exit cash, you would update the balance state.
This should then serve as a quick reference point as to how much cash there is.
Related
I am struggling to come up with a formula that fits certain criteria and was hoping someone with a better math brain than me might be able to help. What I have is a Google Sheets based tool that determines how much a someone has purchased of a product and then calculates the amount of times a special additional offer will be redeemed based on the amount spent.
As an example, the offer has three tiers to it. Though the actual costs will be variable for different offers let's say the first tier is gained with a $10 purchase, the second with a $20 purchase and the third with a $35 purchase (the only real relationship between the prices is that they get higher for each tier but there is no specific pattern to the costing of different offers). So if the customer bought $35 worth of goods they would get three free gifts, if they bought $45 worth they would get 4 and then an additional spend of $5 (totaling $50) would then allow them to redeem 5 gifts in total. It can be considered like filling a bucket, each time you hit the red line you get a new gift, when the bucket is full it's emptied and the process begins again.
If each tier of the offer was the same cost (e.g. $5, $10 and $15) this would be a simple case of division by the total purchase amount but as there is no specific relationship between the cost of the tiers (they are based on the value of the contents) I am having trouble coming up with a simple 'bucket filling' formula or calculation method that will work for any price ranges given to it. My current solution involved taking the modulus, subtracting offer amounts from the purchase amount etc. but provides plenty of cases where it breaks . If anyone could give me a start or provide some information that might help in my quest I would be highly appreciative and let me know if my explanation is unclear.! Thanks in advance and all the best
EDIT:
The user has three tiers and then the offer wraps around to the start after the initial three are unlocked once, looping until the offer has been maxed out. Avoiding a long sheet with a dynamic column of prices would be preferable and a small, multicell formula would be ideal
What you need is a lookup table. Create a table with the tier value in the left column, and the corresponding number of gifts for that tier value in the right column. Then you can use Vlookup to match the amount spent to correct tier.
I am not quite sure about, everything into one entire formula(is there a formula for loop and building arrays?)
from my understanding the tier amounts are viable, so every time you add a new tier with a new price limit then it must be calculated with a new limit price number...wouldn't it be much easier to write such module in javascript than in a google sheet? :o
anyways here is my workaround, that may could help you to find an idea
Example Doc
https://docs.google.com/spreadsheets/d/1z6mwkxqc2NyLJsH16NFWyL01y0jGcKrNNtuYcJS5dNw/edit#gid=0
my approach :
- enter purchases value
-> filter all items based by smaller than or equal "<=" (save all item somewhere as placeholder)
-> then decrease the purchases value by amount of existing number(max value) based on filtered items
-> save the new purchases value somewhere and begin from filtering again and decreasing the purchases value
(this needs to be done as many times again, till the purchases is empty)
after that, sums up all placeholder
We have a very active web page which has lots of ajax and regular updates via jquery. It can load a huge amount of data (< 100k per minute) every user in peak situations and we had 2,000 people online during the last peak.
What we would like to do is count the number of concurrent users. If over 500 (and not a registered user) then bad luck, hit the road!
Has anyone got a class or some other process? Our server recycles every hour so I am thinking of an application level variable that adds one to the current count if successful (gold users are exempt from the test but are added to the quota so we may have 600 users).
Has anyone else played with this idea?
TIA
Just some ideas...
application.lock()
application('visitors') = application('visitors') + 1
application.unlock()
You should stress-test this solutions up to the numbers you want to allow. It will probably work is my fair guess.
Consider counting the ajax url page instead, that gives a more accurate estimate of the load. When going for session's you will not know when I've left. Counting via the Ajax line gives a more accurate number of visitors.
Just suggestion: in GLOBAL.ASA on Session OnStart you could increase count of running Sessions in some global (Application) variable.
Do not forget to decrease it in GLOBAL.ASA on Session OnEnd
We've noticed lately that as our site is growing, our data in Google Analytics is getting less reliable.
One of the places we've noticed this most strongly is on the "Realtime Dashboard".
When we were getting 30k users per day, it would show about 500-600 people on line at a time. Now that we are hitting 50k users per day, it's showing 200-300 people on line at a time.
(Other custom metrics from within our product show that the user behavior hasn't changed much; if anything, users are currently spending longer on the site than ever!)
The daily totals in analytics are still rising, so it's not like it's just missing the hits or something... Does anyone have any thoughts?
The only thing I can think of is that there is probably a difference in interpretation of what constitutes a user being on line.
How do you determine if the user is on line?
Unless there is an explicit login/logout tracking, is it possible that it assumes that a user has gone if there is no user generated event or a request from the browser within an interval of X seconds?
If that is the case then it may be worth while adding a hidden iframe with some Javascript code that keeps sending a request every t seconds.
You can't compare instant measures of unique, concurrent users to different time-slices of unique users.
For example, you could have a small number of concurrent unique users (say 10) and a much higher daily unique users number like 1000, because 1000 different people were there over the course of the day, but only 10 at any given time. The number of concurrent users isn't correlated to the total daily uniques, the distribution over the course of the day may be uneven and it's almost apples and oranges.
This is the same way that monthly unique and daily uniques can't be combined, but average daily uniques are a lower bound for monthly uniques.
I have a SQL Server database which contains stock market quotes and other related data.
This database needs to be updated at regular interval, say 1 minute.
My question is:
How do I get stock quotes every 1 minute and update it to database?
I really appreciate your help.
Thanks!
You know, you seriously put the question from the wrong side. Like "I have a car, Mercedes, Coupe - how can I find the best road from A to B". Totally unrelated to the car.
Same with your question - this is not a sql or even an asp.net question to start with. The solution is independant of both, the sql server used and your web technology. Your main question is:
How do I get stock quotes every 12 minute and update it to the database?
Here we go. I assume you (a) talk of US stocks and (b) mean all of them, not a handfull.. 1 minute is too small an interval to make scanning things like yahoo.com feasible - main problem here is that there are tousands of stocks (actually more in the tens of thousands), and you dont want to go to yahoo scrapping thousands of pages per minute.
Same time, a end retail user data feed provider will not work. They support X symbols at a time, and x being typcially in the low hundred area, sometimes upgradable to 500 or so.
If you need STOCK DATA every minute, as per all US stocks, then this is technically identical to "real time prices", which ends up costing money. In adition you need a commercial higher end data feed of which I know of... one. Sorry. Costs going to be near or full four digit, without (!) publication rights.
And that is NxCore - their system has a data offer that offers US Stocks (all exchanges) real time, complete feed with all corretions etc. Native and C# wrapper API, so you can take the real time data feed, update your current pricing in memory and write them out to sql server every minute. Preferably not from asp.net (baaaaad choice for something that should run 24/7 without interruption unless you do heavy setup changes etc.) but from an installed windows service. Takes some bandwidth - no real idea how much (I am getting 4 exchanges from them, but no stocks, only the cme group futures, CME, CBOT, NYMEX and COMEX).
Note that wwith this setup you can go faster, too, but if you go fully real time you need a serious server. We talk of a billion updates or so per day...
End user sql server setup (i.e. little ram, and few slow discs) wont work.
Too expensive? Ther are plenty of data feeds around for a lower price, but they will not give you "stocks" as in "all of them", just "a selection".
If you are ok with not real time data - i.e. pulling stuff down at the end of the day, eoddata.com has a decent offer. YOu could also thnen pull things up via an asp.net page, but again.... you will not have the data during the day, just - well - after close. Smallest granularity is 1 minute. Repluublication rights again a no - but probably you can talk to them.
This isn't really SQL Server specific; a typical solution is that your run a process that polls an external source (a web service or the like) at regular intervals and uses this information to update the database. You can either implement this as a simple command-line program that gets executed every minute from the task scheduler, or you can make it a windows service that sleeps most of the time and only wakes up once a minute to do its processing. Once you have that, writing to the database is as usual.
I'm constructing a prototype site for my company, which has several thousand employees and I'm running into a wall regarding the implementation of a specific requirement.
To simplify it down with an example, lets say each user has a bank account with interest. Every 5 minutes or so (can vary) the interest pays out. When the user hits the site, they see a timer counting down to when the interest is supposed to pay out.
The wall I'm running into is that it just feels dirty to have a windows service (or whatever) constantly hitting the database looking for accounts that need to 'pay out' and take action accordingly.
This seems to be the only solution in my head right now, but I'm fairly certain a service running a query to retrieve a sorted result set of accounts that need to be 'paid out' just won't cut it.
Thank you in advance for and ideas and suggestions!
Rather than updating records, just calculate the accrued interest on the fly.
This sort of math is pretty straightforward, the calculations are very likely to be orders of magnitude faster than the continuous updating.
Something like the following:
WITH depositswithperiods AS (SELECT accountid, depositamount,
FLOOR(DATEDIFF(n, deposit_timestamp, GETDATE()) / 5) as accrualperiods, interestrate
FROM deposits)
SELECT accountid, sum(depositamount) as TotalDeposits,
sum( POWER(depositamount * (1 + interestrate), accrualperiods) ) as Balance
FROM
depositswithperiods
GROUP BY accountid
I assumed compounded interest above, and no withdrawals.
The addition of withdrawals would require creating a group of deposits for each time period, taking the sum of those to get a net deposit for each time period, and then calculating the interest on those groups.
I don't know if the interest analogy will hold for your actual use case. If the database doesn't need to be kept up to date for all users at all times, you could apply the AddInterest operation multiple times at once when you need an up-to-date value. That is, whenever the value is displayed, or when the account balance is about to change.
You could do a single nightly update for all accounts.
A good thing to think of when doing this kind of thing is DateTime.
If you are charged 10 pence a minute for a phone call, there isn't a computer sitting there counting every second and working out minutes... It just records the date/time at the start, and the datetime at the end.
As others suggest, just calculate it when the user tries to view it.