How to update SQL Server database every 1 minute? - asp.net

I have a SQL Server database which contains stock market quotes and other related data.
This database needs to be updated at regular interval, say 1 minute.
My question is:
How do I get stock quotes every 1 minute and update it to database?
I really appreciate your help.
Thanks!

You know, you seriously put the question from the wrong side. Like "I have a car, Mercedes, Coupe - how can I find the best road from A to B". Totally unrelated to the car.
Same with your question - this is not a sql or even an asp.net question to start with. The solution is independant of both, the sql server used and your web technology. Your main question is:
How do I get stock quotes every 12 minute and update it to the database?
Here we go. I assume you (a) talk of US stocks and (b) mean all of them, not a handfull.. 1 minute is too small an interval to make scanning things like yahoo.com feasible - main problem here is that there are tousands of stocks (actually more in the tens of thousands), and you dont want to go to yahoo scrapping thousands of pages per minute.
Same time, a end retail user data feed provider will not work. They support X symbols at a time, and x being typcially in the low hundred area, sometimes upgradable to 500 or so.
If you need STOCK DATA every minute, as per all US stocks, then this is technically identical to "real time prices", which ends up costing money. In adition you need a commercial higher end data feed of which I know of... one. Sorry. Costs going to be near or full four digit, without (!) publication rights.
And that is NxCore - their system has a data offer that offers US Stocks (all exchanges) real time, complete feed with all corretions etc. Native and C# wrapper API, so you can take the real time data feed, update your current pricing in memory and write them out to sql server every minute. Preferably not from asp.net (baaaaad choice for something that should run 24/7 without interruption unless you do heavy setup changes etc.) but from an installed windows service. Takes some bandwidth - no real idea how much (I am getting 4 exchanges from them, but no stocks, only the cme group futures, CME, CBOT, NYMEX and COMEX).
Note that wwith this setup you can go faster, too, but if you go fully real time you need a serious server. We talk of a billion updates or so per day...
End user sql server setup (i.e. little ram, and few slow discs) wont work.
Too expensive? Ther are plenty of data feeds around for a lower price, but they will not give you "stocks" as in "all of them", just "a selection".
If you are ok with not real time data - i.e. pulling stuff down at the end of the day, eoddata.com has a decent offer. YOu could also thnen pull things up via an asp.net page, but again.... you will not have the data during the day, just - well - after close. Smallest granularity is 1 minute. Repluublication rights again a no - but probably you can talk to them.

This isn't really SQL Server specific; a typical solution is that your run a process that polls an external source (a web service or the like) at regular intervals and uses this information to update the database. You can either implement this as a simple command-line program that gets executed every minute from the task scheduler, or you can make it a windows service that sleeps most of the time and only wakes up once a minute to do its processing. Once you have that, writing to the database is as usual.

Related

Azure Time Series Insights Gen2 slower than preview?

We have a couple of environments still running Time Series Insights Preview version. This is really fast and we are really satisfied by it. However, new environments really seem a lot slower with the official release. Warm path extraction is a lot slower, but still doable, while cold path extraction becomes unbearable.
EDIT: We need to add &storeType=WarmStore if we would like to query warm data. Cool! This works really fast again! Question about cold store still persists:
It is hard to compare the different environments, because the datasets are not exactly the same, but for our new environment we have about 4.5 TB sensor data imported in TSI.
The following screenshot shows a query that tries to retrieve one minute of data for one device (each device only sends data each 10 seconds) in the far past of 2018. However, the server returns the call after 30 seconds with a continuationtoken, saying it couldn't retrieve all the 6 values in time. Sometimes it manages to return all 6 of the values, but it still takes 30 seconds.
My internet download speed, while performing the query, was over 80 Mb per second, so that shouldn't be an issue either.
Is this something we should be worried about in the new release?
please submit a support ticket through the Azure portal with all of these details and the product team will investigate.

total registered vs. concurrent users

Is there a proper way, equation or technique in general to say, "My web application needs to support N number of total users which via this equation/technique/rockHardExperience tells me that I need to support X number of concurrent page requests"?
From my research and/or gut feeling it seems like it would be something like:
totalLoadCapabilityRequired = (totalUsersN x .10 ) * .5
where .10 is for roughly 10% on at any given time
and the whole thing multiplied by 50% to suggest a 50% chance of those total users online executing a request at roughly the same time
any insights would help me in making sure I implement support in my application that is on par for the demand. I expect a lot of users but don't want to over anticipate too early. I know for starters that the org I am programming for will have 45,000 users that they want to use my system, with an anticipation on success for many more.
Here's a couple of things to think about:
What's the time span in which you expect the bulk of your visits? If it's an office application within the same physical company your capacity planning should be based on an 8 hour period. If most visits will come from the same continent you can plan for a 12 hour period instead, etc. Base your visitor spread on that.
Which pages do you anticipate will be the most popular and how heavy are those pages (i.e. how many pages can you load in one second)? Get an understanding of parts that would benefit from caching to squeeze out more performance.
Don't plan based on peak load; design your app to scale and start small.
Design your app in a way that you can take run snapshots at every 500th request; you can use tools like xhprof to create files that you can run through cachegrind tools to analyze the performance as it runs.
In short, there's no catch-all formula :) for a ballpark figure your formula will probably be good enough, but take the above points in consideration.

Google Analytics API recommends running queries on dates that are at least 48 hours in the past for consistent results

It means I can't see traffic i got today. Also is it only specific to API or Overall Analytics System?
The reason Google recommends this is because for most of the data, there is about a 24 hour delay before you see it in reports or have it available for pulling with the API. The extra 24 hours on top of that is a buffer for insurance.
So if you look at a report or pull data with the API from like 12 hours ago, and then wait an hour or whatever and pull the data with same ranges/metrics/etc... the numbers won't match up, because by then, more data will have become available. But it's data that was already there (people didn't take a time machine into the past and visit your site, obviously)...it was just not yet processed and available for looking at through the report/API.
A delay in data for reports (or through an API) is not unique to GA. Different reporting tools have different "lags" in data availability, depending on how their databases are setup, how they process the data, how much you are paying for the services, etc... for instance (these are the 4 major tools I've used):
Yahoo Web Analytics data is more or less real-time
Adobe/Omniture SiteCatalyst is..they say real-time but in practice I've seen it take anywhere from instant to an hour
WebTrends has a 24 hour delay
GA has a 24 hour delay
But this isn't as big a deal as you might think. Most companies look at reports by the week, month, quarter, year, so really the delay isn't a problem for the people that matter. The only people that really feel it are the code implementers who have to sit there and wait to see data come in when they are trying to QA an implementation or debug when there is a potential problem.
But even then there are a lot of tools out there that let you see in real-time what is physically being sent to the tool (like firebug, charles proxy, etc...), which greatly helps in QAing. It doesn't really help as far as QAing stuff that requires settings/alterations within the tool's interface, but still, it's a big help.

asp.net web site developer pricing

i have been approached to build some websites for a few small businesses. They want a basic out of the box database driven website with some standard stuff (users, authentication, a few dynamic pages, etc). i am going to use asp.net mvc for this.
they have asked me how much i charge for this. my question, is that i have no frame of reference here. should i charge for the project a flat fee or a per hour charge. where do i start here to help determine correct pricing for a website project.
Charge an hourly fee that is about 3x the hourly rate you would command in a full-time job. The 3x multiplier basically evens things out for the benefits, etc. that you won't get as a 1099 employee.
Whatever you do, no matter how "Standard" it sounds. Do not charge a flat fee. Under that arrangement they have no incentive to curb feature creep. Even if you agree to a really tight spec up front, it is a recipe for disaster because it forces you to renegotiate every time they want something more. Under an hourly arrangement feature-creep works to your advantage.
Also, don't discount the hourly rate if you are a novice. Just don't bill unproductive hours. It is much easier to ease into billing more hours later than renegotiating the price per hour.
Charge per hour.
-- edit
So attempt to 'quote' it by estimating the number of hours. Make sure your estimate is conservative.
A nice approach is, in your head, consider the 'min', 'max', 'standard' type of time. Then use that to estimate the real time it will take you.
If you know that they know what they want and won't change the specs on you, go for a lump sum. That way you can work quickly.
If they are prone to change their minds and don't know what they want, go for an hourly fee. That way you won't be stuck working on their project for months without additional pay when they can't decide on exactly what they want.
I admit that I don't know much about this issue. However, I would still like to warn about the whole charge-per-hour mentality. While this approach basically protects the developer, it doesn't work well with the business owner:
Charge-per-hour, to the business owner, is a liability, whereas fix-price is just a cost. That's one.
The second thing is, if you are charging per hour, how are you going to justify your "research time"? Are you going to charge that as well? But business owner doesn't like to pay for research time. Or you can stick to your old trick and do something that has been reinvented N times and charge for the amount of time you spend. But that would seem unethical to some.
I have billed both by the hour and by the project. It's been my experience that customers are happier with project based billing instead of hourly billing.
With that in mind, I always pad the project cost by an amount I feel will cover the times when the client decides to change their mind. Further, I keep the project plan pretty simple. For example, I don't write 4 pages on how the login screen will work. Instead, It's a single bullet point: "Login Page". This allows both them and I a little flexibility.
Because I keep things simple AND I allow time for flexibility AND the clients know how much it's going to cost up front, my client's are happier and I can keep better track of my income. Also, I keep in pretty close contact with them. As long as you can keep the relationship good, you'll have a long term client.
Of course, it takes a bit of self discipline in combination with experience to know how long things take to build. Along these lines I never experiment on a client's dime. When I write the proposal, I already know what I'm going to use to get the job done and I've used those tools before. Because of this I can say with confidence that a login page will take a certain amount of time to put together.
Next, don't bite off more than you can chew. If it's a big project, break it up into smaller deliverables with their own pay schedule. That way the client (or you) can decide to walk away at any point. For example, if you think the project will take 3 months, break it up into 3 pieces. Incidentally, this helps with cash flow.
Finally, don't discount your time when getting started. That scares people.
I have a flat rate I charge for sites and outline exactly what they will get and then anything beyond that outline gets charged at an hourly rate. The hard part of this is if your getting into a project where your not sure how long it will take then you might want to break down the various pieces and then add at least 10 hours to that estimate. You don't want to sell yourself short but you also don't want to overcharge the customer. Be sure your clear up front that once the site is delivered then all changes are per hour or based on a maintenance fee structure.
Good luck.

ASP.Net + SQL Server Design/Implementation Suggestions

I'm constructing a prototype site for my company, which has several thousand employees and I'm running into a wall regarding the implementation of a specific requirement.
To simplify it down with an example, lets say each user has a bank account with interest. Every 5 minutes or so (can vary) the interest pays out. When the user hits the site, they see a timer counting down to when the interest is supposed to pay out.
The wall I'm running into is that it just feels dirty to have a windows service (or whatever) constantly hitting the database looking for accounts that need to 'pay out' and take action accordingly.
This seems to be the only solution in my head right now, but I'm fairly certain a service running a query to retrieve a sorted result set of accounts that need to be 'paid out' just won't cut it.
Thank you in advance for and ideas and suggestions!
Rather than updating records, just calculate the accrued interest on the fly.
This sort of math is pretty straightforward, the calculations are very likely to be orders of magnitude faster than the continuous updating.
Something like the following:
WITH depositswithperiods AS (SELECT accountid, depositamount,
FLOOR(DATEDIFF(n, deposit_timestamp, GETDATE()) / 5) as accrualperiods, interestrate
FROM deposits)
SELECT accountid, sum(depositamount) as TotalDeposits,
sum( POWER(depositamount * (1 + interestrate), accrualperiods) ) as Balance
FROM
depositswithperiods
GROUP BY accountid
I assumed compounded interest above, and no withdrawals.
The addition of withdrawals would require creating a group of deposits for each time period, taking the sum of those to get a net deposit for each time period, and then calculating the interest on those groups.
I don't know if the interest analogy will hold for your actual use case. If the database doesn't need to be kept up to date for all users at all times, you could apply the AddInterest operation multiple times at once when you need an up-to-date value. That is, whenever the value is displayed, or when the account balance is about to change.
You could do a single nightly update for all accounts.
A good thing to think of when doing this kind of thing is DateTime.
If you are charged 10 pence a minute for a phone call, there isn't a computer sitting there counting every second and working out minutes... It just records the date/time at the start, and the datetime at the end.
As others suggest, just calculate it when the user tries to view it.

Resources