Manage more than 1000 active Team Projects in TFS2010 or TFS2012 [closed] - scaling

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Can TFS2010 or TFS2012 manage more than 1000 active Projects?
Now we use the QualityCenter each Project is a SQL-DB and about 100 User Access per day.
I found the TFS2010 Limits in the "Scaling Team Foundation Server 2012 Whitepaper" documentation
TFS 2010 Limits
200 Team Projects per Team Project Collection
50 – 200 Active Team Project Collections per SQL Instance (range for 8GB – 64Gb of RAM)
So it is possible to create 10 Collections, each Collections should have 100 Team Projects integrated? So what amount of RAM do i need?
If I have 200 Team Projects in a Collection it that the same when I have 200 Collections? So the amount of required RAM is the same?
Does somebody have experiences with these amout of projects

A project collection equals to a database. So having 200 projects in one Project Collection is definitly not the same as having 200 project collections with one project. In the first case, you'll have one big database. In case of the second you'll have 200 databases.
As for scaling, the rules are pretty clear, you can have up to 200 projects in a project collection.
Depending on the size and activity in each project collection, you can have between 50 and 200 project collections on a SQL Server instance (that doesn't mean on a TFS Instance!). Especially the activity greatly influences the amount of memory needed and thus is the greatest factory for decidign the number of projectcolelctions. 200 stale collections are easy to maintain. But 75 very active ones might be the limit in your case.
You can have any number of SQL Server instances linked to your TFS server. After creating your Project Collection you have the ability to move it to another server for the SQL Server hosting. (See http://healthedev.blogspot.nl/2011/12/move-tfs-2010-project-collection.html).
When you're talking about these kinds of sizes in databases, make sure that the TFS Reporting is installed on it's own box(es). And that the TFS Warehouse refresh frequecies make any sense (don't do every minute updates with these amounts of data)... and make sure the TFS_Warehouse and the Analysis cube each have a SQL instance that has plenty of memory and CPU to process properly.

Related

Updating SQLite DB with MSI?

We have a product with 3 main components
1) a client application
2) a network server
3) Datasets, mostly containing (read-only - from the customer's perspective) documents stored in BLOB fields within a SQLite DB
The client application can access datasets stored directly on that machine (many users are on non-networked laptops) or via a the network server.
The data needs to be updated from time to time - the whole datasets can be several GB, so for updates we wish to only send out those documents that are new or have been revised. A patch in a sense. Our customers tend to like MSIs to incorporate in their own distribution strategies. Some are adamant about accepting nothing else.
How feasible is it to update a SQLite DB via MSI (without a complete overwrite of the DB file)?
I have 2 strategies in mind but both have drawbacks
1) the MSI installs some files on to customer machine (workstation or server) and when the client or the server software detects them it run some sort of DB merge.
2) The MSI accesses functions in a custom DLL (I'm not an MSI expert, so I don't even know if this is possible) to merge new content into DB. I suspect custom DLLs really break the point of repackagability of MSIs.
This is far from my are of expertise, so can anyone suggest potential solutions?
Thanks for your time

Lowest level database support for a free website

I am looking to create a small personal web site using Azure web sites free tier. As part of the project I would add minimal database support. This is a hobby project so I want to stay within the constraints of the free web sites.
Here are some of the options I am considering:
SqlLite
RavenDb (RavenDB Asp.Net Hosted or Embedded)
CouchDb (don't think this will work but I have not investigated enough)
.sdf file
Short of using XML files, is there a good database option for the free web site tier?
When you create an Azure website you get a 20 Megabyte SQL Server database for free. Granted, that is not much, but if you are considering SqlLite, perhaps that is enough.
The small print from their website.
Free and Shared (Preview) tiers include 60 minutes and 240 minutes
of CPU capacity per day, respectively.
These quotas are per
sub-region unless noted otherwise.
One 20MB Azure SQL Database and
one 20MB MySQL database are available at the subscription level for
the first twelve months of use; standard rates apply thereafter.

Using same cloudControl MySQLd addon with multiple apps [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
It is unclear to me how cloudControl MySQLd addon works.
My understanding of MySQLd is that it is a MySQL server that can/will work with unlimited apps.
But since all addons are only app based, this could also mean that I cannot use the same MySQLd server on multiple apps.
Could anyone please help me understand if one MySQLd instance can be used with multiple apps hosted on cloudControl?
There are two concepts on the cloudControl PaaS. Applications and deployments. An application is basically just grouping developers and deployments together. Each deployment is a distinct running version of the app from a branch matching the deployment name. More details on this can be found in the Apps, Users and Deployments documentation.
All add-ons are always per deployment. We do this because this way we can provide all credentials as part of the runtime environment. This means you don't have to have credentials in version controlled files. Thich is a huge benefit when merging between branches, because you don't risk accidentally talking to e.g. the live database from a dev deployment. Also add-on credentials can change at any time at the add-on providers discretion.
For this reason separation between deployments makes a lot of sense. Usually your dev deployments also don't need the same database power as the production deployment for example. So you can easily use a smaller plan or even a shared database (e.g. MySQLs) for development. You can read more about how to use this feature inside your code in the Add-on documentation.
Also as explained earlier, add-on credentials are always provided as part of the runtime environment. Now credetials can change at any time at the add-on providers discretion. These changes are automatically provided in the environment and the app processes restarted. If you had hard coded the credentials as would be required for the second app, this would mean the app would probably experience downtime.
Last but not least, it's usually very bad practice to connect to the same database from two different code bases in different repositories, which would be the reason to have a second app. This causes all kinds of potential conflicts and dependencies that make code changes and database migrations extremely hard to maintain over time. The recommended way would be to have the data owned by one code base only and provide an API to access that data from the second code base.
All this being said, it is technically possible to connect multiple deployments or even apps to the same add-on (database or anything else) but highly advised against.
If you have a good reason to connect two apps/deployments to the same database I would suggest you manually launch an RDS instance at Amazon (MySQLd is based on RDS) and provide credentials for that through the custom config add-on to both of your apps/deployments.
I hope this answers your question and also explains the reasons.

Windows Azure Can I run multiple WebSites on the same Extra small instance or Small instance [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I'm evaluating MS cloud Windows Azure for hosting 3 completely separated websites.
Every website has its own database and they are not connected, so 3 websites and 3 databases.
My aim is to optimize costs for a start-up project with the possibility to scale up on demand.
I would like to know:
If is possible to host 3 websites on the same instance (Extra small instance or Small instance).
if is possible to host 3 databases on the same Sql Azure database (so I would use the total amount of SQL storage for my 3 databases) or for each website database I have to pay an instance of SQL Azure.
Thanks for your time on this.
You can absolutely run multiple web sites on the same instance, starting with SDK 1.3, as full IIS is now running in Web Roles. As Jonathan pointed out with the MSDN article link, you can set up the Sites element to define each website. You should also check out the Windows Azure Platform Training Kit, which has a lab specifically around building a multi-site web role.
You can also take advantage of something like Cloud Ninja or Windows Azure Accelerator for Web Roles, which provides a multi-tenant solution that you can load into your Web Role (check out the Cloud Cover Show video here for more info).
When hosting multiple websites, remember that they're all sharing the same resources on an instance. So you might find that an Extra Small instance won't meet your performance needs (it's limited to 768MB RAM and approx. 5Mbps bandwidth). I think you'll be fine with Small instances and scaling out as you need to handle more traffic.
For the past several months, I've been running three websites on a pair of extra small instances, including albahari.com, linqpad.net and the LINQPad licensing server (which uses LINQ to SQL). The trick is to serve large static content directly from blob storage so that it's not subject to the 5MBit/second I/O bandwidth restriction. And I've never got anywhere close to running out of memory.
A pair of extra small Azure instances is a great alternative to shared hosting when you need better reliability, security and performance.
Edit: close to a year now, still no problems with multiple websites on Azure. I will never go back to shared hosting.
You can definitely run 3 websites in the same instance. Check out this MSDN article that shows you how to form your configuration file such that you can host multiple websites within a single role. One thing to note though since you mentioned "scaling on demand" - when you scale an instance with multiple websites, you are scaling the instance, which means all of the sites will scale together. You can't scale just one of the sites on the shared instance.
For the databases, in theory this can be done, but it would be "manual" in that you would have to all of your tables across the three databsaes in the same database and you would probably want to prefix them with some sort of indicator so that you know which table belongs to which application. This is certainly not a recommended practice, but if it works for your solution, then there is nothing technical preventing you from doing it. If at all possible, I would recommend multiple databases.

ASP.NET deployment and regulatory compliance (SOX, et al) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I have a customer who is being dogged pretty hard by SOX auditors regarding the deployment practices of our ASP.NET applications. Care is taken to be sure to use appropriate file- and folder-level security and authorization. Only those few with deployment privileges can copy an up to the product server (typically done using secure FTP).
However, the file/folder-level security and the requirement of secure FTP isn't enough for the bean counters. They want system logs of who deployed what when, what version replaced what version (and why), and generally lots of other minutiae designed to keep the business from being Office Spaced (the bean counters apparently want the rounded cents all to themselves).
What are your suggestions for making the auditors happy? We don't mind throwing some dollars at this (in fact, I think we would probably throw big dollars at a good enough solution).
You probably want to look at an automated deployment solution and you are going to need a formal change control process. We use anthill pro. It can track what version and when it was deployed.
To satify sox we had a weekly meeting of what was getting deployed when. It had to be approved by compliance manager and each deployment needed to have a form filled out explaining what, why and how something was being changed. Once the form was filled out a third person had to be involved (not the person requesting or approving, neither of them can have access to the production environment, because of the seperation of duties rule you have to follow) to make the change and the change was based off of what was in the "change document" no outside communication from the person making the request. Once deployed, all people had to sign off that it was done and when.
It shouldn't be too hard to meet the requirements, it might require some changes to your development processes but it's definately possible.
What you need is:
A task tracking system, showing descriptions of work, and approvals
The ability to link documents, as well as packages to this system.
A test system to test your deployments onto.
Finally all deployments must be done via installation packages, and other scripted means.
Any manual changes must be documented and approved too.
Also turn on auditing, run regular security tests, and document almost everything.
All of this is possible with a number of systems, the biggest change is the changes to your internal processes.
You might want to take a look at the auditing features provided by NTFS.

Resources