Multiple Resources & Reporting - ms-project

I have a question about creating a custom report in Project 2013. We currently have projects going on that have 3 or 4 project plans with all interconnected dependencies on each other. When I bring all of the plans together and try to run a resource report, I run into an issue. Many of our tasks have 2 or more resources connected to them and when I group by resource, it does not break them out. I have tried using the "Resource Usage" view, but need to have all of the project plans open to accomplish creating a nice looking report and even then it is not very nice looking. I am looking to create a report that breaks out what each individual resource has assigned to them over the next 2-3 weeks. It is more a report to hand to our business leads to says here is what we need from your teams and here's what they are assigned to. I am looking to break it out like this:
By Resource - listing only one person (not the large group of people if 2+ are assigned to a task), list only the hours that they have allocated to the task, and move on.
Is there a way to do this?

The non code or VBA solution is to use a resource pool instead of having separate lists of resources in each file. Once the resources are pooled (shared) with a single file, all resource reporting can be done against that file. There are a lot of caveats that go along with doing it this way though.

Related

Getting a handle on a huge classic asp project

I've been working with an ASP Classic site with over 500 files, some of which aren't used and some of which are; along with a database with hundreds of procs and functions and tables, in the same shape.
I need a way to get a grip on the project so I can eventually migrate it. I don't have time yet to walk through every single page and look at the SQL (stored procedures are in the database and are called properly within the ASP pages), so I'm at a loss as to how to get a handle on this.
My immediate thought is to make ASP classes and put them into the pages as I go - they'd pretty much be used for getting and setting fields, validation, and sending recordsets into display functions.
Is this a reasonable approach? Am I missing some strategy?
How would you approach this? A migration to another platform at this point is considered, but not feasible for the short term (next couple of months)
You can try to compile the project using http://aspclassiccompiler.codeplex.com/ or you can migrate to ASP.net MVC one page at a time (when needed) and using a mix of both in the meantime.
My simple advice is stop think about code. Spend more time with the UI actually using it and spend time examining in detail the database schema.
Edit
If you are trying to determine what pages are active then use IIS logging to harvest distinct pages hit. Also do some scripting to collect the names of files and text search the files in the site looking for any occurance of those files. This info should identify parts of the site that are rarely used or dead.
However in all probability there will considerable content in the "active" files which are also dead. Let me re-iterate do not actually to add classes or refactor the code at this stage you should concentrate on understanding what it does not how it does it. Understanding the DB Schema is a vital step and then understanding what UI interactions bring about specific changes in the DB.

application to list the page elements of an url

I need to make an application which will access an URL(like http://google.com) and return the time spent to load all elements(images, css, js...) and compare this results with the previous results.
This application need to be a Desktop app, and I will save the informations in a text file ou xml, and use this file do compare with previous results.
I have searched for a similar application, but nothing...
There are some plugins for firefox that list these elements, like Yslow or Firebug, but not what I need.
So, i'm totally lost and I don't know how to start this work?
Exists the possibility of make this application? What language is better for this type of application?
Thks!
This is a very objective question, so without you elaborating more on your requirements, you may not get any useful answers.
Some things you would need to answer are: how many URLs you want to check, where are you wanting to store the results (database, files etc), does it need to run on the desktop or on a server etc.
Personally, I like the statistics that cURL gives you - DNS time, connect time, receive time etc - so you could write something in PHP, but as I stress that is personal preference and may not suit your situation.

Need help with a large ASP.NET application

We have an ASP classic ERP (very large application) that we want to rewrite using ASP.NET.
I am looking for a way to organize the application so we are going to be able to separate every program / webpage (over 400) from each other. Every program needs to be independent because many developers will work on the project at the same time.
Visual Studio seems to make a DLL for every assembly so I was wondering if it’s a good idea to make a huge solution with one project per DLL.
Ex. :
Customers.aspx + Customers.aspx.vb (compiled) for presentation
Customers.DLL for the object entity
CustomersManager.DLL for business logic
CustomersData.DLL for data access
This way, we would be able to deploy every program separately without altering the others. We would also have over a thousand DLL to manage…
Does it seem to be a good solution for a large scale application?
Anyone has a better idea?
Thanks
Source control was invented exactly for the purpose of having multiple develops concurrently work on large solutions. I do appreciate the value of having components that can be deployed independently, but perhaps the value is lost as the number of independent components that require maintenance approaches the hundreds and thousands?
Separating the application into separate presentation/business logic/DAL DLLs does make sense on a per-module basis, but not usually on a per page basis.
Consider the different functional areas of your application that are likely to share code and start there (one set of projects for each).
That seems like a huge unmanageable nightmare to me.
I've been a part of several large .Net projects and the way that works the best is, like JeffN825, use some sort of source control, along with classes that support your model (database) directly.
Folders under the project root can help you split things up logically "/Customers", "/Orders", etc.
If you want to make separate projects for your classes, that is also done quite a bit. Have a separate project containing all of your database objects. Create another project for Business Logic. Actually create several Business Logic projects if you feel you need it "CustomerBO", "OrderBO", etc.
But managing over 1000 dlls and their associated web pages...that's going to be a nightmare.
I think the question was asking whether having a seperate DLL for every layer of every page was a good architecture, which it is not, if for no other reason than Visual Studio will more likely than not crawl to halt as it tries to load 100's of seperate projects (I shudder to think of what that would do, and how impossible it would be to maintain all those DLL's). Now a more reasonable solution would be to have a DLL for each layer and seperate each page's code into a different file and use a source control system. This would allow developers to share code even with the worst of breed source control systems. If your source control system has decent Branching/Merging support (i.e. not SourceSafe) like TFS, SVN, Git you don't really even after worry about people working on the same file simulatanously, and then you can organize your code by function not page. I'm hazarding to guess from the question, that there is probably an astronomical amount of duplicated code that could be simplified and made easier to maintain by breaking a rigid connection to the web code and reusing code. It can be amazing to see how much less code there will be. You can do the same on UI side with judicious use of User Controls to encapsulate shared functionality. Plus moving from ASP to .NET, you pickup things like SiteMap controls that should reduce the code footprint as well.

Excel Upload to database table

I'm looking for the best solution to allow our users to upload XLS spreadsheet so that they can be used to populate tables in our data warehouse (DW).
Our users are heavy Business Object (BO) users, and BO lets you export to XLS. When they have data in a spreadsheet that needs to be loaded to the DW, they need a process to upload the data in the XLS to the DW's db. As a result, we end up with many of these "interfaces" when I think that what we really need is a programmatic automated feed. Using Excel as a data source for inter-system feeds, in my gut, just seems like a bad idea to me.
Question #1: I'd like to see if you agree and why or why not.
OK, there is no swimming against that tide, so I now take as a given that XLS uploads are here to stay for us. Now I need to find the best solution. First, I'll explain what we do now and then what I don't like about it:
Via web pages, we provide empty XLS files (no rows) with a defined set of columns. Each file is intended to be used to update a different target dest table. In each spreadsheet is an "upload" button. Pushing the Upload button results in the macro in the spreadsheet serializing the contents of the file to CSV and FTPing the data to server folder. Periodically, a scheduler fires off an Informatica ETL job that uses the CSV file as input and loads the data into a custom XLS-specific staging table and then, if the records pass edits, into the appropriate target table. Any errors encountered are logged to an error table. For each XLS file uploaded, the data ends up in a separate staging and error table that is specific for the file.
Some of the things I don't like include about our process are:
1) The macro code in the XLS is too exposed, includes passwords for example, can be tampered with and there are issues ensuring that the users are using the latest XLS templates.
2) Business Rule edits are placed in the ETL program, where they should probably be, but because we would like to catch the errors ASAP, i.e, in the spreadsheet, edits are also added to the macro code. This results in duplication of business edits. I want these rules in one place and centrally controlled. IMHO, I think putting any macro code in the XLS introduces a maintenance issue, even calls to stored procedures (some of which we have) or calls to web services (we haven't yet tried to call .NET Web Services from XLS macros.)
3) Every XLS file upload template has its own process with distinct set of staging and error tables and a custom screen for reporting errors encountered. It seems like we need a more generalized re-usable solution.
Besides often getting data exported to XLS from BO, the users like also Excel because it is easier to edit a large number of records and less clunkier than editing individual records via a web interface.
This is the general direction that I am thinking:
First, I want the users to have the ease of editing of Excel with editing, but without including embedded macros in the spreadsheet. I experimented with Farpoint's Grid with Excel compatibility...
http://www.fpoint.com/netproducts/spreadweb/tour/excel.aspx
...and I found that it was quite easy to allow a user the ability to open up an XLS file that resides on their PC and have it open up in a browser and be able to easily access the data read from server-side .NET web code. Excel isn't running locally in their browser, but the functionality of Excel is reproduced, presumably through a lot of client sided scripting that I expect would be a real pain to duplicate myself. You can even cut and paste from a local spreadsheet into the web's spreadsheet. This sounds great, by biggest problem is cost. Our company is near death and won't allow us to purchase any new software.
Next, I want to identify the common components across all spreadsheet upload processing and come up with generic processing code. For example, I imagine a table which defines each of our spreadsheets and the format of each including the column names and data type definitions, perhaps in terms of their destination columns instead of hard coding. Based on this table template definition, I can generate XLS templates for download from this table definition. I can also perform simple generic edits to ensure that the data entered matches the table definition. And one common web page can be used to present the data and allow report data type mismatch errors and allow for the user to correct them. I would also define a common table for storing the data in a "staging" table, using a table with two columns, submission #, row num, name and value, perhaps. No more "custom everything" is the goal.
Next I need to decide where to put the business rules. My dept's mgt firmly believes that all loading of data should be done by Informatica ETL batch processes and therefore the rules/edits belong "in Informatica". I have zero experience with Informatica tools, I am more of a .NET guy. I am therefore unsure as to how these rules are implemented but I suspect that they are not reusable in the sense that they can be used by a .NET web page to validate a particular record against. You see, in some cases, when the user is not performing a bulk upload, they do have the ability to edit a specific record and I would like the same edits that were applied by the ETL bulk insert process to be applied to an individual update attempt to a single record via a web page. If the solution to write a single web service or stored procedure that can be called from either the web page doing an update of a single record or called thousands of times for each record in a bulk upload? The latter sounds inefficient.
Your thoughts on anything above would be very much welcomed.
From a cost perspective, the efforts you'll need to go through to re-create spreadsheet functionality on the web will exceed the cost of Farpoint or other controls. Even if you made $20 an hour, do you think you could complete a working product in under 2 weeks? I think you have the facts on your side when you discussed maintenance issues if you allow ETL functionality to exist in Excel - you have twice the amount of work to maintain the transformation rules. I think you need to convince management that in order to create a maintainable, robust solution you need some flexible utilities.
Farpoint is a good choice. There is also SpreadsheetGear that is a .Net engine that interprets Excel macros and can run on a web server. It has a Win32 control that allows you to create a WinForms solution with very Excel interface functionality. Last time I checked there was no web control for the product. It does an excellent job of providing Excel capability for processing large amounts of data.
Good luck. I think you will find a good solution since you seem to have a good grasp of the pro's and con's of all the different potential solutions.

User defined reports with SSRS

I have an web application which serves SQL reporting services reports via the reportviewer control. Because of the complexity of some of the reports I use rdlc reports attached to business objects.
Now I would like to expand the system and allow some form of user-defined reports. Ideally I would like the users to connect their reports to the same business objects I use to create the rdlc reports.
Is there a control that allows users
to create/edit their own rdlc files?
Can rdl files be attached to
business objects?
Any hints/tips for writing my own
control to edit rdlc files? (I would
think this is a lot of work
and would only attempt if there is
no suitable answer to 1 or 2).
All my development has been done in VS 2005 with SQL 2005 but I could upgrade if new features in 2008 help with the solution.
This isn't much of an answer, but at my company I have put together our own Report Builder.
We have about 30 or so Reporting Service reports that our users can access through the web or desktop application. What we wanted to do was give our users the ability to take any given section within those reports and create their own.
If there is a report we have built for them but they don't want to see the graph, they can create the same report with out it. If they want to combine parts from 4 different reports to make one summary report they can drag those sections around on our custom builder and save it.
The report builder I had to put together pulls down all the different sub-reports they have chosen and reads through the XML adding them to a Report Builder Template XML file I have created. I then have to aggregate all the parameters so as to not ask for them more than once (parameter names do have to be unique across all reports if you don't want them aggregated). This new report XML is deployed to the server and the users can access them when ever they want.
I've also given them the ability to create their own cover pages, headers, and footers by dragging text boxes, images, global variables (date ran, created, ran by, page number, etc... anywhere on a blank canvas. I then convert all the items they've drug around and resized on this canvas in to another report XML file and deploy it as a sub-report that they can add to their custom reports.
Yes, this has taken quite a bit of work, but our users love it. We're in the process now of allowing them to create a report with special groupings so the report can be ran at different levels.
So it is possible, but there is no easy answer. =) I'd be glad to give advice to anyone who asks, but a direct copy of the code is a violation of my contract, but I'll do what I can outside of that.
I think SQL Reporting Services isn't meant for this kind of customization. You can hide and show controls and subreports, but stuff like interactive grouping etc isn't there.
You might look into a third-party reporting framework like Telerik's.

Resources