Related
This is more of a general question about which direction would be a better investment for the company.
Our company's core business application is written in Visual FoxPro and is about 9+ years old. The database is huge 15+ gigs and the core logic is complex and to make matters worse the data model is terrible. The two guys that built it and have maintained it all these years are at least in their 50's, so needless to say retirement or possibly death could come within the next decade or so.
This VFP app drives all our core business functions and requires terminal services and citrix to access it from the outside world. Our web apps have to interface with it via ODBC and we are always having performance issues with it. The servers that run this system are also very old, like Win 2000 server and are falling apart.
Recently we have been having meetings about upgrading the systems that run this core app as well as other services like email and file storage. The biggest expense however is buying new server hardware, OS licensing, Terminal Services licensing, Citrix licensing etc to solve some performance and outside access issues we are currently having as well as just generally bringing us to date on our systems.
The price tag is going to be in the $55K to $65K price range. So as a web developer my point of view is that this is a huge waste of money! My solution would be to invest that money in rewriting the core system to run on the web based .Net platform. This would eliminate the need for Terminal Server and Citrix licensing along with the pricey hardware and configuration management to run it on. I don't see the point in investing this kind of money in an antiquated system that should be on it's way out anyways.
I am looking to get some convincing arguments as to why this is a waste of money. Hopefully there is someone here that has faced this type of situation before that can give me some points of view. The hardware upgrade seems to be the easiest road to take because they will just have a consultant come in and do it all. A software development project would take longer, require more resources and possibly cost a little more money.
The short-term rewrite vs. re-hardware argument cannot be won. Hardware and licenses are always cheaper than a rewrite. And hardware plus license seems to involve no risk.
You can't win on ROI argument. Unless the system is trivial and you are a genius, it will always cost $100K or more to rewrite an application that actually does something. Think multiple person years.
You might win the "technical debt" argument. Change is getting more and more complex, risky and expensive. The longer this code is perpetuated, the more risk and cost accumulates.
The real question is "start to fix now?" or "wait until it breaks and suffer later?" And that has no definite $-valued answer.
You can't compete on money, so you have to compete on risk, features, growth, maintainability, adaptability, standards compliance, security, creating unique value for each customer, etc., etc.
"We are now looking at a larger base of customers and more data". That's an argument you might be able to win.
(I'm over 50, I'm not planning on dying any time soon. That argument doesn't win hearts and minds. Unless they're over 80, you can't really use age except as way to get your argument ignored.)
Focus on the cost (and risk) of making changes.
Prove that you have a web-based solution that makes changes less costly and less risky.
Further, dig into what's there and find parts that can be replaced by a web framework. Code you don't write is cheaper to maintain the code you write.
Every project needs a cost-benefit analysis. If a $60,000 one-time investment will resolve all issues for the next 10 years, then it is (probably) far more economical than hiring a team of developers for even one year to build a newer, better system.
On the other hand, if it's already costing $50,000/year in maintenance and this capital cost is just to keep the system alive, and you'll need to spend another $60k in a few years from now, then it warrants a serious consideration with respect to a re-design.
Or you could take the middle road and start wrapping it up in something opaque like a web service, then gradually swapping out components with better (more efficient, more maintainable, etc.) internal components. Lots of companies go this route because it defers the up-front costs of a rewrite; if necessary you can defer IT resources elsewhere.
S.Lott is right, though - it's likely that you won't be able to compete on cost alone. You have to try to quantify the risks associated with these ancient systems - for example, how much it will cost the company to find and train qualified FoxPro developers if the original programmers decide to quit (or, to use the parlance of so many managers I've met, "run over by a bus")...
Just to add some further perspective to this: Before .NET (and for a few years after) I conducted most of my projects exclusively in Delphi. At the time, it really was a great choice for enterprise development. I was actually the person who didn't want to "upgrade." After a while, however, it became apparent to both myself and my higher-ups that this scared people outside the company.
Investors, auditors, everyone - they didn't like the idea that our core IT asset was done in some "obscure" language. Of course, Delphi wasn't/isn't really that obscure; there's a "delphi" tag here on SO with a count of 3340. But let's use SO as our example - here are the current counts:
c# - 57293
.net - 30577
asp.net - 26600
java - 31023
vb.net - 5996
delphi - 3340
foxpro - 69
vfp - 27
Let those numbers sink in for a while. Delphi, my tool of choice at the time, now has less than 10% of the representation of C#, and this made non-techies nervous. Foxpro/VFP is not even at 1%. I can't even remember how many times I had to answer questions like:
What happens if the lead developer (me) quits or gets run over by a bus?
How difficult/costly will it be to hire programmers in that field?
What if the vendor stops supporting it? (This almost happened)
What if we want to get outside help? Consultants? Security audits?
How easy will it be to get it to work with outside products?
Blah blah blah, worry worry worry, was how I felt at the time, and this was a product that wasn't really that obscure. In your case, we're talking about FoxPro here. FoxPro has gotten to be almost like COBOL; sure, it's still around, there are people out there who know it, but who starts a new project in FoxPro today? It's boring, it's downright ghetto. VB6 is starting to become ghetto, and VB/Access effectively replaced FoxPro so many years ago.
I'm obviously being slightly melodramatic here, but if I were you, this is the angle I would be taking. Forget about the short-term economics, forget about the age, and focus on the obscurity of the product. How many genuine, qualified responses do they think they'll get if they put a want-ad out for a FoxPro developer? What kind of pay would they have to offer for a position like that? What would the turnover be like? This may all seem remote if these two developers have been there for 20-odd years, but when you're running a multimillion-dollar business, you ought to know that it's never a good idea to stake your very survival on one or two employees - not if you can help it.
In general supplementing a poor system this tons of hardware is a bad plan, i would probably say that it#s better to rewrite, but it's hard to say without knowing the details.
Bear in mind that a decent rewrite should improve performance, reliability and maintainabilty so the potential savings are large and will only increase year on year, even if the inital investment is a little more.
In order to figure out if it is worthwhile, you have to calculate, in addition to the costs of a rewrite:
Documenting everything the system currently does, and reverse-engineering the requirements.
Writing unit and integration tests for everything that currently exists. This probably doensn't exist already, but should be.
Cost of maintaining the new system. The new system isn't going to eliminate maintenance costs, merely reduce it. How much will you save?
Cost of hardware for the new system. The new system is going to have to run on something.
Licensing costs for any software/etc. that are needed for the new system. Is everything going to be open source? Or are you going to need several Visual Studio Test Editions for your developers and testers?
Cost of hiring new personnel to do the development. In addition to the straight salary costs, there are office costs. The total might be $300,000, for say 3 developers, counting salary, office space, equipment, licenses, health care benefits.
Time horizon for the saving. The saving isn't going to occur immediately. It is going to occur in the future. In the meantime, they have to still pay for the licensing for the current system, because something has to do the job until the new system is put in place.
Cash flow issues. Because of the above, in the short term they are going to need more money to fund the development. The actual costs are higher, because they essentially have to get a loan, raise equity, or have an opportunity cost (they aer going to have to forego some other investment opportunity to pursue the rewrite).
Business risk. There may be a danger that the rewrite might cost more, work worse,
Two important numbers:
Number of "FoxPro" jobs listed in San Francisco's craigslist right now: 2.
Number of ".NET" jobs listed in San Francisco's craigslist right now: 252.
A lot of other points that have been mentioned are valid. However, you can spend as much as you want on hardware, but the fact is that if something breaks and you need help, you are going to have a heck of a time finding more people to help.
Sounds like a good time to start talking about a migration¹ to newer, better-supported technologies. (And in 10 years when .NET is old hat, you can do it all over again :)
[1] And evolve the system, don't rewrite it. I would guess your current system grew very organically based on needs at the time. There's no way that you'll be able to completely replace all of that (at least, not without a couple of years and a few miillion bucks).
As a historical VFP devloper (over 20yrs with Foxpro/VFP, and STILL have people asking me to write / update their systems with VFP, for a variety of reasons), its still very powerful. However, while researching and taking much of my OOP and development experience and working with .Net, I do find some things in .Net much easier, especially the strong type-casting. However, doing a basic report REQUIRES all strong type-casting to the database tables / structures / objects, and in many cases thus far, a PITA to do.
The price tag for a rewrite is always of significant consideration, but so too is the collapse of ANY system... regardless of VFP, VB, Access, or other. I would strongly suggest getting a consulting company in to help in the re-modeling of your system and maybe act as a project manager / mentor to your in-house staff of programmers who may be able to offer their talents even though it may require some training in the new development environment. This way, you can get a good basis of a strong talent in the language, yet keep some costs down by using your own programming staff -- yet you may need to hire supplemental programming staff. The learning curve from VFP to .Net is there, and can still be a head scratcher.
There are a variety of companies out there who were VFP specialists that have subsequently migrated their services to .Net world and may offer a perfect match for your organization having the historic knowledge and professional experience of BOTH worlds. I know they can act as mentors too for the development of such work.
You can only say it is a waste of money after you analyzed the ROI - it will depend heavily on how much does it cost to rewrite the system.
Classic mistake on JOS - "system is a mess, let's rewrite it".
It will be like looking at this old building and seeing a toothpick and wondering why it is there. You figure it isn't needed, and pull it out.
Suddenly the building collapses around your head :)
It might be a better idea to
Consider rewriting parts of the system for better maintainability.
Optimizing the system for better performance.
Abstracting the Foxpro specific parts, so it could be more easily converted to some other technology.
This incremental approach would reduce risk, and provide some short-term improvements.
There is no magic bullet here for the company. The only way to be sure is to take the hit on a new server to get the stability and speed benefits that brings to the existing business-critical software. Then once that is parked for a few years start re-engineering the thing on a different platform like .NET if that's what you want to do. Bearing in mind that you will have to migrate the VFP data into the new database structure at some point.
Although I am using drupal since the D4 series, I only started developing professionally for it with D6, so - despite I did various site upgrades - I was never faced by the task of having to port my own code to a new version.
I know the Drupal community will come up with lot of technical support about changed API's and architectural changes (see the deadwood module for D5-D6 or even these stubs of D6-D7 how-to's for modules and themes).
However what I am looking for with my question is more in the line of strategy thinking, or in other words, I am looking for inputs and advice on how to plan / implement / review the process of porting my own code, in the light of what colleague developers learned by previous experience. Some example:
Would you advice to begin to port my modules as soon as I have time for doing it, and to maintain a concurrent D7 for some time (so I am "prepared" for the D-day) or would you advice to rather wait for the day in which the port will be actually imminent and then upgrade the modules to D7 and drop the D6 version?
Only some of my modules have full test coverage. Would you advice to complete test coverage for the D6 version so to have all tests working to check the D7 port, or would you advice to write my test directing at porting time, to test the D7 version?
Did you find that being an early adopter gives you an edge in terms of new features and better API's or did you rather find that is more convenient to delay the conversion so as to leverage the larger amount of readily available contrib modules?
Did you set for yourself quality standards / evaluation criteria or did you just set the bar to "if it works, I'm happy"? Why? If you set certain standards or goals, what did they where / what will they be? How did they help you?
Are there common pitfalls that you experienced in the past and that you think are applicable to the D6-D7 porting process?
Is porting a good moment to do some refactoring or it is just going to make everything more complex to be put back together?
...
These questions are not an exhaustive list, but I hope they give an idea of what kind of information I am looking for. I would rather say: whatever you think is relevant and I did not list above gets a "plus"! :)
If I did not manage to express myself clearly enough, please post a comment with the info you think I should add in the question. Thank you in advance for your time!
PS: Yes I know... D7 is not yet out and it will take months before important contrib modules will be upgraded... but it's never too early to start thinking! :)
Good questions, so let's see:
(when to start porting)
This certainly depends on the complexity of the modules to port. If there are really complex/large ones, it might be useful to start early in order to find tricky spots while not being under pressure. For smaller/standard ones, I'd try to find a bigger time slot later on where I can port many of them in a row in order to get the routine stuff memorized quickly (and benefit from the probably improved documentation).
(test coverage)
Normally I'd say that having a good test coverage before starting refactoring/porting would certainly be advisable. But given that Drupal-7 introduces a major change concerning the testing framework by moving it to core, I'd expect the need to rewrite a significant amount of tests anyway. So if there is no need to maintain the Drupal-6 versions after the migration, I'd save the time/trouble and aim for increased coverage after the porting.
(early adopter vs. wait and see)
Using Drupal since the 4.7 version, we have always waited for at least the first official release of a new major version before even thinking about porting. With Drupal 6, we waited for the views module before porting our first site, and we still have some smaller projects on Drupal-5, as they are working just fine and it would be hard to justify the extra bill for our clients as long as there are still maintenance/security fixes for it. There is just so much time in a day and there is always this backlog of bugs to fix, features to add, etc., so no use playing with unfinished technology while there are more imminent things to do that would immediately benefit our clients. Now this would certainly be different if we'd have to maintain one or more 'official' contributed modules, as offering an early port would be a good thing.
I'm a bit in a bind here - being an early adopter certainly benefits the community, as someone has to find that bugs before they can get fixed, but on the other hand, it makes little business sense to fight hour after hour with bugs others might have found/fixed if you'd just waited a bit longer. As long as I have to do this for a living, I need to watch my resources, trying to strike an acceptable balance between serving the community and benefiting from it :-/
(quality standards)
"If it works, I'm happy" just doesn't cut it, as I don't want to be happy momentarily only, but tomorrow as well. So one of my quality standards is that I need to be (somewhat) certain that I 'grokked' new concepts well enough in order to not just makes things work, but make them work like they should. Now this is hard to define more precisely, as it is obviously impossible to know if one 'got it' before 'getting it', so it boils down to a gut feeling/distinction of 'yeah, it kinda works' vs. 'yup, that looks right', and one has to accept that he will quite regularly be wrong about this.
That said, one particular point I'm looking out for is 'intervene as early as possible'. As a beginner, I often tweaked stuff 'after the fact' during the theming stage, while it would have been much easier to apply the 'fix' earlier in the processing chain by means of one hook or the other. So right now, whenever I'm about to 'adjust' something in the theme layer, I deliberately take a small time out to check if this can not be done more cleanly/compatible within a hook earlier on. As I expect Drupal-7 to add even more hooking options, this is something I will pay extra attention to, as it usually reduces conflicts and sudden 'breaking of stuff' when adding new modules.
(common pitfalls)
Well - mainly porting to early, finding out afterwards/in between that one or more needed modules were not available for the new version at all, or only in dev/alpha/early beta state. So I'd make sure to compile a complete list of used/needed modules first, listing their porting state, along with a quick inspection of their issue queues.
Besides that, I have so far always been very pleased with the new versions and their improvements, and I'm looking forward for Drupal-7 again.
(refactoring while porting)
One could say that porting is a rather large refactoring in itself, so there is no need to add to the complexity by restructuring non porting related stuff. On the other hand, if you already have to shred your modules to pieces anyway, why not use the opportunity to make it a major overhaul? I'd try to draw a line based on complexity - for big/complex modules, I'd do the port as straight as possible, and refactor more later on, if need be. For smaller modules, it shouldn't really matter, as the likelihood of introducing subtle bugs should be rather small.
(other stuff)
... need to think about it ...
Ok, other stuff:
Resource needs - given some of the Drupal-7 threads, it looks like they are likely to go up, so this should be evaluated before porting smaller sites that sit on a shared/restricted hosting account.
Latest versions first - This one is rather obvious and always stressed in the upgrade guides, but nevertheless worth mentioning: Upgrade core and all modules to their latest current version first before doing a major upgrade, as the upgrade code is highly likely to depend on the latest table/data structures to work correctly. Given Drupals 'piecemeal', one step at a time update strategy, it would be very hard to implement upgrade code that would detect different pre-upgrade states and acted accordingly.
I have been tasked with automating some of the paper forms in HR. This might turn into "automate all forms" eventually, so I want to approach this in a way which will be best for the long term and will be a good framework as this project grows.
The first things that come to mind were:
-InfoPath/SharePoint (We currently don't use SharePoint now, and wouldn't be an option for the next two years.)
-Workflow Foundation (I've looked into this and does not seem too attractive or appropriate)
Option I'm considering at this point:
-Custom ASP.NET (VB.NET) & SQL Server, which is what my team mostly writes their apps with.
-Leverage Infopath for creating the forms electronically. Wondering if there is a good approach to integrating this with a custom built ASP.NET app.
-Considering creating the app as an MVC web app.
My question is this:
-Are there other options I might want to consider?
-Are there any starter kits or VB.NET based open source projects there which would be a starting point or could be used as a good reference. Here I'm mostly concerned with the workflow processing.
-Any comnments from those who have gone down this path?
This is going to sound really dumb, but in my many years of helping companies automate paper form-based processes is to understand the process first. You will most likely find that no single person understands the whole thing. You will need to role-play the many paths thru the process to get your head around it. And once you present your findings, everyone will be shocked because they had no idea it was that complex. Use that as an opportunity to streamline.
Automating a broken process only makes it screw up faster and tell a lot of people.
As far as tools, my experience dates me but try to go with something with these properties:
EASY to change. You WILL be changing it. So don't hard-code anything.
Possible revision control - changes to a process may or may not affect documents already in route?
Visual workflow editing. Everyone wants this but they'll all ask you to drive it. Still, nice tools.
Not sure if this helps or not - but 80% of success in automating processes is not technology.
This is slightly off topic, but related - defect tracking systems generally have workflow engines/state. (In fact, I think Joel or some other FC employee posted something about using FB for managing the initial emails and resume process)
I second the other advice about modeling the workflow before doing any coding or technology choices. You will also want this to be flexible.
as n8owl reminded us, automating a mess yields an automated mess - which is not an improvement. Many paper-forms systems have evolved over decades and can be quite redundant and unruly. Some may view "messing with the forms" as a violation of their personal fiefdoms, so watch your back ;-)
model the workflow in terms of the forms used by whom in what roles for what purposes; this documents the current process as a baseline. Get estimates of how long each step takes, both in terms of man-hours and calendar time
understand the workflow in terms of the information gathered, generated, and transmitted
consolidate the information on the forms into a new set of forms for minimal workflow
be prepared to be told "This is the way we've always done it and we're not going to change", and to gently (a) validate their feelings, (b) explain how less work is more efficient, and (c) show concrete benefits [vs.the baseline from step 1]
soft-code when possible; use processing rules when possible; web services and html forms (esp. w/jquery) will go a long way if you have an intranet
beware of canned packages (including sharepoint) unless you are absolutely certain they encompass your organization's current and future needs
good luck!
--S
I detect here a general tone of caution with regards to a workflow based approach and must agree. Be advised about the caveats of most workflow technologies which sacrifice usability for flexibility.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 months ago.
Improve this question
.NET 3.5, .NET 4.0, WPF, Silverlight, ASP.NET MVC - there's really a lot of new Microsoft technology released / on the horizon to try out these days.
(The examples I gave is all Microsoft technology but this can apply to any language or platform). I am curious how this is handled in the company you work for. A few examples:
Do you have a CTO that determines what technology the company uses?
Are development teams free to choose what technology they use? For example: framework version, classic ASP.NET vs ASP.NET MVC, ADO.NET Entity Framework vs Linq2Sql or NHibernate? Or a mix of these?
What new technologies does the company you work for try out and why?
Does your company have dedicated resources (time) to try out WPF or whatever technology, just for research, or do you try things out in your spare time and try to introduce them to your company?
These are just examples to make my question clearer. To summarize, I'd like to know what this process looks likes, who is responsible, who makes the decisions. Does your company jump on the bandwagon, or is it reluctant to try new technologies? And are you comfortable with this situation?
At the company I work for, we still use .NET 2.0 (although we are now slowly switching to .NET 3.5), haven't seriously looked into ASP.NET MVC, haven't tried out WPF at all, etcetera. And, some find it pretty hard to convince people to do. Is it fair to expect otherwise?
At my company, we have an architecture group that determines which technologies are used. People are welcome to read up on alternative technologies and make suggestions, but at the end of the day, it's the architecture group that makes the decisions.
While this may seem restrictive, it does ensure that all of the development groups are using the same or similar technologies, and moving from one group to the next is fairly easy. As well, by having one group do all the research, you ensure that you don't waste time by having multiple groups duplicate the research effort.
Since I work in such a small company and am I typically either the only developer, or the lead developer in a very small group, I can usually convince my boss to use whatever I think would be the best for a given project/situation.
We stick to what we know for our major and key projects within the company.
For any new "mini" projects that come along, we take the hit on the learning curve to try and build them in the latest technologies if at all possible.
This enables us to get up to speed on these things to then comfortably and safely use these technologies in our major projects as we see fit.
Where I work there is an architect team which looks at technologies from a high level and makes recommendations to various actual teams. A subset of the architect team actually takes the technologies and experiments on them and out of the produces
Internal 1 hour overview sessions
Week long boot camps
Whitepapers/Posters
The more important the technology is the more of that list is produced. All of that just feeds to teams, which combined with customer requirements for technology actually make the decision for what that team should use.
I have a mix answer to this question. Where I work, lower level technical managers are usually the ones that chose a certain technology and sometimes even the developers have the freedom to try something new. For example, I really wanted to learn about JavaScript's Prototype while working on a web site. I made the case to my boss, he was reluctant first because nobody else knew it or had used it before, but gave me the go ahead. It was great for me to be able to learn Prototype and take advantage of it's many built in functionality. Other bigger projects come down from higher management and we don't really have much of a choice. Right now, my company is adopting SAP, so everything is moving into that direction. I don't necessarily want to become an SAP expert, but if I want to stay here, I'll need to at least learn how to work with it.
Every company has its own pace for innovation, and it's dependent first on the comfort level of the managers, and second on whether anybody actually does the work to research and propose using new things. When the managers start getting uncomfortable, innovation slows or stops until they get comfortable again. Some innovations they will never be comfortable with.
Keeping this in mind, I'm not sure how to answer your question about whether or not it's fair to expect more innovation than is happening. Certainly it's reasonable for you to want more; equally, once you've hit your organization's speed limit on innovation, it's not likely to change and, if it does change, it will probably take a long, long time.
I've been given rather large amounts of freedom to change things by various managers in my past, and I took advantage of it. I also ran into the limits on a regular basis, and finally dealt with my frustration by starting my own company. (This may be considered a somewhat drastic measure; certainly by doing do you reduce the time you have to research and develop the very things for which you started your company.)
These days I'm developing rather significant applications in Haskell, and I'm pleased as punch. After a year, I'm starting to get the hang of it, and I certainly have several more years ahead of me just learning what I can do with the tools I have now.
I suppose the summary of my response is: if you want to innovate more than those around you, you need to change your peer group.
I think any company that tries new technology for the sake of it, as its bleeding edge and 'innovative' is crazy. To have a formal 'lets play with new technology to try it out department' is just nuts.... unless they're in the business of providing technology consulting to other businesses.
For everyone else technology is there to help the business get things done. Not to help developers line their CV's with cool sounding TLA's.
The company I'm working at the moment is quite large and has a CTO that chooses 'strategic platforms'. But I've have to say, if you can pick a technology, they're probably using it. They're too big to beat everyone down with the corporate stick, but they try. If the technology will work in the project and bring it in on time, then it gets used.
We need solid and proven platforms for our stuff. And, we don't need anything fancy. Therefore we might go for .NET after 5-10 years or so, hope it's ready by then. On the other hand, Java is already mature enough, so we're using it alongside with C++ and some Jython scripting. These decisions are pretty much autonomous (we're a small shop).
I don't mean to mock bleeding edge developers, but whether you need solidity or newest features obviously depends on what you're working on. Many scientists are still happily using Fortran 77.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
RPO 1.0 (Runtime Page Optimizer) is a recently (today?) released component for ASP and Sharepoint that compresses, combines and minifies (I can’t believe that is a real word) Javascript, CSS and other things.
What is interesting is that it was developed for ActionThis.com a NZ shop that saw at TechEd last year. They built a site that quickly needed to be trimmed down due to the deployment scale and this seems to be the result of some of that effort.
Anyone have any comments? Is it worthwhile evaluating this?
http://www.getrpo.com/Product/HowItWorks
Update
I downloaded this yesterday and gave it a whirl on our site. The site is large, complex and uses a lot of javascript, css, ajax, jquery etc as well as URL rewriters and so on. The installation was too easy to be true and I had to bang my head against it a few times to get it to work. The trick... entries in the correct place in the web.config and a close read through the AdvancedSetup.txt to flip settings manually. The site renders mostly correctly but there are a few issues which are probably due to the naming off css classed - it will require some close attention and a lot of testing to make sure that it fits, but so far it looks good and well worth the cost.
Second Update We are busy trying to get RPO hooked up. There are a couple of problems with character encoding and possibly with the composition of some of our scripts. I have to point out that the response and support from the vendor has been very positive and proactive
Third Update I went ahead and went ahead with the process of getting RPO integrated into the site that I was involved in. Although there were some hiccups, the RPO people were very helpful and put a lot of effort into improving the product and making it fit in our environment. It is definitely a no-brainer to use RPO - the cost for features means that it is simple to just go ahead and implement it. Job done. Move on to next task
I decided to answer this question again after evalutating it a little.
The image combining is really amazing
The CSS and Javascript is nicely minified
All files are cached on the server meaning that the server isn't cained every time it makes a request
The caching is performed at a browser level, meaning it will still work if you use an old (unsupported) browser because you'll just recieve the page un-compressed
You can see the difference youself Optimized vs Unoptimized
The price is as follows...
$499 until the end of september is a steal
$199 for an annual renewal is a steal
I love how RPO is plug and play.
It will take time to create a module like theirs and depending on work load can be worth the $750/year versus the development time it takes to re-create it.
I'm very excited about RPO and reviewing it's effect on my sites.
Something I used quite recently was page optimization module from I found on Darksider's blog. It it not nearly as intense as what RPO sets out to achieve, but a nice start block to building your own optimization module if that's what you're after.
Clarification on the RPO price. Launch price until end of September 2008 is $499 - and this discount is by voucher (email service#getrpo.com to get a voucher). This includes software assurrance for 12 months, after which you can choose to renew for $199 or not - the software still works.
The RPO automates 8 of Steve Souders/Yahoo's principles for High Performance Web Sites - the important thing for us was making a developer friendly tool - you can keep your resources in the format and structure that makes sense for development and the optimization happens at runtime.
I don't want to spam this forum with sales stuff, so just email me if you have any questions - ed.robinson#aptimize.net. Thanks for looking at the RPO.
Ed Robinson, Chief Executive Officer, Aptimize Ltd
I've been a user of the RPO since beta and have it deployed in anger on two of my sites:
http://www.syringe.net.nz (My blog) and
http://www.medrecruit.com (A company in which I have an interest)
I've done a longish winded blog post on the whole why not just turn on caching question here:
http://www.syringe.net.nz/2008/10/21/RuntimePageOptimizerWhyNotJustEnableCachingInIIS.aspx
The short summary version- Caching is a nice to have for people who aren't really geared up to turn it on in IIS (it's still not super easy in IIS6)... the real power is in combining resources as it's latency * request count that really kills your performance.
minifying and gzipping commonly called scripts and style sheets is totally worthwhile - the file size reduction speaks for itself. That's something that you can do through your webserver, without the help of another product.
However, merging scripts and styles and serving them together is an interesting idea from a general 'the fewer requests the better' standpoint.
It looks like interesting technology - I'd try it out. It almost certainly couldn't hurt.
Just had a little look, a lot of the things they offer you should be able to do yourself with a little palnning and foresight (combine all javascript files, combine all css, minify, enable GZip...
$750 a year seems a little steep, and theres no options.
(edit)
After speaking with the marketing bods, it's $499 until end of september, and renewing the liscence will be $199. That persuades me a lot more!
I'm going to give it a whirl and then see how much it improves our DEV server.
I personally have been using a product called PageBlaster by Snapsis that does caching, minification. It is primarily used in DotNetNuke applications, but if I recall correctly it can be used with any ASP.NET application, and the price is right.....