From one developer to another; use as much memory and resources as possible because the servers can handle it and we need to make our lives easier.
There are a few vomit inducing things that pop into my mind when I hear things like that.
I fully comprehend and realize that the days of C.I.C.S. and nitpicking over kilobytes of memory are over in favor of get it done now and throw resources at it world of today. We have scalable clouds now, right? Why bother doing something efficiently when you can just do it, get it done and make yourself look good (or save your job, or make your resource manager look good). Just add another node.
Something has to give somewhere though.
As noted in a previous post, adding a millisecond to each transaction might seem like nothing in the long run but when millions of transactions are taking place and those milliseconds multiplying like those bunnies from that Monty Python movie; there will be performance issues.
Then the developers complain to the system administrators and users complain to the help desk (who also complain to the system administrators) and web users complain or simply take their business elsewhere which causes the business units to complain to their managers who meet over lunch at a bistro with IT managers who then order the system administrators to fix it.
Resources are finite, even in a cloud.
Resources are finite on the client, too. Many in the younger generation would rather do things on their phones than sit in front of a computer and not every phone has Flash, Silverlight, etc. or enough display space to handle the additional burden of poorly conceived code.
I just thought of a business model. I'll ask your managers if they are having performance problems and then charge by the hour to point out issues that will cause a crap-ton of meetings, lots of paper to be printed, and inner turmoil to increase between the development staff and administrators while users think I'm a minor god (or something).