There's an oft-cited technote indicating that running with multiple instances somehow performs much better than running with a single instance, but no testing methodology is cited on the technote.
In my experience load testing, in the general case, the opposite is true-- you get better performance with fewer instances. In the specific case where you're doing a very large number of very small requests, the single-threaded nature of the datasource connection pool becomes a bottleneck, but it's easier (and less resource intensive) to work around that specific issue by using a modest number of identically-configured datasources, and choosing a datasource randomly at the start of the request.
If you're actually memory constrained, then you're better off using a single large 64-bit instance vs. a bunch of small 32-bit instances.
(I proved via load testing to one customer that their app had 10-15% better throughput when three of their four instances were disabled, and the techie that had pushed for Enterprise + multiple instances said, and I quote, "I don't believe you." Believe? A result that changes consistently and predictably when one variable changes somehow involves "belief"? What's happened to /science/ in this country?)