Every user consumes system resources so you need to know how your system reacts to abnormally high demand. Try and keep user sessions as small as possible. It is probably better to re-execute an operation than it is to cache data for 30 minutes that is never accessed and puts the system at risk. Using Java's SoftReferences can be a good compromise but requires more complex coding. You have to be aware that some users are more expensive than others because they do more -- like buy stuff. There is no direct defense against expensive users but you can test and ensure your system can handle double the number of current level of expensive users (which implies that you have a way of figuring out how many expensive users you have). Web sessions are the weakness of web applications so be aware of them.
Some users are bad either by accident or by design. Some users may deviate from the expected work flow and stress your system. Some users are actually programs, such as spiders or robots, that can stress your system. Legitimate search engines will honor a robots.txt file which can allow you to control access to your site. Others don't so you have two choices: add firewall rules to prevent access from unwanted ip blocks or create a "terms of service" agreement and sic lawyers on the offending parties.
Remember that users consume memory so make sure that sessions are used as caches and not the "database of record" so that they may be safely purged when memory becomes scarce. Some users are weird and will do strange things which cannot be defended against. Malicious users exists and you can help your cause by knowing your network design and keeping all your software patched and up to date. Users gang up on you when your site is the focal point of sudden interest, like when referenced by Slashdot. Use special stress tests to ensure your system can handle the extra load.