Child pages
  • Miscellaneous
Skip to end of metadata
Go to start of metadata
  • Ensure that production builds and deployments are automated. Having people manually performing builds introduces too much risk and variability
  • Ensure that EARs can be rolled back quickly and in an automated fashion. The most recent EAR should always be on the server, ready to be used again
  • Consider using a "Customer Experience Management" tool like Tealeaf or Coradiant. These tools can record end-user sessions and are incredibly useful for troubleshooting/recreating errors. You may want to modify ATG's logging to print the session ID with each entry to the log file
  • Ensure that an intelligent load balancing strategy is in place. Simply load balancing based on TCP pings is not acceptable, as an instance may be unusable for various reasons but still responsive to TCP pings. A good approach is to have a "healthcheck.jsp" that checks a number of application-level indicators for health and then prints out "OK" or "FAIL." The load balancer (or Apache) can periodically poll healthcheck.jsp and grep for the string "OK" and "FAIL" and then take actions appropriately
  • Make sure that CSS, JavaScript, and image files are retrieved from the server following a new code deployment. If you're not careful, these files can stay permanently cached on the client-side. See for a good approach
  • Ensure that a search engine will never index any URLs containing a rewritten URL (e.g. ";jsessionid"). If your site is live, search Google for " jsessionid" to see if any pages contain rewritten URLs
  • Verify that code is in place to programmatically invalidate HTTP sessions created by bots after each HTTP request. Search engines (should) crawl your site in a stateless fashion, meaning each HTTP request creates a new HTTP session. If you have thousands of HTTP requests per crawl, multiple search engines, the number of sessions and the memory those sessions consume can quickly get out hand
  • Consider having a different pool of instances that handles HTTP requests from bots. A layer 7-based load balancer can direct HTTP requests from bots to that special pool. Bots can be aggressive and handling bots requires special code/configuration. In order to isolate any damage done by bots, it's a good idea to keep that traffic separate from everything else
  • Be sure to check for broken links. Use a link checker tool like Xenu -
  • No labels