[This is the third in a five-part series on ways to help make your apps run better on lower-cost phones.–Ed.]
When thinking about targeting 256MB devices, your first order of business should be to check whether your apps exceed 90MB today. If they don’t, then reducing memory usage is less urgent (but still important as I’ll mention later). If they do, then memory tuning will be required per certification requirement 5.2.5.
Leverage the memory profiler and memory-related API’s to identify areas of opportunity to improve memory usage in your app. These tools can help you understand pretty quickly what the peak memory usage of your app is as well as what the breakdown of memory usage is across various states of your app.
A general principle to keep in mind is that loading less will result in less memory usage. Loading less could mean loading less data at a time, loading less/smaller content, or fixing leaks which result in more objects residing in memory over the lifetime of your app.
Many apps load and display lists of data (news articles, recipes, events, search results, etc.) often across several pivots on a page. If each of these pivots loads hundreds of list items (usually with images attached to each one), this can result in wasted memory usage particularly if (1) the user never views particular pivots or (2) the user never scrolls below 20 items in any given list. If several pages in your app display lists of data in this way, then memory usage can add up quickly as users navigate through your app. Wherever possible, load less data at a time. Defaulting to a small amount of items in each list and then loading more data as the user requests it can enable you to load the most relevant content in your UI at once (rather than potentially harm user interactivity by loading data in response to user input like pivot selection changes) while keeping your app’s memory footprint at a reasonable level.
Asset memory (for images, sound effects, etc.) can quickly add up if you’re not careful about how you load assets. The more assets you load into memory at a time, the larger your memory footprint will become. Not only does the amount of content you load affect memory usage, but also does the size of the content. Memory usage of images, for instance, can be approximated by multiplying width * height * bpp (bits/pixel). Higher resolution images will naturally translate into higher memory usage. Loading a 1024×1024 image into memory only to scale it down to a 100×100 container will consume more memory than is necessary. If you’re doing this for every image in a list or in a photo gallery feature, then you’re using a lot more memory than you need to be. Load thumbnails where it makes sense, and load higher resolution content as the user requests it. If your content is preloaded in your app, make sure that the resolution of the content aligns with the sizes of the containers that will host the content. If you’re loading images from a web service, check for options to load lower resolution images instead of higher resolution versions. PictureDecoder.DecodeJpeg can generate thumbnails from high resolution image streams so use this to your advantage. If you’re building a game, load assets for the on-screen experience only and flush those assets when they are no longer necessary. Hanging on to assets unnecessarily can result in OOM (out-of-memory) exceptions or general performance problems as the size of your game grows.
Memory leaks can bloat your app footprint and should be identified and addressed. Common sources of memory leaks are building circular navigation loops and failing to deregister event handlers to long-lived objects. Apps that use a home button can result in circular navigation loops which can fill the back stack with redundant page instances. These page instances consume memory and could result in your app crashing due to an OOM (out-of-memory) exception while users navigate through your app. Avoid circular navigation loops by removing home button functionality or by leveraging the new back stack manipulation API’s exposed in Windows Phone 7.5. Neglecting to deregister event handlers to static objects can prevent objects from being reclaimed by the garbage collector. Imagine a search results page where the user navigates back and forth between the details of a search result and the search results page itself. If the details pages hold on to references to long-lived objects, they could continue to reside in memory even after the user navigates back from them. Be sure to deregister event handlers to static objects when they are no longer needed to prevent such leaks. OnRemovedFromJournal is the recommended place to perform this task since it handles page closure both via backward navigations as well as via the back stack manipulation API’s.
Additionally, you should be careful with the use of memory-intensive controls like the WebBrowser and Maps controls. IE can consume a lot of memory when rendering complex websites. This is true even today. The WebBrowser control essentially embeds IE in your app, so if you allow users to navigate to arbitrary complex websites in your app, the memory that IE uses will be attributed to your app. This could result in your app running out of memory if the sum of your app’s memory usage plus IE’s memory usage exceeds the recommended 90MB limit. If you’d like to use the WebBrowser control, just be mindful of the content that users are allowed to load. If the content isn’t very complex, then performance/memory issues shouldn’t be a concern. If users can load complex content in the control and your app footprint is already large, you may want to consider using the WebBrowserTask on 256MB devices to reduce the memory usage inside of your app. Similarly, for the Maps control, you may want to leverage the BingMapsTask/BingMapsDirectionsTask to offload map memory to a system process. If a customized map experience is critical to your app, then loading fewer overlays or otherwise simplifying your map experience can help save some memory as well.
As I mentioned before, if you’re exceeding 90MB today, then you’re likely violating some of these principles and tuning will be required to target 256MB devices. If you’re not exceeding 90MB, tuning the memory usage of your app is less urgent but still has its benefits.
To enable apps to use up to 90MB on 256MB devices without jeopardizing the overall integrity of the system, we improved paging support in the OS with the Windows Phone 7.5 Refresh to better manage and distribute memory between background services/processes and the foreground app. While paging is generally abstracted from you, apps may perform slower or glitch occasionally if they push the device to its limits. Apps that use less than 60MB of memory will generally not be impacted by paging. Apps that use between 60MB and 90MB of memory will participate in paging, more so as you approach 90MB. Apps that fall in this range will be more likely to suffer from slower performance or glitches depending on the rest of the activity on the system. The smaller your app’s memory footprint, the less likely it will be impacted by this variability.
In addition to having better performance, apps that use less memory are more likely to fast resume. A precondition to fast resume is that the OS has enough free RAM available to keep apps dormant (in memory) in the back stack. On 256MB devices, free RAM is not as abundant as on 512MB devices, so an app that approaches 90MB of memory usage will be tombstoned immediately as it leaves the foreground to free up RAM for the incoming app. Apps that use less memory are more likely to be kept alive in the back stack. It should be noted that the Task Switcher (tap and hold Back to visualize the back stack and quickly navigate to a given app) will always be available on 256MB devices. What we’re discussing here is merely the mechanism by which a 7.1 app will fast resume (the existing app instance is returned to the foreground from memory) vs. resume from tombstoning (where the app instance is torn down when it is placed in the back stack, and a new app instance is created when the app returns to the foreground). Apps that use less memory are more likely to benefit from fast resume.
Tuning memory usage is a critical component of optimizing apps for 256MB devices. After becoming comfortable with these guidelines, it will become quite natural for you to build very efficient apps. It’s a lot easier to build efficient apps from the beginning than to try and tack on efficiency after you’ve architected your app, so take comfort in knowing that the knowledge you gain tuning your existing apps (and seeing the results you achieve) will be directly applicable to your future Windows Phone projects. At the end of the day, your users will notice and appreciate the efficiency gains.
Make sure to check out the other stories in this series:
- Optimizing Apps for Lower Cost Devices (part 1)
- Optimize Start Up Time (part 2)
- Reduce Memory Usage (part 3)
- Handle Feature Reductions (part 4)
- Respond to User Input (part 5)
Updated November 7, 2014 11:59 pm