Select a language to translate this page!
Powered by Microsoft® Translator
[This is the third in a five-part series on ways to help make your apps run better on lower-cost phones.--Ed.]
When thinking about targeting 256MB devices, your first order of business should be to check whether your apps exceed 90MB today. If they don't, then reducing memory usage is less urgent (but still important as I'll mention later). If they do, then memory tuning will be required per certification requirement 5.2.5.
Leverage the memory profiler and memory-related API's to identify areas of opportunity to improve memory usage in your app. These tools can help you understand pretty quickly what the peak memory usage of your app is as well as what the breakdown of memory usage is across various states of your app.
A general principle to keep in mind is that loading less will result in less memory usage. Loading less could mean loading less data at a time, loading less/smaller content, or fixing leaks which result in more objects residing in memory over the lifetime of your app.
Many apps load and display lists of data (news articles, recipes, events, search results, etc.) often across several pivots on a page. If each of these pivots loads hundreds of list items (usually with images attached to each one), this can result in wasted memory usage particularly if (1) the user never views particular pivots or (2) the user never scrolls below 20 items in any given list. If several pages in your app display lists of data in this way, then memory usage can add up quickly as users navigate through your app. Wherever possible, load less data at a time. Defaulting to a small amount of items in each list and then loading more data as the user requests it can enable you to load the most relevant content in your UI at once (rather than potentially harm user interactivity by loading data in response to user input like pivot selection changes) while keeping your app's memory footprint at a reasonable level.
Asset memory (for images, sound effects, etc.) can quickly add up if you're not careful about how you load assets. The more assets you load into memory at a time, the larger your memory footprint will become. Not only does the amount of content you load affect memory usage, but also does the size of the content. Memory usage of images, for instance, can be approximated by multiplying width * height * bpp (bits/pixel). Higher resolution images will naturally translate into higher memory usage. Loading a 1024x1024 image into memory only to scale it down to a 100x100 container will consume more memory than is necessary. If you're doing this for every image in a list or in a photo gallery feature, then you're using a lot more memory than you need to be. Load thumbnails where it makes sense, and load higher resolution content as the user requests it. If your content is preloaded in your app, make sure that the resolution of the content aligns with the sizes of the containers that will host the content. If you're loading images from a web service, check for options to load lower resolution images instead of higher resolution versions. PictureDecoder.DecodeJpeg can generate thumbnails from high resolution image streams so use this to your advantage. If you're building a game, load assets for the on-screen experience only and flush those assets when they are no longer necessary. Hanging on to assets unnecessarily can result in OOM (out-of-memory) exceptions or general performance problems as the size of your game grows.
Memory leaks can bloat your app footprint and should be identified and addressed. Common sources of memory leaks are building circular navigation loops and failing to deregister event handlers to long-lived objects. Apps that use a home button can result in circular navigation loops which can fill the back stack with redundant page instances. These page instances consume memory and could result in your app crashing due to an OOM (out-of-memory) exception while users navigate through your app. Avoid circular navigation loops by removing home button functionality or by leveraging the new back stack manipulation API’s exposed in Windows Phone 7.5. Neglecting to deregister event handlers to static objects can prevent objects from being reclaimed by the garbage collector. Imagine a search results page where the user navigates back and forth between the details of a search result and the search results page itself. If the details pages hold on to references to long-lived objects, they could continue to reside in memory even after the user navigates back from them. Be sure to deregister event handlers to static objects when they are no longer needed to prevent such leaks. OnRemovedFromJournal is the recommended place to perform this task since it handles page closure both via backward navigations as well as via the back stack manipulation API’s.
Additionally, you should be careful with the use of memory-intensive controls like the WebBrowser and Maps controls. IE can consume a lot of memory when rendering complex websites. This is true even today. The WebBrowser control essentially embeds IE in your app, so if you allow users to navigate to arbitrary complex websites in your app, the memory that IE uses will be attributed to your app. This could result in your app running out of memory if the sum of your app's memory usage plus IE's memory usage exceeds the recommended 90MB limit. If you'd like to use the WebBrowser control, just be mindful of the content that users are allowed to load. If the content isn't very complex, then performance/memory issues shouldn't be a concern. If users can load complex content in the control and your app footprint is already large, you may want to consider using the WebBrowserTask on 256MB devices to reduce the memory usage inside of your app. Similarly, for the Maps control, you may want to leverage the BingMapsTask/BingMapsDirectionsTask to offload map memory to a system process. If a customized map experience is critical to your app, then loading fewer overlays or otherwise simplifying your map experience can help save some memory as well.
As I mentioned before, if you're exceeding 90MB today, then you're likely violating some of these principles and tuning will be required to target 256MB devices. If you're not exceeding 90MB, tuning the memory usage of your app is less urgent but still has its benefits.
To enable apps to use up to 90MB on 256MB devices without jeopardizing the overall integrity of the system, we improved paging support in the OS with the Windows Phone 7.5 Refresh to better manage and distribute memory between background services/processes and the foreground app. While paging is generally abstracted from you, apps may perform slower or glitch occasionally if they push the device to its limits. Apps that use less than 60MB of memory will generally not be impacted by paging. Apps that use between 60MB and 90MB of memory will participate in paging, more so as you approach 90MB. Apps that fall in this range will be more likely to suffer from slower performance or glitches depending on the rest of the activity on the system. The smaller your app's memory footprint, the less likely it will be impacted by this variability.
In addition to having better performance, apps that use less memory are more likely to fast resume. A precondition to fast resume is that the OS has enough free RAM available to keep apps dormant (in memory) in the back stack. On 256MB devices, free RAM is not as abundant as on 512MB devices, so an app that approaches 90MB of memory usage will be tombstoned immediately as it leaves the foreground to free up RAM for the incoming app. Apps that use less memory are more likely to be kept alive in the back stack. It should be noted that the Task Switcher (tap and hold Back to visualize the back stack and quickly navigate to a given app) will always be available on 256MB devices. What we're discussing here is merely the mechanism by which a 7.1 app will fast resume (the existing app instance is returned to the foreground from memory) vs. resume from tombstoning (where the app instance is torn down when it is placed in the back stack, and a new app instance is created when the app returns to the foreground). Apps that use less memory are more likely to benefit from fast resume.
Tuning memory usage is a critical component of optimizing apps for 256MB devices. After becoming comfortable with these guidelines, it will become quite natural for you to build very efficient apps. It's a lot easier to build efficient apps from the beginning than to try and tack on efficiency after you've architected your app, so take comfort in knowing that the knowledge you gain tuning your existing apps (and seeing the results you achieve) will be directly applicable to your future Windows Phone projects. At the end of the day, your users will notice and appreciate the efficiency gains.
Make sure to check out the other stories in this series:
Well, here is the result of this lousy decision so far, needless to say, key apps are unusable on such 256MB RAM devices. Why knowingly cause fragmentation instead of standardizing on minimum requirement set before at 512MB? The price difference does not justify the resulting fragmentation. Please scrap it and let the Nokia 610 be the only device that suffers such fate of 256MB RAM where key apps are rendered uninstallablel. See link below;
Hello!, my apologies for posting here, but i don't know where to address a question, and a sugestion, from what i know is there a posibility that Windows Phone 8 will feature kinetc integration, but the HW guys are having trouble implementing the 3 cameras needed for this to work, sugestion, remove the IR filters from the cameras, and use a IR led, to avoid issues with the traking software, so the phone will need just 2 of them, hope this elps.. and sorry again!
Thanks for this eye opener. Memory is such a complicated issue, especially for someone who is only semi-technical. I'm bookmarking this article. http://www.geekchoice.com
A very interesting Post which touches on some of the major points of getting memory consumption down. Still in some cases they will lead to worse performance - not in the application being slower but in that the user will have to wait more often for lazy load operations to finish (e.g. fetching high res pictures). In some cases there will be tradeoffs so it really depends on how much caching you can and want to do to not degrade the experience for your users. This might be tuned after detecting a devices memory though.
On another note the memory leaking problems can also be commonly encountered on Desktop applications so looking into those kinds of problems will pay off in any environment that uses Garbage Collection.
Are there any thoughts on perhaps adding "automated tombstoning", e.g. taking an inactive App and writing it completely to the paging memory instead of leaving the task to developers? Lots of Apps today sadly don't tombstone properly and for those devs that want to optimize there Apps for it there could still be the possibility to opt in similarily to how it's done with the exclude entry for 256 MB devices... just a thought.
That is really useful information, thanks Mike. It has highlighted a couple of areas that I had forgotten about (and in fact occasionally produces a weird bug) - by applying these techniques, bugs are gone and even less memory in use than before.....perfect!!
This is not fragmentation. What we're discussing here is merely another instance of optional hardware components.
Apps that fit within 90MB will work well across all devices. This guidance has existed for some time and most developers do already adhere to this principle. Apps that require > 90MB can declare that requirement in their manifest to exclude the apps from devices that cannot provide the additional memory. This is the same model we use today for optional hardware components like FFC and Gyro.
Additionally, this guidance translates directly to better performance on 512MB devices as well. You should view this as an opportunity to improve the performance of your apps for all of your users with the additional addressable market enabled by 256MB devices as additional incentive.
Why knowingly create fragmentation with your own hands where some apps play on certain WP devices and not on other WP devices? Why not keep the minimum RAM at 512MB across thr board? What will the diffeence between 256MB and 512MB make to OEMs to matter in device pricings? The resulting issues of having 256MB WP devices really does not make any sense.