The Dirty Little Secret About Mobile Benchmarks

 

This article has had almost 30,000 views. Thanks for reading it.

When I wrote this article over a year ago, most people believed mobile benchmarks were a strong indicator of device performance. Since then a lot has happened: Both Samsung and Intel were caught cheating and some of the most popular benchmarks are no longer used by leading bloggers because they are too easy to game. By now almost every mobile OEM has figured out how to “game” popular benchmarks including 3DMark, AnTuTu, Vellamo 2 and others. Details. The iPhone hasn’t been called out yet, but Apple has been caught cheating on benchmarks before, so there is a high probability they are employing one or more of the techniques described below like driver tricks. Although Samsung and the Galaxy Note 3 have received a bad rap over this, the actual impact on their benchmark results was fairly small, because none of the GPU frequency optimizations that helped the Exynos 5410 scores exist on Snapdragon processors. Even when it comes to the Samsung CPU cheats, this time around the performance deltas were only 0-5%.

11/26/13 Update: 3DMark just delisted mobile devices with suspicious benchmark scores. More info.

2/1/17 Update: XDA just accused Chinese phone manufacturers of cheating on benchmarks. You can read the full article here.

Mobile benchmarks are supposed to make it easier to compare smartphones and tablets. In theory, the higher the score, the better the performance. You might have heard the iPhone 5 beats the Samsung Galaxy S III in some benchmarks. That’s true. It’s also true the Galaxy S III beats the iPhone 5 in other benchmarks, but what does this really mean? And more importantly, can benchmarks really tell us which phone is better than another?

Why Mobile Benchmarks Are Almost Meaningless

    1. Benchmarks can easily be gamed – Manufacturers want the highest possible benchmark scores and are willing to cheat to get them. Sometimes this is done by optimizing code so it favors a certain benchmark. In this case, the optimization results in a higher benchmark score, but has no impact on real-world performance. Other times, manufacturers cheat by tweaking drivers to ignore certain things, lower the quality to improve performance or offload processing to other areas. The bottom line is that almost all benchmarks can be gamed. Computer graphics card makers found this out a long time ago and there are many well-documented accounts of Nvidia, AMD and Intel cheating to improve their scores.Here’s an example of this type of cheating: Samsung created a white list for Exynos 5-based Galaxy S4 phones which allow some of the most popular benchmarking apps to shift into a high-performance mode not available to most applications. These apps run the GPU at 532MHz, while other apps cannot exceed 480MHz. This cheat was confirmed by AnandTech, who is the most respected name in both PC and mobile benchmarking. Samsung claims “the maximum GPU frequency is lowered to 480MHz for certain gaming apps that may cause an overload, when they are used for a prolonged period of time in full-screen mode,” but it doesn’t make sense that S Browser, Gallery, Camera and the Video Player apps can all run with the GPU wide open, but that all games are forced to run at a much lower speed.Samsung isn’t the only manufacturer accused of cheating. Back in June Intel shouted at the top of their lungs about the results of an ABI Research report that claimed their Atom processor outperformed ARM chips by Nvidia, Qualcomm and Samsung. This raised quite a few eyebrows and further research showed the Intel processor was not completely executing all of the instructions. After released an updated version of the benchmark, Intel’s scores dropped overnight by 20% to 50%. Was this really cheating? You can decide for yourself — but it’s hard to believe Intel didn’t know their chip was bypassing large portions of the tests AnTuTu was running. It’s also possible to fake benchmark scores as in this example.Intel has even gone so far as to create their own suite of benchmarks that they admit favor Intel processors. You won’t find the word “Intel” anywhere on the BenchmarkXPRT website, but if you check the small print on some Intel websites you’ll find they admit “Intel is a sponsor and member of the BenchmarkXPRT Development Community, and was the major developer of the XPRT family of benchmarks.” Intel also says “Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors.” Bottom line: Intel made these benchmarks to make Intel processors look good and others look bad.
    2. Benchmarks measure performance without considering power consumption – Benchmarks were first created for desktop PCs. These PC were always plugged into the wall, had multiple fans and large heat-sinks to dissipate the massive amounts of power they consumed. The mobile world couldn’t be more different. Your phone is rarely plugged into the wall — even when you are gaming. Your mobile device is also very limited on the amount of heat it can dissipate and battery life drops as heat increases. It doesn’t matter if your mobile device is capable of incredible benchmark scores if your battery dies in only an hour or two. Mobile benchmarks don’t factor in the power needed to achieve a certain level of performance. That’s a huge oversight, because the best chip manufacturers spend incredible amounts of time optimizing power usage. Even though one processor might slightly underperform another in a benchmark, it could be far superior, because it consumed half the power of the other chip. You’d have no way to know this without expensive hardware capable of performing this type of measurements.

 

  • Benchmarks rarely predict real-world performance — Many benchmarks favor graphics performance and have little bearing on the things real consumers do with their phones. For example, no one watches hundreds of polygons draw on their screens, but that’s exactly the types of things benchmarks do. Even mobile gamers are unlikely to see increased performance on devices which score higher, because most popular games don’t stress the CPU and GPU the same way benchmarks do. Benchmarks like GLBenchmark 2.5 focus on things like high-level 3D animations. One reviewer recently said, “Apple’s A6 has an edge in polygon performance and that may be important for ultra-high resolution games, but I have yet to see many of those. Most games that I’ve tried on both platforms run in lower resolution with an up-scaling.” For more on this topic, scroll down to the section titled: “Case Study 2: Is the iPhone 5 Really Twice as Fast?”This video proves shows that the iPhone 5s is only slightly faster than the iPhone 5 when it comes to real-world tests. For example, The iPhone 5s only starts up only 1 second faster than the iPhone 5 (23 seconds vs. 24 seconds). The iPhone 5s only loads the Reddit.com site 0.1 seconds faster than the iPhone 5. These differences are so small it’s unlikely anyone would even notice them. Would you believe the iPhone 4 shuts down five times faster than the iPhone 5s? It’s true (4 seconds vs. 21.6 seconds). Another video shows that even though the iPhone 5s does better on most graphics benchmarks, when it comes to real world things like scrolling a webpage in the Chrome browser, Android devices scroll significantly faster than a iPhone 5s running iOS 7.See for yourself in this video.

 

The iPhone 5s appears to do well on graphics benchmarks until you realize that Android phones have almost 3x the pixels


The iPhone 5s appears to do well on graphics benchmarks until you realize that Android phones have almost 3x the pixels

  • Some benchmarks penalize devices with more pixels — Most graphic benchmarks measure performance in terms of frames per second. GFXBench (formerly GLBenchmark) is the most popular graphics benchmark. Apple has dominated in the scores of this benchmark for one simple reason. Apple’s iPhone 4, 4S, 5 and 5s displays all have a fraction of the pixels flagship Android devices have. For example, in the chart above, the iPhone 5s gets a score of 53 fps, while the LG G2 gets a score of 47 fps. Most people would be impressed by the fact that the iPhone 5s got a score that was 12.7% higher than the LG G2, but when you consider the fact the LG G2 is pushing almost 3x the pixels (2073600 pixels vs. 727040 pixels), it’s clear the Adreno 330 GPU in the LG G2 is actually killing the GPU in the iPhone 5s. The GFXBench scores on the 720p Moto X (shown above) are further proof that what I am saying is true. This bias against devices with more pixels isn’t just true with GFXBench, you can see the same behavior with graphics benchmarks like Basemark X shown below (where the Moto X beats the Nexus 4).
More proof that graphics benchmarks favor devices with lower-res displays

More proof that graphics benchmarks favor devices with lower-res displays

  • Some popular benchmarks are no longer relevantSunSpider is a popular JavaScript benchmark that was designed to compare different browsers. However, according to at least one expert, the data that SunSpider uses is a small enough benchmark that it’s become more of a cache test. That’s one reason why Google came out with their V8 and Octane benchmark suites, both are better JavaScript tests than SunSpider.” According to Google, Octane is based upon a set of well-known web applications and libraries. This means, “a high score in the new benchmark directly translates to better and smoother performance in similar web applications.” Even though it may no longer be relevant as an indicator of Java-script browsing performance, SunSpider is still quoted by many bloggers. SunSpider isn’t the only popular benchmark with issues, this blogger says BrowserMark also has problems.
SunSpider is a good example of a benchmark which may no longer be relevant

SunSpider is a good example of a benchmark which may no longer be relevant — yet people continue to use it

  • Benchmark scores are not always repeatable – In theory, you should be able to run the same benchmark on the same phone and get the same results over and over, but this doesn’t always occur. If you run a benchmark immediately after a reboot and then run the same benchmark during heavy use, you’ll get different results. Even if you reboot every time before you benchmark, you’ll still get different scores due to memory allocation, caching, memory fragmentation, OS house-keeping and other factors like throttling.Another reason you’ll get different scores on devices running exactly the same mobile processors and operating system is because different devices have different apps running in the background. For example, Nexus devices have far less apps running in the background than a non-Nexus carrier-issued devices. Even after you close all running apps, there are still apps running in the background that you can’t see — yet these apps are consuming system resources and can have an affect on benchmark scores. Some apps run automatically to perform housekeeping for a short period and then close. The number and types of apps vary greatly from phone to phone and platform to platform, so this makes objective testing of one phone against another difficult.Benchmark scores sometimes change after you upgrade a device to a new operating system. This makes it difficult to compare two devices running different versions of the same OS. For example, the Samsung Galaxy S III running Android 4.0 gets a Geekbench score of 1560, which the same exact phone running Android 4.1 gets Geekbench score of 1781. That’s a 14% increase. The Android 4.4 OS causes many benchmark scores to increase, but not in all cases. For example, after moving to Android 4.4, Vellamo 2 scores drop significantly on some devices because it can’t make use of some aspects of hardware acceleration due to Google’s changes.

    Perhaps the biggest reason benchmark scores change over time is because they stress the processor increasing its temperature. When the processor temperature reaches a certain level, the device starts to throttle or reduce power. This is one of the reasons scores on benchmarks like AnTuTu change when they are run consecutive times. Other benchmarks have the same problem. In this video, the person testing several phones gets a Quadrant Standard score on the Nexus 4 that is 4569 on the first run and 4826 on a second run (skip to 14:25 to view).

  • Not all mobile benchmarks are cross-platform — Many mobile benchmarks are Android-only and can’t help you to compare an Android phone to the iPhone 5. Here are just a few popular mobile benchmarks which are not available for iOS and other mobile platforms (e.g. AnTuTu Benchmark, Octane, Neocore, NenaMark, Quadrant Standard and Vellamo).
  • Some benchmarks are not yet 64-bit — Android 5.0 supports 64-bit apps, but most benchmarks do not run in 64-bit mode yet. There are a few exceptions to this rule. A few Java-based benchmarks (Linpack, Quadrant) run in 64-bit mode and do see performance benefits on systems with 64-bit OS and processors. AnTuTu also supports 64-bit.
  • Mobile benchmarks are not time-tested — Most mobile benchmarks are relatively new and not as mature as the benchmarks which are used to test Macs and PCs. The best computer benchmarks are real world, relevant and produce repeatable scores. There is some encouraging news in this area however — now that 3DMark is available for mobile devices. It would be nice if someone ported other time-tested benchmarks like SPECint to iOS as well.
Existing benchmarks don't accurate measure the impact of memory speed or throughput

Existing benchmarks don’t accurately measure storage performance on things like video playback

  • Inaccurate measurement of memory and storage performance — There is evidence that existing mobile benchmarks do not accurate measure the impact of faster memory speeds or storage performance. Examples above and below. MobileBench is supposed to address this issue, but it would be better if there was a reliable benchmark that was not partially created memory suppliers like Samsung.
Existing benchmarks don't accurately measure storage performance on things like video playback

Existing benchmarks don’t accurate measure the impact of memory speed or throughput

  • Inaccurate measurement of the heterogenous nature of mobile devices — Only 15% of a mobile processor is the CPU. Modern mobile processors also have DSPs, image processing cores, sensor cores, audio and video decoding cores, and more, but not one of today’s mobile benchmarks can measure any of this. This is a big problem.

Case Study 1: Is the New iPad Air Really 2-5x as Fast As Other iPads?

There have been a lot of articles lately about the benchmark performance of the new iPad Air. The writers of these article truly believe that the iPad Air is dramatically faster than any other iPad, but most real world tests don’t show this to be true. This video compares 5 generations of iPads.

Benchmark tests suggest the iPad Air should be much faster than previous iPads

Benchmark tests suggest the iPad Air should be much faster than previous iPads

Results of side-by-side video comparisons between the iPad Air and other iPads:

  • Test 1 – Start Up – iPad Air started up 5.73 seconds faster than the iPad 1. That’s 23% faster, yet the Geekbench 3 benchmark suggests the iPad Air should be over 500% faster than an iPad 2. I would expect the iPad Air would be more than 23% faster than a product that came out 3 years and 6 months ago. Wouldn’t you?
  • Test 2 – Page load times – The narrator claims the iPad Air’s new MIMO antennas are part of the reason the new iPad Air loads webpages so much faster. First off, MIMO antennas are not new in mobile devices; They were in the Kindle HD two generations ago. Second, apparently Apple’s MIMO implementation isn’t effective, because if you freeze frame the video just before 1:00, you’ll see the iPad 4 clearly loads all of the text on the page before the iPad Air. All of the images on the webpage load on the iPad 4 and the iPad Air at exactly the same time – even though browser-based benchmarks suggest the iPad Air should load web pages much faster.
  • Test 3 – Video Playback – On the video playback test, the iPad Air was no more than 15.3% faster than the iPad 4 (3.65s vs. 4.31s)

Reality: Although most benchmarks suggest the iPad Air should be 2-5x faster than older iPads, at best, the iPad Air is only 15-25% faster than the iPad 4 in real world usage, and is some cases it is no faster.

Final Thoughts

You should never make a purchasing decision based on benchmarks alone. Most popular benchmarks are flawed because they don’t predict real world performance and they don’t take into consideration power consumption. They measure your mobile device in a way that you never use it: running all-out while it’s plugged into the wall. It doesn’t matter how fast your mobile device can operate if your battery only lasts an hour. For the reason top benchmarking bloggers like AnandTech have stopped using the AnTuTu, BenchmarkPi, Linpack and Quadrant benchmarks, but they still continue to propagate the myth that benchmarks are an indicator of real world performance. They claim they use them because they aren’t subjective, but then them mislead their readers about their often meaningless nature.

Some benchmarks do have their place however. Even though they are far from perfect they can be useful if you understand their limitations. However you shouldn’t read too much into them. They are just one indicator, along with product specs and side-by-side real world comparisons between different mobile devices.

Bloggers should spend more time measuring things that actually matter like start-up and shutdown times, Wi-Fi and mobile network speeds in controlled reproducible environments, game responsiveness, app launch times, browser page load times, task switching times, actual power consumption on standardized tasks, touch-panel response times, camera response times, audio playback quality (S/N, distortion, etc.), video frame rates and other things that are related to the ways you use your device.

Although most of today’s mobile benchmarks are flawed, there is some hope for the future. Broadcom, Huawei, OPPO, Samsung Electronics and Spreadtrum recently announced the formation of MobileBench, a new industry consortium formed to provide more effective hardware and system-level performance assessment of mobile devices. They have a proposal for a new benchmark that is supposed to address some of the issues I’ve highlighted above. You can read more about this here.

A Mobile Benchmark Primer

      If you are wondering which benchmarks are the best, and which should not be used,

this article

    should be of use.

Benchmarks like this one suggest the iPhone 5 is twice as fast as the iPhone 4S.

Case Study 2: Is the iPhone 5 Really Twice as Fast?

Note: Although this section was written about the iPhone 5, this section applies equally to the iPhone 5s. Like the iPhone 5, experts say the iPhone 5s is twice as fast in some areas — yet most users will notice little if any differences that are related to hardware alone. The biggest differences are related to changes in iOS 7 and the new registers in the A7.

Apple and most tech writers believe the iPhone 5’s A6 processor is twice as fast as the chip in the iPhone 4S. Benchmarks like the one in the above chart support these claims. This video tests these claims.

In tests like this one, the iPhone 4S beats the iPhone 5 when benchmarks suggest it should be twice as slow.

Results of side-by-side comparisons between the iPhone 5 to the iPhone 4S:

  • Opening the Facebook app is faster on the iPhone 4S (skip to 7:49 to see this).
  • The iPhone 4S also recognizes speech much faster, although the iPhone 5 returns the results to a query faster (skip to 8:43 to see this). In a second test, the iPhone 4S once again beats the iPhone 5 in speech recognition and almost ties it in returning the answer to a math problem (skip to 9:01 to see this).
  • App launches times vary, in some cases iPhone 5 wins, in others the iPhone 4S wins.
  • The iPhone 4S beats the iPhone 5 easily when SpeedTest is run (skip to 10:32 to see this).
  • The iPhone 5 does load web pages and games faster than the iPhone 4S, but it’s no where near twice as fast (skip to 12:56 on the video to see this).

I found a few other comparison videos like this one, which show similar results. As the video says, “Even with games like “Wild Blood” (shown in the video at 5:01) which are optimized for the iPhone 5s screen size, looking closely doesn’t really reveal anything significant in terms of improved detail, highlighting, aliasing or smoother frame-rates.” He goes to say, “the real gains seem to be in the system RAM which does contribute to improved day to day performance of the OS and apps.”

So the bottom line is: Although benchmarks predict the iPhone 5 should be twice as fast as the iPhone 4S, in the real-world tests, the difference between the two is not that large and partially due to the fact that the iPhone 5 has twice as much memory. In some cases, the iPhone 4S is actually faster, because it has less pixels to display on the screen. The same is true for tests of the iPad 4 which reviewers say “performs at least twice as fast as the iPad 3.” However when it comes to actual game play, the same reviewer says, “I couldn’t detect any difference at all. Slices, parries and stabs against the monstrous rivals in Infinity Blade II were fast and responsive on both iPads. Blasting pirates in Galaxy on Fire HD 2 was a pixel-perfect exercise on the two tablets, even at maximum resolution. And zombie brains from The Walking Dead spattered just as well on the iPad 3 as the iPad 4.”

– Rick

Copyright 2012-2014 Rick Schwartz. All rights reserved. This article includes the opinions of the author and does not reflect the views of his employer. Linking to this article is encouraged.

Follow me on Twitter @mostlytech1

 

Connected Home Best Practices

Last update: February 17, 2013

This article has had over 700,000 views! Thanks for reading it.

Back in 2009, there wasn’t much information around to help people set up a home network for multimedia, so I wrote an article called Connected Home Best Practices. This is an updated version of that article.

Connected Home Benefits

Before I get into any details, I thought I’d mention some of the benefits you might experience if you follow my guidelines below.

  • Watch all of your of your movies on any TV in the house without inserting a disc into a DVD player

    You can use a game console like this to access your media

  • Access your entire music collection of CDs and downloaded music from your receiver, computer, tablet or smartphone. You can even play different music in every room if you want
  • View any photo you’ve ever taken on any TV in your house or any mobile device
  • Drag and drop photos from Flickr, Facebook, Photobucket or Picasa Web onto a device to view them
  • Access your media 24/7 without turning on a computer
  • Copy media from your computers to mobile devices with needing a cable
  • Access all of your media from a game console
  • Have media automatically copied from your mobile device to your backup drive when you enter your home

These are just a few of the things which are possible with a multimedia network and connected devices.

Step 1 – Preparing Your Media

This first step is to collect all of your media assets. If you haven’t scanned your analog photos and digitized your home movies, you should consider doing that first. If you have DVD  movies, you might want to consider converting those to digital files as well. There is already a lot of info on the Internet about this, so I’m not covering it here. Here are some important things to consider as you prepare your media to be shared.

Some people digitize their DVD movie collections for easy access

    1. As you create your digital media library, try to use formats which are supported by the devices you plan to use. If you’re using AirPlay devices like Apple TV, you should use the following: AAC, MP3, Apple Lossless, AIFF and WAV audio; JPEG, GIF and TIFF images; H.264, MPEG-4 and Motion JPEG video (up to 720p). More info. If you’re using DLNA-certified devices you should try to stick with MP3 or LPCM audio; JPEG photos; and MPEG-2, MPEG4, WMV9 video. More info.
    2. Avoid buying copy-protected media when the same content exists in a legal, unprotected form. Unprotected media is superior because any device can play it and you’ll never have to worry about losing your licenses.
    3. When ripping audio CDs, choose high-bit rate MP3 or linear audio (WAV) over FLAC or Apple Lossless (ALAC), because not every device can play these formats. If you insist on lossless audio make sure all of your devices can play the format you plan to use.
    4. It’s essential that all of your music files have accurate ID3 tags. Your media server uses these tags to create its navigation trees. If any of your music files are missing artist, genre or album tags, those artists/genres/albums won’t appear in the navigation tree. You can still access that media from the song list, but it’s more time consuming. Tip: Software like Tag & Rename, can convert filenames into ID3 tags.
    5. Avoid editing metadata using iTunes or Windows Media Player unless you’re sure the changes you make are stored in your media as a standard tag format. Older versions of iTunes and Windows Media Player stored changes in the local database, so they would be lost when a file was moved. It’s better to use specialized software like Media Monkey, which enters metadata directly into an ID3 or EXIF tag, so it can be imported by software on any Mac or PC.

This drive backs up 2TB of media for only $159

  1. Create separate folders for each artist in your music library. Each artist folder should contain separate folders for each album. Each album folder should contain a JPEG file for the album cover. Normally this file is named “folder.jpg”. Your media server will use this file as album art. You can also embed album art in each music file as an ID3 tag, but doing this with file formats like WAV can cause problems with some media players.
  2. It’s a good idea to create separate folders for each year in your My Photos folder. Inside each year folder, you should have subfolders for different photo albums.
  3. You may want to also use software like Windows Live Photo Gallery to add tags or descriptions to your photos.
  4. And last but not least, make sure to perform regular backups of your media. After you do all this work, you won’t want to lose it.

Step 2 – Selecting a Media Server

    1. If you want to access your media without a computer on, you’ll want to get a network-attached storage device (NAS). Make sure your NAS has an embedded media server. Beware of older or inexpensive NAS devices. Some of these have memory limitations or slow CPUs have problems with large media collections.
    2. Do not encrypt the data on your NAS (at the partition or at a file level). The performance hit which occurs when you do this  is massive and the drop in transfer rates is likely to cause problems streaming video.

A NAS provides 24/7 access to your media

  1. Avoid using software like Microsoft’s Windows Media Player or Apple’s iTunes to share your media files. Premium media servers like TwonkyServer (and others) are faster and more reliable. They also support more devices and file formats.
  2. Make sure the virus scanner on your computer or NAS is not a CPU hog. Poorly designed virus scanners can cause skipping  problems during playback.
  3. Avoid running software firewalls like ZoneAlarm (unless you know how to configure them so they don’t cause problems). Make sure your software firewall isn’t blocking any of the ports it needs to function. Check the manufacturers’ site to obtain this info.
  4. Avoid storing media on a network share. It’s better to share content from a hard drive in the same device where the media server resides. Network shares increase the traffic on your network and can be unreliable.
  5. Don’t nest your media deeply under many levels of folders. Doing so, can slow down media scanning and increase the size of your media database.
  6. Be careful which folder you select as your watched folder. Do not select a folder your Operating System constantly updates, like a Temp folder, bit-torrent download folder, or the Windows System folder. A watched folder with lots of changes can slow down your media server.

Step 3 – Choosing a Connected Media Player

There is no single media player that’s good at everything. Each has its own strengths and weaknesses. Here are some suggestions which will help you to choose the right media player for your needs:

    1. Consider buying products which have been DLNA-certified because they normally undergo a higher level of testing than other connected devices. You can search for DLNA-certified products here.
    2. Not all DLNA-certified devices can be externally controlled. Make sure your media player can accept media which is pushed from your computer or mobile device. Many devices only can pull media using their remote controls. You can find a list of devices that meet this requirement here.
    3. If you’re having problems with a device, check to see if a newer firmware update addresses those problems. Keep in mind installing new firmware can create problems, so you should only do so if you know it will fix a problem you’re experiencing, or you can revert to a previous version if needed.

The Sony PS3 is a photo and music player, but you can’t beam media to it.

    1. Before using a game console as a media player, you should be aware of their limitations:
    2. You cannot beam media from your PC or mobile device to a Sony PS3 or Nintendo Wii. The Xbox 360 can accept pushed media, but only when it’s in Media Center Extender mode.
    3. Game consoles do not support as many formats as other digital media players.
    4. The Xbox 360 does not display all of the items in the Twonky navigation tree. As a result, you won’t see things like By Folder, Artist Index, Artist Album, Genre Index, Genre/Artist.
    5. Most game consoles, connected TVs and Blu-ray players can only play a limited number of media formats. Because of this, you’re often better off purchasing a low-cost connected media player, like a WD TV, which supports a wider range of formats.

What to look for in a connected photo and video player

      1. Make sure your digital media player can automatically scale photos so they appear full-screen. Not all connected media players and TVs can do this.

Apple TV handles pushed slideshows well

      1. The best digital media players, like the PS3, have nice transitions in-between photos and let you play music in the background while you are watching a slideshow. Xbox 360 has a nice zoom transition effect on photos. At this time, there are no media players other than Apple TV, which can display a continuous slideshow of photos pushed from a computer or mobile device. In most cases, a black or blue screen appears in-between each photo. If you experience this problem, use the remote which came with your device and select a folder of images to view. If you do this, you should see nice transitions between every photo.

      1. Don’t assume all media players can stream 1080p videos. Some can only play 720p, or have stuttering problems when they play 1080p.
      2. Don’t assume your connected TV will be able to play all popular media formats. Most DLNA-certified TVs can only play MPEG-2, AVCHD and a few other formats. Don’t expect 3GP, QuickTime, DivX, MKV or YouTube videos to play on these devices. If you want to play YouTube videos, you’ll need a Samsung TV, Sony TV (2011  or later), Xbox 360 or a WD TV Live.

What to look for in a connected music player

      1. Should be able to play all popular music formats (e.g. MP3, FLAC, AAC/M4A, ALAC, WMA and Ogg Vorbis).
      2. If you don’t have a connected receiver you should connect a media player your stereo receiver.
      3. Don’t assume that all media players can accept music playlists that are beamed from computers or mobile devices. Some only allow one song to be sent at a time.
      4. Don’t assume all connected devices can have their volume changed externally. Most stereo receivers disable this feature.
      5. Look for media players which can be grouped so you can have the same music playing in different rooms of your home. Examples: Linn multi-room music systems, Philips Streamium players, Sonos ZonePlayers, etc.
      6. If you have a great stereo system, make sure your media player has an optical output or good audio converters (e.g. Linn products, Sonos Zone Players).

What to look for in a media controller


Your computer or mobile device can act as a media controller. They can be used like a remote to play/stop or skip media. They can also stream media directly to certain devices.

      1. Should work with DLNA-certified devices and AirPlay-certified devices like Apple TV.
      2. A good controller should be available for Android and iOS mobile devices as well as Mac and PCs.
      3. A good mobile controller should be able to automatically hand off it’s playlists to an always-on device like a NAS. That way playback will not stop when the mobile device leaves the network or goes into power standby.

Network-related Advice

Not all home networks are ready to stream high-definition video. Here are some suggestions which will make your home network multimedia-ready.

Wired Network Tips

      1. Connect your media players using a wired connection when you have a choice. Wired networks capable of much higher throughput. Cat 6 Ethernet is up to 5 times faster than 802.11n wireless and less likely to have stuttering problems when streaming HD video. 802.11g Wi-Fi is fine for streaming music and photos, but can be problematic for HD video. The data-rate required for DVD-quality video is 9.8Mbps, while Blu-ray is around 40Mbps. Although it seems like a 54Mbps wireless router should be able to handle this amount of data, in reality, it probably can’t. Wired connections also don’t “drop” or have range problems like wireless connections do.
      2. Don’t back up your computer while you’re trying to stream HD video. This can cause videos to buffer.
      3. Don’t connect any of your PCs or devices to the “Internet” or “Uplink” ports on your router or switch.
      4. If you want to stream HD video over the Internet, check your Internet connection speed using sites like this. Most HD Internet movie streaming sites recommend a download speed of at least 3.0 Mbps.
      5. Use Cat 5e or Cat 6 network patch cords and cabling. They cost about the same as regular cable and could be a little faster.

Wireless Network Tips

      1. Dual-band Wi-Fi routers like the Linksys N600, N750, N900 over AC1750 are better for multimedia streaming because they support the 2.4GHz and 5GHz bands. The 5GHz band is much less prone to interference. Use the 5GHz band if your devices support it and range is not a problem.
      2. If you have problems with wireless devices on your network, consider power-line networking alternatives like HomePlug Powerline AV adapters. If you go this route, make sure your adapter is HomePlug AV-compatible so you can mix and match devices from other companies. According a recent article in Maximum PC, last-generation Homeplug AV 200 adapters were supposed to be capable of speeds of up to 200Mb/s. Even though they only got real world speeds were 60-70Mb/s, that’s enough for a single HD stream. Newer devices support the IEEE 1903 standard which is capable of theoretical speeds up to 500Mbps and real world speeds up to 100Mbps. These speeds are even faster than you can get over standard Ethernet wiring, so you should be able to stream multiple HD movies at once in your home using multiple adapters. Consider the TP-Link AV-500 TL-PA511KIT or eNetgear Nano 500 XAVB5101 adapters. Those are two of the best affordable adapters available today.
      3. Another good wireless alternative is Ethernet over coax MoCA adapters like these. Both of these are much more reliable than wireless routers. They are also capable of higher data rates. However, HomePlug is not without problems, it sometimes has issues with split-phase wiring and surge protectors. When using HomePlug, avoid using an AC power strip and plug the unit directly into the wall.
      4. If your wireless devices are having trouble seeing your media server running on your LAN, check your router to make sure that multicast is enabled.
      5. To increase performance, you should try to use wireless channels which do not overlap with your neighbors. If you have an Android device, you should download the “Wi-Fi analyzer” app by FARPROC. It shows which channel has the best signal strength.
      6. Try to place your wireless router in a location where it is as close to your media players as possible. Every time the signal has to pass through a wall it drops in strength. 5GHz wireless loses more signal than 2.4GHz when going through walls.
      7. Be aware that cordless phones, baby monitors, wireless security cameras and microwave ovens can all interfere with 2.4GHz wireless networks.

Router Suggestions

      1. Having devices connected to both wired and wireless networks at the same time can cause problems. If you can’t see your media server or some media players on your network, this could be the cause.

        Free apps like this make it easy to see wireless congestion

      2. Use DHCP because it makes setup easier. Don’t use fancy network setups with multiple subnets and hubs. Doing so can introduce latencies that cause problems with UPnP data.
      3. If you want to be able to stream multiple HD videos at once, make sure there are no 10Mbps routers or switches on your LAN. Use Gigabit Ethernet switches instead.
      4. Some routers and switches work better than others for media streaming. Problems with media playback stopping or stuttering often go away when a new router or switch is used.
      5. Changing router settings can also sometimes improve media streaming performance. Before changing any router setting, make sure to write down the old setting, in case you need to go back to it. If you have a busy network, collisions can occur that reduce your throughput. Lowering the fragmentation threshold can improve performance by reducing re-transmissions. Try setting the fragmentation threshold to 1,000 bytes and see if that improves media streaming. Be aware that using smaller packet adds extra overhead, so you shouldn’t set this value too low. Setting the threshold to the largest value (2,346 bytes) effectively disables fragmentation. Do not change this setting if you are not having media streaming problems. Another parameter that some users experiment with is the UPnP Advertisement Period. Some claim lowering this parameter can cause devices to appear faster on the network.
      6. If your router has a setting to enable or disable UPnP, it’s important you understand what this setting does.
        1. “UPnP AV” is a streaming protocol that allows UPnP software to discover and communicate with other UPnP devices. If this is disabled, you may not be able to discover media servers or media players/renderers on your network. This should be enabled for proper operation.
        2. “UPnP IGD” (Internet Gateway Device protocol) allows software to automatic configure port forwarding for remote access. This setting does not have to be enabled to browse or stream media. In some countries this setting is disabled by default because it can make it easier for hackers to punch a hole in your router’s firewall and gain access to your network. Keep this disabled, unless you need to allow remote access and do not know how to configure this manually. If the router lists UPnP without more details it probably refers to UPnP IGD. Check its manual to make sure.

The Final Word

If you’ve made it this far, you know what you need to optimize your home network for multimedia streaming. I hope you find some of this information to be useful. If you have any additional suggestions, please leave them in the Comments section.

Thanks.

– Rick

Thanks to Christian Gran, Jim Pfeifer, Angela Scheller, Cindy Vivoli, Ken Clapp and pcfe for also contributing to this article.

Copyright 2009-2016 Rick Schwartz. All rights reserved. Linking to this article is encouraged.

Follow me on Twitter @mostlytech1