The final installment to the Tracker Comparison series involves lessons learned from the actual testing. These are more observations about the particular trackers we used and the syncing process itself. When reading this, please keep in mind that all of the trackers featured in this article are from the best brands and will all work exceptionally well for most of your needs. In fact, my sharpest criticisms are against the devices I like the most and use even when I’m not testing for an article.
Fitbit is the standard by which all trackers are measured and deserves this elevated position. Fitbit defined much of the wearables category and have continued to improve and innovate. Testing with a Fitbit went smoothly and results were always consistent. The only issue discovered was that even though data would be on the Fitbit smartphone app, it would occasionally not be synced with the fitbit.com website where ChallengeRunner pulls data. It was a simple matter to correct by swiping down on the app to force the sync process, but I can see where it could cause confusion.
I wear an Apple Watch every day (it’s on my wrist as I write this) and like nearly everything about it… However, I’m not totally sold on it as a fitness device. The biggest issue is its size. It needed to be large enough to interact with the touch screen but that kept it from being small enough to work out with and forget that it is on your wrist. In addition, the screen can get scratched easily during high intensity, strength-building workouts. My other observation is about the software used to track an activity while it is progress. If you forget to specifically tell it that you are running before you start, it will ask you if you are running and may continue to do so throughout your workout. If you do not respond to it, it will still count steps and your distance seems to be fine. However, if you know that I am running, just record it like all other trackers and leave me alone while I run. Ok, I guess I was complaining a little more than I intended, but that is because I spend the most time with this device.
I have been using a Misfit Shine as a testing mule for several years and again in testing for this comparison. While the Shine is not new and not a particularly flashy device, it always performed well and synced easily. I was going to give it kudos in this article for its endurance but it unfortunately stopped working right before I published this article. I just received a Misfit Phase as a testing replacement which seems like a nice device, but I will miss the Shine.
The HJ-327T (they need to work on naming) I tested was the cheapest device in the group and can be found online for about 10 to 15 dollars. It is basically a smart pedometer and is good at what it does. That said, it can be frustrating to sync and I sometimes felt like I was getting more of a workout shaking the device to wake it up as I was during the actual run for this test. However, the issue could very well be a user error.
The Polar Loop 2 I used during the competition performed very well up to the point of the calories-burned test at which time it recorded much higher calories than the other devices on the two-mile run. I checked my Polar app settings since calories burned is not as concrete as a step walked and must be computed using various physical characteristics such as height and weight and the settings were correct. So, what to try next… I wear the Fitbit and the Polar devices for an afternoon of general activity and the recorded calories burned were very close. Moral of the story: while calories burned for a Polar Loop was outside of the norm in a two-mile running test, during every-day activity, it seemed to count calories as accurately as other devices.
Ok, so I left out some of the devices I used during the test such as the Garmin, Withings and Samsung Gear. Frankly, they were great and worked as expected, they updated easily and were accurate. The data from Google Fit was a little wonky but that had more to do with how it was being pulled from the Google website. I expect better results once we switch to using a different interface.
Beside tracker differences which, all things considered, are not very different, this comparison also exposed some activities that might change the way you approach running a fitness challenge. For example, steps are largely generic; every device does a pretty good job at tracking them and are somewhat interchangeable. On the other hand, active minutes are more subjective and may lead to very different results for participants performing the same activity but wearing different trackers. However, that doesn’t mean that you should discontinue active-minute-based challenges. They are much better at capturing non-running or walking activities (yoga or cycling anyone?) and are more inclusive to your walking-impaired participants. This is, however, beyond the scope of this article so I will have to pick it up again in a later discussion.
I do hope you enjoyed the series. It took some work putting it all together but I’m glad that we ran the comparison so that we know a little more about the data being collecting at ChallengeRunner and hopefully the comparison provides some guidance to anyone else using multiple devices in a fitness challenge.
<< Prev Next >>