Connect with us

Privacy

Comparing Privacy Policies for Wearable Fitness Trackers: Apple, Fitbit, Xiaomi and Under Armour

Masha Abarinova

Published

on

Photo by Kārlis Dambrāns used with permission

WASHINGTON, September 6, 2019 – Wearable fitness trackers are just another way of monitoring personal data. Yet the market for wearables is booming, because people want to optimize their fitness routines through self-monitoring of vital signs.

Aside from personal use, fitness tracker data is valuable for health insurance companies as it is a more efficient way to analyze a person’s fitness patterns. John Hancock, for instance, offers the Apple Watch at a high discount when paired with an insurance plan.

As these devices can track intricate details such as heart rate, calorie consumption and GPS coordinates, transparency about these practices is a must. Amid the diverse fitness tracker market, Apple and Fitbit are considered leading manufacturers. In addition to spearheading innovation in the fitness tech industry, Apple and Fitbit have been reported to have some of the most consumer-friendly policies.

Their competitors, on the other hand, still have some ways to go before they have developed a comprehensive level of security. Xiaomi and UnderArmour are two fitness tracker manufacturers that have had some discrepancies with their privacy and security practices.

Describing these four companies in the above manner is an oversimplification of their methodology. It is important to look at each individual company’s approach to privacy and to determine what benefits and gaps they have.

Apple

Apple has one of the most robust and easy-to-read privacy policies among major tech firms. Praised by the transparency of its computer and mobile devices, the Cupertino- based company puts the same amount of effort for wearables.

On its website, Apple states that privacy is a “fundamental human right.” It then goes into detail about the various ways in which a user’s privacy is protected. Concerns that Apple touches upon include encryption, third-party usage of app data and how users can modify and/or delete their data.

With the advent of the EU’s General Date Protection Regulations, multinational companies needed to update their policies to comply with international standards.

Last year, Apple launched a privacy portal allowing users to obtain a copy of the personal data associated with their account. This includes Apple ID info, App Store activity and data stored in iCloud. Additionally, the online portal has a page allowing users to correct their data and deactivate or delete their account.

For these reasons, the iPhone and Apple Watch have overlapping data protections. Apple Watch’s terms and conditions underscores the ability to control data via the privacy portal. This data is shareable with Apple’s affiliates and is combined with other personal information the company obtained, as well as with de-identified data.

Apple Watch terms note that photo and location data is shared, though it does not guarantee that data is accurate. Users are also informed that Apple can limit Watch use without any notice, and are advised to consult with a physician before starting a fitness program with the Watch.

Fitbit

Fitbit is Apple’s like-minded competitor, fully disclosing what its device does and who can access the information. Much of its privacy approach mirrors that of Apple, however Fitbit also has a separate privacy policy for children.

From the start, Fitbit discloses that it does not produce medical devices and that their trackers are not meant to replace doctor consultation. In its premium services, Fitbit incorporates data of non-paying users.

In the summary of its privacy policy, Fitbit states that transparency “is the key to any healthy relationship.” Following Apple’s protocol, Fitbit takes personal data such as height, weight and sleep patterns to give the user an accurate and relevant portrayal of their fitness regime.

We may personalize exercise and activity goals, Fitbit writes, based on past activity data and goals a user has previously set.

Under Fitbit’s child policy, parents have control over their child’s data until they’re 13 years of age. Persons under 13 are not permitted to create accounts without parental consent, and parents must consent to the use of their child’s data in accordance with Fitbit’s Privacy Policy for Children’s Accounts.

A separate link outlines these policies in detail. If parents have reason to believe that their child’s data was submitted without consent, they can contact Fitbit to request the removal of that data.

Xiaomi

At a glance, this Chinese-based tech company seems to have minor privacy discrepancies. It also sells fitness trackers for a much lower price than the mainstream manufacturers. However, some security issues and limited disclosure causes Xiaomi to have a lower credibility than its competitors.

Xiaomi specifically states that neither they nor their suppliers and distributors make any specific promises about the service. Xiaomi claims it is committed to upholding privacy policies worldwide, however it explicitly states that any disputes to the company’s terms and conditions “will be litigated exclusively” in Chinese courts, where the user and Xiaomi “consent to personal jurisdiction.”

Another concerning factor of the privacy agreement is that Xiaomi doesn’t clarify what happens to personal information in the event of a merger or acquisition. The company only states that users will be notified.

Furthermore, Xiaomi fitness trackers have also been found vulnerable to Bluetooth MAC address surveillance, a flaw that’s not uncommon among trackers. According to a study by Open Effect, when a fitness tracker’s MAC address doesn’t change, it becomes easier for users to be monitored through location sharing services.

Overall, the vague and slightly intimidating nature of Xiaomi’s terms and conditions prevents the company from contributing to transparency.

Under Armour

Under Armour faced significant scrutiny after its subsidiary MyFitnessPal was compromised in a data breach involving 150 million accounts. Yet the company’s terms and conditions still leave room for improvement. The amount of data it collects and the default sharing options available are particularly concerning.

Moreover, Under Armour’s online privacy policy hasn’t been updated for over a year. Their terms outline the different ways they collect, use, disclose and process personal data. It’s also unclear the extent to which UA shares personal data with other companies.

In addition to UA’s abundant yet vague privacy approach, UA can track a user’s location even when its app isn’t running and sharing preferences such as Activity Stats, Community Social Data and Lookup Information are set as public by default. Setting an entire account on private mode would prevent those on the user’s friend list from finding them.

California residents have some leeway with UA’s privacy agreement due to the state’s legal framework. They are permitted once a year to request a list of personal data that the company disclosed to third parties if it was used for direct marketing purposes. However, UA claims that it does not share personal data for this purpose as per a California Civil Code.

As with most multinational companies, users residing in the EU have the right to request deletion of their account.

Although UA does provide ways for users to opt out of public settings, the fact that these are set to default is alarming. UA also suffers from a lengthy and difficult to understand Terms and Conditions agreement. This implies that privacy is not on the top of UA’s priorities, which is perplexing given the aftermath of their data breach scandal.

Privacy

National Plan Required For Consumer Privacy, Congresswoman says

Samuel Triginelli

Published

on

Screenshot of Suzan DelBene from C-Span

April 1, 2021 — A Congresswoman from Washington State, who introduced federal legislation that would be the first national consumer privacy law if adopted, says the federal government is being outpaced by some states that are implementing their own consumer privacy legislation.

“There is a significant problem with consumer privacy in the US,” said Representative Suzan Delbene on Tuesday during a New Democratic Network event. Delbene introduced her Information Transparency, and Personal Data Control Act, a wide ranging federal privacy legislation, on March 10. Delbene is the vice chair of the Ways and Means Committee, and is the chair of the House New Democrat Coalition caucus.

There is no federal data privacy law, which has forced some states to pursue their own consumer data policies. That includes California and, recently, Virginia. Some have said the concern is that there will be a patchwork of different privacy legislation that may end up just confusing Americans.

“We need a uniform set of rights for consumers and businesses standards to follow in the digital world,” DelBene said.

The bill states that companies must provide privacy polies in plain language, must allow users to opt-in for personal information gathering, must disclose who personal information is being shared with, and must submit to privacy audits every two years. The federal law would also give the government the ability to preempt existing state laws.

Simon Rosenberg, president of New Democrat Network, said about the bill that, “together, we have a lot of work to do in the coming years to restore the promise of the Internet. One of the areas of greatest need is creating a single working privacy standard for the United States.

“In her bill, the approach Representative DelBene takes to protecting Americans’ privacy is smart, measured, and will undoubtedly be highly influential in shaping the approach Congress takes in the days ahead. It is a very welcome addition to the vital debate underway about our digital future,” Rosenberg added.

The purpose of this bill is to ensure that privacy policies are transparent and clear. “Many consumers are given lots of information with lots of legal terms, that leads them to click the accept button without knowing what they have signed up for,” DelBene said.

“There is an urgent need for consumers to understand what data is being shared,” she added. “We want to make sure there is enforcement. The law says that this will be the responsibility of the Federal Trade Commission, so the FTC must have the resources to do this.

“I think my bill is focused on privacy specifically because I think it is foundational. We build on important things, such as AI, facial recognition, and all the other issues we need to address. If we don’t start addressing the issues of data privacy, it will be hard to imagine how it will the expansion of laws to address a broader set of issues that need to get ahead of.”

Congresswoman DelBene believes the bill can be bipartisan, but she wants to make sure Congress understands its importance. “I’m not sure Congress understands these issues, so it takes a collective effort to push it forward.”

DelBene says she’s confident that Congress will follow the bill, despite many congresspeople who she said are hesitant to take that first step.

Continue Reading

Privacy

Attach Strings To Data Collection To Combat Surveillance Capitalism, Experts Suggest

Samuel Triginelli

Published

on

Photo of Marietje Schaake from the European Parliament

March 29, 2021 – Laws addressing how much data can be collected should be among new regulations that must ensure data collection from big technology companies doesn’t harm Americans, according to a March 17 panel of academics at the South by Southwest conference.

The era of corporate self-regulation is now up, said Marietje Schaake of the Standford University Cyber Policy Center and panelist at SXSW conference discussing the “techno-democratic” approach to Big Tech, including what to do about surveillance capitalism.

Surveillance capitalism is an economic system centered on commodifying personal data with the core purpose of profit-making.

“We have heard many pledges, many promises, and good intention offers for solutions for self-regulatory initiatives. And the time is out for those,” she said.

Schaake said it is time the government attach consequences to data collection to the detriment of the public and to set clear limits on collection practices.

“We have tried for too long, and it has led to several distractions and lost time to make sure that the rule of law is leading and that there are enforceable accountable, transparent expectations placed on these companies,” she said.

Joan Donovan, a social scientist at the Harvard Kennedy School, said what’s critical is how much data tech companies should be allowed to collect and under what conditions should they sell it to ensure rights aren’t violated.

“The tech sector as it is built now, relies on harvesting so much data about an individual that their products and the entire economy they are built on could not exist” if there were robust rights and privacy protections in place, Donovan said.

She said the discussion about regulating these businesses should include moving from a focus on protecting enterprise to protecting human rights.

Continue Reading

Privacy

House Energy and Commerce Chairman Frank Pallone Calls for Update to Children’s Privacy Legislation

Derek Shumway

Published

on

March 11, 2021 – House Energy and Commerce Committee Chairman Frank Pallone, D-N.J., on Thursday called for an update to the Children’s Online Privacy Protection Act at subcommittee hearing on “Kids Online During COVID: Child Safety in an Increasingly Digital Age.”

“The challenges children face online existed before the pandemic, but it’s only gotten worse,” he said.

Visiting in person with extended family and friends have so far become a thing of the past as the COVID-19 pandemic continues. Many other in-person activities have been replaced with video games, social media, and other video services.

Kids’ screen time has doubled during the pandemic, said Pallone. The effects of too much screen time can increase instances of anxiety, sleep deprivation, obesity, and cyber bullying, he said.

The increased screen time due to the pandemic has turned consumers into victims of what he called harassment and dark pattern manipulation led by advertisers. Children cannot defend themselves like adults in managing these predatory practices, he said.

“Despite laws to protect children’s privacy, data collection and tracking of children is disturbingly prevalent.” He went on to criticize many apps targeting children on mobile devices are notorious for collecting personal information, which is then bought and sold, resulting in advertising meant to manipulate children.

He said that digital ad spending specifically targeting children was expected to reach $1.7 billion this year. COPPA, which hasn’t been updated since 2013, needs to be updated because, he said, internet companies have since continued to target children.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending