Connect with us

Privacy

Comparing Privacy Policies for Wearable Fitness Trackers: Apple, Fitbit, Xiaomi and Under Armour

Published

on

Photo by Kārlis Dambrāns used with permission

WASHINGTON, September 6, 2019 – Wearable fitness trackers are just another way of monitoring personal data. Yet the market for wearables is booming, because people want to optimize their fitness routines through self-monitoring of vital signs.

Aside from personal use, fitness tracker data is valuable for health insurance companies as it is a more efficient way to analyze a person’s fitness patterns. John Hancock, for instance, offers the Apple Watch at a high discount when paired with an insurance plan.

As these devices can track intricate details such as heart rate, calorie consumption and GPS coordinates, transparency about these practices is a must. Amid the diverse fitness tracker market, Apple and Fitbit are considered leading manufacturers. In addition to spearheading innovation in the fitness tech industry, Apple and Fitbit have been reported to have some of the most consumer-friendly policies.

Their competitors, on the other hand, still have some ways to go before they have developed a comprehensive level of security. Xiaomi and UnderArmour are two fitness tracker manufacturers that have had some discrepancies with their privacy and security practices.

Describing these four companies in the above manner is an oversimplification of their methodology. It is important to look at each individual company’s approach to privacy and to determine what benefits and gaps they have.

Apple

Apple has one of the most robust and easy-to-read privacy policies among major tech firms. Praised by the transparency of its computer and mobile devices, the Cupertino- based company puts the same amount of effort for wearables.

On its website, Apple states that privacy is a “fundamental human right.” It then goes into detail about the various ways in which a user’s privacy is protected. Concerns that Apple touches upon include encryption, third-party usage of app data and how users can modify and/or delete their data.

With the advent of the EU’s General Date Protection Regulations, multinational companies needed to update their policies to comply with international standards.

Last year, Apple launched a privacy portal allowing users to obtain a copy of the personal data associated with their account. This includes Apple ID info, App Store activity and data stored in iCloud. Additionally, the online portal has a page allowing users to correct their data and deactivate or delete their account.

For these reasons, the iPhone and Apple Watch have overlapping data protections. Apple Watch’s terms and conditions underscores the ability to control data via the privacy portal. This data is shareable with Apple’s affiliates and is combined with other personal information the company obtained, as well as with de-identified data.

Apple Watch terms note that photo and location data is shared, though it does not guarantee that data is accurate. Users are also informed that Apple can limit Watch use without any notice, and are advised to consult with a physician before starting a fitness program with the Watch.

Fitbit

Fitbit is Apple’s like-minded competitor, fully disclosing what its device does and who can access the information. Much of its privacy approach mirrors that of Apple, however Fitbit also has a separate privacy policy for children.

From the start, Fitbit discloses that it does not produce medical devices and that their trackers are not meant to replace doctor consultation. In its premium services, Fitbit incorporates data of non-paying users.

In the summary of its privacy policy, Fitbit states that transparency “is the key to any healthy relationship.” Following Apple’s protocol, Fitbit takes personal data such as height, weight and sleep patterns to give the user an accurate and relevant portrayal of their fitness regime.

We may personalize exercise and activity goals, Fitbit writes, based on past activity data and goals a user has previously set.

Under Fitbit’s child policy, parents have control over their child’s data until they’re 13 years of age. Persons under 13 are not permitted to create accounts without parental consent, and parents must consent to the use of their child’s data in accordance with Fitbit’s Privacy Policy for Children’s Accounts.

A separate link outlines these policies in detail. If parents have reason to believe that their child’s data was submitted without consent, they can contact Fitbit to request the removal of that data.

Xiaomi

At a glance, this Chinese-based tech company seems to have minor privacy discrepancies. It also sells fitness trackers for a much lower price than the mainstream manufacturers. However, some security issues and limited disclosure causes Xiaomi to have a lower credibility than its competitors.

Xiaomi specifically states that neither they nor their suppliers and distributors make any specific promises about the service. Xiaomi claims it is committed to upholding privacy policies worldwide, however it explicitly states that any disputes to the company’s terms and conditions “will be litigated exclusively” in Chinese courts, where the user and Xiaomi “consent to personal jurisdiction.”

Another concerning factor of the privacy agreement is that Xiaomi doesn’t clarify what happens to personal information in the event of a merger or acquisition. The company only states that users will be notified.

Furthermore, Xiaomi fitness trackers have also been found vulnerable to Bluetooth MAC address surveillance, a flaw that’s not uncommon among trackers. According to a study by Open Effect, when a fitness tracker’s MAC address doesn’t change, it becomes easier for users to be monitored through location sharing services.

Overall, the vague and slightly intimidating nature of Xiaomi’s terms and conditions prevents the company from contributing to transparency.

Under Armour

Under Armour faced significant scrutiny after its subsidiary MyFitnessPal was compromised in a data breach involving 150 million accounts. Yet the company’s terms and conditions still leave room for improvement. The amount of data it collects and the default sharing options available are particularly concerning.

Moreover, Under Armour’s online privacy policy hasn’t been updated for over a year. Their terms outline the different ways they collect, use, disclose and process personal data. It’s also unclear the extent to which UA shares personal data with other companies.

In addition to UA’s abundant yet vague privacy approach, UA can track a user’s location even when its app isn’t running and sharing preferences such as Activity Stats, Community Social Data and Lookup Information are set as public by default. Setting an entire account on private mode would prevent those on the user’s friend list from finding them.

California residents have some leeway with UA’s privacy agreement due to the state’s legal framework. They are permitted once a year to request a list of personal data that the company disclosed to third parties if it was used for direct marketing purposes. However, UA claims that it does not share personal data for this purpose as per a California Civil Code.

As with most multinational companies, users residing in the EU have the right to request deletion of their account.

Although UA does provide ways for users to opt out of public settings, the fact that these are set to default is alarming. UA also suffers from a lengthy and difficult to understand Terms and Conditions agreement. This implies that privacy is not on the top of UA’s priorities, which is perplexing given the aftermath of their data breach scandal.

Robocall

Public Knowledge Urges VoIP to Be Regulated Under Title II to Stop Robocalls

Title II would require VoIP services to be subject to stronger regulations already in place for telecommunication providers.

Published

on

Photo of Harold Feld, Senior Vice President of Public Knowledge

WASHINGTON, August 18, 2022 – Public Knowledge is asking the Federal Communications Commission to classify facilities-based voice over Internet protocol services under Title II of the 1934 Communications Act, which it said would help the commission tackle robocalls.

The non-profit public interest group last week amended a March petition to the agency narrowing the field of VoIP providers to be captured under its proposal to facilities-based interconnected VoIP services, which require a broadband connection for real-time voice communications on the public telephone network. That’s instead of a broader field including non-interconnected services, which allow voice communications through a device not connected to the phone network, like gaming consoles.

Title II specifies authority given to the FCC to regulate “common carriers” – utilities such as landline phones, telecommunication services, and electricity. Currently, VoIP services are not included in any specific classification. Instead, the FCC relies on rules based on its ancillary authority given under Title I of the Communications Act, which provides less regulatory authority to the commission.

If classified under Title II, VoIP providers would be beholden to service quality regulations, such as the prevention of ever-increasing robocalls, and to regulations ensuring affordable access to infrastructure for competitive carriers, Public Knowledge said in its petition.

The organization also said that new categorization would prevent a “crisis of legal authority” for the FCC, which already makes VoIP services subject to certain Title II regulations, such as contributions to the basic telecommunications program, the Universal Service Fund. Currently, Public Knowledge argues, regulations governing VoIP services are a collection of ad hoc rulings based on ancillary authority.

Lack of classification ‘threatens’ FCC ability to fulfill legislative mandate

Congress “deliberately used expansive terms” when defining telecommunications in the Telecommunications Act of 1996, which gave the FCC authority to regulate sectors within the communications industry, said the March petition. “At a minimum, Congress intended the FCC to regulate any service that behaves like a traditional telephone service – regardless of the underlying technology – as a telecommunications service,” read the petition.

Yet despite a lack of meaningful difference between VoIP and traditional telephone services, the FCC continues to treat VoIP services differently, said the petition. This “failure” of the FCC to classify VoIP under Title II allegedly frustrates the commission’s ability to effectively address robocalls and makes uncertain whether the commission preempted its authority to regulate VoIP services.

“The FCC’s failure to classify facilities-based interconnected VoIP threatens the ability of the FCC to fulfill the most basic responsibilities entrusted to it by Congress,” stated the petition.

The burden of Title II

In a blog post on the matter, communications law firm CommLaw group argued that Title II VoIP providers would likely be required to obtain FCC approval prior to transfers of assets and mergers and acquisitions, which it said would slow transaction speed considerably. Furthermore, it could open the door to “increased state regulatory oversight, requirements, and burdens,” it added.

Earlier this month, Democratic Senators introduced a bill that would give the FCC regulatory authority over broadband by classifying those services as Title II. It would allow the commission greater regulatory authority to make internet service providers respect principles of net neutrality, which prohibit providers from throttling traffic on their networks, participating in paid prioritization, or blocking of any lawful content. The bill, however, has been met with opposition.

Continue Reading

Privacy

Online Protections for Children Bill Passes Committee Despite Concern over FTC Authority

Opposition to a reformed COPPA include the ability of the FTC to enact broad rule-making.

Published

on

Photo of Senator Edward Markey, D-Mass.

WASHINGTON, July 28, 2022 – The Senate Committee on Commerce, Science and Transportation approved two online privacy protection bills in a Wednesday markup, including an update to legislation that will increase the age for online protection for children.

An update to the Children and Teens’ Online Privacy and Protection Act (S.1628) – which originally passed in 1998 but had amendments proposed last May – would see the age of protections increase from 13 to 15, meaning large internet companies will be prohibited from collecting the personal information of anyone under 16 without consent and ban targeted marketing to those children. The bill passed via voice vote.

Other provisions in the bill include a mandate to create an online “eraser button” that will allow users to eliminate personal information of a child or teen; implement a “Digital Marketing Bill of Rights for Minors” that limits the collection of personal information from young users; and establish a first-of-its-kind Youth Privacy and Marketing Division at the FTC,” according to a summary of the bill’s key components.

“The Senate Commerce Committee this morning took a historic step towards stopping Big Tech’s predatory behavior from harming kids every day,” Senator Edward Markey, D-Mass., who introduced the amendments, said Wednesday.

The other bill, the Kids Online Safety Act (S.3663), will give parents enhanced control over their children’s online activities to “better protect their health and well-being.” The bill, introduced by Senator Richard Blumenthal, D-CT, and Senator Marsha Blackburn, R-TN, passed 28-0.

The bill would put in place additional safeguards and tools, such as platforms giving minors options to protect their personal information and to disable recommendations.

“I don’t think we’ve ever had a piece of legislation that has had such strong support across groups across the country” “Parents want a tool kit to protect their children online,” Senator Blumenthal said during Wednesday’s hearing.

The bills now move to the Senate floor.

Concern about FTC authority under new COPPA

Under COPPA 2.0, the FTC authority includes determining what are “unfair or deceptive acts” in marketing practices and enforcing violations. In May, the agency put out a policy statement specifying its focus on enforcing the existing version of the bill.

Some senators voted against passing COPPA 2.0 over concern that it would give the Federal Trade Commission too much rule-making authority.

Senator Blackburn said there should be more restrictions on the ability of the FTC to make rules so there wouldn’t be overreach.

Similarly, Senator Mike Lee, R-UT, said he was not able to support the bill during markup because he is concerned about “giving a blanket ruling power to the FTC.

“We are at our best when we carefully consider legislation and don’t rush through it,” Lee said.

Continue Reading

Cybersecurity

Rep. Swalwell Says App Preference Bill Will Harm National Security

‘I just want to limit the ability for any bad actor to get into your device.’

Published

on

Photo of Representative Eric Swalwell, D-Calif.

July 27, 2022 – Antitrust legislation that would restrict the preferential treatment of certain apps on platforms would harm national security by making more visible apps from hostile nations, claimed Representative Eric Swalwell, D-Calif, at a Punchbowl News event Wednesday.

The American Innovation and Choice Online Act is currently under review by the Senate and, if passed, would prohibit certain online platforms from unfairly preferencing products, limiting another business’ ability to operate on a platform, or discriminating against competing products and services.

The legislation would ban Apple and Google from preferencing their own first-party apps on their app stores, which would make it easier for apps disseminated from hostile nations to be seen on the online stores, Swalwell said.

“[Russia and China] could flood the app store with apps that can vacuum up consumer data and send it back to China,” said Swalwell, adding that disinformation regarding American elections would spread. “Until these security concerns are addressed, we should really pump the breaks on this.”

Swalwell asked for a hearing conducted by Judiciary Committee of the House with the National Security Agency, Federal Bureau of Investigation, and Homeland Security officials to lay out what the bill would mean for national security.

“I just want to limit the ability for any bad actor to get into your device, whether you’re an individual or small business,” said Swalwell.

Lawmakers have become increasingly concerned about China’s access to American data through popular video-sharing apps, such as TikTok. Last month, Federal Communications Commissioner Brendan Carr called for Apple and Google to remove the app on the grounds that the app’s parent company, ByteDance, is “beholden” to the Communist government in China and required to comply with “surveillance demands.”

The comments follow debate surrounding the bill, which was introduced to the Senate on May 2 by Sen. Amy Klobuchar, D-Minn., on how it would affect small businesses and American competitiveness globally.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending