“Noom does not keep all this information for itself”
If you’ve spent literally any time on social media, or watched so much as ten minutes of TV in the days after Christmas, you’ll likely have seen adverts for weight-loss programmes such as ” data-vars-ga-product-id=”8484e71b-fba8-4a1f-97c7-a993b9c7b352″ data-vars-ga-product-price=”0.00″ data-vars-ga-product-sem3-brand=”” data-vars-ga-product-sem3-category=”” data-vars-ga-product-sem3-id=”” data-affiliate-network=”” data-vars-ga-media-type=”” data-affiliate=”true”>Noom.
According to their website, Noom uses science and psychology to “help you lose weight and keep it off for good.” They do this by helping users “better understand” their relationship with food, being “more mindful” of their habits and providing the knowledge needed for “long-lasting change.”
To achieve this, Noom claims to build each user a personalised diet plan using information gathered during an initial testing process, which takes place before your account is made and asks about your medical history, such as prescribed medications and any mental health issues you might’ve experienced.
But, while the allure of losing weight and ‘keeping it off for good’ may sound appealing to some, an investigation carried out by anti-exploitation charity, Privacy International (PI), alleged that some personalised diet companies, including Noom “are using tests to lure [in] users.”
In a nutshell: according to PI, they’re selling the same ‘personalised’ programme to a whole lot of people. Which, really, makes it not so personal… And begs the question, why does Noom need so much data from users (and what are they actually doing with it)?
“Companies selling diet programmes are increasingly targeting internet users with online tests and provide little to no clarity on what happens to your data [after],” PI told Cosmopolitan UK.
Their study showed that users completing Noom’s test are first asked whether they want to “get fit” or “lose weight,” followed by over 50 questions. Some questions ask the user’s thoughts on cognitive behaviour therapy (CBT), what triggers their urge to snack, what fitness apps they’re subscribed to, as well as meal kit subscriptions.
Of the questions asked, in particular whether a customer has been treated for diabetes or recently taken antibiotics, PI found that the answers “can be considered health data.” Health data, PI points out, is classed as sensitive under General Data Protection Regulation (GDPR) guidelines, so Noom is legally obliged under EU and UK data protection law to “prove that they have taken extra steps to specifically protect these categories of data.”
While PI’s report doesn’t make any definitive claims about Noom’s compliance with data protection legislation, it does point out concerns about the apparent lack of obtaining explicit consent for this sensitive data to be collected.
Data protection law aside, PI also interestingly found that “the data entered did not affect the programme being sold” despite the 50-plus test questions, and that “Noom does not keep all this information for itself.” In fact, PI’s study found that the data collected was previously “being shared with a company called FullStory” – a platform that allows companies to understand how consumers interact with their website: what they are looking at, what they are clicking on, what they are purchasing, etc.
Although not explicitly named, Noom says FullStory (being a third-party service provider) was covered within their Terms and Conditions. “Noom cares about its users’ privacy and does not share its users’ data with third parties other than with its service providers,” a spokesperson for the company also told Cosmopolitan UK. But that’s not the full picture…
“The data you share doesn’t affect the programme you’re sold [and] Noom also doesn’t keep that data private…”
Since their initial investigation was carried out, PI found that yes, while Noom no longer shares user’s data with FullStory, it now instead does so with Facebook. When asked about this finding, PI said that the “same concerns apply as to lack of transparency and information to users.”
In response to PI’s finding that user’s data is now shared with Facebook, a spokesperson for Meta (Facebook’s parent company) said: “We have policies around the kinds of information businesses can share with us — we don’t want websites or apps sending us sensitive information about people.”
To combat this, Facebook says they have a system “built to detect and filter out this type of information.” They also claim to alert Noom if and when potentially sensitive information is identified, as well as reaching out to them “directly to make sure they are complying with our policies to help protect people’s privacy.”
But, despite third-parties like Facebook claiming to do their bit to filter out sensitive data, PI argues that the lack of transparency about this data being shared in the first place is “problematic,” adding that the user’s opportunity to consent or object to this needs to be made far clearer.
“Apps can collect any sort of data including personal data,” explains Magnus Boyd, a lawyer and partner at Schillings who specialises in information security. “But, the lawfulness of that collection hinges on the question of consent.”
As Boyd points out, “Apps have to inform the user about who they will be sharing their data with [before they consent], but in reality, the third parties are not always listed in an easily accessible way. Instead, they’re often buried in lengthy terms and conditions or Privacy Notices.”
In their investigation, (which also studied other ‘personalised’ diet companies) PI ultimately found that “under the pretence of finding the best diet for us and protecting our health, diet companies are only collecting more and more data about us, without providing us proper information about what happens with the data and who they share it with.”
While Noom’s T&Cs assert that third parties cannot resell user’s data, PI highlights that it is “difficult to assess based on these companies’ privacy policies what specifically happens to our data,” adding that “the privacy policies we have read provide scope for worrying practices.”
But, what does this mean for how our data may be used in the future? “Big tech companies have had their eyes on our bodies and our healthcare in the past couple of years,” says PI. “The specific appetite they have for our health data is something we need to be particularly wary of.”
And, we’re already seeing evidence of this play out, with Amazon US now competing with health insurance providers by offering cheaper medications, while here in the UK the shopping giant has teamed with the NHS to encourage people to use their Alexa devices for health queries.
The bottom line is this: our personal data has a price. And, while we’d like to think our personal data isn’t being sold off, especially when it comes to health, how can we ever be sure this isn’t the case?
Boyd says that one thing we can all do to help protect ourselves is “as boring and tedious as it sounds… read the terms and conditions, and look out for the document called the privacy notice or privacy statement” He adds that this is the document that will set out how the app will use and share your data. “The devil is in the detail.”
Here’s hoping tech companies moving forward make that process a lot easier and clearer, as when it comes to health (and data), we all know it’s our biggest asset.
When asked specifically about PI’s claim that all users receive the same ‘personalised’ diet plan, Noom did not provide comment.