The word “FemTech” often evokes excitement and curiosity. The same isn’t usually said for “privacy and data security,” which is enough to make most people’s eyes glaze over. Yet, embracing privacy and data security is vital for advancing the FemTech industry and accelerating progress in women’s health in 2022 and beyond.

As a researcher, data are at the heart of everything I do. They tell me about the challenges someone faced trying to access medical care. They tell me if someone’s symptoms are considered normal or abnormal. They tell me about a person’s experience giving birth. When data are analyzed and combined, they can tell powerful stories and lead to impactful change.

The absence of data can be just as powerful as the findings generated by data. Without data, for example, it was easy for some in the medical community to initially dismiss long-COVID as “just in the patient’s head.” Before measures for disrespect and abuse in childbirth existed, it was easy to turn an eye to the situation or to dismiss it as a rare occurrence. It’s easy to overlook the issues when you don’t have data to tell you the magnitude of the problems or what they are.

There’s still so much we don’t know about women’s health, simply because we lack the data. In the USA, for example, decades of data on women’s health is missing simply because women were excluded from participating in clinical trials (1977–1993) or because data were not required to be disaggregated by sex or gender. It was only in January 2016 that the USA’s National Institutes of Health started requiring scientists to consider sex as a biological variable in their preclinical research.

This exclusion of women and female bodies from the data has limited our understanding of female bodies and health outcomes. This problem is not unique to the United States or even to humans. Historically, animal studies have used only male rats or male animals as the norm.

Whether it’s improving the accuracy of diagnosing heart attacks in women or better designing pain treatment options tolerated by female bodies, there’s such potential for using data to revolutionize women’s health. And there’s such good that can come out of it. I see this promise for a more inclusive, more evidence-based future as one of the big appeals of FemTech. For me, FemTech is about challenging the status quo regarding girls’, women’s, and females’ health. But I don’t think we can challenge the status quo when embracing outdated privacy and data security models. I’d argue that a company that hasn’t embraced privacy and data security from the start will never truly embrace it as a value or a priority.

How often have you unlocked your phone in public and handed it over to a stranger? Maybe the stranger needs to make an urgent phone call to their mother. His phone died, and his mother was just admitted to the hospital. You know what he’d be doing on your device–calling the hospital–and you’d be there to monitor it. So perhaps it would be OK for him to access your phone.

But what if you didn’t know why the stranger needed to access your phone? What if he were going to take your phone around the corner out of your sight? Would you still unlock it and hand it over?

I would never unlock my phone, hand it over to a random stranger on the street, and let them have access to my data out of sight. But during my first pregnancy, I downloaded numerous pregnancy apps from the Apple Store. I trusted Apple, and I assumed that the apps in the app store had some quality or privacy standard. This instance was back before the European Union’s General Data Protection Regulation (GDPR) was fully implemented in May 2018. The apps I downloaded had a seller named, but the company names and websites didn’t always say who ran the company behind the app, in which country they operated, or how they would use my data.

For the record, Apple says today that it reviews all apps and app updates submitted to the App Store to determine whether they respect user privacy. Recently, Apple also required developers to provide new privacy data that are publicly available.

Yet, even this week, I came across a pregnancy app with over 10K reviews in the app store that links to a website with no publicly available privacy and data security information. There’s no imprint or privacy policy required by GDPR on their website, and I only saw a generic Gmail address listed. Why the lack of transparency? Who are they? Where are they? What is this unnamed person (or persons) in an unknown country doing with the data of well over 10K users?

You might be thinking, get to the point, Sam. Why should I care about boring things like an imprint or privacy policy on someone’s website? First, a lack of transparency and accountability in the healthcare field, particularly in health research, has historically resulted in human rights abuses.

Second, data policies vary quite a bit around the world. You can imagine companies will treat your private data differently depending on whether the company stores them in the European Union, the USA, Russia, China, etc. Not all countries even require companies to report data breaches, for example, or to inform the people who’ve been affected.

While companies’ approaches to privacy and data security vary at the country level or regionally, the media seems to be remarkably consistent in flagging privacy and security issues in FemTech. Journalists from the USA, UK, Nigeria, and beyond have raised these issues in stories with catchy headlines warning readers of “a dark side to women’s health apps” or “dangerous permissions and hidden trackers in your period app.” Other articles raise questions about whether “your sex tech devices may be spying on you” or ask, “is your pregnancy app sharing your intimate data with your boss?

Now, some of these headlines refer to events where the companies faced legal and financial repercussions for violating data and privacy standards. Last year, a period and fertility app settled with the USA’s Federal Trade Commission after being exposed for sharing users’ health data with advertising companies and Facebook despite promising to keep the data private. That’s a violation of the trust of over 100 million FemTech customers. Another popular fertility app settled with California’s Attorney General in 2020 over alleged privacy and security violations.

Some headlines refer to companies that sell users’ data to third parties for advertising purposes or to users’ employers. While this practice might seem questionable, it can be legal depending on how and where they do it. Companies, whether Facebook or FemTech, need to earn money if they offer their products for free. If the user isn’t paying, then there’s a good chance the user is the product for sale.

As a user of an app that uses one of these business models, how would you feel being spammed with adverts for baby equipment after you miscarried? How would you feel about a company sharing your data with your employer? Given how women, particularly pregnant women and mothers, have historically been treated in the workplace, that practice might raise particular concerns. Depending on what data are shared and how they are shared, maybe it’s OK.

A true revolution in women’s health relies on data. Quality data. So what happens when the FemTech community’s data become less and less perfect?

FemTech users are increasingly impacted by less-than-ideal privacy and data security issues, as the headlines showed. In the USA, the Supreme Court’s recent decision to overturn Roe v. Wade has again highlighted consumers’ valid concerns about how companies might turn over sensitive data, leading to calls for people to delete their period tracking apps. Without global privacy and data security standards, FemTech users are developing privacy strategies to protect themselves. These strategies include falsifying information, tracking information across multiple apps so that no one has the entire picture, or simply deleting apps, for example.

Now, data are rarely perfect. That’s why there are processes for checking and cleaning up the data before they are analyzed. Depending on an app’s business model and purpose, it’s probably not a big issue if my app calls me “Anna Smith” instead of “Sam Lattof.” Or if my app thinks my birthday is 6–12 months off from my actual birthday. If the app shares my data with third parties for advertising purposes, this false information won’t cause me physical harm.

But what happens when FemTech products use my false health data to develop recommendations for me or evaluate my health? Will the inaccurate data be caught in time to prevent me from experiencing potential harm from inappropriate recommendations or incorrect health evaluations? At the population level, what exciting new insights into women’s health might we miss out on learning if FemTech users increasingly feel a need to falsify their data to maintain their privacy?

Inherent consumer mistrust threatens the women’s health revolution and a more inclusive, evidence-based FemTech future. As the FemTech industry moves forward in 2022 and beyond, I think we will see privacy and data security becoming essential values amongst companies and a growing number of consumers. Part of this shift will happen due to government-imposed regulations and legislation like GDPR, but I hope to see part of it come from the FemTech industry itself. I think the industry could and should be a leader in adopting, implementing, and promoting the highest privacy and data protection standards.

As an industry, our ability to learn and innovate is only as good our data quality.

This article originates from Dr. Lattof’s presentation at “The State of FemTech 2021–2022: Industry Review and Overlook” conference hosted by FemTech Analytics on the 27th of January 2022.

Written by Dr. Samantha Lattof

As CEO and Co-Founder of Maila Health, Dr. Samantha Lattof works to improve maternal and newborn health by bridging research and technology. Dr. Lattof has over fifteen years of experience conducting womxn’s health research and driving global health programs and projects. Her research has been used to develop health guidelines and inform policy. She has consulted for the World Health Organization’s Network for Improving Quality of Care for Maternal, Newborn and Child Health; Ipas; and LSE Enterprise. Prior to consulting, she worked for the London School of Economics and Political Science, Harvard School of Public Health, and Columbia University. Dr. Lattof earned her PhD at the London School of Economics and Political Science and her Master of Science at Harvard School of Public Health.
Maila Health-Logo

Verbesserung der Gesundheitsergebnisse und -erfahrungen von Müttern und Neugeborenen durch bedeutsame Innovationen in der digitalen Gesundheit

de_DEDeutsch