I was recently asked to look at this article about an engineer who looked under the hood of TikTok to learn more about it’s use of users personal data. After reading the article I wrote the following back:
My personal take is that any app that is popular is going to be under increased scrutiny, especially when that app is being used to share personal content. All apps, depending on how nosey the developer/marketer is, will attempt to grab as much data about its users as possible.
Many times the app developer maybe unaware about what analytics they are actually performing, especially when using a third party tracking tool such as Firebase (Google Analytics), Flurry, MapBox and even Facebook’s own mobile analytics platform. Each one of these SDKs or add-ons developers use to bring in a specific feature to their app adds another layer of tracking, sometimes unknown without further investigation. Here’s a chart that shows the top SDKs for analytics that developers are adding, I bet most developers don’t take the time to check how much of their users data is left vulnerable by each.
In this case, I think TikTok is moving so quickly to keep momentum they are just keeping everything “on” in terms of analytics tracking. When your business unit is growing exponentially, like I’m sure it is at Bytedance currently, you have lots of people asking for more and more data and information to use for many different reasons. Ad Teams want location data for in-store sales attribution, security team wants clipboard data (very interesting recent security find), and the law enforcement liaisons need a backdoor for governments to track and infiltrate suspected criminals. Everyone gets a seat at the “data table” and no one is really assigned as the consumer advocate.
That’s why it was such a big deal when tech companies started hiring people to be privacy ombudsmen, someone was finally there to gut check what each team was asking for. I believe that the iOS version of TikTok is probably safer than Androids for this reason, and safer still than anyone who is jail breaking their phones.
As for the encryption left behind in each stage of the apps development and analytics, I just think that’s TikTok looking for anyway to protect their codebase. They have a vested interest to make it as difficult as possible to replicate TikTok.
I don’t think TikTok is any worse than any other quickly developed app out there that hasn’t started actively working to protect the info of its users yet. However, more information may come to light in the future about what data is being shared with governments which would change my mind.
I’m general, I work under the rule that if someone truly wants your private information there’s not too much you can do to stop it, except to keep it off the web. If you have something you don’t want shared, best to not share it no matter how secure or private you think your device or social platform may be.
The biggest vulnerability on TikTok (Jan 2020) was discovered because someone forgot to add a security feature to the Ads FAQ page. An oversight that for sure happened because of the speed Bytedance is working to make things like ads self service tools and influencer databases. Imagine how many 3rd party companies are being brought in to create these kinds of tools right now, each one bringing its own flaws on data protection and security procedures.
TikTok will encounter a lot more bumps in its rise over the next few years, as did Facebook, Twitter, and Instagram. But eventually it will mature to an app that’s “secure” enough for most to be happy. Unfortunately by then most users will be on to the next rising star that has its own privacy flaws, and opportunistic hackers and security researchers looking to make a name for themselves will be waiting there too.