Applications or even OS itself doesn’t know the keys that are protected by Secure Enclave. OS only sees Keychain, but the Security Enclave is separated. It’s asked to decrypt the information and it does just that returning the decrypted information.
Hardware
Imagine that there lives a vicious criminal π¦Ή who’s stealing and deleting puppy photos from a preschool server. It’s a nasty bussiness that can get him in jail if he’s not cautious. He likes to keep everything on the mobile phone he has with him all the time even in the bathroom π½ and while taking a shower πΏ (he’s very careful). He considers the photos to be completely safe since he has set up a PIN for the device only he knows. All his secrets and stuff that can potentially throw him into jail are all on this little piece of harware.
But suddenly some other guy rats π him out (because he is sure that’s beneath his dignity to do such a damage for little kids) and serious and righteous people in suits are comming to your home with a warrant π. Well, this criminal π¦ΉββοΈ still thinks: “Hey, suckers, I’ve got everything on my mobile phone which is secure as hell. All other devices are completely wiped.”. How wrong would this assumption be? What if he had an old Android device? Or may be a iPhone 5S? Or iPhone 4? Is there really a difference? What if the only reasonable assumption in this case would be for him to pack ciggies π¬ for the jail?
One of the main things to think about when answering the question: is the mobile phone’s NAND or NOR (i.e. non-volatile storage) encrypted? If it’s not, then, my dear fellow, he is in trouble. One could just extract this precious little chip and insert it into another device. Now they don’t need his passcode, they’d only require theirs or even none at all.
Alright, say, he was not complete dump and was smart enough to get himself a deviceπ±that uses fulldisk encryption. Is this attack still feseable? Well, depends on where the key π is stored. If it’s, for example, stored at 0x0000000
address of this NAND, then it’s just the question of extracting this chip, reading π the first bytes and decrypting the contents. So, how and where to store this key in order to make this at least harder?
This is what Apple has been doing for a while. It got so overwhelmed with this task that it happened to end up protecting its endusers even from themselves.
Back then mobile devices had several basic parts: CPU, some storage chip (NAND or NOR) and some other pieces used to make it possible to use this for talking and SMS-ing. Apple has decided to apply the least-privilege policy here again and restrict the enduser to only those tasks he or she can possibly want to do. It’s a phone afterall. Why modifying OS? You don’t need it. Make you calls and be happy.
Modern Apple iDevices have two AES engines: one is for Secure Enclave only, another one is used both for SE and Application Processor.
Intro
When the device is manufactured, a 256-bit AES key is written to Secure Enclave. Before some veresions of Apple devices these UIDs were randomly generated and them fused to the chip. However at the moment they are generated by SE during manufacturing and fused using special software. There is no API (neither software, nor hardware) that is able to get it from there, used only by the processorβs hardware AES engine. This UID key is unique for each device.
Also, when user creates a passcode, it is turned into a cryptographic key and strengthened with the deviceβs UID.
def PBKDF2(user_pass_derived_crypto_key, device_uid):
aes_key = device_uid
for i in range(0,n):
result = pseudorandom_func(aes, aes_key, result, user_pass_derived_crypto_key)
return result
user_passcode = "secret"
device_uid = "123456...0"
user_pass_derived_crypto_key = crypto(user_passcode)
se_mem_key = PBKDF2(user_pass_derived_crypto_key, device_uid)
se_mem_key
encrypts Secure Enclave’s memory space.
In iOS and iPadOS, files are encrypted with a key entangled with the Secure Enclaveβs UID and an anti-replay nonce as they are written to the data volume. On A9 (and newer) SoCs, the anti-replay nonce uses entropy generated by the hardware random number generator. The anti-replay nonce support is rooted in a dedicated nonvolatile memory integrated circuit (IC). In Mac computers with the Apple T2 Security Chip, the FileVault key hierarchy is similarly linked to the UID of the Secure Enclave. In devices with A12 (and newer) and S4 SoCs, the Secure Enclave is paired with a secure storage IC for anti-replay nonce storage. The secure storage IC is designed with immutable ROM code, a hardware random number generator, cryptography engines, and physical tamper detection. To read and update nonces, the Secure Enclave and storage IC employ a secure protocol that ensures exclusive access to the nonces.
https://support.apple.com/guide/security/dedicated-aes-engine-sec4ea70a303/1/web/1
All this resembles the sad story of a businessman from Eine Woche volle Samstage (1973) by Paul Maar, when a man was so afraid π± of thieves π¦ΉββοΈ that he hid the office key π in a sock π§¦, the sock - in a boot π₯Ύ, the boot in a wardrobe locked with a key, the key to the wardrobe hid in his table’s drawer, and the key from the last had lost and by doing so arranged an unscheduled day-off for his employees. So is with the encryption on iOS devices:
File contents’s is encrypted with its key, which is stored in the metadata. File key is wrapped with a class key. The metadata, where this encrypted key is stored, is then encrypted by the file system key. All of it is finally protected by an alianz of the user’s passcode key and a hardware key (UID). Wow. Let’s dive a bit deeper, cause it all seems to be too messed up and confusing.
User passcode if fed to RNG. It’s also to entangled with UID.
Class keys
Class key represent protection classes. Each one except of NSFileProtectionNone
are protected with the alianz of hardware key and user passcode key:
- Complete Protection (NSFileProtectionComplete) Data is inaccessible until the user unlocks the device.
- Protected Unless Open (NSFileProtectionCompleteUnlessOpen): Data is accessible after the user unlocks the device.
- Protected Until First User Authentication (NSFileProtectionCompleteUntilFirstUserAuthentication): The file can be accessed as soon as the user unlocks the device for the first time after booting. It can be accessed even if the user subsequently locks the device and the class key is not removed from memory.
- No Protection (NSFileProtectionNone): Protected with the UID only. The class key is stored in “Effaceable Storage”, which is a region of flash memory on the iOS device that allows the storage of small amounts of data. Used for remote wiping.
About Secure Enclave:
https://www.youtube.com/watch?v=7UNeUT_sRos
https://support.apple.com/guide/security/welcome/web
The main idea behind the scenes is that applications or even OS itself doesn’t know the keys. OS only sees Keychain, but the Security Enclave is separated. It’s asked to decrypt the information and it does just that returning the decrypted information.
- T2 vs T1
- Security Enclave - coprocessor with a Keymanager
Devices which have the SE:
- iPhone 5s (or later)
- iPad Air (or later)
- Mac computers that contain the T1 chip or the Apple T2 Security Chip
- Apple TV 4th generation (or later)
- Apple Watch Series 1 (or later)
- HomePod
Let’s observe this scheme from Apple website
Secure Enclave and Application Processor (the main CPU) have separate boot processes and even separate starting code (BootROM) and separate OS. SE has its own light-weight OS (based on L4-family microkernel). And even update process is isolated. With each boot an ephemeral memory protection key is created.
Boot process:
- Boot ROM (Read Only Memory) is executed like the Big Bang, out of nowhere. It’s called read-only because it can not be changes and therefore is hardware root of trust. It’s implicitly trusted. So they say, you know, if you can’t trust Boot ROM, whom can you trust then? Than would be a crazy world. Phew! π It’s such a relief that ROM can be ultimately trusted. It contains the Apple Root CA π public key π.
- Boot ROM creates a key
- User’s key + device’s UID = ephemeral memory protection key
- Now, ephemeral memory protection key is used to protect Security Enclave’s memory.
FileVault
TouchID
2012 - AuthenTec?
Tied at the hardware level to A chip. LAContext
and User Presence/ACLs.
if(touchIdIsSet){
if(checkcredentials()){
//authenticated
}
else {
//error
}
}
Questions
But I don’t enter the passcode at the very beginning. Where does SE get it from during the very first steps of its boot?
Low Level Security
UID
- key that is fused into Application processor. No one know it and can get from the device. Used to encrypt the contents.
GUID
-
UDID
- device identifier. It can be retrived with iTunes.
Home button - sapfire crystal for scratch resistance. Capacitive touch for detection.
Application Sandbox
There are two users on iOS: mobile
and root
. When the device is not jailbroken, all applications and processes run by user are in mobile
’s context. Each application is sandboxed and certain policies are implemented (they are called profiles in iOS). Hence, an application can’t access other apps’ protected resources. This system is called TrustedBSD Mandatory Access Control (MAC) Framework. To access other apps’ resources, entitlements are specified for each application and checked by securityd
.
Entitlements
Consider some Russian π·πΊ π© woman willing to travel to Japan π―π΅, EU πͺπΊ and USA πΊπΈ. Since October Revolution in 1917 and WWI it’s no longer possible to do it so simply as that. For travelling one needs a visa π«, and usually each for each country one’s planning to attend.
So, this Russian woman has two visas yet: Shengen and US, but has not aquired Japanese visa. On passport control π (securityd
) these visas in her passport are checked and corresponding gates are opened if the visa is present. If not, the access to the country’s gate is restricted π«. Since she doesn’t have a Japanese visa, she is not entitled to travel to Japan. Since she has EU and US visas, she can travel there freely. There are, of course, dozens of other people, who can have EU or US visa, so she’s not the only one entitled to get there. Basically, they are all in the same entitlement group.
The same is here: each application has “passport” π with entitlements (an array of strings), based on which an access is denied π« or allowed β
by π securityd
(passport control).
All entitlements are added before or during signing the application, hence they cannot be changed.
Apple is very well known for its value of security and the advanced security mechanisms. I personally found it hard sometimes to understand specific technical controls. This is why I am writing this little article. Smartphones are known to be more secure than desktops for they are also more MOBILE and tend to get lost a lot. That’s why MacBooks tend to get closer and closer to smartphones in terms of security.
Hardware
Every software needs a hardware to run on. Maybe that’s going to change sometime or the hardware turns into something different like bioware. Who knows?
Imagine a security guard who makes sure only authorised folk enter the building. He’s tough, he’s often blunt and ready to engage in a tussle if it ever comes to it. He’s there to keep miscreants with their pesky delinquent thoughts from passing through. He’s not a bigot though. He is on his heels to hunt down anyone indulging themselves in any sort of felony. No culprit will be able to talk his way out of this. The only problem is that once the authorised person is inside, it may turn malicious and may do lots of different bad stuff. For example, this slimy git may take the keys and mess them up, or even use those keys to bust into the rooms he was not supposed to be in. What do we do to protect those keys? We can’t allocate a guard to each person coming in, that would be a disaster!
So, here is a crazy idea: keep keys in a separate place and don’t let anyone in! The key-keeper inside will give out and take the keys, managing this particular problem himself. So, that’s what Secure Enclave is for and that’s exactly the reason why it has its own CPU and key generator. To keep things even more secure, we have an option to ask SE (Secure Enclave) to generate the keys and
Modes
- Unlocked. Either you know the passcode or the device was not locked since the acquision.
- After first unlock. AFU is a considerably less secure iPhone mode, because in the AFU state the files are not encrypted. Some extraction procedures employ fast brute force techniques (numerous cracking attempts). You can get the information around installed apps, WiFi connections, some media, system files.
- Before first unlock. Browser data and mail remain encrypted.
- just rebooted or turned on
- Incorrect pin provided
- user data encrypted
- decryption keys removed from RAM
- DFU (Device Firmware Update). It’s part of the SecureROM burned into the device. It’s not the Recovery mode. Here is the instructions on how to put your iDevice into this mode.
- USB Restricted Mode. iPhone will not talk to any PC when plugged into it, unless you provide a device passcode. It might not even charge in some cases. Some software (for example, Belkasoft) can disable this mode but only via an exploit.
Acquisition
β οΈ Keep in mind that when connected to the INternet, iDevices can be wiped remotely. It’s also possible to wipe the device over Bluetooth.
Mobile devices in general and iOS in particular, is a challenge for a forensicator. When possible, bit-by-bit copy is usually considered the best option. It’s not always possible, especially when dealing with a mobile device.
Here the main types of artefacts that can be acquired for the examination.
- Backups. I’ve dedicated a separate section for this type of artefacts. Just a few words here though. You can acquire those from iCloud or from a PC.
- Crash reports. They do not typically contain user data, however, they can be used to prove that some app existed on the iDevice. Crash logs can persist after an application is uninstalled.
- Media over AFC. It’s the data transfer between the desktop and the iDevice.
- Full acquisition (requires jailbreak and thus exploits). This one is not forensically sound unless the evidence was already jailbroken.
β οΈ Of course, to use any of those methods, you’d still need the device passcode.
Now, what are the main methods to acquire those artefacts?
- iTunes. Connecting the device to a desktop and getting the data. On macOS you don’t even need an iTunes, just plug it in.
- Lockdown files. If when connecting the iDevice to the desktop computer, the user confirms the request, the device pairs with the computer and creates a lockdown file to auto-approve further connections to it. It’s unclear for how long those files remain on the system. You can find those at π
/private/var/db/Lockdown/
on Mac OS X (may require additional access permissions) and here πC:\ProgramData\AppleLockdown
on Windows 7-10. The most value is when the device is in AFU mode. - Screen capture.
- AFC (Apple File Conduit). Only for photos, audios and videos.
- Jailbreak. You can read more details in here. During this process the iDevice first goes into recovery mode and the in DFU. That’s not a forensically sound method unless the device was already jailbroken.
- Exploits. Some exploits do not require full jailbreak and are thus more forensically sound. Some software, like Belkasoft, supports this type. You need to install an app on the iDevice and the agent will acquire the full disk image. Notes, calls, chats etc can be acquired this way. There are some requirements though (like, the machine where you perform the acquisition must have the iCloud application installed from the Microsoft store). Learn more from their course in the references. The iDevice needs to be unlocked, trusted and you need an Apple ID. When using Apple Developer ID instead of an Apple ID, Internet might not be needed on the device.
- Belkasoft X Brute-Force. For 4/6 passcodes. There several (6 devices) supported. iOS 14 and 15 currently (π
13/10/2023
). It’s powered by the checkm8 acquisition method to bypass the restrictions when the devices is locked after several unsuccessful login attempts.
References
Expand…
- Apple Security Guide
- OWASP MASTG
- https://belkasoft.thinkific.com/courses/take/ios-forensics-with-belkasoft/texts/46763931-2-8-ios-agent-based-acquisition
- [1] [2] CIA hacking GID for uploading spy malware on iDevices
- [3] iPhone Wiki about GID key
- [4] How to recover data from a Mac with T2 or FileVault encryption and without a password (links to article about FileVault and T2)
- https://mobile-security.gitbook.io/mobile-security-testing-guide/ios-testing-guide/0x06d-testing-data-storage
- https://support.apple.com/guide/security/secure-enclave-overview-sec59b0b31ff/web
- https://www.youtube.com/watch?v=XhXIHVGCFFM
- https://support.apple.com/guide/security/secure-enclave-overview-sec59b0b31ff/web
- https://www.theiphonewiki.com/wiki/Bootrom