
Appeal; screenshot by CNET
For years, Apple has built into the security built into its iPhones and iPads. More than a decade ago, it added ways to encrypt information on the iPhone. In 2010 it was launched encrypted messages with iMessage. And in 2013 it is TouchID biometric sensors introduced to help people unlock phones. Over the years, he was able to bring these technologies to the Mac as well, but now, with his new M1 chips for the MacMini, MacBook Air and MacBook Pro, it will be able to increase the efforts.
On its website Thursday, Apple updated its platform security documents and described how Mac computers now work much more similarly to their iPhone counterparts. The documents dive into funny details of how different security systems within computers and phones talk to each other, and how they are designed to protect the privacy of an Apple user.
“Secure software requires a base of security built into hardware,” Apple said in its nearly 200-page security update. “That’s why Apple devices – which use iOS, iPadOS, MacOS, TVOS or WatchOS – have security features designed in silicone.”

Apple has increasingly marketed its products as designed to protect the privacy of the user.
Angela Lang / CNET
This may seem strange to a business as mysterious as Apple to share so many details about almost anything. The technology giant is just as well known for its marketing as for its devices, and although the company shares technical details about its products on its website, it is intended for general audience.
However, the platform security information is different. Apple said it began publishing this information to business customers more than a decade ago. But the company soon learned that security investigators he works with identify vulnerabilities in its devices also found it useful. This is part of the reason why you will find terms like “core integrity protection” and “pointer verification codes”, both of which are part of the company’s various security systems.
Of course, Apple is not the only company working with security investigators. Over the past decade, the technology industry in general has ‘established’bug abundance“programs to pay outside researchers to encourage them to identify vulnerabilities on their devices. Businesses, including Microsoft, Google and Facebook, have large sums paid out and resigned some in public security experts for identifying security issues before being widely exploited by hackers. Apple itself pays up to $ 1.5 million for such bounties.
Encouraging use
Apple said that part of the way it designs security systems is encouraging people to use it, or to run it in the background without people having to know how it works and what to use to use it.
For example, iMessage has built-in encryption – users do not have to turn it on. And it also has its TouchID fingerprint sensor and FaceID face unlock system to encourage people to use the encryption systems, which are activated when people set a password. Before Apple built TouchID, for example, Apple said less than 49% of people use the passcodes on their phones. After the launch, 92% of people did it.
“This is important because a strong password or password forms the basis for how a user’s iPhone, iPad, Mac or Apple Watch cryptographically protects the user’s data,” Apple said in its security document.