At Apple’s fall new product launch conference, Apple further introduced Apple Intelligence, which is the core driving force of the iPhone 16 series. It can not only understand and create diverse content such as language and images, but also integrate personalized context. This technology is realized through Apple’s self-built model and has been fine-tuned to cope with daily dynamic changes, ensuring that users can get a smooth and intelligent interactive experience in daily use.
For computationally intensive tasks, Apple Intelligence uses private cloud computing capabilities to securely access larger generative models, thereby providing more powerful intelligent support while ensuring privacy. It is worth noting that in this process, user data is never stored or shared with Apple.
As an iconic application of Apple’s smart ecosystem, Siri has also received an important update. Even if the user is a little stumbling when expressing, Siri can keep up with their thoughts and provide an interactive method for direct text input, making operation more convenient.
In addition, Apple also demonstrated its new progress in photo search and memory creation. Users can now search for photos by description, such as “Shani dancing in a red skirt”, or search for specific moments in videos. At the same time, using prompts such as “Children and Aunt Fiona learn knitting”, users can easily create precious memories.
Apple Intelligence is currently available in beta in the US English version, followed by a localized English version, which is expected to be launched in October this year. More language versions such as Chinese, French, Japanese and Spanish are planned to be launched next year.