All consumer electronics are unfinished symphonies. The annual refresh cycle means a new one comes out every year, tempts you with upgrades, and nudges you toward an upgrade. You’ve probably noticed how gadgets age after a few years – like milk more than wine if you’re an early adopter.
The iPhone isn’t immune to this. Some say it’s planned obsolescence, and there’s probably some truth to that. But more than that, it’s the relentless march of feature enhancements. And while skipping a release may give you a little FOMO, the reality is that most new releases are just incremental. Each generation is another step in evolution.
iPhone 16 and Apple Intelligence: Subtle Evolution or Game-Changer?
Announced at the “Glowtime” event in Cupertino, the iPhone 16 lineup brings generative AI to Apple devices with Apple Intelligence. This in-house AI platform is designed to enhance the iOS experience with a range of features carefully woven throughout. Apple is finally in the generative AI space in a way that feels like Apple. However, while Apple Intelligence may spark what some call a “supercycle” of iPhone upgrades, Apple is being very restrained, adding value through refinement, not revolution.
Apple Intelligence: A Different Approach to Generative AI
Unlike the large, data-hungry models that power popular generative AI platforms like ChatGPT and Google’s Gemini, Apple Intelligence uses smaller, less opaque models. This is because of Apple’s focus on privacy, user trust, and device performance. Instead of relying on cloud-based servers to handle complex AI operations, Apple’s model keeps data processing closer to the user for efficiency and security.
This is all about adding to what you already love. For example, Mail now has AI-driven summaries and message generation to get you to the heart of your inbox faster. Photos have better object recognition so that you can find people, places, and moments more easily. Apple wants to make the user experience feel more intuitive without overwhelming you with big, complex changes.
The User Experience
Apple Intelligence may not have the immediate “wow” of earlier generative AI experiences, but it’s deliberate. Early adopters and industry insiders have already seen AI tools for text and image generation and were wowed by their novelty and scope. But Apple’s approach is different: instead of trying to impress you with extensive capabilities, it’s trying to weave intelligent assistance into the fabric of the experience.
Supercycle – and Hurdles
Apple’s gradual rollout of Apple Intelligence could spark a “supercycle” of upgrades, as analysts are calling it the first 5G iPhone all over again. A supercycle refers to a surge in sales beyond the usual upgrade cycle, driven by the desire for the latest AI. However, since Apple Intelligence is rolling out in phases, the sales bump may not be as big as with previous “must-have” upgrades.
And Apple’s “subtle” approach may not immediately grab consumers’ attention. Those looking for a big new experience may not see Apple Intelligence as a novelty like the original iPhone or the first 5G release. However, as Apple Intelligence proves itself to be a quiet but powerful tool over time, it may shift consumer attitudes towards long-term upgrades and solidify Apple’s reputation for making smooth and user-focused technology.
Apple Intelligence’s First Test: Real-World Usability
How Apple Intelligence will do in the real world will depend on how well it integrates with everyday tasks. Generative AI is still new, and Apple is betting its incremental approach will feel more natural and less overwhelming than other generative AI out there.
For example, Reminders and Calendar now have AI-driven insights and event suggestions that boost productivity without requiring much user input. The changes are designed to feel intuitive, evolving based on user behavior rather than direct command. This subtle integration, combined with Apple’s focus on privacy, could be a sophisticated and easy-to-adopt experience.
A New Era of Apple Camera Innovation
Apple has always relied on the camera as the crown jewel of its iPhone releases; the iPhone 16 is no exception. This time, the upgrade comes with Camera Control, a new physical button that goes back to the days of more straightforward, hands-on devices. Like the Action button on the iPhone 15, the Camera Control button takes it further and offers a tactile, interactive experience for photography enthusiasts.
Unlike the Action button, the Camera Control does more than just open the camera app. Once the camera is open, users can use the touch-sensitive button to swipe through camera options, modes, and settings. This added level of control brings an element of immediacy to experience, which is what Apple is all about with simple yet meaningful tools.
Visual Intelligence: Apple’s Answer to Google Lens
The Camera Control button is just the beginning of Apple’s camera innovation. It’s an entry point for Visual Intelligence, Apple’s take on AI-driven augmented reality that brings real-time object recognition and interaction to the device. Often compared to Google Lens, Visual Intelligence helps users analyze, identify, and interact with objects, locations, and information in their environment.
Visual Intelligence will be part of the camera system and will work seamlessly with Apple Intelligence to create a new kind of experience. For example, you can take a photo of a book cover, and Visual Intelligence will identify the title, author, and availability in nearby stores. This will be in beta this October, which means Apple is taking a cautious but committed approach to rolling out new AI features.
Apple’s Branding Tightrope with Apple Intelligence
Apple’s AI ambitions come with their own set of challenges. Unlike other tech companies that are all in on cloud-driven, large-scale AI, Apple is taking a restrained, privacy-focused approach with Apple Intelligence. This tightrope walk means balancing cutting-edge innovation with Apple’s existing brand of security and minimalism.
As Apple rolls out Apple Intelligence across its devices, it has to promote it without overwhelming users. Tech enthusiasts will appreciate the nuance of Apple Intelligence’s AI, but the broader audience will need to be introduced to it through practical applications in the iOS experience. Apple’s approach is to let the feature speak for itself as it becomes part of users’ daily habits, from Mail’s AI-driven summaries to Photos’ object recognition.
Apple Intelligence Availability: Roadblocks and Rollouts
Apple’s rollout of Apple Intelligence, the in-house AI platform that will power the iPhone 16’s most advanced features, has several roadblocks to adoption. Along with a staggered feature release schedule, regulatory hurdles have effectively blocked Apple Intelligence in some of Apple’s biggest markets. This limited availability is a big roadblock and could prevent the iPhone 16 from getting the supercycle Apple has gotten with previous product cycles.
Regulatory Hurdles: EU and China Exclusions
The iPhone 16’s AI features won’t be available to Apple’s EU or China customers — at least not at launch. According to a company statement to the Financial Times, the EU’s Digital Markets Act (DMA) has introduced “regulatory uncertainties” that prevent Apple from launching Apple Intelligence, iPhone Mirroring, and SharePlay Screen Sharing this year. These regulatory hurdles mean EU users will miss out on Apple Intelligence’s AI features, including features that would otherwise be reasons to upgrade.
China is another big roadblock for Apple Intelligence. The Chinese version is expected in 2025 but as the South China Morning Post points out it’s unclear if China’s evolving AI regulations will delay or even block the feature. This means Apple can’t offer one of the iPhone 16’s biggest selling points to a big chunk of its global user base.
Apple Intelligence in the US: A Better Outlook
The US will see Apple Intelligence with the iOS 18.1 update. After testing the developer beta, it’s clear the feature is close to public release, but a few minor bugs are still being ironed out. Apple’s cautious rollout matches its overall approach to fine-tuning generative AI for a better user experience without sacrificing its standards for privacy and reliability.
One key aspect of Apple Intelligence in the US release is the “opt-in” setup. Given the current skepticism around generative AI, Apple’s decision to make Apple Intelligence an opt-in feature is a nod to user choice and privacy. Users will have to manually enable the feature in the settings menu, which may be a slight inconvenience, but means users are actively choosing to use AI-driven features.
Some Features Available Even If Apple Intelligence is Off
Even if Apple Intelligence is turned off, some of its features will still be available. “Clean Up,” Apple’s photo editing tool, which is similar to Google’s Magic Eraser, is one of them. It’s in the Photos app but requires a one-time download upon first use. This gives users a taste of Apple Intelligence without having to turn on the whole AI suite, which may appeal to those who are still skeptical of generative AI.
Apple Intelligence Writing Tools: A Gentle Intro to Generative AI
As Apple Intelligence arrives with the iPhone 16, Writing Tools is the most widely available AI feature across the Apple ecosystem. It’s in native apps like Pages and Mail and is designed to work with third-party apps so developers can access Apple’s AI writing features. This is Apple’s first big foray into generative AI, positioning AI as a helper that works behind the scenes.
Why Writing Tools is Apple’s First GenAI Release
Writing Tools’ text generation is like what’s made platforms like ChatGPT popular. Instead of changing how we interact from the ground up, Apple is bringing AI to the iPhone in a familiar and useful way. Apple is meeting people where they are—in the writing and note-taking tools they already use. Apple isn’t overwhelming users with “AI magic” and instead is giving them targeted productivity-focused upgrades.
How to Use Writing Tools: Simple and Fast
Using Writing Tools feels native and nonobtrusive. Users highlight text and then tap on Writing Tools from the popup menu, which opens a secondary menu with options:
Proofread: Checks grammar and clarity.
Rewrite: Suggests alternative text for highlighted text.
Friendly: Makes text sound warmer and more conversational.
Professional: Tones down the text.
Concise: Shortens the text while keeping the essence.
Summary/Key Points: Summarizes long text into key points.
List/Table: Converts text into a list or table.
As someone who likes to craft each word, I thought I’d skip these features. But I’ve come to like the Summary/Key Points tool, especially when reviewing long email threads or organizing research notes. It’s actually helpful in simplifying communication. The List/Table option feels a bit redundant since it just adds bullet points without making the text clearer or more organized. I’d love to see more adaptive features here — like a smart outline tool that restructures the text based on logical points or priorities.
User Feedback for Future Updates
Apple’s thumbs-up and thumbs-down icons are a nice touch, allowing users to give specific feedback on suggestions. This is especially important in an AI setting where nuance is key, and every user has their own preferences. I use the thumbs down on the more formal or friendly rephrasing options as they can sometimes miss the subtlety of my original tone. With Apple’s history of iterative updates, I hope this feedback loop will lead to more targeted and context-sensitive suggestions.
First Impressions and Limitations
As someone who enjoys writing, I find some of the stylistic options, like Friendly and Professional, a bit too formulaic to be useful. That said I can see how others might find them useful especially when switching between different writing contexts. Concise and Summary have been useful, especially for long emails and notes where brevity is key. Writing Tools doesn’t try to take over the writing process, which is a nice change of pace from AI tools that feel like they’re trying to “replace” the writer rather than help them.
But sometimes, AI’s rephrasing can feel generic, almost like it’s being too cautious about changing too much. If Apple added a “creative rewrite” option, I think it would add more range to the writing enhancements without losing the user’s tone.
Apple Intelligence and Siri: A Long Overdue Refresh for the Smart Assistant
With Apple Intelligence rolling out on the iPhone 16, Siri is one of the most hyped updates, especially for those who have seen smart assistants plateau over the past 10 years. Since its release 13 years ago, Siri has struggled to keep up with the pace of AI advancements. Now, with Apple Intelligence, Siri finally gets a major refresh that redefines what Apple’s assistant can do and how it does it.
The New Siri Interface: A Clean and Modern Look
The most obvious change in Siri is the new interface. Apple has replaced the glowing orb with a smooth, glowing border that outlines the screen when Siri is active. This new design isn’t just pretty; it’s functional, too. When you activate Siri, the subtle animation doesn’t block the text, which I find much more user-friendly. It’s an attractive but unobtrusive visual cue that lets you know the assistant is listening, a nice touch that makes the experience feel more modern.
The screen jiggling is fun, but Apple has kept it minimal. The interface refresh is subtle, but enough to let you know Siri’s capabilities are on a whole new level. It’s a nice touch that shows Apple’s commitment to user-centric design.
New Capabilities: Contextual Awareness and Task Assistance
While the interface is new, the assistant’s most exciting features are its capabilities. Apple Intelligence powers a better understanding of user intent, which has been a long time coming. Siri is now better at understanding what you’re asking even if you stumble over your words, making interactions feel more natural and less annoying.
For practical tasks, Siri’s updates are super noticeable. Now, you can ask Siri to do specific things in apps, like log medications in the Health app. These features make Siri feel less like a basic voice command tool and more like a part of the iOS experience. In my testing, Siri’s improved contextual awareness was useful. For example, it can use recent requests and the current screen content to inform the response—a nice but powerful feature.
Rolling Out in Stages: Apple’s Iterative Approach to Siri’s AI
As expected, Apple is taking an iterative approach to Siri’s updates. Not all of Siri’s new features will be available at launch; several big-ticket items will roll out in future iOS updates. Advanced contextual awareness and a better understanding of past interactions are among those. This may be frustrating for users who want everything now, but it’s a practical approach. By gradually rolling out features, Apple can test and refine Siri based on real-world feedback and make sure the assistant works seamlessly across devices.
Photographic Intelligence
Apple Intelligence’s photo editing has taken a giant leap forward with features like Clean Up. This new tool brings users a level of convenience and capability that feels more like an advanced feature than a complex AI process.
What is Clean Up?
Clean Up allows you to enhance your photos by removing unwanted objects. It’s like Google’s Magic Eraser, which is another feature that sets the standard for photo editing in the age of generative AI. When you use Clean Up, you simply take a photo and circle the object or detail you want to remove. The software then works its magic and generates a background approximation to replace the unwanted object.
This is built on top of Apple’s existing object recognition tech, enabling features like background removal in photos. It’s a simple concept, but execution is what matters in photo editing.
Real-world Experience With Clean Up
From my personal testing, Clean Up works well for simple edits. For example, I used it to remove a random pedestrian from a landscape shot, and it filled in the gap nicely, making the final image look natural. But like any AI-based tech, it has its limits. I found it struggled with complex backgrounds, especially those with intricate patterns or multiple colors.
For example, when I tried to remove a stray branch from a photo of a lake surrounded by trees, the algorithm had trouble getting the background right. It’s impressive in many cases but not foolproof, and users should expect occasional imperfections.
Object Recognition and Wildlife Photography
One of the best uses of the new object recognition is in wildlife photography. After moving from the city to the country, I decided to put the features to the test with the various animals around my home. It was fun to see how well the software would recognize different creatures. For example, it quickly recognized an eastern chipmunk lounging on the armrest of my Adirondack chair. This was cool and showed off the software’s capabilities.
But my pet rabbit June was a funny test. The software alternately labeled her as a cat and a mammal. While it’s technically true she’s a mammal, the mix-up shows the limitations of current AI in distinguishing between similar animals. These quirks add a bit of humor to the user experience.
Search
Beyond Clean Up, Apple Intelligence introduces new search capabilities within the Photos app. Users can now search for images with more complex keyword strings, making it easier to find specific photos in an extensive library. For example, instead of just searching for “dog,” you can search for “dog at the beach” and get more precise results. This new level of search makes the app more usable, so you can find the images you want without having to scroll through hundreds of thumbnails.
iPhone 16 Camera: Professional Quality
When it comes to upgrades, the camera is always the star, and the iPhone 16 is no exception. Apple knows we love photography and wants to make it better with every iteration. The iPhone 16 camera system is Apple’s attempt to marry advanced tech with ease of use.
A Dual Focus
Apple’s camera development is driven by two main goals: matching the quality of a standalone camera and simplifying the photography experience for everyday users. This dual approach means photography enthusiasts and casual users can have a great experience without navigating complex settings.
Matching Standalone Cameras
The first goal of the iPhone 16 camera system is to match the performance of standalone cameras as closely as possible. This means improving the sensors and the image signal processor (ISP), the two most important parts of the camera. The addition of features like a 5x telephoto lens and macro shooting takes the iPhone’s photography capabilities to the next level.
The 3D sensor-shift optical image stabilization is cool. This technology reduces blur in photos, making your images sharp, even in low light or when you’re moving. It’s a game changer for anyone who wants to capture spontaneous moments without worrying about camera shake.
Easy to Use
The second goal is to help non-experts get great results without having to dive into the camera settings. In today’s fast-paced world, people want their photos to look good instantly. The iPhone 16 delivers on that by adding features that automate parts of the photography process.
For example, the camera app will give you the optimal settings for the shooting environment. You can trust your photos will look good and be well-exposed without adjusting anything. This is a relief for those who don’t have the time or inclination to learn the intricacies of photography.
Video and Audio
The iPhone 16 also improves video capabilities. Improved mic quality and wind sound reduction are big pluses for video recording. Good sound can make or break a video, especially in noisy environments.
For casual videographers, the ability to isolate voices within the frame is a nice addition. This is perfect for recording friends in a busy restaurant or during outdoor events where wind noise would otherwise ruin the audio. Professional videographers may still use standalone mics for more controlled sound capture, but the iPhone 16 provides great options for spontaneous, high-quality video shooting.
iPhone 16 Components: A New Tech and Experience
With the iPhone 16 series, Apple has changed its chip strategy and overall design to make the experience more cohesive across the lineup. All new devices will have either the A18 or A18 Pro chip. Apple is finally bringing the Apple Intelligence experience to all devices (previously only available on Pro models in the iPhone 15 series).
Unified Chip
By putting the A18 and A18 Pro chips on all iPhone 16 devices, Apple is finally eliminating the frustration of iPhone 15 owners who only had access to certain features on Pro models. This move makes the experience more streamlined and powerful, and advanced features are available to more people.
A18 Pro Chip
The A18 Pro takes performance to the next level with a 16-core Neural Engine, 6-core CPU, and 6-core GPU. Apple still has a way to go in the AAA gaming space, but the improvements in graphics—faster hardware-accelerated ray tracing, mesh shading, and dynamic caching—make the iPhone a more competitive gaming platform. This is good news for mobile gamers who want more device performance.
Durability and Battery Life
For almost a decade, I’ve been saying that smartphones need to improve durability and battery life. Apple has definitely improved durability by introducing stronger glass and Ceramic Shields. I haven’t done drop tests myself yet, but early reviews say the iPhone 16 can withstand daily use better than its predecessors.
Better Battery Life
Battery life is another big focus for Apple, and while they didn’t disclose the exact increase in capacity, they say the iPhone 16 Pro Max has the best battery life of any iPhone ever. The more power-efficient A18 chip also helps with this by allowing you to leave your chargers at home during the day and not worry about battery drain. I’ve personally experienced this; my phone easily lasts a full day on a single charge.
Repairability
Apple didn’t mention repairability during the event, which felt like a big miss, given the Right to Repair movement. However, there are some improvements in this area—a new adhesive design makes it easier to repair the device, and the Repair Assistant has been added in iOS 18. This is a good start for consumers who want to fix their devices instead of replacing them.