Apple finds its price ceiling, readies to unleash AI

Another year, another launch of Apple iPhone models characterised by incremental tweaks, with the infusion this year of Apple Intelligence to, among other things, supercharge Siri.

Apple has gradually ratched up the price of its flagship iPhone devices in recent years, but sensing the chill economic winds blowing in the US, decided to keep pricing for its iPhone 16 line-up the same as for the iPhone 15 series.

In fact, New Zealand pricing takes a dip - the iPhone 16 will go on sale for $1,599 (iPhone 15 $1,649), iPhone 16 Plus $1,799 (iPhone 15 Plus $1,849), iPhone 16 Pro $1,999 (iPhone 15 Pro $2,099) and iPhone 16 Pro Max $2,399 (iPhone 15 Pro Max $2,499). The Low New Zealand dollar relative to the US dollar may have something to do with that.

Analysts were anticipating a bump in pricing to pay for the more sophisticated hardware and development costs associated with Apple Intelligence, which debuts on the iPhone 16 phones and higher-end iPhone 15 models, running iOS 18. The market has warmly greeted previous iPhone price increases as a way of propping up revenue from Apple’s phone business. Apple stock ended flat after losing some ground, but the market appears to have accepted the fact that Apple has found its price ceiling for now. 

A few highlights for me among the flurry of Apple announcements this morning:

Apple Watch sleep apnoea notifications: The new Apple Watch features a Breathing Disturbances metric which will let the wearer know if signs of sleep apnoea are detected. Sleep apnoea is a disorder that causes you to stop breathing while asleep. I know several people who suffer from. It can have major debilitating health consequences. 

For more informed conversations with their healthcare providers, users can export a PDF that shows when sleep apnoea may have occurred, three months of breathing disturbance data, and additional information. Educational articles are also available within the Health app to help users learn more about sleep apnoea.

Many smartwatches have sleep tracking, but alerting people specifically sleep apnoea may spur them to get the medical treatment and make the lifestyle changes they need to address the condition. The feature hasn’t got US FDA approval yet, but Apple is anticipating it will come through shortly so is launching the feature this month in 150 countries. The sleep apnoea function will be available on the Apple Watch Series 9, Apple Watch Series 10, and Apple Watch Ultra 2.

Source: Apple

AirPods Pro as hearing aids: Apple is essentially turning its AirPods Pro earbuds into a “clinical grade hearing aid” for users with mild to moderate hearing loss. Users can undertake a Hearing Test.

“After set-up, the feature enables personalised dynamic adjustments so users have the sounds around them boosted in real time. This helps them better engage in conversation, and keeps them connected to the people and environment around them,” Apple claims.

It won’t replace dedicated hearing aids for users who require them for all day use - who would want to walk around all day with AirPods stuck in their ears? But I know a lot of people who have mild hearing loss and won’t go to the next step of investigating getting hearing aids - which are pretty expensive. This could be a useful intermediate measure, depending on how effective the technology is.

Apple continues: “The user’s personalised hearing profile is automatically applied to music, movies, games, and phone calls across their devices, without needing to adjust any settings. Users can also set up the Hearing Aid feature with an audiogram created by a hearing health professional.”

The hearing aid tech will be available on the new AirPods Pro 2 ($479) in conjunction with an Apple device running iOS 18.

Source: Apple

Apple Intelligence: Android phone makers’ integration of AI has been underwhelming to date. Can Apple change that? The iPhone 16 line-up goes big on AI with the first features becoming available next month. There’s dedicated Apple silicon featuring a neural processor and generative models. Most tasks will be completed on the device for greater privacy and security - the rest will be sent to Apple’s Private Cloud Compute.

The system-wide Writing Tools will assist in writing better notes and emails, there’s auto-transcription and summation of audio recordings, which you currently need to use a third-party app like Otter for. In-phone recording is a great feature and, if activated, informs the other participants for privacy reasons. Inbox summaries should do for free what Microsoft and Google are charging users a monthly fee.

Siri is getting a major AI upgrade, which could make the digital voice assistant a lot more useful. 

“With richer language-understanding capabilities, communicating with Siri is more natural and flexible,” Apple promises. 

“Siri follows along when users stumble over their words, and maintains context from one request to the next. Users can type to Siri at any time, and switch fluidly between text and voice as they accelerate everyday tasks. Siri also now has extensive product knowledge to answer thousands of questions about features on iPhone and other Apple devices.”

We will see about that!

Camera Control: Apple’s high-end iphones, the Pro and pro Max models, are popular among creatives for high-quality photography, and video production. In that sense, the addition of a new physical button to these phones makes sense. As Apple explains:

“It has a tactile switch that powers the click experience, a high-precision force sensor that enables the light press gesture, and a capacitive sensor that allows for touch interactions. A new camera preview helps users frame the shot and adjust other control options — such as zoom, exposure, or depth of field — to compose a stunning photo or video by sliding their finger on the Camera Control. 

“Later this spring, Camera Control will be updated with a two-stage shutter to automatically lock focus and exposure on a subject with a light press, letting users reframe the shot without losing focus. Additionally, developers will be able to bring Camera Control to third-party apps such as Kino, which will offer users the ability to adjust white balance and set focus points, including at various levels of depth in their scene,” Apple adds.

Source: Apple

Previous
Previous

Too Many Choices? The Overwhelming World of Digital Tech Education in NZ

Next
Next

Griffin on Tech: Aussie floats mandatory guardrails for AI