Connect with us

Apple

Why Apple Watch is Apple’s Most Important Product

Published

on

I think the Apple Watch is the best (or should I say?) most important device the tech giant makes. It’s amazing for notifications, productivity and fitness but most of all – it’s vital for health and safety.

As I explained on Sky News Weekend Edition in the video above, Apple’s latest Watch campaign includes a couple of Australian survival stories that give compelling reasons as to why having a smartwatch is a good idea.

The two Aussies are Bruce Mildenhall and Lexie Northcott. We spoke with them last week and they had two very different lifesaving experiences with Apple Watch.

BRUCE’S STORY – FALL DETECTION

Back in 2021 Bruce Mildenhall was enjoying his regular bike ride in Victoria’s Macedon Ranges when a Kangaroo hopped out of nowhere.

After hitting a kangaroo and coming off his bike, he was knocked out. Fortunately for Bruce, his Apple Watch detected the fall and notified his emergency contacts and emergency services. 

Mildenhall remembers waking up in an ambulance and hearing his wife pounding on the door asking if he was alive. He was taken to hospital where he spent a week recovering from a dislocated shoulder and fractured ribs. 

There’s no doubt fall detection helped Mildenhall receive medical attention quickly. Without it, he could have been stuck for hours or days. It’s also important to consider a cellular Apple Watch. It works independently of the iPhone (once setup) but it’s also important to note that a cellular Apple Watch will call 000 even it doesn’t have a plan.

Bruce says he’s fully recovered from his injuries and he’s back on the bike, cycling the same route like a champ.

LEXIE NORTHCOTT – LOW HEART RATE NOTIFICATIONS

This a powerful story about taking the data seriously from your Apple Watch.

In 2019 Lexie Northcott was just 16 when she received an Apple Watch as a birthday present. Soon after wearing the watch she started receiving low heart rate notifications – daily. 

As a young and fit person, Lexie dismissed the notifications. But the notifications kept coming, several times a day.

A year after receiving the watch, Lexie was visiting a doctor on another matter when her mother, Karla, mentioned the Apple Watch notifications. The doctor assumed Lexie would be fine considering her age but suggested she get an ECG (electrocardiogram) just to be sure. 

A week later things escalated for dramatically.

During the ECG, doctors informed Lexie that she was at extreme risk of heart failure. They rushed to Melbourne where she received heart surgery.  

Lexie is doing fine now and admits to feeling safer because of her Apple Watch. Karla (mum) believes the Apple Watch saved her daughters life. Without it, they would not have known to mention a low heart rate concern to doctors.

This is why the Apple Watch is the most important device made by Apple, in my opinion. If your parents are getting older and they have an iPhone, seriously consider getting them an Apple Watch.

There is no downside, just up.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI

How to Use Apple Intelligence Outside the United States

Published

on

We have an update about the availability of Apple Intelligence for users outside the United States. Or more accurately, those “non-American” speakers.

Remember Apple stated at WWDC, “Apple Intelligence is free for users, and will be available in beta as part of iOS 18iPadOS 18, and macOS Sequoia this spring in U.S. English.”

That’s U.S. ENGLISH if you want Apple Intelligence to work on your device.

Of course this setting is easily changed but it wasn’t until today that we received confirmation that Apple Intelligence will work for Australians (or anyone else) selecting U.S. English outside the U.S.A.

Apple could have decided to geo-block access even after the user changed languages, but that concern was dismissed today. Anyone who changes their language to U.S. English in general settings and Siri will be able to join in the Apple Intelligence fun.

As an Australian English user, I showed how easy it is to change language settings to U.S. English during Sunday’s tech segment with Tim Gilbert on Sky News Weekend Edition.

DJURO SEN TALKS APPLE INTELLIGENCE WITH TIM GILBERT ON SKY NEW WEEKEND EDITION

The above TV segment was broadcast live before Apple confirmed ANYONE with a device switched to U.S. English will be able to access Apple Intelligence.

Some features, software platforms, and additional languages will arrive next year. Apple Intelligence will be available on iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac with M1 and later, with Siri and device language set to U.S. English.

Clearly it means giving up some local language support but if I can get early access to Apple Intelligence, I’m OK with that.

I am desperate to get a smarter version of Siri but we could be waiting until next year to see the full benefits of Apple Intelligence.

Continue Reading

Apple

WWDC 2024: Canon and Blackmagic Design to Help Push Apple Vision Pro Sales

Published

on

Content is king and that’s why Apple needs Canon and Blackmagic Design cameras to capture high quality spatial and immersive video to wow potential buyers of Vision Pro headsets. We’ve been able to shoot Vision Pro spatial video for some time on the iPhone but soon professionals and expert amateurs will be able to capture better quality images, specifically for Apple’s premium device.

At WWDC Apple announced that Vision Pro will soon be selling in Australia, China, Hong Kong, Japan, Singapore, Canada, France, Germany, and the U.K. More markets means more content.

At a RRP of A$5,999, the Vision Pro needs to blow you away. Fortunately there will be more ways to gather stunning content thanks to Canon and Blackmagic Design. One for the enthusiasts and the other for pros only.

CANON RF-S7.8mm F4 STM DUAL lens for EOS R7

The lens shown by Apple in its keynote is the Canon RF-S7.8mm F4 STM DUAL lens for its popular APS-C R7 camera. It’s under development specifically for Apple Vision Pro spatial content. The new DUAL lens will be available between September and December this year.

The RF-S7.8mm F4 STM DUAL lens features a field angle that is similar to a person’s field of view and it’s equipped with a high-speed autofocus mechanism.

If you can’t wait for the Apple specific lens then try the one below.

CANON RF-S 3.9mm F3.5 STM DUAL FISHEYE

The lens above is the (soon-to-be-released) RF-S 3.9mm F3.5 STM DUAL FISHEYE. It’s an APS-C (VR) lens and joins the RF 5.2mm F2.8L DUAL FISHEYE in the EOS mirrorless lineup.  

The lens will only be compatible with the EOS R7 camera at launch. The recording angle is 144-degrees but it results in a natural, forward-facing view of the world. It’s also the first 3D lens with Autofocus (AF) from Canon, with a One Shot Autofocus (AF). This helps a lot. You’ll also find a rear mounted filter holder for both screw-on and gelatin filters.  

The RF-S 3.9mm F3.5 STM DUAL FISHEYE is priced at A$1,999 RRP and should be available late June.

I’ve shot immersive 180 degree videos with Canon’s super-fast dual fisheye lens and although impressive, I really didn’t have anywhere to show them off. Although I tried my best in the Sky News segment below.

The Canon RF 5.2mm f/2.8L Dual Fisheye lens was the world’s first lens for digital interchangeable lens cameras enabling 180° VR shooting to a single sensor. Using a Canon R5 you could record 8K 30p. Resolution is extremely important when it comes to video on VR/AR headsets. The higher the better.

Apple Vision Pro’s ultra-high-resolution display system uses micro-OLED technology to squeeze 23 million pixels into two displays. Each one is the size of a postage stamp, delivering more pixels than a 4K TV to each eye.

That’s where Blackmagic Design’s URSA Cine Immersive camera system is set to reach new standards in immersive video quality on the Apple Vision Pro.

BLACKMAGIC URSA CINE IMMERSIVE

“We are thrilled to announce the first-ever commercial camera system and post-production software that supports Apple Immersive Video, giving professional filmmakers the tools to create remarkable stories with this powerful new format pioneered by Apple,” said Grant Petty, Blackmagic Design CEO.

“Built on the new URSA Cine platform, URSA Cine Immersive features a fixed, custom, stereoscopic 3D lens system with dual 8K image sensors that can capture 16 stops of dynamic range.”

Blackmagic URSA Cine Immersive uses a fixed, custom lens system pre-installed on the body, which is designed specifically for Apple Immersive Video. The sensor delivers a jaw-dropping 8160 x 7200 resolution per eye with pixel level synchronisation.

The custom lens system is designed specifically for URSA Cine’s large format image sensor with extremely accurate positional data that’s read and stored at time of manufacturing. This immersive lens data — which is mapped, calibrated and stored per eye — then travels through post production in the Blackmagic RAW file itself.

Cinematographers can shoot 90fps stereoscopic 3D immersive cinema content to a single file.

The camera comes with 8TB of high performance network storage built in. It records directly to the included Blackmagic Media Module, and can be synced to Blackmagic Cloud and DaVinci Resolve media bins in real time. This gives over 2 hours of Blackmagic RAW in 8K stereoscopic 3D immersive.

DaVinci Resolve with Apple Immersive video support for Apple Vision Pro will be released later this year. Blackmagic customers will be able to edit Apple Immersive Video shot on the URSA Cine Immersive camera. A new immersive video viewer will let editors pan, tilt and roll clips for viewing on 2D monitors or on Apple Vision Pro for an even more immersive editing experience.

Transitions rendered by Apple Vision Pro will also be able to be bypassed using FCP XML metadata, giving editors clean master files. Export presets will enable quick output into a package which can be viewed directly on Apple Vision Pro.

Blackmagic URSA Cine Immersive Features

  • Dual custom lens system for shooting Apple Immersive Video for Apple Vision Pro.
  • 8K stereoscopic 3D immersive image capture.
  • 8160 x 7200 resolution per eye with pixel level synchronisation.
  • Massive 16 stops of dynamic range.
  • Lightweight, robust camera body with industry standard connections.
  • Generation 5 Colour Science with new film curve.
  • Dual 90 fps capture to a single Blackmagic RAW file.
  • Includes high performance Blackmagic Media Module 8TB for recording.
  • High speed Wi-Fi, 10G Ethernet or mobile data for network connections.
  • Optional Blackmagic URSA Cine EVF.
  • Includes DaVinci Resolve Studio for post production.

DaVinci Resolve Immersive Features

  • Support for monitoring on Apple Vision Pro from the DaVinci Resolve timeline.
  • Ability to edit Blackmagic RAW Immersive video shot on Blackmagic URSA Cine Immersive.
  • Immersive video viewer for pan, tilt and roll.
  • Automatic recognition of Apple Immersive Video (left and right eye) for dual file stereo immersive content.
  • Option to bypass transitions rendered by Apple Vision Pro.
  • Export and deliver native files for viewing on Apple Vision Pro.

Availability and Price

Blackmagic URSA Cine Immersive and the update to DaVinci Resolve will be available later this year.

Continue Reading

AI

WWDC 2024: Apple Intelligence

Published

on

Apple’s Worldwide Developers Conference (WWDC) is underway in California and yes, AI is the feature item. ChatGPT is part of the package but not all of it. Apple has been using a types of AI to perform tasks like photography for some time but now the tech giant is all in with generative AI.

AI NOW MEANS

“APPLE INTELLIGENCE”

Apple Intelligence will be available later this year when integrated into iOS 18, iPadOS 18, and macOS Sequoia. Private Cloud Compute adds computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers.

“Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence,” said Apple CEO Tim Cook.

SIRI

Siri gets a badly needed overhaul thanks to Apple Intelligence. One of the main issues with Siri has been the lack of integration with the iPhone’s features and lack of understanding of context. That’s all changing with AI. We can look forward to more intelligent language capabilities, plus more natural and more effective communication.

If you hesitate or stumble, Siri can follow your conversation while maintaining context from one request to the next. Impressively, users can type to Siri, and switch between text and voice – nice. It can even give tech support by answering questions about how to do something on the iPhone or iPad.

Siri also looks different. It will light up around the display edges when active, giving a clear indication that it is listening.

Siri will be much more useful thanks to AI. If a friend texts a user their new address in Messages, the receiver can say, “Add this address to his contact card.” Siri will work across apps too.

I often run into trouble trying to play a podcast via Siri I’ve been listening to since 2006. Now I can say things like, “Play that podcast that Mick recommended,” and Siri will locate and play the episode, without the user having to remember whether it was mentioned in a text or an email. Or, “When is Mum’s flight landing?” and Siri will find the flight details and cross-reference them with real-time flight tracking to give an arrival time.

If Siri delivers on all these changes in real life, while I’m driving my car, I’ll be doing cartwheels to celebrate.

CHATGPT

Later this year ChatGPT access will be integrated into iOS 18, iPadOS 18, and macOS Sequoia but Apple’s put a deliberate roadblock in place. Users will be asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly.

Privacy protections when using ChatGPT include the obscuring of IP addresses and OpenAI won’t store requests. ChatGPT’s data-use policies will apply when users connect their account. Apple users can access it for free without creating an account, and ChatGPT subscribers can connect their accounts and access paid features right from these experiences.

ChatGPT will be accessible from Apple’s Writing Tools ecosystem and through Compose, users can also access ChatGPT image tools.

PRIVACY

Apple has always been big on privacy. Sometimes it’s held the company back while others have made profitable use of our previous personal data. To make AI work on a device it needs to get to know us. Know our habits and preferences.

This requires trust.

On-device processing is a big part of that for Apple and the iPhone for example is a very secure device to run AI. To run more complex requests that require more processing power, Private Cloud Compute extends the privacy and security of Apple devices into the cloud to unlock even more intelligence.

With Private Cloud Compute, Apple Intelligence can flex and scale its computational capacity and draw on larger, server-based models for more complex requests. These models run on servers powered by Apple silicon, so that only data relevant to the user’s task is processed on Apple servers, but never stored or made accessible to Apple.

PHOTOS

You’ll be able to search through your photo library using natural language. Using Apple Intelligence anyone can search for “Dick surfing in a red shirt,” or “Jessie with ink on her face.”

A really handy feature will be searching within videos. You’ll be able to find specific moments without having to scroll through hundreds of minutes of video.

Just like Android AI devices, iOS users can use the new Clean Up tool to identify and remove distracting objects in the background.

Memories allows users to create a story by just typing a description. Apple Intelligence will choose the photos and videos, create a storyline and arrange them into a movie. AI will offer song suggestions to match from Apple Music.

During this process all photos and videos are kept private on device.

WRITING

Users can rewrite, proofread, and summarise text they write, including Mail, Notes, Pages, and third-party apps. AI will allow users to choose from different versions of their written content.

MAIL

In Mail there’s a new Priority Messages section at the top of the inbox. Instead of previewing the first few lines of each email, users can see summaries without needing to open a message. When it comes to long threads, key details can be selected and Smart Reply gives suggestions for a quick response, while AI identifies questions in an email.

NOTIFICATIONS

Priority Notifications appear at the top of the stack to show what’s most important. Summaries help users scan long or stacked notifications to show key details right on the Lock Screen. To stop notifications from being a distraction – Reduce Interruptions – will only show notifications that might need immediate action, like an urgent medical call.

TRANSCRIPTION

In the Notes and Phone apps, users can now record, transcribe, and summarise audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.

IMAGE PLAYGROUND

Image Playground allows users to create images in three styles: Animation, Illustration, or Sketch. It’s a dedicated app which also operates within apps like Messages. In Notes, users can access Image Playground through the new Image Wand in the Apple Pencil tool palette. You will also find it in Keynote, Freeform, and Pages plus third-party apps.

GENMOJI

Something I’ve always wanted to do on the fly is create my own emoji. Now we can. Users can create an original ‘Genmoji’ just by typing a description or based on their photos of friends and family. As with emojis, Genmoji can be added inline to messages, or shared as a sticker or reaction in a Tapback.

AVAILABILITY

Apple Intelligence is free for Apple device users, and will be available in beta as part of iOS 18iPadOS 18, and macOS Sequoia later this year in U.S. English.

Some features, software platforms, and additional languages will arrive next year. Apple Intelligence will be available on iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac with M1 and later, with Siri and device language set to U.S. English.

Continue Reading
Advertisement

Recent Most Popular