Connect with us

Apple

iPhone 15 and iPhone 15 Plus Get 48MP, USB-C, Dynamic Island

Published

on

CAMERA

The iPhone 15 and its bigger version have benefitted from several features found in last year’s iPhone 14 Pro models. Although still using two cameras, the main camera is now 48MP, allowing for a 2x Telephoto option. Combined with the second camera, users will operate the phone as if it has three cameras, 0.5x, 1x and 2x for the first time on an iPhone dual-camera system.

iPhone 15 main camera is now 48MP.

This really opens up the base models to taking far better images with a quad-pixel sensor and 100 per cent Focus Pixels for fast autofocus. The Main camera gives users a new 24MP super-high-resolution default image using the power of computational photography. This will help manage image quality and file size in one swift process.

0.5x, 1x, 2x for the first time on an iPhone dual-camera system.

For the first time, users can take portraits without having to switch to Portrait mode. When there’s a person, dog, or cat in the frame, or when a user taps to focus, iPhone automatically captures depth information, so users can turn photos into portraits later in the Photos app on iPhone, iPad, or Mac. For greater creative control, users can also adjust the focus point after the photo has been taken.

Computational photography is the key to great images on iPhone 15

Night mode is improved with sharper details and more vivid colours. New Smart HDR captures subjects and the background with more true-to-life renderings of skin tones, while ensuring photos have brighter highlights, richer midtones, and deeper shadows when viewed in the Photos app. This advanced HDR rendering is also available to third-party apps.

USB-C

As expected, the iPhone 15 series will ship with USB-C instead of the Lightning Port which has served Apple well for over a decade. Personally, I’m relieved this has finally happened. The Lightning Cable is the one cord I always forget when travelling.

USB-C replaces the Lightning Port

The main reason for that forgetfulness has been the success of MagSafe. I have MagSafe chargers everywhere, so the last thing I think of is packing a Lighting Cable. This will also enables faster speeds when offloading video, something Image Matrix Tech will cover in-depth in the iPhone 15 pro stories.

DISPLAY AND APPEARANCE

Dynamic Island was a clever way to make use of the camera cutout a the top of the display in the iPhone 14 Pro models. Now it’s on the 6.1-inch and 6.7-inch displays of the iPhone 15. The Island expands and adapts so users can see the next direction in Maps; easily control music; and, with third-party app integrations, get real-time updates on food delivery, ride sharing, sports scores, travel plans, and more.

Dynamic Island now on iPhone 15

The Super Retina XDR display Peak HDR brightness now reaches up to 1600 nits. When in full sun the peak outdoor brightness reaches up to 2000 nits — twice as bright as the previous series.

For the first time in a smartphone, colour is infused throughout the back glass, which are pink, yellow, green, blue, and black. The back glass is strengthened with an optimised dual-ion exchange process before being polished with nanocrystalline particles and etched to create a textured matte finish. A new contoured edge on the aerospace-grade aluminium enclosure and Ceramic Shield front cover give better protection.

Apple’s Kaiann Drance Launches iPhone 15

PERFORMANCE

The A16 Bionic chip has two high-performance cores that use 20 per cent less power and four high-efficiency cores that together deliver speed and extended battery life. The 5-core GPU has 50 per cent more memory bandwidth for smooth graphics when streaming videos and playing games. A new 16-core Neural Engine is capable of nearly 17 trillion operations per second, enabling even faster machine learning computations for features like Live Voicemail transcriptions in iOS 17 and third-party app experiences — all while protecting critical privacy and security features using the Secure Enclave. 

A16 Bionic chip

SAFETY FEATURES

Apple clearly sees the iPhone as a personal safety device. Whether it’s the Apple Watch saving lives by detecting irregular heartbeats or the iPhone with its Emergency SOS via satellite. The good news is the satellite service continues from iPhone 14 which is currently in 14 countries including Australia and NZ. It will will come to Spain and Switzerland later this month.

IPHONE SOS VIA SATELLITE SAVES HIKERS

Apple’s satellite service will be expanded to include roadside assistance in the US.

Apple is miles ahead in this area. I’m excited to see that roadside breakdown assistance is also being rolled out in the United States. This would make a huge impact if available in Australia. As we all know, you don’t have to go far on country roads in Australia or New Zealand to lose mobile phone cell coverage. So let’s make it happen Down Under Apple!

Both models feature the second-generation Ultra Wideband chip, enabling two iPhone devices with this chip to connect at three times the range as before. This opens up a new way to use Precision Finding for Find My friends, so iPhone 15 users can share their location and find each other, even in crowds. Precision Finding is built with the same privacy protections that users have come to trust in Find My.

Pricing and Availability

iPhone 15 and iPhone 15 Plus will be available in pink, yellow, green, blue, and black in 128GB, 256GB, and 512GB storage capacities, starting at RRP A$1499 inc. GST and RRP A$1649 inc. GST.

iOS 17 will be available as a free software update on Tuesday, 19 September.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI

How to Use Apple Intelligence Outside the United States

Published

on

We have an update about the availability of Apple Intelligence for users outside the United States. Or more accurately, those “non-American” speakers.

Remember Apple stated at WWDC, “Apple Intelligence is free for users, and will be available in beta as part of iOS 18iPadOS 18, and macOS Sequoia this spring in U.S. English.”

That’s U.S. ENGLISH if you want Apple Intelligence to work on your device.

Of course this setting is easily changed but it wasn’t until today that we received confirmation that Apple Intelligence will work for Australians (or anyone else) selecting U.S. English outside the U.S.A.

Apple could have decided to geo-block access even after the user changed languages, but that concern was dismissed today. Anyone who changes their language to U.S. English in general settings and Siri will be able to join in the Apple Intelligence fun.

As an Australian English user, I showed how easy it is to change language settings to U.S. English during Sunday’s tech segment with Tim Gilbert on Sky News Weekend Edition.

DJURO SEN TALKS APPLE INTELLIGENCE WITH TIM GILBERT ON SKY NEW WEEKEND EDITION

The above TV segment was broadcast live before Apple confirmed ANYONE with a device switched to U.S. English will be able to access Apple Intelligence.

Some features, software platforms, and additional languages will arrive next year. Apple Intelligence will be available on iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac with M1 and later, with Siri and device language set to U.S. English.

Clearly it means giving up some local language support but if I can get early access to Apple Intelligence, I’m OK with that.

I am desperate to get a smarter version of Siri but we could be waiting until next year to see the full benefits of Apple Intelligence.

Continue Reading

Apple

WWDC 2024: Canon and Blackmagic Design to Help Push Apple Vision Pro Sales

Published

on

Content is king and that’s why Apple needs Canon and Blackmagic Design cameras to capture high quality spatial and immersive video to wow potential buyers of Vision Pro headsets. We’ve been able to shoot Vision Pro spatial video for some time on the iPhone but soon professionals and expert amateurs will be able to capture better quality images, specifically for Apple’s premium device.

At WWDC Apple announced that Vision Pro will soon be selling in Australia, China, Hong Kong, Japan, Singapore, Canada, France, Germany, and the U.K. More markets means more content.

At a RRP of A$5,999, the Vision Pro needs to blow you away. Fortunately there will be more ways to gather stunning content thanks to Canon and Blackmagic Design. One for the enthusiasts and the other for pros only.

CANON RF-S7.8mm F4 STM DUAL lens for EOS R7

The lens shown by Apple in its keynote is the Canon RF-S7.8mm F4 STM DUAL lens for its popular APS-C R7 camera. It’s under development specifically for Apple Vision Pro spatial content. The new DUAL lens will be available between September and December this year.

The RF-S7.8mm F4 STM DUAL lens features a field angle that is similar to a person’s field of view and it’s equipped with a high-speed autofocus mechanism.

If you can’t wait for the Apple specific lens then try the one below.

CANON RF-S 3.9mm F3.5 STM DUAL FISHEYE

The lens above is the (soon-to-be-released) RF-S 3.9mm F3.5 STM DUAL FISHEYE. It’s an APS-C (VR) lens and joins the RF 5.2mm F2.8L DUAL FISHEYE in the EOS mirrorless lineup.  

The lens will only be compatible with the EOS R7 camera at launch. The recording angle is 144-degrees but it results in a natural, forward-facing view of the world. It’s also the first 3D lens with Autofocus (AF) from Canon, with a One Shot Autofocus (AF). This helps a lot. You’ll also find a rear mounted filter holder for both screw-on and gelatin filters.  

The RF-S 3.9mm F3.5 STM DUAL FISHEYE is priced at A$1,999 RRP and should be available late June.

I’ve shot immersive 180 degree videos with Canon’s super-fast dual fisheye lens and although impressive, I really didn’t have anywhere to show them off. Although I tried my best in the Sky News segment below.

The Canon RF 5.2mm f/2.8L Dual Fisheye lens was the world’s first lens for digital interchangeable lens cameras enabling 180° VR shooting to a single sensor. Using a Canon R5 you could record 8K 30p. Resolution is extremely important when it comes to video on VR/AR headsets. The higher the better.

Apple Vision Pro’s ultra-high-resolution display system uses micro-OLED technology to squeeze 23 million pixels into two displays. Each one is the size of a postage stamp, delivering more pixels than a 4K TV to each eye.

That’s where Blackmagic Design’s URSA Cine Immersive camera system is set to reach new standards in immersive video quality on the Apple Vision Pro.

BLACKMAGIC URSA CINE IMMERSIVE

“We are thrilled to announce the first-ever commercial camera system and post-production software that supports Apple Immersive Video, giving professional filmmakers the tools to create remarkable stories with this powerful new format pioneered by Apple,” said Grant Petty, Blackmagic Design CEO.

“Built on the new URSA Cine platform, URSA Cine Immersive features a fixed, custom, stereoscopic 3D lens system with dual 8K image sensors that can capture 16 stops of dynamic range.”

Blackmagic URSA Cine Immersive uses a fixed, custom lens system pre-installed on the body, which is designed specifically for Apple Immersive Video. The sensor delivers a jaw-dropping 8160 x 7200 resolution per eye with pixel level synchronisation.

The custom lens system is designed specifically for URSA Cine’s large format image sensor with extremely accurate positional data that’s read and stored at time of manufacturing. This immersive lens data — which is mapped, calibrated and stored per eye — then travels through post production in the Blackmagic RAW file itself.

Cinematographers can shoot 90fps stereoscopic 3D immersive cinema content to a single file.

The camera comes with 8TB of high performance network storage built in. It records directly to the included Blackmagic Media Module, and can be synced to Blackmagic Cloud and DaVinci Resolve media bins in real time. This gives over 2 hours of Blackmagic RAW in 8K stereoscopic 3D immersive.

DaVinci Resolve with Apple Immersive video support for Apple Vision Pro will be released later this year. Blackmagic customers will be able to edit Apple Immersive Video shot on the URSA Cine Immersive camera. A new immersive video viewer will let editors pan, tilt and roll clips for viewing on 2D monitors or on Apple Vision Pro for an even more immersive editing experience.

Transitions rendered by Apple Vision Pro will also be able to be bypassed using FCP XML metadata, giving editors clean master files. Export presets will enable quick output into a package which can be viewed directly on Apple Vision Pro.

Blackmagic URSA Cine Immersive Features

  • Dual custom lens system for shooting Apple Immersive Video for Apple Vision Pro.
  • 8K stereoscopic 3D immersive image capture.
  • 8160 x 7200 resolution per eye with pixel level synchronisation.
  • Massive 16 stops of dynamic range.
  • Lightweight, robust camera body with industry standard connections.
  • Generation 5 Colour Science with new film curve.
  • Dual 90 fps capture to a single Blackmagic RAW file.
  • Includes high performance Blackmagic Media Module 8TB for recording.
  • High speed Wi-Fi, 10G Ethernet or mobile data for network connections.
  • Optional Blackmagic URSA Cine EVF.
  • Includes DaVinci Resolve Studio for post production.

DaVinci Resolve Immersive Features

  • Support for monitoring on Apple Vision Pro from the DaVinci Resolve timeline.
  • Ability to edit Blackmagic RAW Immersive video shot on Blackmagic URSA Cine Immersive.
  • Immersive video viewer for pan, tilt and roll.
  • Automatic recognition of Apple Immersive Video (left and right eye) for dual file stereo immersive content.
  • Option to bypass transitions rendered by Apple Vision Pro.
  • Export and deliver native files for viewing on Apple Vision Pro.

Availability and Price

Blackmagic URSA Cine Immersive and the update to DaVinci Resolve will be available later this year.

Continue Reading

AI

WWDC 2024: Apple Intelligence

Published

on

Apple’s Worldwide Developers Conference (WWDC) is underway in California and yes, AI is the feature item. ChatGPT is part of the package but not all of it. Apple has been using a types of AI to perform tasks like photography for some time but now the tech giant is all in with generative AI.

AI NOW MEANS

“APPLE INTELLIGENCE”

Apple Intelligence will be available later this year when integrated into iOS 18, iPadOS 18, and macOS Sequoia. Private Cloud Compute adds computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers.

“Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence,” said Apple CEO Tim Cook.

SIRI

Siri gets a badly needed overhaul thanks to Apple Intelligence. One of the main issues with Siri has been the lack of integration with the iPhone’s features and lack of understanding of context. That’s all changing with AI. We can look forward to more intelligent language capabilities, plus more natural and more effective communication.

If you hesitate or stumble, Siri can follow your conversation while maintaining context from one request to the next. Impressively, users can type to Siri, and switch between text and voice – nice. It can even give tech support by answering questions about how to do something on the iPhone or iPad.

Siri also looks different. It will light up around the display edges when active, giving a clear indication that it is listening.

Siri will be much more useful thanks to AI. If a friend texts a user their new address in Messages, the receiver can say, “Add this address to his contact card.” Siri will work across apps too.

I often run into trouble trying to play a podcast via Siri I’ve been listening to since 2006. Now I can say things like, “Play that podcast that Mick recommended,” and Siri will locate and play the episode, without the user having to remember whether it was mentioned in a text or an email. Or, “When is Mum’s flight landing?” and Siri will find the flight details and cross-reference them with real-time flight tracking to give an arrival time.

If Siri delivers on all these changes in real life, while I’m driving my car, I’ll be doing cartwheels to celebrate.

CHATGPT

Later this year ChatGPT access will be integrated into iOS 18, iPadOS 18, and macOS Sequoia but Apple’s put a deliberate roadblock in place. Users will be asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly.

Privacy protections when using ChatGPT include the obscuring of IP addresses and OpenAI won’t store requests. ChatGPT’s data-use policies will apply when users connect their account. Apple users can access it for free without creating an account, and ChatGPT subscribers can connect their accounts and access paid features right from these experiences.

ChatGPT will be accessible from Apple’s Writing Tools ecosystem and through Compose, users can also access ChatGPT image tools.

PRIVACY

Apple has always been big on privacy. Sometimes it’s held the company back while others have made profitable use of our previous personal data. To make AI work on a device it needs to get to know us. Know our habits and preferences.

This requires trust.

On-device processing is a big part of that for Apple and the iPhone for example is a very secure device to run AI. To run more complex requests that require more processing power, Private Cloud Compute extends the privacy and security of Apple devices into the cloud to unlock even more intelligence.

With Private Cloud Compute, Apple Intelligence can flex and scale its computational capacity and draw on larger, server-based models for more complex requests. These models run on servers powered by Apple silicon, so that only data relevant to the user’s task is processed on Apple servers, but never stored or made accessible to Apple.

PHOTOS

You’ll be able to search through your photo library using natural language. Using Apple Intelligence anyone can search for “Dick surfing in a red shirt,” or “Jessie with ink on her face.”

A really handy feature will be searching within videos. You’ll be able to find specific moments without having to scroll through hundreds of minutes of video.

Just like Android AI devices, iOS users can use the new Clean Up tool to identify and remove distracting objects in the background.

Memories allows users to create a story by just typing a description. Apple Intelligence will choose the photos and videos, create a storyline and arrange them into a movie. AI will offer song suggestions to match from Apple Music.

During this process all photos and videos are kept private on device.

WRITING

Users can rewrite, proofread, and summarise text they write, including Mail, Notes, Pages, and third-party apps. AI will allow users to choose from different versions of their written content.

MAIL

In Mail there’s a new Priority Messages section at the top of the inbox. Instead of previewing the first few lines of each email, users can see summaries without needing to open a message. When it comes to long threads, key details can be selected and Smart Reply gives suggestions for a quick response, while AI identifies questions in an email.

NOTIFICATIONS

Priority Notifications appear at the top of the stack to show what’s most important. Summaries help users scan long or stacked notifications to show key details right on the Lock Screen. To stop notifications from being a distraction – Reduce Interruptions – will only show notifications that might need immediate action, like an urgent medical call.

TRANSCRIPTION

In the Notes and Phone apps, users can now record, transcribe, and summarise audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.

IMAGE PLAYGROUND

Image Playground allows users to create images in three styles: Animation, Illustration, or Sketch. It’s a dedicated app which also operates within apps like Messages. In Notes, users can access Image Playground through the new Image Wand in the Apple Pencil tool palette. You will also find it in Keynote, Freeform, and Pages plus third-party apps.

GENMOJI

Something I’ve always wanted to do on the fly is create my own emoji. Now we can. Users can create an original ‘Genmoji’ just by typing a description or based on their photos of friends and family. As with emojis, Genmoji can be added inline to messages, or shared as a sticker or reaction in a Tapback.

AVAILABILITY

Apple Intelligence is free for Apple device users, and will be available in beta as part of iOS 18iPadOS 18, and macOS Sequoia later this year in U.S. English.

Some features, software platforms, and additional languages will arrive next year. Apple Intelligence will be available on iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac with M1 and later, with Siri and device language set to U.S. English.

Continue Reading
Advertisement

Recent Most Popular