Connect with us

AI

F1 Broadcast Technology Embraces AI in 2024

Published

on

The most advanced sport in the world is turning to Artificial Intelligence to solve a problem that many broadcasters have struggled with for decades. Formula One is using AI to create 3-times slow motion replays from standard speed cameras. Having dedicated slow motion cameras is costly and adds more sources to an already stacked board.

Frame interpolation adds extra frames to give the impression of a higher frame rate recording. This gives a smooth playback, avoiding a choppy replay or video stutter. The latest AI systems do a far better job than previous methods of adding frames. This gives F1 viewers a chance to see smooth action replays from any track camera. You can see how this type of AI slow motion works in this Samsung S24 Ultra story and although impressive, it’s underpowered compared to the F1 cameras.

AI is also used on some of the graphics, like car tracking and virtual graphics signage, for up to eight cameras. AI is used to position signage within the frame so it looks real when cars drive in front of it. There’s no doubt AI will make its way into more of the F1 broadcast in the future but these are the main implementations of Artificial Intelligence right now.

A small group of journalists (including Image Matrix Tech) was hosted by Lenovo at the Australian Grand Prix in Melbourne last week. The company not only sponsors F1, it supplies a vast number of displays, computers and expertise to the race broadcast and technical setup team.

Lenovo consumer and business laptops, plus desktop devices, as well as monitors, tablets, and smartphones are used by F1’s 600 plus staff.

As you can see in the Weekend Edition video above, Lenovo’s High-Performance Computing (HPC) and server solutions are critical to F1’s success. Devices collect on-premise data, improve data storage at the races and help create engaging content.

It’s important to note that the F1 broadcast is split between the track setup and broadcast home base in the UK. The ETC (track-side Event Technical Centre) resembles a mobile NASA mission control. Aside from sending all camera feeds to home base, the ETC sends one 4K HDR mixed feed back to the UK where graphics and replays etc are added to the broadcast for the global feed. This local mix is vision switched by the director. F1 is way too fast for a director to call out camera selections to a dedicated vision mixer.

This is where low latency communication is important. The team in the UK needs to make sure it’s not making a mess of the original track feed. This is not easy. Once you start overlaying the live track mix with replays, graphics, team audio and more, you run the risk of breaking the flow of the telecast. But as you can see from Sunday’s race and every race before it, the F1 broadcast team has its act together.

28 UHD track cameras are used at most events, with an additional 4-6 ‘special’ unmanned cameras mounted on bridges, in kerbs, on Armco and bollards.  In addition, they use a Giro Stabilised Heli-cam, 1 Cable camera, 5 RF cameras in the Pitlane, 3 pit wall cameras, 2 roam cameras and 2 podium cameras.

Every race, over 93 cameras are deployed on cars, with up to nine cameras per car: forwards and rear from the roll hoop, a face view camera, a helmet camera, pedal camera, as well as an additional camera either on the side of the chassis or nose. All cars also carry a non-live 360-degree camera for extra content, collecting 480GB of footage from the race and at least 72GB of footage from qualifying, as well as special interest drivers for P1, P2 and P3.

“Around 500 terabytes of data is transferred per event weekend”

Around 500 terabytes of data is transferred per event weekend, with bandwidth peaking at around 7.5 Gbps at the start of an event. This is done via fibre optic cable with a 600 millisecond return trip.

The brand-new state of the art production and technology facility in Kent handles most of the processing and publishing systems. This includes remote colour correction of all broadcast cameras. The camera operators at the track control focus, position and zoom. All the other settings, including white balance and exposure are controlled in the UK.

Helicopter vision is handled the same way, remote white balance, remote iris and remote colour grade.

Lenovo’s virtualisation platform provides 1.16 THz of CPU across 448 CPU Cores, 3.5TB of RAM and 480TB of storage. It deploys over 200 network switches carrying 350 VLANs with an 200 Gbps backbone.

147 bespoke microphones are deployed around the track and are fitted on cars across F1, F2, F3, F1 Academy and Porsche, and also to the FIA cars, supplying high quality stereo audio to the production.

The ETC is the largest, most complex transportable facility of its type in the world. When assembled it covers an area of 25mX15m, and houses 750 pieces of equipment, which run over 40 bespoke software systems. The F1 crew starts assembling the facility 10 days out from event. If there’s an interruption to the link back to the UK, the local ETC is capable of producing the world feed broadcast.

F1 produces over 470 hours of live TV over the season and over 205 hours of post produced content including global news packages, promos, documentaries and features for TV Broadcast.

Image Matrix Tech also asked about drones (of course we did) and F1 is trying to work them into the broadcast but it’s not easy. Some drone work has been done but the speed of the cars, broadcast payload, crowd safety and flight time are all problems that need to be solved before drones come lose to replacing helicopters.

As you can see by the Red Bull video above, the potential is great, but there’s a long way to go for reliable drone coverage in F1.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI

WWDC 2024: Apple Intelligence

Published

on

Apple’s Worldwide Developers Conference (WWDC) is underway in California and yes, AI is the feature item. ChatGPT is part of the package but not all of it. Apple has been using a types of AI to perform tasks like photography for some time but now the tech giant is all in with generative AI.

AI NOW MEANS

“APPLE INTELLIGENCE”

Apple Intelligence will be available later this year when integrated into iOS 18, iPadOS 18, and macOS Sequoia. Private Cloud Compute adds computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers.

“Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence,” said Apple CEO Tim Cook.

SIRI

Siri gets a badly needed overhaul thanks to Apple Intelligence. One of the main issues with Siri has been the lack of integration with the iPhone’s features and lack of understanding of context. That’s all changing with AI. We can look forward to more intelligent language capabilities, plus more natural and more effective communication.

If you hesitate or stumble, Siri can follow your conversation while maintaining context from one request to the next. Impressively, users can type to Siri, and switch between text and voice – nice. It can even give tech support by answering questions about how to do something on the iPhone or iPad.

Siri also looks different. It will light up around the display edges when active, giving a clear indication that it is listening.

Siri will be much more useful thanks to AI. If a friend texts a user their new address in Messages, the receiver can say, “Add this address to his contact card.” Siri will work across apps too.

I often run into trouble trying to play a podcast via Siri I’ve been listening to since 2006. Now I can say things like, “Play that podcast that Mick recommended,” and Siri will locate and play the episode, without the user having to remember whether it was mentioned in a text or an email. Or, “When is Mum’s flight landing?” and Siri will find the flight details and cross-reference them with real-time flight tracking to give an arrival time.

If Siri delivers on all these changes in real life, while I’m driving my car, I’ll be doing cartwheels to celebrate.

CHATGPT

Later this year ChatGPT access will be integrated into iOS 18, iPadOS 18, and macOS Sequoia but Apple’s put a deliberate roadblock in place. Users will be asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly.

Privacy protections when using ChatGPT include the obscuring of IP addresses and OpenAI won’t store requests. ChatGPT’s data-use policies will apply when users connect their account. Apple users can access it for free without creating an account, and ChatGPT subscribers can connect their accounts and access paid features right from these experiences.

ChatGPT will be accessible from Apple’s Writing Tools ecosystem and through Compose, users can also access ChatGPT image tools.

PRIVACY

Apple has always been big on privacy. Sometimes it’s held the company back while others have made profitable use of our previous personal data. To make AI work on a device it needs to get to know us. Know our habits and preferences.

This requires trust.

On-device processing is a big part of that for Apple and the iPhone for example is a very secure device to run AI. To run more complex requests that require more processing power, Private Cloud Compute extends the privacy and security of Apple devices into the cloud to unlock even more intelligence.

With Private Cloud Compute, Apple Intelligence can flex and scale its computational capacity and draw on larger, server-based models for more complex requests. These models run on servers powered by Apple silicon, so that only data relevant to the user’s task is processed on Apple servers, but never stored or made accessible to Apple.

PHOTOS

You’ll be able to search through your photo library using natural language. Using Apple Intelligence anyone can search for “Dick surfing in a red shirt,” or “Jessie with ink on her face.”

A really handy feature will be searching within videos. You’ll be able to find specific moments without having to scroll through hundreds of minutes of video.

Just like Android AI devices, iOS users can use the new Clean Up tool to identify and remove distracting objects in the background.

Memories allows users to create a story by just typing a description. Apple Intelligence will choose the photos and videos, create a storyline and arrange them into a movie. AI will offer song suggestions to match from Apple Music.

During this process all photos and videos are kept private on device.

WRITING

Users can rewrite, proofread, and summarise text they write, including Mail, Notes, Pages, and third-party apps. AI will allow users to choose from different versions of their written content.

MAIL

In Mail there’s a new Priority Messages section at the top of the inbox. Instead of previewing the first few lines of each email, users can see summaries without needing to open a message. When it comes to long threads, key details can be selected and Smart Reply gives suggestions for a quick response, while AI identifies questions in an email.

NOTIFICATIONS

Priority Notifications appear at the top of the stack to show what’s most important. Summaries help users scan long or stacked notifications to show key details right on the Lock Screen. To stop notifications from being a distraction – Reduce Interruptions – will only show notifications that might need immediate action, like an urgent medical call.

TRANSCRIPTION

In the Notes and Phone apps, users can now record, transcribe, and summarise audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points.

IMAGE PLAYGROUND

Image Playground allows users to create images in three styles: Animation, Illustration, or Sketch. It’s a dedicated app which also operates within apps like Messages. In Notes, users can access Image Playground through the new Image Wand in the Apple Pencil tool palette. You will also find it in Keynote, Freeform, and Pages plus third-party apps.

GENMOJI

Something I’ve always wanted to do on the fly is create my own emoji. Now we can. Users can create an original ‘Genmoji’ just by typing a description or based on their photos of friends and family. As with emojis, Genmoji can be added inline to messages, or shared as a sticker or reaction in a Tapback.

AVAILABILITY

Apple Intelligence is free for Apple device users, and will be available in beta as part of iOS 18iPadOS 18, and macOS Sequoia later this year in U.S. English.

Some features, software platforms, and additional languages will arrive next year. Apple Intelligence will be available on iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac with M1 and later, with Siri and device language set to U.S. English.

Continue Reading

AI

What You Need to Know Ahead of Windows Recall Release

Published

on

Microsoft’s next generation AI powered Copilot+ PCs will soon be available but one controversial Windows feature has already undergone a drastic overhaul. RECALL is – as Microsoft says – ‘a new way to instantly find something you’ve previously seen on your PC’ using on device AI.

Every few seconds Recall takes a snapshot of what appears on your screen. These images are analysed locally by AI so you are not sending anything to the cloud. Recall can then offer you a timeline of your computer use through a visual interface. It really is like having a ‘photographic memory’ of all the apps, websites, images and documents that you’ve interacted with on your PC.

Obviously this would be a treasure trove for cyber criminals if they could access it. Security experts are worried it could be abused once a user logs into their device. Once the drive is decrypted, the history recorded by Recall is potentially accessible by a bad actor.

As I mentioned in the Sky News Weekend Edition segment above, Offensive Cybersecurity Advocate, Alexander Hagenah, an ethical hacker, created TotalRecall to encourage Microsoft to make changes. And thankfully Microsoft did make changes.

“Even before making Recall available to customers, we have heard a clear signal that we can make it easier for people to choose to enable Recall on their Copilot+ PC and improve privacy and security safeguards,” said Corporate Vice President Windows + Devices Pavan Davuluri.

“With that in mind we are announcing updates that will go into effect before Recall (preview) ships to customers on June 18.”

So, not only has Microsoft made Recall opt-in only, users will also need to take more security steps to activate and actively use the feature once logged into Windows 11.

Recall is now off by default

RECALL SECURITY CHANGES

  1. If you don’t proactively choose to turn it on, it will be off by default.
  2. Windows Hello enrolment is required to enable Recall. In addition, proof of presence is also required to view your timeline and search in Recall.
  3. Additional layers of data protection including “just in time” decryption protected by Windows Hello Enhanced Sign-in Security (ESS) so Recall snapshots will only be decrypted and accessible when the user authenticates. In addition, we encrypted the search index database.
Windows Hello enrolment is required to enable Recall

MORE SECURITY NOTES ON RECALL

  • Snapshots are stored locally. Copilot+ PCs have new NPUs (Neural Processing Unit) and this allows on device AI processing. No internet or cloud connections are used to store and process snapshots. Microsoft says your snapshots are yours and they are not used to train the AI on Copilot+ PCs.
  • Snapshots are not shared. Recall does not send your snapshots to Microsoft. Snapshots are not shared with any other companies or applications. Recall doesn’t share snapshots with other users who are signed into the same device, and per-user encryption ensures even administrators cannot view other users’ snapshots.
  • You will know when Recall is saving snapshots. You’ll see Recall pinned to the taskbar when you reach your desktop. You’ll have a Recall snapshot icon on the system tray letting you know when Windows is saving snapshots.
  • Digital rights managed or InPrivate browsing snapshots are not saved. Recall does not save snapshots of digital rights managed content or InPrivate browsing in supported web browsers.
  • You can pause, filter and delete what’s saved at any time. You can disable saving snapshots, pause them temporarily, filter applications and websites from being in snapshots, and delete your snapshots at any time.
  • Enterprise and customer choice. For customers using managed work devices, your IT administrator is provided the control to disable the ability to save snapshots. Your IT administrator CANNOT enable saving snapshots on your behalf.

Microsoft also posted on its blog:

“In our early internal testing, we have seen different people use Recall in the way that works best for them. Some love the way it makes remembering what they’ve seen across the web so much easier to find than reviewing their browser history. Others like the way it allows them to better review an online course or find a PowerPoint. And people are taking advantage of the controls to exclude apps they don’t want captured in snapshots, from communication apps or Teams calls, or to delete some or all their snapshots. This is why we built Recall with fine-grained controls to allow each person to customise the experience to their comfort level, ensuring your information is protected and that you are in control of when, what and how it is captured.”

I’m attending the Australian launch of CoPilot+ PCs at Microsoft, so if there are any more changes, I’ll report back to you.

Continue Reading

AI

Exclusive Preview: Samsung Lights Up Vivid with AI Inspired Display

Published

on

Samsung is back at Vivid Sydney, Australia’s annual light show spectacular, with an AI inspired display that will give Instagrammers plenty of sweet content to share online. Image Matrix Tech had an exclusive preview tour last night and this is what you can expect.

Lots of Light Globes

Located in the prime position of First Fleet Park, The Rocks – the Samsung ‘Chorus of Light’ is made up of 20,000 RGB light globes hanging from 1,600 strings (cables). It’s open from 6pm tonight, every night until June 15, 2024.

AI Inspired

Chorus of Light’ is inspired by Galaxy AI. Samsung is using the event to promote to AI features introduced with the S24 range earlier this year. The display will use Galaxy AI to help with photography or as Samsung calls it, NIGHTOGRAPHY.

Interactive

As you walk among the hanging lights you’ll be directed to a couple of photo ‘platforms’ where a very polite guide will take a photo for you with an S24 Ultra. They can share it with you immediately using Quick Share. This works by generating a QR Code. It’s very fast and works with iPhone too. Given the team knows how the lighting works, it’s probably better they take the shot.

The other part of the interactive journey pays homage to Vivid Sydney 2024’s theme of ‘Humanity’. Visitors are asked to record a personal message of hope for the future in one of the 16 languages available with Galaxy AI. Each attendee’s message will then be live translated and expressed as a unique light display. The visuals are made possible by by renowned international media artist, Susan Kosti.

The recorded messages will later feature in an original track produced by multi-genre Australian musician and beat maker, Ta-ku. The track will be crafted entirely from vocal samples recorded within the Chorus of Light installation and launched on Samsung’s YouTube on 12 June for the world to hear.

Great View

The other section is a massive viewing platform that allows visitors to look down on the whole display. There are Galaxy phones and staff to help you capture the scene if you want the best result. They can fit about 60 people up there while the numbers going through the hanging lights is tightly controlled.

Wait – There’s More!

At the end of the tour, visitors can scan a QR code for a chance to win a travel prize. The prize pack includes a Galaxy S24 Ultra, Galaxy Buds2 Pro, Galaxy Watch6, and a $5K travel voucher.

Opening Times

The display is open to visitors from 6pm each night at First Fleet Park in The Rocks for the duration of Vivid Sydney. Vivid Sydney runs from Friday 24 May to Saturday 15 June 2024. Plan your Vivid Sydney trip at vividsydney.com.

Continue Reading

Recent Most Popular