The New iPhone 15, Blackmagic Camera App, Camera to Cloud, and Why it All Matters
Alissa Valentina Knight
Writer, Director and Executive Producer | Recovering Editor and Colorist | Business Inquiries: [email protected]
On Friday, September 22nd, the new iPhone 15 was made available worldwide with the long-awaited USB-C connector and for the first-time ever in any smart-phone, hardware-based ray tracing. Ray tracing is a feature historically only found in higher-end gaming PCs and something that even the highest-end Macs don't offer built-in. Games and other apps that support ray tracing can render more realistic shadows, water, lighting and atmospheric effects (Apple Insider, 2023). Additionally, the iPhone 15 also brought iPhone filmmakers a 48 megapixel camera and pixel binning on the rear camera giving it far better low light performance (ability to shoot in poor lighting conditions without introducing as much noise in the shadows).?
However, something exciting has also come to the iPhone that few people are talking about and that's the Blackmagic Camera app, which I'll be referring to as BMCA in this article.?
But why is the announcement of the BMCA such a big deal? What makes it different from the in-phone camera app? And why is it important that it offers camera-to-cloud capabilities and what the heck is that anyway?
Over the last year, camera-to-cloud (C2C) has seemed to be capturing the attention of filmmakers and manufacturers everywhere, from Frame.io (now Adobe), Atomos, RED, and more companies offering C2C in many of their products. But what is C2C and why should you care??
"It went up to the Cloud. You can't get it down from the Cloud??Nobody understands the Cloud, it's a mystery” (Sex Tape, 2014)
C2C is the ability for a device to send proxy files and even original RAW camera files up to the cloud directly from the camera (or at least one hop away through a device like the Atomos Connect, Shogun Connect, etc) while the Director of Photography is still on set shooting. It allows the post-production team (editors for example) to immediately see what's being shot on set without having to wait until filming is done. With C2C, editors can contact the camera unit and request reshoots of specific scenes before the crew and talent are "wrapped." More often than not, cast and crew members will even be flown in from other states or even countries for productions, so flying everyone back for a reshoot is cost-prohibitive which C2C solves. Therefore, the cost savings created by allowing the post-production team to quickly look at footage while it's being shot right within the NLE is astronomical.
As an aside, NLE stands for non-linear editing as opposed to linear editing, which was used during days of film cameras. Linear editing is where film was edited using video tape that combined different video clips together from different tapes onto a single tape in the order they would appear in the final cut. Non-linear editing systems or (NLE) allows digital video files to be edited using computers and software, such as Davinci Resolve.?
?
Blackmagic Camera App
In Blackmagic Design's (BMD) April 2022 event, the company released a line of new cloud storage solutions called Blackmagic Cloud Store and Blackmagic Cloud Pod that uses NVMe SSDs and along with it, launched a new cloud service dubbed Blackmagic Cloud. For the first time with any NLE, users of Davinci Resolve could store their project files in the cloud allowing for easier collaboration with remote teams. The new cloud-native collaboration direction of the company was undoubtedly influenced by the COVID-19 work-from-home reality we're now in. However, I'd be remiss if I didn't mention that Hollywood historically operated in disparate work environments well before COVID-19 but more commonly in post-production with editors and visual effects artists being located in different parts of the world, not necessarily just in "Tinseltown." However, COVID-19 certainly sped things up in companies supporting work-from-anywhere.?
In its first version of Blackmagic Cloud, users could only store their project files in their cloud workloads. However, in its new version, Blackmagic Cloud now allows users to send their proxies, even their in-camera original RAW files directly to the cloud as well -- not just their Davinci Resolve project files. Further to being able to store proxies and originals in the cloud, users of the BMCA can also send all of the video they record with their iPhone directly from the camera app to the cloud as well, making their videos immediately available to editors and colorists anywhere in the world.?
Even more exciting is the ability to also send the RAW files along with the proxies to the cloud as well. iPhone 13 was the first release by Apple of the iPhone Pro to support RAW video codecs (Apple ProRes) allowing filmmakers to record in log allowing for color grading their footage in post. The new BMCA allows the user to switch between the different codecs with ease. Blackmagic does have its own RAW file format called Blackmagic Raw (BRAW), which I hope will be introduced into newer versions of the app down the road.?
While more professional camera apps do exist, such as Filmic Pro, it isn't free. Filmic Pro used to be a one-time purchase, but the company shifted to a subscription model in 2022 with subscriptions starting at?$2.99/week, with a full year priced at $39.99. This makes the BMCA with all of its features designed for the cinematographer who is used to the capability of modifying their shutter angle (E.g. 180 degrees) instead of timing for the shutter speed, the aperture/iris (E.g. f/1.8) for a shallow depth of field for that creamy bokeh, zebra lines, camera-to-cloud, and the other features it offers for free, far more appealing (Figure 1).
?
Configurable Options
While the native iPhone camera app is clearly limited in what you can set, the BMCA is packed full of configurable options for the cinematographer. I'll go over only the most important options and what they mean below.
By clicking on the SETTINGS icon (Figure 2), you'll be presented with a menu interface that's the simple, easy-to-use user experience that's been a longtime hallmark of Blackmagic Design. While we use RED cameras and ARRI at Knight Studios, I do have to say that the UX in Blackmagic really is unmatched, especially when you look at the UX of other companies.
So what does this all mean? You really don't need to burn the storage and computational power needed to film in ProRes 4444 unless you're using BMCA to film a movie as either an A-cam or B-cam. If you're just doing some urban videography, vlogging, run-and-gun shooting, etc, 422 should be just fine.
If the explanations for H.265 vs H.264 didn't make much sense, here is a visual representation of the differences (Figure 3).
Color Space
This will be an entire article on its own later, but for now, I'll quickly decompose the differences here between Rec.709, Rec.2020, P3 D65, and Apple Log. In post production color grading, understanding color theory, color space, etc is vital.
?
D65 color space against Rec.2020 and Rec.709
For more information on Apple Log recording available only in the new iPhone 15, Derek Wise wrote a great blog on this at 9to5Mac.?
In post-production, colorists will work within a specific color space so if you change this from the default value of Rec.709, you're probably already familiar with the color space you tend to use in your specific NLE. For example, I'll use the color space transform node in Davinci Resolve to convert our different camera videos from the RED V-Raptor and ARRI Alexa Mini LF, which both record to their native RAW file format into the working color space I use of ACES (CCT) or Rec.709. Since the BMCA doesn't support ACES, I've got mine set to Rec.709.
?
领英推荐
Anamorphic Desqueeze
There are two categories of lenses, spherical and anamorphic. Anamorphic lenses take an image being projected onto the sensor and squeeze it horizontally, giving you a much wide field of view than what is possible with a spherical lens. This creates a more "cinematic" look typically found in Hollywood movies. Moment and other lens manufactures have begun making anamorphic lenses for cell phones that are mounted in front of the cell phone lens using their proprietary cases. Not all of the lenses are created equally as there are different squeeze factors in anamorphic lenses (1.33, 1.55, even a higher squeeze ratio of 2.0). If you're using an anamorphic lens, use this de-squeeze factor to view the image properly within the BMCA (Figure 6).
Because the human eye is spherical, spherical lenses attempt to mimic what the human eye can see whereas anamorphic lenses attempt to expand the amount of horizontal viewing space than what the human eye is capable of (Figure 7).
?
Lock White Balance
I'm honestly so glad that Blackmagic Design thought of this. So what happens with auto white balance (WB) and why does it matter? Typically on a set, the Digital Imaging Technician (DIT) is responsible for shot matching multiple cameras (at Knight Studios, we'll typically use 3 cameras (A-cam: ARRI Alexa Mini LF, B-cam: RED V-Raptor, C-can: RED Komodo). The DIT will typically ensure that the white balance is the same across all cameras to ensure less time is spent by the colorists in post-production to try and shot match all the cameras. When a camera's white balance is set to auto, you'll see a WB color shift in the middle of filming which is a real pain the in a$!% for colorists to correct for in post, so in cinema cameras, you typically would "set it and forget it" to ensure it doesn't change in the middle of filming. Auto WB is atypical in cinema cameras whereas auto WB on the iPhone is the default setting in the camera app.?
?
Upload Clips
This is where BMCA flexes its muscles. This is your camera to cloud setting where either proxies or originals AND proxies are uploaded to the Blackmagic Cloud. This is where you need to also be careful on your choice as the costs for cloud storage aren't exactly cheap. Be conscious of this setting. If you are fine with using that beautiful, highly anticipated arrival of the USB-C port in the new iPhone 15 to transfer the original video files off the phone using a USB-C cable to your editing station, then use that to transfer your RAW video files and just send your proxies to the cloud. Proxies can be used for editing within Davinci Resolve without needing the originals. Think of the proxies as a low-resolution version of the high-resolution original RAW files. So instead of a 5GB raw file, you'd be using a 50MB proxy file instead. Don't of course, color grade with your proxies. They should only be used for editing.
?
The costs for the different storage options in Blackmagic Cloud as I mentioned can climb quickly:
While some of the larger color houses for example might use the 100 TB or even 1 PB plan, as a mobile filmmaker, I can't imagine you would need anything more than 500 GB or even 2 TB for proxies generated on your iPhone.
?
LUTs
Lookup Tables (LUTs) are used by cinematographers in-camera, such as show LUTs, to try and illustrate to the Director while on set what the final image might look like when that same LUT is used by colorists in post. The purpose of this is cinematographers will typically use a Rec.709 display to client monitors that will look flat. When a display LUT is applied, it will try and bring the cinematographer closer to what it will look like in the final cut. The LUTs setting within the BMCA allows you to load different LUTs, for example, a Kodak 2383 LUT for film print emulation for that more filmic look. I should note these are display LUTs and used for viewing only. The LUT is not burned into the final video file.
?
Summary
While YouTubers are publishing unboxing videos of the new iPhone 15, discussing "fingerprint gate" on the sides of the phone with the new Titanium material, and doing drop tests, there's so much more that was brought to market (pun intended) with the new iPhone 15. The USB-C port in lieu of the lightning port brings the ability to record your RAW video files directly to an external SSD connected to the phone. I remain excited to see what's to come from Blackmagic Design in future versions of the app. While many BMD cultists are saddened that the new Blackmagic cinema camera didn't sport the box design offered by ARRI and RED cameras, the BMCA opens up a whole new market for Blackmagic and while I'm not a mobile phone cinematographer, Mel and I have decided to switch to the iPhone 15 + the BMCA for filming our on-set Behind The Scenes (BTS) footage where we used to use the Sony A7Siii.?
So for now, I'm excited for what's possible with the first release of BMCA and its future. The BMCA coupled with what Apple is doing to democratize filmmaking with the iPhone for filmmakers and creators who can't yet afford a RED or ARRI cinema camera is something we should all be watching closely.
Knight Studios Insider
Want early access to Knight Studios shows, prize giveaways, guest appearances on future shows, behind-the-scenes interviews, and access to our filmmakers community? Register for free for the new Knight Studios Insider program at https://www.knightstudios.co/sign-up
?
Bibliography
About Apple ProRes - Apple Support (CA). (2023, August 22). Apple Support.? https://support.apple.com/en-ca/102207
Monette, M. (2022). H.264 vs. H.265: Video codecs compared.?Epiphan Video.? https://www.epiphan.com/blog/h264-vs-h265/
Hosalikar, S. (2023). Difference between H.264 and H.265.?Gumlet.? https://www.gumlet.com/learn/h264-vs-h265/
Monette, M. (2022b). H.264 vs. H.265: Video codecs compared.?Epiphan Video.? https://www.epiphan.com/blog/h264-vs-h265/
UnRavel | Understanding Color Spaces. (n.d.). Unravel.? https://www.unravel.com.au/understanding-color-spaces
Qazi, W. +. D. (2021). Resolve Color Management | DaVinci Resolve 17 Tutorial — Qazi & Co ? Color Grading Studio.?Qazi & Co ? Color Grading Studio.? https://waqasqazi.com/blog/resolve-color-management-davinci-resolve-17-tutorial
Wise, D. (2023, September 15).?iPhone 15’s new LOG recording and why it matters [Video] - 9to5Mac. 9to5Mac.? https://9to5mac.com/2023/09/15/iphone-15s-new-log-recording-and-why-it-matters-video/
C, B. (2023). Anamorphic lens v spherical lens: What’s the difference??SIRUI?Official Store.? https://store.sirui.com/en-ca/blogs/guides/anamorphic-lens-v-spherical-lens-what-s-the-difference
?