Connect with us

Camera

Head to Head: Apple Final Cut Pro vs Adobe Premiere Pro

Published

on

Head to Head: Apple Final Cut Pro vs Adobe Premiere Pro

There are some debates that stand the test of time. Chocolate or vanilla? (Both). Crunchy peanut butter or smooth? (Smooth). Nikon or Canon? (Pentax). But among video editors, especially the ones on YouTube, one scuffle comes up more than any other: Apple Final Cut Pro or Adobe Premiere Pro?

They both have ‘Pro’ in the name, so according to Apple nomenclature rules they should both be excellent. But for all the head-to-head editing shootouts and ‘why I switched’ anecdotes from disgruntled Adobe and/or Apple users, what matters in the end is raw performance.



The tests

How quickly you can edit a video from start to finish in either Premiere Pro or Final Cut is largely a matter of personal preference and familiarity with each application’s quirks. Pure performance, on the other hand, is measurable. So we took an 8K project filmed on the Sony a1, compiled it into two identical 4K timelines with identical effects, scoured the settings to ensure everything was as similar as reasonably possible, and then ran both of these video editors through the same battery of tests.

Note: preview codec, target bitrates, and other settings in Adobe Premiere Pro were based on analyzing the Final Cut Pro files.

Apple Final Cut Pro Adobe Premiere Pro
Render All – 4K ProRes 4:2:2 Render In to Out – 4K ProRes 4:2:2
Export Master File Export Using Sequence Settings
Export H.264 – Better Quality Export H.264 – Target Bitrate 51Mbps
Export HEVC – 8-bit Export H.265 – Target Bitrate 15Mbps
Automatic Stabilization – 15 Second Clip Warp Stabilize – 15 Second Clip

If you’re curious, here’s the full video.

Coming up with tests that were close to identical was tricky because Final Cut Pro gives you less control over how and what you can render and export unless you also buy Apple’s Compressor software. For example, the difference between H.264 ‘Faster Encode’ and H.264 ‘Higher Quality’ isn’t explained anywhere in Apple’s documentation. It makes only a slight difference in total bitrate, and may be similar to Premiere Pro’s option for CBR vs VBR 1-pass vs VBR 2-pass encoding, but we have no way of knowing for sure.

We took 8K footage from a Sony a1, compiled it into two identical timelines with identical effects, scoured the settings to ensure everything was identical, and ran both of these video editors through the same battery of tests.

Similarly, previews for this piece were set by default to 4K ProRes 4:2:2 in Final Cut’s Project Settings, with no option to change the resolution of your previews without changing the resolution of the entire project/timeline or going through the additional step of generating proxy media.

To keep things as even as possible, all Final Cut Pro exports were done at ‘Better Quality’ and all Premiere Pro exports were configured to match the bitrate of the Final Cut File using VBR 1-pass encoding. Previews were rendered with identical settings in both programs, and ‘Use Previews’ was checked when exporting the master file (i.e. Match Sequence Settings in Premiere), since Final Cut will use the rendered previews by default.

Back to top


The computers

All 5 tests were run on 3 different computers: a 13-inch Intel MacBook Pro, a 24-inch M1 iMac, and a Razer Blade 15 Advanced. Specs were essentially maxed out on all three machines (see below), and each computer was fully charged and/or plugged in, with no other programs running in the background to take up memory, CPU, or GPU resources.

Test machine specs:

13″ MacBook Pro 24″ iMac Razer Blade 15 Advanced
CPU 10th-Gen Intel Core i7-1068NG7 Apple Silicon M1 10th-Gen Intel Core i7-10875H
Cores 4 cores/8 threads 8 cores 8 cores/16 threads
Clock Speed 2.3GHz Base 4.1GHz Boost 3.2GHz Max 2.3GHz Base 5.1GHz Boost
GPU Intel Iris Plus with

1536MB VRAM

8-core Apple Silicon GPU NVIDIA RTX 3080 with 16GB VRAM
RAM/Memory 32GB 3733MHz LPDDR4X 16GB unified memory 32GB Dual-Channel 2933MHz DDR4
Storage 4TB integrated SSD 512GB integrated SSD 1TB M.2 NVMe SSD

Obviously we couldn’t run the Final Cut tests on the Razer laptop, but we felt it was important to include a high-powered Windows machine with an NVIDIA GPU in order to demonstrate the benefits of CUDA hardware acceleration in Premiere Pro. In fact, it’s the RTX 3080 laptop GPU inside the Razer Blade that really turned this head-to-head into a fair fight. When set to ‘Software Only’ encoding, you can expect these same exports and renders to take a brutal 3x to 5x longer.

It was important to include a high-powered Windows machine with an NVIDIA GPU in order to demonstrate the benefits of CUDA hardware acceleration

Unfortunately, we didn’t have an AMD laptop on hand to see how a Ryzen CPU or Radeon GPU would have fared compared to the Intel, Apple Silicon, and NVIDIA hardware tested here, but stay tuned because we have more head-to-head comparisons and computer reviews planned for the coming months.

Back to top


The results

You can see the full results of our testing in the graphs below. Each time is the average of at least three consecutive runs of every render, export, or stabilization run, with outliers thrown out if the system happened to glitch. Obviously, in this context, shorter bars mean better performance.

The first chart shows Final Cut Pro performance, comparing the MacBook Pro against the iMac:

The second compares Premiere Pro performance across all three machines. Note that the iMac was tested using the Arm-optimized Beta version of Premiere Pro:

The third and final graph shows Premiere Pro vs Final Cut Pro on the same scale, using the Razer as a high water mark for Premiere performance on Windows:

For those who prefer numbers, the table below shows all the benchmarks we ran, with winning times for each task highlighted in green. You may spot a pattern here.

Apple Final Cut Pro Adobe Premiere Pro
MacBook iMac MacBook iMac Razer Blade
Render All 09:57 05:12 25:53 07:40 08:50
Master File 02:07 01:24 00:37 00:16 00:41
H.264 06:55 04:19 26:12 07:28 08:12
H.265 02:59 01:55 25:09 07:16 08:06
Stabilize 00:55 00:25 02:36 02:06 03:13

Back to top


The takeaways

You can, of course, draw your own conclusions, but we noticed three major takeaways from these numbers.

1. Nothing beats a well-optimized app

We all hate on Apple’s walled garden from time to time, but having such tight integration of hardware and software comes with perks. Not only does Final Cut Pro on the M1 iMac sweep all but one category, just compare the Final Cut results from the relatively meager 13-inch MacBook Pro against the Premiere Pro results from the beefy Razer Blade 15. Even without a discrete GPU and 4 fewer CPU cores, the MacBook Pro running Final Cut still outperformed the Razer running Premiere in several benchmarks.

The MacBook Pro/Final Cut combo was able to export an H.264 file 1 minute and 17 seconds faster than the Razer in Premiere, while the H.265/HEVC export ran a full 5 minutes and 7 seconds faster. The Razer was still able to render previews and produce a master file more quickly, but it’s not the massive performance gain you would expect when going from a 4-core CPU and integrated graphics to an 8-core CPU and an RTX 3080.

Even without a discrete GPU and 4 fewer CPU cores, the MacBook Pro/Final Cut still outperformed the Razer/Premiere in several benchmarks.

Word to the wise: if you’re using a lower-end Intel-based Mac to do your video editing, and especially if you’re using high-resolution source footage, use Final Cut Pro. It will be 3x to 4x faster than Premiere in every category. The difference isn’t quite as drastic once you upgrade to Apple Silicon, but even there, you’re still looking at a significant bump in performance over Premiere.

2. If you are using Premiere Pro on a Windows machine, you will benefit hugely from a discrete GPU

Our Razer Blade 15 Advanced comes with the latest and greatest NVIDIA RTX 3080 laptop GPU complete with 16GB of dedicated VRAM. That will cost you a pretty penny, but even if you can’t afford the newest machine with the latest specs, picking up a laptop with a discrete GPU makes a big difference to both render and export times thanks to CUDA hardware acceleration.

It’s one of the main reasons the Intel MacBook Pro fares so badly in Premiere Pro, and we wouldn’t expect an equivalent PC with Intel integrated graphics to do any better.

3. When using the Arm-optimized Beta version of Premiere Pro, the M1 iMac was surprisingly fast

Here we see, once again, that Apple have something very special on their hands with the M1 chip. Unfortunately, the Intel version of Premiere Pro (running via Rosetta 2 emulation) was a mess on our M1 iMac: springing memory leaks, crashing, and causing all sorts of headaches. Before you know it, the app has taken up 90+GB of system memory and you have to force quit or the operating system will crash.

Fortunately, the current M1-optimized Beta is surprisingly stable and much faster. So much faster that it allowed the iMac to outperform the much more expensive Razer laptop in every single test. This bodes very well for future Apple Silicon devices already churning around in the rumor mill, as well as Arm-based Windows laptops.

Apple Final Cut Pro X
Adobe Premiere Pro
Pros: Pros:
  • Faster than Premiere Pro in most editing and exporting tasks
  • Well optimized for lower spec machines
  • Previews can render in the background while you keep editing
  • Available as one-time purchase
  • Granular control over previews, export files, and more
  • Compatible with Mac and Windows
  • Seamless integration with the rest of Adobe’s Creative Cloud library
  • Support for significant GPU acceleration
Cons: Cons:
  • Not compatible with Windows
  • Minimal control over preview and export files
  • Exports proprietary XML file that cannot be used in Premiere Pro
  • Library, Project, and Event system can be confusing to newcomers.
  • Slower than Final Cut when using equivalent hardware
  • Resource intensive, crashes frequently
  • Poorly optimized for lower spec machines
  • Can’t render and edit at the same time
  • Subscription model is a drag

Raw performance is never the whole story, as I’m sure several people are busy writing in the comments section right now (hi guys!). Which app you use has just as much to do with the amount of control you demand, the color grading tools you prefer, and which corporation’s ethos you would rather subsidize.

In all things Apple, you give up control in exchange for stability, speed, and a seamless experience across MacOS and iOS devices. In all things Adobe, you give a little sanity and a monthly offering of cash or credit in exchange for the features, tools, and granular controls that many working pros demand.

Consider your own needs (and hardware) and choose wisely… or just say ‘screw it’ and download a copy of DaVinci Resolve.

Back to top

Source link

Continue Reading
Click to comment

Leave a Reply

Camera

New Pen not yet on the drawing board, says OM System

Published

on

By

New Pen not yet on the drawing board, says OM System


OM System’s Director of Product Planning, Hiroki Koyama and VP for Brand Strategy and Product Planning, Kazuhiro Togashi, at CP+ 2025

Photo: Dale Baskin

“We are considering the new Pen concept as OM System brand,” says OM System’s Kazuhiro Togashi, VP for Brand Strategy and Product Planning.

We spoke at the CP+ trade show in Yokohama, Japan, and he reassured us that the arrival of the OM-3 with a Pen-F style ‘creative dial’ on the front doesn’t close the door on the rangefinder-style series.

“There’s a different concept between OM-3 and Pen-F series,” he explains: “basically the Pen-F series is about ultimate beauty and the ultimate craftsmanship. Whereas OM-3’s core concept is to take authentic and great creative photos.”

But, he says, it’s too soon to know what a future Pen might look like. “We think the camera’s design must realize the concept of the product, so we don’t start to decide the camera design before deciding the camera’s concept: the product concept must come first.”

“Therefore, we haven’t yet decided if the product design for a new Pen will look like the Pen-F or similar to the E-P7 because we haven’t decided on the product concept.”

But what’s clear is that OM System does plan to continue the Pen line.

The continued appeal of dedicated cameras

We asked Togashi what he thought makes shooting with a dedicated camera special, in a time when smartphone image quality has got so good.

“Experience is very important,” he says: “There’s a different kind of experience between smartphones and a camera. For example, I personally love to use a smartphone, but just to record; without any emotional feeling.”

“When a user decides the moment with their camera, maybe their feelings are being moved by such an attempt: they’re not just recording, there’s more to it.”

“It’s like with professional sportsmen. They have to prepare to give their best performance during the game. They are always training before the game.”

“When you get a perfect photo, you feel a win”

“In the case of photos, photo enthusiasts always think or calculate before taking a photograph. Before you take something, you consider the place, or you think about which position is better, or what sort of atmosphere or angle: you calculate before you take the photo.”

“This is like a serious game, just as it is for football or baseball player. And when you get a perfect photo, you feel a win. ‘I win, by myself’ I don’t know if many people can get that similar experience by taking photos with a smartphone.”

And he thinks this difference should remain, even as the image quality gap narrows. “Smartphone’s development speed is very high, and in the future, the difference between smartphones and camera might become very small,” he says: “however the difference in experience, is a bit bigger.”

We’re not the company to make an enthusiast compact

Despite this, and in spite of rising sales of compacts, Togashi says we shouldn’t expect an enthusiast compact.

“As for the current popularity of compact digital cameras, lower-priced models seem to be selling very well worldwide, but we feel that this is a temporary trend.” he says: “We are continuing to study the development of a successor to the TG series, but currently we don’t have any plans to introduce other compact camera concepts.”

“We don’t have any plans to introduce other compact camera concepts”

“As for high-end compact digital cameras, we recognize that there is a dedicated user base that remains a valued segment of the market, however, at OM System, we are focused on developing products that align with the evolving needs of photographers, ensuring we deliver the best possible innovation and performance across our lineup.”

“When we were Olympus, our brand was known for high-end compact cameras like the XZ series and Stylus 1. However, since becoming OM System, we no longer carry high-end compact cameras. Instead, we focus on cameras that align with broader market needs, including those of younger generation photographers. Given the significant investment required – not only in research and development but also in reestablishing a high-end compact brand image – such a product would be challenging to make profitable.”

The TG series endures…

OM System TG-7
The TG series of rugged, waterproof cameras continues to have an audience, the company says.

Image: OM System

But the TG series definitely has a future, says Togashi, because it has a dedicated user base.

“TG still survives and is well received by the market,” he says: “Outdoor enthusiasts want to capture their activities and adventures. Also families look for ways to preserve special memories—whether it’s their children playing in the pool or on vacations and situations like that.”

“On the other hand, professional scuba divers or climbers continue to rely on the TG series. For them, safety is very important during these extreme activities, and the TG series remains a trusted tool in these challenging environments. “

“Both types of users continue to use the TG series, setting it apart from other high-end compact cameras. Their main priorities are mobility and ease of operation, rather than smartphone connectivity. They love the operation and mobility.”

…but a high-end TG would be challenging

These specific requirements might rule out a higher-end TG, he suggests

“We’re always talking about the successor of the TG series and whether to add a new, higher TG line, maybe using a bigger sensor, or perhaps a TG-DSLR.”

“We’re always thinking about the possibilities. However, as of today we don’t have any best answer to realize this concept because the requirement for TG series is very hard. For example, making a large or removable lens drop resistant is very difficult.”

Also, he says, keeping the size down is important: “if we adopted a bigger sensor and we maintain the same optical zoom range, the body would need to be very big. That means such a TG would lose the mobility concept.”

Togashi didn’t seem enthused by our suggestion of a prime lens: “A lot of TG users’ photographic needs are different from enthusiasts’, so they like to use a zoom lens. They like to enlarge subjects in their photos, therefore they always use tele-zoom.”

Director of Product Planning, Hiroki Koyama raises another concern: ” We also give priority for close-up capability. TG can be used very close to the subject. If we choose a bigger sensor size, the close-up capability will be reduced. The current sensor size is the best balance, but we’ll try to study the concept.”

The OM System lens range

On the subject of lenses, we asked whether they believe the current Micro Four Thirds lens lineup includes all the options an OM-3 user might want.

“Still not yet,” says Togashi: “We are also trying to develop small and light and bright lenses or something like that. We have space to make new lenses in the future. I can’t disclose [the details], but yes.”

Choosing the right lenses to add isn’t always easy, he suggests: “People always ask ‘will you make a pancake lens?’,” he says: “but then the pancake lens sales are not so good in general. But still, we’ll continue to consider it.”


Interview conducted by Dale Baskin and Richard Butler, answers edited for flow.



Source link

Continue Reading

Camera

Tips for taking epic shots of tonight’s ‘blood moon’ total lunar eclipse

Published

on

By

Tips for taking epic shots of tonight’s ‘blood moon’ total lunar eclipse


A lunar eclipse, captured by Jamie Malcolm-Brown in November 2021. Used with permission.

Editor’s note: This article was originally published in 2022. We have updated it with information about the current eclipse as a service to readers.


Starting tonight, March 13, through the early hours of tomorrow, March 14th, skywatchers in the Americas will be able to view the first total lunar eclipse of the year. The moon will turn a ‘blood red’ hue for a brief period as it passes entirely into the Earth’s shadow when lined up with the sun. Depending on where you are located, there is a specific time you can witness this phenomenon.

Time and Date, a top-ranking site for times and timezones, created a useful tool that allows you to make a plan by entering your viewing location. From there, it gives you pertinent information, including the total duration, what time each phase of the eclipse starts and the direction it’ll travel, plus altitude during these phases. A helpful animation gives you a visual of how it will appear, minute by minute, once it starts.

Details of the March 13  2025 total lunar eclipse
Time and Date created a free tool to help you plan your total lunar eclipse viewing, depending on your location. This is the data for Seattle, WA, where DPReview’s headquarters is located.

If you plan on bringing your camera out for the ‘blood moon’, photographer Jamie Malcolm-Brown has some helpful tips for camera settings. Describing his process for capturing a lunar eclipse in 2021, he tells DPReview that ‘it was taken with [a] 200-600mm lens at 600mm, ISO 800, F6.3, at 1/3 sec. I bracketed the shots at 5 shots with an EV (exposure value) change of 1. Next time I would probably bracket 5 shots but with only an EV change of .3. The final image was cropped fairly significantly to fill the frame with the moon.’

While useful for capturing more detail on the moon’s surface, you don’t necessarily need a long lens that extends to 600mm to photograph the blood moon. John Weatherby released a quick, helpful tutorial on Instagram outlining his process for getting the best images possible. For one, you can shoot at a focal length between 100–200mm if you want to include a foreground.

Weatherby also explains that having a sturdy tripod and ball head is an absolute necessity. Ensuring that the lens is locked in securely will yield clearer images of the moon. Using the camera’s shutter delay or self-timer, or an external remote, will also help prevent blurry shots as the camera is likely to shake a bit once you press the shutter. PhotoPills, an app that helps you identify where the moon will travel in accordance with your specific location, is recommended as well.

It’s important to check the weather in your area as cloud coverage can potentially conceal the moon completely. Windy.com is a free app available on desktop, iOS and Android that, in my opinion, does a decent job of forecasting weather patterns. It’ll give you a visual of where clouds will appear at specific dates and times so you can determine the best place to set up in your state or country.

Screen Shot 2022-05-15 at 12.59.27 AM
Windy.com, a free app, is an effective tool for forecasting weather elements, including cloud coverage.

The next total lunar eclipse will take place on September 7, and will be visible in parts of Asia, Africa and Australia. If skywatching interests you, you’re in or near one of the locations where the eclipse is visible and weather permits, I recommend getting out for a few hours and witnessing this wonderful event first-hand.





Source link

Continue Reading

Camera

Fast and fun: Photographer captures the thrill of Formula 1 with Lego

Published

on

By

Fast and fun: Photographer captures the thrill of Formula 1 with Lego


Photo: Benedek Lampert

This weekend marks the start of the 2025 Formula 1 season, and one photographer is kicking things off with a series of photographs to celebrate. With a fine focus on detail and many hours of work, toy photographer Benedek Lampert has recreated F1 moments using Lego. This project is just the latest for Lampert, who has previously created life-like scenes of Lego versions of the Eiffel Tower and Shackleton’s Endurance.

In September 2024, Lego and F1 announced a partnership that included releasing numerous F1 Lego sets, some of which featured more realistic-looking models of F1 team cars. Lampert managed to get his hands on the entire starting grid and set to work on creating highly detailed, life-like photographs of the Lego F1 cars.

Sample gallery
This widget is not optimized for RSS feed readers. Click here to open it in a new browser window / tab.
Photos: Benedek Lampert

As with all of Lampert’s work, nearly everything was done in camera. “It’s extremely important to me that these are actual photos and not AI-generated graphics,” he explained. That meant lots of hands-on time to build sets and problem-solve special effects. He built the track scenery and crafted unique sets that allowed him to get motion blur, spinning wheels, smoke and water vapor without any editing work. Lampert explained that the only thing he added while editing was the cloud texture in the sky and rear lights in one image.

All said and done, Lampert says the project took 70 hours for the 10 final images. The photo shoot portion of the project took five days, with ten to twelve-hour days at times. You can see how he meticulously created each image in the behind-the-scenes video below, as well as the photos in the gallery above.



Source link

Continue Reading

Trending