You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Cleaning up markdown getting it to conform to markdownlint.
Signed-off-by: Sam.Richards@taurich.org <Sam.Richards@taurich.org>
* Adding a lint config line. I'm not going with all of them.
Signed-off-by: Sam.Richards@taurich.org <Sam.Richards@taurich.org>
* More MD cleanup.
Signed-off-by: Sam.Richards@taurich.org <Sam.Richards@taurich.org>
* Removing warning, we think the HDR section is pretty good.
Signed-off-by: Sam.Richards@taurich.org <Sam.Richards@taurich.org>
* Updated the link.
Signed-off-by: Sam.Richards@taurich.org <Sam.Richards@taurich.org>
---------
Signed-off-by: Sam.Richards@taurich.org <Sam.Richards@taurich.org>
Please reach out to us, particularly if you are having a problem, since its likely somebody else is sharing that problem, and if we have not documented it, it may be something that needs to be updated.
Copy file name to clipboardExpand all lines: ColorPreservation.md
+18-13Lines changed: 18 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,6 +9,7 @@ parent: Encoding Overview
9
9
10
10
11
11
# RGB to YCrCb Conversion <aname="yuv"></a>
12
+
12
13
We would like ffmpeg to do as little as possible in terms of color space conversion. i.e. what comes in, goes out. The problem is that most of the codecs prefer to convert from RGB to YUV conversion (technically YCrCb). Do be aware that a number of codecs do support native RGB encoding (including h264, hevc, vp9, av1), but they are not typically supported in web browsers.
13
14
14
15
The main problem is that ffmpeg by default assumes that any unknown still image format has a color space of [rec601](https://en.wikipedia.org/wiki/Rec._601) which is very unlikely to be the color space your source media was generate in. So unless you tell it otherwise it will attempt to convert from that colorspace producing a color shift.
@@ -22,9 +23,11 @@ For more information, see: [https://trac.ffmpeg.org/wiki/colorspace](https://tra
22
23
For examples comparing these see: [here](https://academysoftwarefoundation.github.io/EncodingGuidelines/tests/chip-chart-yuvconvert/compare.html)
23
24
24
25
## colormatrix filter
25
-
```
26
+
27
+
```console
26
28
-vf "colormatrix=bt470bg:bt709"
27
29
```
30
+
28
31
This is the most basic colorspace filtering. bt470bg is essentially part of the bt601 spec. See: [https://www.ffmpeg.org/ffmpeg-filters.html#colormatrix](https://www.ffmpeg.org/ffmpeg-filters.html#colormatrix)
* only supports 8bpc (8-bit per component) pixel formats
59
-
* Its slower than the alternatives.
61
+
62
+
* only supports 8bpc (8-bit per component) pixel formats
63
+
* Its slower than the alternatives.
60
64
61
65
## colorspace filter
62
-
```
66
+
67
+
```console
63
68
-vf "colorspace=bt709:iall=bt601-6-625:fast=1"
64
69
```
70
+
65
71
Using colorspace filter, better quality filter, SIMD so faster too, can support 10-bit too. The second part `-vf "colorspace=bt709:iall=bt601-6-625:fast=1"` encodes for the output being bt709, rather than the default bt601 matrix. iall=bt601-6-625 says to treat all the input (colorspace, primaries and transfer function) with the bt601-6-625 label). fast=1 skips gamma/primary conversion in a mathematically correct way. See: [https://ffmpeg.org/ffmpeg-filters.html#colorspace](https://ffmpeg.org/ffmpeg-filters.html#colorspace)
Using the libswscale library. Seems similar to colorspace, but with image resizing, and levels built in. [https://www.ffmpeg.org/ffmpeg-filters.html#scale-1](https://www.ffmpeg.org/ffmpeg-filters.html#scale-1)
Note, there are a lot of other flags often used with the swscale filter (such as -sws_flags spline+full_chroma_int+accurate_rnd ) which really have minimal impact in the RGB to YCrCb conversion, if you are not resizing the image. For more details on this see [SWS Flags](EncodeSwsScale.html) section.
160
+
Note, there are a lot of other flags often used with the swscale filter (such as -sws_flags spline+full_chroma_int+accurate_rnd ) which really have minimal impact in the RGB to YCrCb conversion, if you are not resizing the image. For more details on this see [SWS Flags](EncodeSwsScale.html) section.
There are three approaches for what to use for the timecode:
41
+
44
42
*[Convert the start frame number](#start-frame-as-timecode) to the related timecode.
45
43
* Use the timecode from the [original plate](#start-frame-as-original-plate-timecode)
46
44
* A "fixed" timecode for all deliverables
47
45
48
-
49
46
## Start Frame as Timecode
50
47
51
48
It's extremely common to use a start frame of 1001 for a shot at the beginning of production, rather than frame 0. The three big reasons for this are:
@@ -60,7 +57,7 @@ By remapping the frame number to a timecode number, e.g. frame 1001 to 00:00:41:
60
57
61
58
Converting the frame number to timecode can be done using OTIO:
62
59
63
-
```
60
+
```console
64
61
import opentimelineio as otio
65
62
start_frame = 1001
66
63
frame_rate = 24.0
@@ -73,22 +70,19 @@ Another scenario is that the client is delivering a single clip, that your facil
73
70
74
71
This has a similar benefit in terms of conform, you can add or remove frames, and the conform will do the right thing, but it does require *a lot* more tracking, since if the frames are trimmed off the beginning, you will need to calculate the new timecode. Equally problematic is if you have multiple plates since you would need to track which clip is the baseline in terms of timecode and make sure any deliveries for the shot are appropriately using that timecode.
75
72
76
-
77
73
## Reel Name
78
74
79
75
While tracking the timecode for dailies may be too complex, it can be extremely useful for making proxies for source camera files. But the timecode alone is not enough, you also would need the reel-name, which typically is closely mapped to the filename of the original camera files.
80
76
81
77
For a QuickTime the reel name can be defined with the -metadata:s:v:0 flag:
@@ -100,7 +94,7 @@ However, metadata for reel-name is not consistently supported across the applica
100
94
</td>
101
95
<td>Resolve
102
96
</td>
103
-
<td>AVID MC
97
+
<td>AVID MC
104
98
</td>
105
99
<td>Premiere
106
100
</td>
@@ -157,17 +151,14 @@ However, metadata for reel-name is not consistently supported across the applica
157
151
</tr>
158
152
</table>
159
153
160
-
161
154
To get resolve to import the Reel-name you need to change how the reel name is defined, which is set under the project settings (see below). NB this can be done after the media has been added to the media pool.
For media composer you will find much more flexibility wrapping the MXF file in an AAF (see below).
167
159
168
160
For examples of the conform workflow, see: [VFX Subclipping relink](https://www.youtube.com/watch?app=desktop&v=gbReqyofLLE).
169
161
170
-
171
162
## AVID Media Composer Workflows
172
163
173
164
Deciding on whether to create Op1a vs. OpAtom does depend on which version of media composer you are using. Newer ones tend to prefer op-atom, but you should check with your editor.
@@ -176,7 +167,7 @@ For details on creating MXF files, see [OpAtom](EncodeDNXHD.html#op-atom-mxf) an
176
167
Part of the decision is whether you want a single file to also contain the audio, and whether you want to additionally use AAF files (see below).
177
168
178
169
If an AVID imports a media file with no timecode, it will default to 01:00:00:00.
179
-
For this reason it can be desirable to do one of the above approaches, but do work with editorial to confirm what they would like.
170
+
For this reason it can be desirable to do one of the above approaches, but do work with editorial to confirm what they would like.
180
171
181
172
[OpAtom](EncodeDNXHD.html#op-atom-mxf) files do not get directly imported into the AVID, instead you copy them directly into the /Users/Shared/AvidMediaComposer/Avid MediaFiles/MXF/{NUMBER} folder (e.g. /Users/Shared/AvidMediaComposer/Avid MediaFiles/MXF/2) on OSX or C:\Avid MediaFiles\MXF\{NUMBER} on windows. You can make a higher number, but Media Composer will also scan existing folders. Media composer will scan for new files and create (or update) a msmMMOB.mdb file, which is a database of the MOB ID's of the files. This can then be dragged into a Avid Bin to import the new files.
182
173
@@ -189,7 +180,8 @@ If you are tightly integrating your pipeline into an AVID workflow, you should c
189
180
Ideally with AAF files, you would be importing MXF files (like the example above) to minimize the import time to the AVID (so it doesn't require any media transcoding).
190
181
191
182
A simple example of this is to convert all your clips to raw DNxHD files, e.g.:
and then to wrap these resulting files in an AAF with:
195
+
203
196
```python
204
197
import aaf2
205
198
import os, sys
@@ -245,8 +238,8 @@ for filename in sys.argv[1:]:
245
238
# mob.import_audio_essence("sample.wav", edit_rate) #Modify if you have audio too.
246
239
```
247
240
248
-
In this simplistic example, I'm overwriting the Shot and Scene metadata columns, which should then show up in the bin, when the resulting AAF files are dragged into a bin. For a more complex version of this see: [aaf_embed_media_tool](https://github.com/markreidvfx/pyaaf2/blob/main/examples/aaf_embed_media_tool.py).
241
+
In this simplistic example, I'm overwriting the Shot and Scene metadata columns, which should then show up in the bin, when the resulting AAF files are dragged into a bin. For a more complex version of this see: [aaf_embed_media_tool](https://github.com/markreidvfx/pyaaf2/blob/main/examples/aaf_embed_media_tool.py).
0 commit comments