Showing posts with label blender. Show all posts
Showing posts with label blender. Show all posts

Saturday, August 1, 2020

keyframes

Links: keyframe :: (wikipedia) keyframe is an "I" frame -- intraframe. types of frames :: (wikipedia) I, P, & B frames.

These blog posts are application oriented. Sometimes however, a little theory helps the application. Simply stated,keyframes are a type of i-frame manually added by a user editing a video, increasing the total i-frame count of a video.

i-frames

Video cameras record a chronological procession of pictures, frame by frame. They do so with 3 types of frames: i, p, and b. The i-frames are complete pictures, like a photo, of what is being recorded. As the camera records, it takes an i-frame and then several P or B frames, and then another i-frame, and so on. The P or B frames refer to an associated I-frame to complete the picture during playback. The Wikipedia graphic below showes the sequence. The i,p, b-frame schema was created to save memory space.

Most video devices insert i-frames about every 8 seconds or every 240 frames or so (I shoot mostly 30fps) when recording video. Newer codecs set these intervals dynamically: shorter intervals when action increases, and longer intervals when action decreases. H264 comes to mind.

keyframes

When software users edit video effects, say a dissolve transition, their editing software adds an i-frame to the beginning and end of the effect. These post-recording, user-added i-frames are in addition to the existing i-frames embedded by their camera during recording. Only these post-recording, user inserted i-frames are properly termed "keyframes".

Further nomenclature confusion can arise when software uses proprietary terms for key or i-frame intervals. For example, in ffmpeg, i-frame intervals are called GOP's "Groups of Pictures", and without regard to whether they are between key or i-frames.

raw video analysis

When I import uncut clips, I first-off detect and save all the i-frame time stamps to a file I can refer to when editing. If it's a simple edit without transition, and all my cuts are at i-frames, I might not need to add keyframes and re-encode the video. How do I get a list of i-frame time stamps? Suppose I have a clip, "foo.mp4".

$ ffprobe -loglevel error -skip_frame nokey -select_streams v:0 -show_entries frame=pkt_pts_time -of csv=print_section=0 foo.mp4 >iframes_foo.txt 2>&1
The output intervals will be timestamps, which we can easily see by catting the file.
$ cat iframes_foo.txt
0.000000
4.771433
10.410400
18.752067
27.093733
...

To determine the exact frame number (always an integer) of the i-frame, multiply the time stamp by the recording FPS. I can determine the FPS using a basic $ ffprobe foo.mp4. In this example, it revealed a 29.97 FPS. So...

29.97 x 4.771 = 142.99 or frame 143.
29.97 x 10.4104 = 311.99 or frame 312.
29.97 x 18.7521 = 561.99 or frame 562.

...and so on. We could write a short program to calulate this or export it into Excel/Gnumeric/Sheets. But this is for only a single clip and of course I want this information for each of my clips.

repair - ffmpeg

Sometimes, the keyframes become unmanageable for some reason and need to be cleaned. Typically re-encoding is required. But there is a methodology.

Thursday, July 30, 2020

blender 2.8 notes - vse, video

contents
clip matchingplugins necessary plugins
setup and render
keyframeswatermark

NB: set rendering output confinguration prior to editing, esp FPS


Blender or, as I call it, "Death from 1000 Cuts", is vast, almost an operating system. It is Python + FFmpeg based. KDenLive, the other major Linux GUI editor, is MLT based. Whether using Blender for animation or video (this post concerns video), a reasonably good understanding of keyframes goes a long ways. The reason is that, portions of edits which don't require keyframes can be done with a CLI editor. Blender is likely to be used for sliding text on or out of a frame, etc.

what has to match?

  • frame rate(in ffmpeg: avg_frame_rate) must match in final edit. Having a uniform frame rate, dimensions, bitrate, and so on, makes for easier editing. So if just using a few varied clips, it's worthwhile to recode them to match, prior to importing. Obviously, frame rate will ultimately be forced in the final render, and it can be jerry-rigged during editing if desired...
    Dealing with Variable Framerates (7:03) Michael Chu, 2019. Audio unaffected by framerate, but we want the video to correspond. This is a manual solution when many clip-types are present.
  • tbn, tbc, tbr These are ffmpeg(Blender's backend) names for timing information beyond the fps. The time_base (tbc) is simply the inverse of the time_scale (tbn), but there is not necessarily one frame for each tick of the timebase (see 2nd link below). Timescale (tbn) is often 90K.
    Variable names w/descriptions (page) GitHub, 2020. What each variable pertains to. Inaccurately substitutes "timebase" for "timescale".
    Mismatch between clips (page) 2013. An example timing mismatch between clips. Timebase vs. Codec Time base (page) 2013. Variations between these two can lead to problems.
    Container vs. Codec (page) Stack Oveflow, 2013.
    Ffmepg timebase (page) Alibaba, 2018. Description of how obtained.
    **NB: 90,000 is typical tbn because divisible by 24, 25, and 30, although 600 will work, a la Quicktime.
  • bit rate helpful if matched, varies in quality if not. Along with the i-frame interval, bit rate determiines the quality of a clip. It's OK for bit rate to differ across project clips -- they can still be fused -- as long as one understands that each clip's quality will vary accordingly.
  • i-frame interval these can vary from clip to clip and change throughout editing as keyframes are needed. However, attention should be paid to this again at the final render to see that an efficient, hopefully dynamic, setting is selected. In ffmpeg itself, the i-frame interval is defined by "Groups of Pictures".
  • PTS

setup (also render)

I strongly suggest configuring one's output render settings as the first step of any Blender project. A consistent framerate, codec, and other details set to the desired output, forces clips into alignment from the start. As a bare minimum, set the final rendering FPS when getting started. That being said, the final render requires i-frame interval adjustments. The newer codecs will do this dynamically, so that if there are periods of zero action, i-frame intervals can expand to, say, one every 10 seconds, etc.

Dealing with Variable Framerates (7:03) Michael Chu, 2019. Audio unaffected by framerate, but we want the video to correspond. This is a manual solution when many clip-types are present.
Blender 2.8 Setup (19:27) Mikeycal Meyers, 2019. Render settings begin at 5:00.

The directory structure follows a pretty standard setup with a user file in ~/.config/blender/[version]/startup.conf, but there are also files inside /usr/share/blender/, which is where add-ons seem to live. So it may be that there are configurations to save from both of these sources.

Keyframes (and other) in 2.8 (9:48) Code, Tech, and Tutorials, 2019. Solid inadvertent tour around the VSE via a simple edit. Shows how to do transitions simply without inserting them, pressing "i" for keyframe.
Blender 2.8 Setup (19:27) Mikeycal Meyers, 2019. Render settings begin at 5:00.
Ffmpeg blur box (14:48) Thilakanathan, 2019. Fast, but covers nearly everything. Rendering in last 5 minutes. Comments have a lot of tips.

render troubleshoot

I've put these rendering fixes at the top of the blog to help determine how to preventatively configure settings.

  • grey timeline is rendered but appears to have a sheen of gray over entire playback, like through a dirty window. Inside the little camera icon, go all the way to the bottom "Color Management" and change it from Filmic to Standard.

keyframes

Video keyframes are a large editing topic I cover separately, but a few notes here. Fundamentally, keyframes are a hand-entered subset of i-frames, added by users. Since they are a sub-type of i-frame, all keyframes are i-frames, but not all i-frames are key frames.

  • "i" to add, "Alt+i" to delete
  • keyframes can be edited in Timeline, Dope Sheet, or Graph Editor. Only manually added keyframes appear in these editors, not generic i-frames.
    Keyframe manipulation (6:18) Michael Chu, 2020.
  • the final render should eliminate as many keyframes as possible, to decrease the size of our output file

How to Delete all Keyframes (page) TechJunkie, 2018. This is in the older version of Blender but has loads of solid keyframe information.
Keyframes (and other) in 2.8 (9:48) Code, Tech, and Tutorials, 2019. Solid inadvertent tour around the VSE via a simple edit. Shows how to do transitions simply without inserting them, pressing "i" for keyframe.

plugins

Plugins appear to live in /usr/share/blender/[ver]/scripts/addons. There are some key ones to have

proxy encoding.

If a person has a gaming system, this is not a concern. However, many systems edit smoother if the video is proxied. The system is lags and jumps during playback. Proxying does not work well if a person has elaborate transitions that require exact keyframes and so on.

Proxy a clip or clips (4:17) Waylight Creations, 2019. How to deal with a slower system.

sound effects

Occasionally, we'd like a sound effect to this or that in blender without a video clip coming in and so on.

Inserting Sound effects in Blender(11:11) Oracles Stream School, 2020. OBS based tutorial, using the computer, not a capture card.

Sunday, May 17, 2020

PiP pt 2 -- more ffmpeg

contents
blur: somewhat complexglossary
structure/workflowkeyframes also my full post
color balancerotation simple in ffmpeg
crossfadeslideshow complex! use a GUI app
de-interlacewatermark
gif

NB: 1) try to make all cuts on keyframes, 2) "overlay" filter operations are unreal time lost


Links: 1) PiP Pt I   2) hardware for a system that will render

Impossibly, the simple crossfade is one of the most elusive video effects in Linux. You'd think simple transitions would be managed by any of the 10 GUI's out there, but the 10 GUI's are only intermittently usable whenver QT upgrades, or GTK upgrades, etc. So the only totally reliable way to do video is through a CLI, eg ffmpeg or melt. It's a complete pain in the ass, of course, so I use GUI's when they're actually running. Ex: Pitivi (and Shotcut subsequently - QT5), Flowblade, or Olive, are OK for crossfades -- not perfect but OK. One month, GTK got an upgrade, and thus we learn that GTK is not backwards compatible... poof! PiTiVi gone...

$ pitivi
ERROR - The following hard dependencies are unmet:
==================================================
- gtk not found on the system
# pacman -S gtk
error: target not found: gtk
# pacman -S gtk2
warning: gtk2-2.24.32-2 is up to date -- reinstalling

... WTF?!! So there's little choice but to go CLI for one's mental health -- it's our lives we're giving up to edit.

You can pick any GUI application. Like, there's final edit color balancing in Blender (Blender is so complex it's practically an OS itself), but even color balancing can be done inside ffmpeg (see 0612 TV w/NERDfirst around 6:00).

Besides impossible crossfades, the challenge within ffmpeg is keeping the number of renders to a minimum. We can use ffplay to preview or, since ffmpeg is non-destructive (non-linear), we can do a test render and check it. And don't forget ffprobe to get information out of the files. Also recall that, if we just add an audio track, we can copy the video codec and add audio without degrading the video whatsoever.

other options

And of course, one should also not overlook the MLT framework, eg "melt", for command line transitions, if it's still being developed, not sure. In 2020, there's bleeding-edge on the AUR (mlt-git), and in "extra" (mlt). With MLT you will need a frame counting tool, not just a second counting tool to make edits.

MLT Melt Transitions (13:48) Kris Occhipinti, 2012.

structure and workflow

See graphics below. Each edit requires several files (see my typical file structure)The grouping below is typical but of course can't include hand-drawn items: 1) storyboard 2)timing notes for audio sync with narration and so forth. How to pause and linger on a frame? The best storyboard may be a video describing the drawings while doing a UMlet workflow.



If you imagine the workflow graphic above as 3-D, one can see that each item is not equal. In the 3rd column, there's a place to get started on code and that's because the most flexible TTS is via Python, but Python itself is a deep subject requiring thoughtful installation. So just to make the TTS possible is likely a 2 week project.

blur (+1 render)

This is a complex ffmpeg filter. Three processes are accomplished, a box region is cropped out, it's blurred, and it's re-overlaid onto the original video. Added difficulty is determining an x:y dimension for the region.

$ ffmpeg -i foo-in.mp4 -filter_complex \
"[0:V]crop=100:100, boxblur=20[fg]; \
[0:V][fg]overlay=(main_w-200):30" foo-out.mp4

Ffmpeg blur box (6:45) Cool IT Help, 2019. A bit of a strong accent, but no BS.

concatenate (+1 render)

The +1 render is when getting one's clips ready: rotating, accomplishing any dissolve edits ahead of time. There's no render during the actual concatenation. Try different orderings of your clips in a simple M3U, playing the M3U with vlc or xplayer. You can keep filenames simple, avoiding full absolute paths, if you place the M3U in the same directory with the clips. The format...

# This is a comment
somevideo.mp4
anothervideo.mp4
  • finalize the mix order in the m3u
  • open vlc and be sure "repeat" is disabled (else it will go into infinite loop). close vlc.
  • $ vlc playlist.m3u --sout "#gather:std{access=file,dst=vlcmerged.mp4}" --sout-keep
...in 5-6 seconds, this command concatenated 31 clips into a 7 minute 720P h264 video with sound, no tearing, and so on. It's production quality. The process is described here, albeit with Windows conventions.

Using ffmpeg can also be without rendering (-c copy), but will likely lead to DTS timestamp errors and tearing in playback. Even if one renders, the quality has not yet been as high as the vlc method above. Additionally, it doesn't work with an M3U. One has to make a text file with a special syntax on each line (viz, file 'foo.mp4') and one file per line, repeats also OK however.

$ ffmpeg -f concat -i playlist.txt -c:v libx264 -an grouped.mp4
I find that playback with the resulting file is often jerky or paused, even when all files are the same encoder and there is discussion (render, tbr) about how to manage this, but I've never found anything to prevent it. If I had time, I would troubleshoot until a fix.

colorbalance (+1 render)

Not sure if we could chain filters and do this with a single render

$ ffmpeg -i foo.mp4 -vf "colorbalance=rs=-0.25,colorbalance=bm=-0.25" evencolor.mp4

How to use FFMpeg - Advanced Pt1 (19:37) 0612 TV w/NERDfirst, 2015. Color balance at the 7:30 mark.
Another great site, and he also has a video attached.

crossfade (+2 render)

Typically, crossfades (aka "dissolves") are the most complex thing in Linux video editing and a separate post will likely be forthcoming. Some progress has ocurred however. This recent article is a must-read on the subject.

$ ffmpeg -i invid1.mp4 -i invid2.mp4 -filter_complex xfade=transition=fade:duration=2:offset=24 outputvid.mp4

... where the duration of the crossfade is 2 seconds, and it begins at the 24 second mark. Note that this effect renders with extremely high (hot) CPU cycling, even for a short clip.

Pan and Zoom Slideshow (26:50) Chipper Videos, 2019. Blender 2.8. Crossfade is shown in approximately last 5 minutes, but acccurate. Most is concerned with Ken Burns.
Crossfade (0:20) doufuwang, 2016. Shows the crossfade or dissolve effect perfectly.
Tech Notes: FFmpeg Multi Fades In Out (37:09) Steve AB4EL, 2017. Hilarious. Drones on and on. Takes 3 pictures and makes slide show. (8:30)fade transitions
FFmpeg Cross fades (2:25) A Forum, 2020. One of the best out there. Command given above is taken from here.

de-interlace (+1 render)

De-interlacing increases the frame rate. It could take you from 30 to 60, for example.

Sample Video Editing Workflow using FFmpeg (19:33) Rick Makes, 2019. Covers de-interlacing (2:00) to get rid of lines,cropping, audio library(13:30), and so on.

$ ffmpeg -i foo.mp4 -vf "bwdif=1" -c:a copy defoo.mp4

glossary

  • frame rate vs. fps: these appear identical, and its unfortunate. It's nuanced like iframe and keyframe. Framerate (-r) is an input parameter used for setting fps. Fps is the stream speed. Ex: I want a 6 second video of 3 photos - I set fps to 2, and framerate to 12.
  • ripple-cut: cut per usual, but then it magnets over to previous clip to fill the space

scripts

There are two types we might want to consider

  • bash: easiest for ffmpeg sequence, not sure for tts
  • python: using pip we can get some tts modules, send to WAV

We want to make bash scripts for the ffmpeg actions.

timing

If I want to add sound effects or narrate, how do we at once watch the film and secondly enter sounds? The best way is to watch the film and take timing notes. Perhaps you want to pause on a frame for a few seconds while you discuss something.While you're watching the soundless video, just note how long you hold down the pause key to do your talking, say 5 seconds, and then using a loop filter

reverse

This one just for video.
$ ffmpeg -i foo -vf reverse reversed.mp4

rotation (+1 render)

Few do it better than NERDfirst. The reason for the division is ffmpeg determines the angle in radians. Furthermore the angle of rotation is clockwise. The most common is righting an upside down video.

$ ffmpeg -i foo.mp4 -filter:v "rotate=PI" fout.mp4

How to use FFMpeg (12:48) 0612 TV w/NERDfirst, 2015. At the 11:00 mark, he describes rotation, and scaling just before it.

slideshow (+1 or 2 render)

The simplest slideshow just cuts from one to the next at a set time, say 10 seconds. This is pretty easy to do on an entire directory of numbered pix, say foo01.png, foo02.png, foo03.png, etc. The "2d" indicates how many digits in the numbering system.

$ ffmpeg -framerate 1/10 -i foo%02d.png -c:v libx264 -r 30 -pix_fmt yuv420p slideshow.mp4

Similar to the Bash above is a slideshow. Run a batch file on a set of pictures and to make short 5 second video clips of them all, first render. At that point, we can do Ken Burns on some videos, and on others just overlap one to the next, the second render. Here, I have 3 x 5 second clips, and first accomplished $ yay ffmeg-concat so that I didn't have to use the complicated overlay filter.

$ shotcut
This fades up and down from black for each one and is not a dissolve between them. For those using Windows, a caret "^" breaks-up the command in a batch instead of a backslash.
$ffmpeg -i foo1.mp4 -i foo2.mp4 \
-i foo3.mp4 -filter_complex \
"fade=in:st=0:d=1,fade=out:st=4:d=1[f0]; \
fade=in:st=0:d=1,fade=out:st=4:d=1[f1]; \
fade=in:st=0:d=1,fade=out:st=4:d=1[f2]; \
[f0][f1][f2]concat=n=3[123]" \
-map [f123] foocombine.mp4

Pan and Zoom Slideshow (26:50) Chipper Videos, 2019. Blender 2.8. Most is concerned with Ken Burns, but crossfade at the end.
Basic Slideshow (8:09) Luke Smith, 2017. Easy to put all of these together, but nothing here about ken burns
Tech Notes: FFmpeg Multi Fades In Out (37:09) Steve AB4EL, 2017. Hilarious. Drones on and on. Takes 3 pictures and makes slide show. (8:30)fade transitions

watermarking (+1 render)

Accomplished with the "overlay" filter, and also possible in a batch (2019, Cool IT Help, seek to 5:00min), the bash equivalent being

#!/bin/bash
for file in *.mp4 do
ffmpeg -i "$file" -i watermark.jpg -filter_complex "overlay = 20:20" "${file%.*}_wm".mp4
done

keyframes

Keyframing is a large topic, thus a separate post will be developed, but why is this a challenging topic?

  • keyframes are not physically part of a video. They are placemarks temporarily created and indexed inside whatever video application is being used to edit the video. Being application-specific, they must be learned for each application
  • Nomenclature issues abound. Keyframes and I-frames are similar and related, so that some use them interchangeably. There is the further confusion that iframes in HTML are different from iframes in video editing, but are again, related.
  • ffmpeg refers to keyframes as "GOP" or Group of Pictures. It's set with the "-g" switch but requires re-encoding.
  • keyframes are used differently in animation and video, so that learning basic video keyframes means sifting through many Google results for animation keyframing, ambiguously labeled. This is so common that it's hard to avoid (wastefully) learning animation usage while attempting to learn video usage.
  • Blender, which is so vast and complex that it's like learning another programming language even before one considers keyframes, is probably the only GUI for which it's worth taking the time to learn keyframes. This means learning Blender just to get to the keyframe level.

Keyframes in Blender 2.8 (7:40) Blender, 2019. Animation-centric video, but significant application to video, since keyframes apply to both. 1:30 Keyframes can be managed in the Dopesheet, Graph editor, and Timeline.
Keyframes (and other) in 2.8 (9:48) Code, Tech, and Tutorials, 2019. Solid inadvertent tour around the VSE via a simple edit. Shows how to do transitions simply without inserting them, pressing "i" for keyframe.

gifs

Definitely an important concept for texting, however a person must go online to get this done: an unknown site has your vids and any related IP. On one's own machine, you'd think it would be quick, as 3rd party apps such as gifify appear to make it so, but these almost never work. This means we're dealing with f**king seeking syntax once again

Beginning at the 2:30 mark, create 10 seconds of sequential PNGs using a 2 digit format. I resized them down to a 640x360 size, by just determining the clip native resolution with ffprobe. Eg, I resize GoPro footage is 1280x720 natively, so I drop it to 640x360 or 320x180. Zoom is natively 640x360, so I leave it, or resize it to 320x180.

ffmpeg -ss 00:02:30 -i foo.mp4 -t 0:10 -s 240x135 %02d.png

Concatenate the PNG's (or use JPG's) into a GIF

ffmpeg -i %02d.png output.gif

These tend to be made at 25 fps, so I slow it down sometimes for slo-mo, eg 12 frames per second...

ffmpeg -framerate 12/1 -i %02d.png output.gif

Thursday, July 23, 2015

blender odds and ends (250Mb)

In Arch, Blender is a 60Mb download and roughly a 250Mb installation. Several associated dependencies install with it, most of which are likely to already be installed.

3 button mouse

The number one undocumented hassle for installing Blender. Users can opt for 3 button mouse emulation in Blender (Preferences ->Input Tab), but: 3 button emulation leads to overlap problems between X's management of mouse events and Blender's management of mouse events. For example, Blender's 3 button emulation of object rotation is "Alt+LMB". But in X, "Alt+LMB" are the strokes to grab an active window and move it around the desktop. What happens when a Blender user selects "Alt+LMB" while in "3-button emulation"? The entire window moves instead of the object inside Blender's window.

Solution: X mouse strokes can be altered by creating a "SectionLayout" file, and putting it in /etc/X11/xorg.conf.d/. Time consuming, considering no such extra files or time are needed if a person has a $10 (Logitech M110, Ebay) 3 button mouse. Additionally, if you have a 2 button mouse with a scroll wheel, the scroll wheel typically is a disguised 3rd button which can (in addition to being rolled) be pressed directly down until clicked. Clicking and holding the wheel down, while moving the mouse around, is how to rotate around an object inside Blender.

numpad

Also in Preferences -> Input Tab, is numpad emulation. On a laptop, this is necessary: there's obviously no numpad on standard laptops. As users might expect, numpad emulation allows using the number keys across the top of the keyboard instead of a numpad.

selection/deselection - extrusion

There are tens of YouTube video tutorials about extrusion, apparently a basic Blender feature. However, four of the seven steps for extrusion were not mentioned in any of the videos. Accordingly, for the first several hair-pulling days I attempted to extrude, the result would invariably be new, unattached duplicate boxes, NOT a connected extrusion from the current box.

The unexplained step, discovered only inadvertently (auggh) is that start-up boxes are, by default, already selected. So de-select ("A") the start-up box and then select a side, or however many one wishes, to extrude. When something is selected in Blender it changes from grey to gold:

  1. TAB to select "Edit" Mode
  2. Be in "Solid" view, not wireframe view
  3. Be in Face View, not Vertices View
  4. Use "A" to deselect/select all. Select faces by flying around the cube (MMB), and selecting the faces one wishes (Shift - LMB).
  5. Press "X", "Y", or "Z", to obtain the respective axis of the extrusion. Or, if one wishes to freehand it; "G"
  6. Press "E". You can also express it as E, then "2", or any other number. This will extrude that many grid squares along the selected axis.
  7. Move the mouse (no buttons), which will pull the extrusion. L click once it the shape is satisfactory.

floor plans

Links: Render DXF to 3-D
Floor plans are a common use of Blender for those not doing animations. Users can take standard .dxf line-art files and import them, and extrude them into complete floor plans with some additional work. Additionally textures can be downloaded and added to one's textures library to .