Real motion blur without an expensive plugin!

Have you come up with a good Moho trick? Need help solving an animation problem? Come on in.

Moderators: Fahim, Distinct Sun, Víctor Paredes, erey, Belgarath, slowtiger

Rudiger
Posts: 786
Joined: Sun Dec 18, 2005 2:25 am

Real motion blur without an expensive plugin!

Post by Rudiger » Sun Jun 27, 2010 1:42 pm

I was frustrated that there seem to be no free solutions for adding realistic looking motion blur to video, so I came up with my own.

I tried to emulate what happens with a real film camera by generating all bunch of sub-frames for each animation frame and integrating them all to get the resulting exposure. As it turns out, this is already an existing technique called "temporal oversampling", that is used by most 3d animation programs that support high quality motion blur.

Here are the steps I took:
1. Rescale entire document by a factor of 100. This gives 100 sub-frames per animation frame, meaning that an object can move up to 100 pixels without disrupting the blur effect.
2. Export entire animation to PNGs.
3. Run Python Imaging Library script to perform the integration on the sub-frames. I wrote this myself, and it's not too complicated, but if anyone's interested in obtaining it, they can PM me.
4. Compose integrated PNGs into a video file.

For a 10 second animation (330 frames at 30 fps), it took about 15 minutes to render all 33000 subframes and the resulting PNG files took up about 500MB. It then took another 15 minutes to integrate them.

Here is the result:
Without motion-blur: http://www.youtube.com/watch?v=G13UZDYS5g0
With motion-blur: http://www.youtube.com/watch?v=VqJVWTyljL4

Still not sure how to make videos look good on YouTube so you can download the original x264 AVI files from here:
Without motion-blur: http://users.on.net/~alexical/Anime%20S ... 20blur.avi
With motion-blur: http://users.on.net/~alexical/Anime%20S ... 20blur.avi

As you can see, it's actually not that complex, so I'm hoping that Mike will be able to integrate it into AS one day. That would be much better, as then it could be adaptive by using the amount of motion at a given frame to determine how many sub-frames to generate.
User avatar
J. Baker
Posts: 1037
Joined: Wed Mar 23, 2005 7:22 pm
Location: USA
Contact:

Post by J. Baker » Mon Jun 28, 2010 8:10 am

Looks really good from what I can tell. I would like to see some other animations done using the same technique. Just for my understanding. Does your script create a greater blur for faster moving objects then slower ones? I'll have to look up some more info about "temporal oversampling". ;)
Rudiger
Posts: 786
Joined: Sun Dec 18, 2005 2:25 am

Post by Rudiger » Mon Jun 28, 2010 8:39 am

J. Baker wrote:Looks really good from what I can tell. I would like to see some other animations done using the same technique. Just for my understanding. Does your script create a greater blur for faster moving objects then slower ones? I'll have to look up some more info about "temporal oversampling". ;)
Yes it does. If you pause the video when the balls are moving fast, you should see much more motion blur than when they are moving slowly. Actually, the script doesn't need to worry about that. It just simulates the shutter being open between frames and the objects that move faster will naturally have more motion blur.

The "temporal oversampling" just refers to the fact that you have to render many sub frames between each frame so you can integrate them to simulate the effect of the open shutter.

I would like to try it on more animations as well, but it's a bit of pain to use at the moment. I would like to try and use Anime Studio's command-line mode to write a command-line renderer that could render an anme file to a video file with motion blur in one step. I think I could also use my nudge key scripts to make it adaptive (ie vary the number of subframes depending on amount of motion in each frame), which would make it much faster as well.
Genete
Posts: 3483
Joined: Tue Oct 17, 2006 3:27 pm
Location: España / Spain

Post by Genete » Mon Jun 28, 2010 11:28 am

Synfig Studio has native motion blur with variable parameters:
http://www.synfig.org/wiki/Releases/0.62.01
:-P
-G
ponysmasher
Posts: 370
Joined: Thu Aug 05, 2004 2:23 am
Location: Los Angeles
Contact:

Post by ponysmasher » Mon Jun 28, 2010 12:18 pm

I've tried doing this in After Effects but it seems that AE will only use a certain number of frames. Rendering by a factor of 100 looked the same as a factor of 10 (or something like that) so objects couldn't move to fast.

I don't know anything about this Python Imaging Library script, maybe that will actually use all frames?

It could also be that I did something wrong in AE.

But, yeah, this technique will give better results than the Realviz plugin.
Rudiger
Posts: 786
Joined: Sun Dec 18, 2005 2:25 am

Post by Rudiger » Mon Jun 28, 2010 5:16 pm

Genete wrote:Synfig Studio has native motion blur with variable parameters:
http://www.synfig.org/wiki/Releases/0.62.01
:-P
-G
Hey, wadaya know! My result looks exactly like the Constant version in Synfig. That's encouraging! It would be interesting to implement the other algorithms as well. I'm guessing they just give more emphasis to the more recent subframes.
User avatar
heyvern
Posts: 6964
Joined: Fri Sep 02, 2005 4:49 am

Post by heyvern » Mon Jun 28, 2010 5:59 pm

Animation Master has "sub frames". They call it... uh... er... can't remember what it's called. Anyway, Animation Master uses the technique you describe to do FANTASTIC motion blur. You can set the "sub frame sampling", higher values give better quality slower renders. Another advantage of sub frames in Animation Master is that it will do fantastic antialiasing. By "blending" sub frames together it really smooths the edges nicely.

This would be AWESOME in Anime Studio. I never even thought of that as a feature. It would be kind of tricky though. For example in Animation Master you can actually set the key frame to a sub frame, ie; frame: 23.6 and put in motion keys. When you render using sub frames, it includes that sub frame motion when blending frames.

Even without having this in Anime Studio it is a really really really really COOOL technique!

I should look at python scripting... maybe there could be a way to script the sub frames based on motion? So you could get the variable motion blur?

-vern
Rudiger
Posts: 786
Joined: Sun Dec 18, 2005 2:25 am

Post by Rudiger » Mon Jun 28, 2010 6:40 pm

heyvern wrote:
I should look at python scripting... maybe there could be a way to script the sub frames based on motion? So you could get the variable motion blur?

-vern
Not sure what you mean by "variable motion blur". Fast moving objects will already blur more than slow moving objects as they move further across the frame in the interval between frames as slow moving objects. Perhaps you are thinking of vector blur, where you determine the motion vector for each pixel in a frame and apply a varying motion blur filter depending on the amount of motion. This is how motion blur filters in compositing programs work, as they have don't have access to the subframes.

However, in the case of 2D and 3D renderers you can generate as many subframes as you need to achieve a smooth blur effect. It would be great, though, if you used the maximum amount of motion in each frame to calculate the minimum number of subframes you have to generate. Would save heaps of time and storage for redundant subframes.

With regards to trying to Python, there's something called Lunatic Python, which is I've always wanted to try. In theory, it would let you access all of the Python libraries, including the imaging library, from within AS! The possibilities are almost scary!
User avatar
heyvern
Posts: 6964
Joined: Fri Sep 02, 2005 4:49 am

Post by heyvern » Mon Jun 28, 2010 7:02 pm

My mistake. You are right I am confused. Motion blue "quality" would be based on how many "sub frames" are used. A really fast motion would need MORE sub frames to look smooth but the total sub frames would be a global value for the entire animation. I think the trick here would be to only use the motion blur "trick" in areas that really need it.

-vern
User avatar
Jean_R
Posts: 68
Joined: Sat Nov 03, 2007 8:02 pm

Post by Jean_R » Mon Jun 28, 2010 7:19 pm

I like the way you made a subtle motion blur in your example.

I'm still very disappointed how this topic has been rejected by Slowtiger, arguing that Anime Studio does not address the images created as an image processing software.
User avatar
J. Baker
Posts: 1037
Joined: Wed Mar 23, 2005 7:22 pm
Location: USA
Contact:

Post by J. Baker » Mon Jun 28, 2010 7:46 pm

Any good documentation on how this works precisely?
Rudiger
Posts: 786
Joined: Sun Dec 18, 2005 2:25 am

Post by Rudiger » Tue Jun 29, 2010 8:53 am

J. Baker wrote:Any good documentation on how this works precisely?
Unfortunately, there doesn't seem to be a lot out there :(.
This was the best I could find:
http://en.wikipedia.org/wiki/Motion_blur
http://en.wikipedia.org/wiki/Temporal_anti-aliasing

Hopefully, it will be enough once you look at my python script.
User avatar
J. Baker
Posts: 1037
Joined: Wed Mar 23, 2005 7:22 pm
Location: USA
Contact:

Post by J. Baker » Tue Jun 29, 2010 9:28 am

Rudiger wrote:
J. Baker wrote:Any good documentation on how this works precisely?
Unfortunately, there doesn't seem to be a lot out there :(.
This was the best I could find:
http://en.wikipedia.org/wiki/Motion_blur
http://en.wikipedia.org/wiki/Temporal_anti-aliasing

Hopefully, it will be enough once you look at my python script.
Yeah, I couldn't find much when searching. Thanks Rudiger! ;)
Genete
Posts: 3483
Joined: Tue Oct 17, 2006 3:27 pm
Location: España / Spain

Post by Genete » Tue Jun 29, 2010 1:29 pm

Although this is not a development thread, here is the code that does the motion blur in Synfig.
Notice how the context is evaluated at different times and how later its alpha scaled down based on the subsample type. Later, everything is alpha composed again in one single frame. Since in Synfig, parameters evaluation is not frame based but float point time based, the subsampling can be done inside the animation program and not rescale and external composing is needed.
That makes me think that in the current state of art of Anime Studio (frame dependence) it is not possible to produce that kind of motion blur unless you scale up and later down the frame rate so the parameters of the layers can be evaluated between two frames or at fraction of frames. Meanwhile, only frame based motion blur is possible in AS, unless Mike invent something new.
-G
Rudiger
Posts: 786
Joined: Sun Dec 18, 2005 2:25 am

Post by Rudiger » Tue Jun 29, 2010 2:20 pm

Genete wrote:Although this is not a development thread, here is the code that does the motion blur in Synfig.
Notice how the context is evaluated at different times and how later its alpha scaled down based on the subsample type. Later, everything is alpha composed again in one single frame. Since in Synfig, parameters evaluation is not frame based but float point time based, the subsampling can be done inside the animation program and not rescale and external composing is needed.
That makes me think that in the current state of art of Anime Studio (frame dependence) it is not possible to produce that kind of motion blur unless you scale up and later down the frame rate so the parameters of the layers can be evaluated between two frames or at fraction of frames. Meanwhile, only frame based motion blur is possible in AS, unless Mike invent something new.
-G
Since you can change the frame rate in AS without changing the animation speed, I'm guessing that a time is internally represented by a float and is only represented as individual frames in the scripting interface for convenience. That brings me to a question. Does Synfig have a built-in scripting language like AS?
Post Reply