Shotgun Action Menu Item (AMI) Server

The #1 request every studio wants is the ability to run actions on their pipeline through the Shotgun web interface. Luckily SG allows this through the use of Action Menu Items (AMIs). The jist is: choose an Entity type in the web interface (ie Version), name your new Item (say Render Slates…), and setup an IP address to receive this information. SG then encapsulates the users’ selection of that entity type (either one or multiple entities) and sends the data to the receving machine, re-directing the users’ web browser in the process. With that data & connection, the machine at the other end can perform any action (say, submit all selected shots to the farm) and provide the end user with visual feedback (success, failure).

I developed a plugin-based AMI Server (philosophically similar to the Shotgun Event Server) for just this purpose, and I’m very happy with the result. The key is not only to perform the action, but to provide feedback to the user in a friendly and readable way; the user should be able to tell clearly whether the action has succeeded or failed. I even had a friend in CSS design whip me up a nice-looking CRT-style output because, why not 😉

Check out the video below to see it in action!

AMI Server operating at Phosphene LLC

Introducing the tks Suite

an expandable, customizable framework for compositing artists and supervisors

Years ago I was fortunate enough to spend nearly a year working with the wonderful guys at East Side Effects in NYC – not only did I get to work on a film for some of my favorite directors (the Coen brothers), but I was able to build the pipeline of my deepest artist dreams!

In the subsequent years, I have kept tinkering with it, until I have arrived at something I’m extremely proud of. The primary advantage (IMHO) of the tks Suite is that it was developed by someone who started as an artist, and therefore tries to provide the tools I wanted with the simple, easy-to-understand gui I always craved. What started as a purely-nuke panel for Artists has morphed into a full-featured, program-agnostic QT-based gui system for managing VFX workflow.

  • pure QT-based implementation
  • present Artists/Supervisors/Producers with only the information they need, with quick access to the actions they need to perform
  • modular “Action” system – adjust standard actions (Submit to Render Farm, Create Version/Publish) based on per-project settings
  • automated vfx pipeline from creating V0’s through review, publish and delivery creation
  • artist-focused for clarity and simplicity
  • tested and perfected on multiple AAA projects
Continue Reading →

Detecting a Nuke Panel closing

This is a simple, straight-forward style post. [EDIT: lol, I wish – click here to see what I ACTUALLY ended up doing]. I am working with updating my tks “Suite” of tools for Shotgun-Nuke integration, and I wanted to be sure I was doing everything as safely as possible. As such, I wanted to make sure that when a user “hides” the panel in Nuke, my panel cleans up after itself. (I noticed that, by default, when the panel is “closed” with the X box, the panel itself and the thread keep going and going and going in the background).

With some dir() inspection and some super() magic, I was able to determine the following functions to override if you want to add special nuke panel close and open logic:

class NotesPanel( QWidget ):
	def __init__( self , scrollable=True):

	def hideEvent(self, *args, **kwargs):
		#this function fires when the user closes the panel
		super(NotesPanel, self).hideEvent(*args, **kwargs)

	def showEvent(self, *args, **kwargs):
		#this function fires when the user opens the panel
		super(NotesPanel, self).showEvent(*args, **kwargs)

Short and sweet! Hopefully this will help others who are looking for similar logic.

UPDATED: Well, nothing is ever that simple, is it?

Continue Reading →

Elements Ingest Tool

handling ingest of bulk data with style!

When I was first asked to begin thinking about a tool to handle and sort files coming into a busy VFX house, I was pretty hesitant. How could you possibly correctly sort BG plates from stock footage, editorial QTs from plates, set data from delivery manifests??

It wasn’t until we came up with the idea to utilize the Shotgun Toolkit’s publisher (tk-multi-publish2) that it started to seem doable. While my experience with the publisher hadn’t been all stellar up to that point, this was really a place for it to shine. Specifically – the publisher excels at handling large quantities of different file types all at once, determining info about them and offering a nicely designed GUI to the end user.

Continue Reading →

Nuke and Shotgun Integration

In my years at the Molecule in NYC, first as a freelancer and then as a perma-lancer, I had lots of time to observe how a mid-sized studio manages the sometimes huge number of shots that need to be kept track of on a daily basis. When I started, the studio was using Ftrack to manage its shot pipeline, and while poking around online I saw that ftrack had a python API, and a lightbulb went off – Nuke + Python + Ftrack = ? Awesomeness, anyway.

So I went to work on building a nuke-based shot tracking system, first for artists but then expanding to include supervisors as well. I called it “the dashboard”, and though it started small it quickly became essential to the Molecule’s pipeline. When the studio switched tracking packages to Shotgun, a lot more functionality was exposed and the whole thing just got 50% better. You can catch a quick glimpse of it in this promotional video from Autodesk – at around 0:55 in this video, the awesome Rick Shick talks about how he uses it instead of the web-based interface almost exclusively in his role as comp supervisor:

Gone were the days of manually creating contact sheets for each project; now every artist and supervisor had access to dynamic contact sheets, through which they could see and change statuses, read and post notes and images, and quickly open any shot.

JPEG compression in nuke (in-line)

I’ve been annoyed with the basic jpeg nuke workflow for a while – the only way to get that magic compression look was to write it out as a jpeg, and then read it back in. So when I discovered blink scripts last week, building this was a great way to learn.

Introducing – JPEGer! Including quality slider, and options for chroma and alpha (jpeg-ing an alpha is untested right now). Interestingly enough, in our testing, the results were identical across multiple machines, so this should be safe for a renderfarm. I’m not sure if it’s coded in the most efficient way, but it seems to only take a few seconds/frame, even on 4k images, which seems workable.

Update 12/8/22 – At some point the Blink syntax changed and the tool started to fail in newer versions of nuke. This has been fixed! [check out this forum post on the nuke dev forum for more info]


JPEGer Gizmo
tested on Nuke 11/12/13, Win 10

Nuke Ramped Defocus

If I have an image in nuke that I want to control the falloff of the depth of field, ordinarily I would pump a ramp into the mask input of defocus. However, this would only work if I want my defocus to start at 0 and move to some value. If, however, I want the smallest amount of defocus to be something other than 0, say 2, I can’t achieve this effect with a good result, even if I adjust the start value of my mask input (instead of ramping from 0-1, say .2-1). If I input this ramp into the mask input on a defocus, this is the result:


Instead of moving from a 2 defocus to a 10 defocus, instead I get the original un-blurred result, with the 10 defocus merged on top of it at 20%, creating a halo.

With a few merges and two defocuses instead, I can get a much better result:


So I grouped this little setup and posted it here. Simply add a gradient map into the ramp input, 0-1, and set your minimum blur and maximum blur values, and you’ll get a much smoother result.


tested on Nuke 9.0v4

Nuke VHS Noise Group


I’ve recently had to create (and re-use) a customizable VHS tracking-error style effect in Nuke, and have collected the node group in the attached file. It’s mostly controlled by some nobs on the “MASTER” node, though there are some other tweakable elements here. Unfortunately there’s no documentation so you’ll have to play around with it, but I’ve been getting good results from it.


tested on Nuke 8.0v1

Nuke multiGrad

***updated 3/9/2015

I’ll be the first to admit that I’m no expert on rotoscoping, but in my experience one of the first tools I look for is a multi-gradient – something that I can use to paint out large sections of the image, and then add detail to. So I was surprised that, besides this shake-like 4-point gradient there wasn’t anything in Nuke that was what I wanted. The shake gradient is nice, but I was really looking for one that would allow me to move the points around in space, and would interpolate colors around the points as well as in between. With some patience and a lot of math (point-slope anyone?) I created a gizmo that will do just that. Below you can see the results on a racecar:

Of course, it’s not perfect, and there would need to be fine tuning around the edges, but all the rotoing in that image was done just with this tool. The user pipes in the source image to “Source” and a mask to “matte”, and the gizmo will comp it onto the source automatically, preserving the source alpha (the user can turn this feature off under the “Matte” tab. It also includes python scripting that will automatically grab the color of the source at the points, which makes painting out areas really quick and easy.

Control Panel

Because of the math involved, points 1 & 2 must be at the top, and 3 & 4 at the bottom, or it will start to act screwy. If it needs to rotate, you’re better off translating the whole result.


tested on Nuke 9.0v4

Houdini CloudMaker OTL


I’ve been working on some cloud-related projects, and I needed a streamlined way to make many different clouds – with art-directed shapes – quickly and simply. I’ve come up with an OTL that takes rough input geometry and then builds a wispy, fractal-y volume for rendering.

The asset is implemented in Volume VOPs, using simple noise equations to extrapolate cloud-like edges. It creates an intermediate geometry out of metaballs, then creates a volume and adds noise. With this, you can quickly and easily create simple cloud shapes and convert them into high-quality clouds for rendering.


cloudMaker OTL
requires houdini 12.5 or higher