Importing Modules Into Unreal Blueprints Using a Blueprint Function Library

General / 03 August 2023

Intro

In my post Improving the MegaAssemblies Workflow I mentioned that the method we were using to create Blueprint Functions via Python was no longer supported by Epic and would need to be revisited. After some explorations and carefully looking into the way Epic appears to want users to interface with Python in Blueprints, I have landed on a new way to integrate Python modules into Unreal which adheres to Epic's current rules.

Prerequisites

This solution assumes you have already enabled the Python Editor Script Plugin and set up your Unreal Python environment with appropriate paths to your tools, and possibly set up any additional pathing through a startup script. More information about this process can be found here on Epic's website.

The Problem

As referenced in that previous post, Epic redirects users to using the Execute Python Script, Execute Python Command, and Execute Python Command (Advanced) nodes which are available in Editor-Only Blueprints. This allows you to write Python snippets and code directly in the Blueprint, but this is a limited way to write code in general, let alone when seen from the scope of a studio and pipeline which makes use of a large code base.

When I first started investigating ways to do this dynamically, I attempted to use the Execute Python Script node, as it accepts any number of inputs and can return any number of outputs. This worked as expected, and using the network and code in the following snippets I was able to pass the name of a module and import it when running a Bluetility tool. However, I immediately found a catch with this method.

Initial tests with Execute Python Script

import importlib

# Using built in __import__ so we can pass the name of the module as a variable
tool = __import__(tool_name)    

# Reload the tool to make sure it's up to date
importlib.reload(tool)

# Run the tool via the built in run func
tool.run()

As a matter of modularity and simplicity for other Tech Artists, I tried to migrate my new Blueprint network to a Blueprint Function Library. Doing this would make creating additional Python tools for Bluetility uses much easier, as users would only have to create their Bluetility, call my import_python_tool node, and pass the name of their module. For reasons unknown, the Execute Python Script node is not available within Blueprint Function Libraries. My best guess is that Blueprint Function Libraries are currently geared toward gameplay related Blueprints, and therefore limit access to the Python nodes as Python is not supported for runtime applications. This immediately broke the plan, so back to the drawing board we went.

The Solution

During my investigation I noticed I could still call the Execute Python Command node from the Blueprint Function Library. This node only accepts a single input for our Python command, and no additional inputs or outputs are available. However, utilizing a Format Text node, we can format text very similar to using str.format() in Python. This allows us to pass a variable into our text, then push that text to the Execute Python Command node as our script and run it. While it is effectively a workaround for the functionality of the Execute Python Script node, it does allow us to create modular Blueprint Functions for our Blueprint Function Library which accept variables perform tasks through Python.

The Blueprint Function in our Blueprint Function Library.

In this first image is the Blueprint Function for our Blueprint Function Library. You can see that all we are doing is passing a Text Input into a Format Text node in the first argument position. A slight note here is I did notice this node acts up at times, and I recommend typing out your code before connecting your input as that seemed the most stable to me.

import importlib

# import tool, reload tool via importlib to ensure it's up to date
import {0}
importlib.reload({0})

# run 
tool{0}.run()

In this code snippet, we are importing importlib so we can refresh our code each time it's run, which is great for both testing and making sure any updates to the module from the Tools directory are passed to users even if they leave the editor open. You'll notice the {0} entries in the code, which is where we are sending our text input. You can of course use multiple inputs, designating them {0} - {N} to match the number of the inputs on the Format Text node. Lastly, we call a function called run in our module, which will run our script. You can of course do this differently, but for us this was a simple standard to include a run function for all our Unreal tools written in Python.

Using Our New Solution

Calling our Blueprint  Function Library Func from a new Bluetility Function.

Now that we have a Blueprint Function Library that contains our import script, we simply need to call this function from any Bluetility we need to import a Python script in. From there, we simply pass in the exact name of the module we want to import, exactly like you would type it in an import statement in Python. At this point, on calling our Bluetility, the script will import and refresh our Python script, then call it's run function.

One caveat to this is that, as I alluded to earlier, it appears Epic makes the assumption that all Blueprint Function Libraries will be used in game, and to run the script we must provide a World Context input. This is easily done by calling Get Active World and passing it into the appropriate slot. This does not really do anything in our situation, but is mandatory to compile the script. Hopefully in future we will see some expansions on the Blueprint Function Library which will allow us to designate a library as editor only to bypass this.

Report

Improving the MegaAssemblies Workflow

General / 11 August 2022

Prefabs are an incredibly common, extremely powerful tool in any world-builder's kit. If you want to place a campfire, you can combine a firepit, VFX, lights, and sound effects into one asset to be placed and updated together. However, Unreal Engine 5 doesn’t have a true prefab system. I know myself and many others were ecstatic when Epic first teased the Packed Level Instance Blueprints (aka MegaAssemblies) system for Unreal 5. When I first got my hands on the early access version of UE5 I found building out MegaAssemblies to be a tedious, confusing process, even after doing it a few times. To make matters worse, the system only supports static meshes. This means a major feature of traditional prefabs – the ability to package multiple asset types together – remains missing. 

In an effort to implement a pseudo-prefab system in UE5, I spent countless hours digging through docs, attempting to build them manually through various combinations of features. I landed on a happy marriage between Packed Level Instance Blueprints, Level Instances, and the default in-level editing features in UE5. This results in a more modern feel to building out “prefabs” that is as simple as selecting your assets and using the right-click context menu.

For more information on the creation of this tool, please see my longer post here.

Quick example of building a prefab from various assets, inside a level in real time.

Examples of editing the prefab, as well as the Packed Level Instance Blueprint/MegaAssembly within the prefab.

Report

Layered Materials - Vertex Color Manager in Maya

General / 01 August 2022

In modern development layered materials are all the rage. One of the cheaper ways to denote where to assign materials is vertex color, and in this post we will explore a dynamic solution for accurately setting and managing those vertex colors in Maya. Using this system artists can avoid manually entering values, and simply assign predetermined values to their selection. This reduces the tedium of assigning tightly assigned values within the same channel, which allows for a much more nuanced control over the vertex color without artist having to remember specific values or formulas. Additionally this reduces human error by reducing the necessary detailed knowledge artists are required to remember. As long as they know what vertex color channel, and what layer they wish to assign their selection to, they can continue mistake free. 

For more information on the creation of this tool, please see my longer post here.


Report

Automating Light-Map Creation in Maya

General / 29 July 2022

Intro

When using light-maps in Unreal texel density is key. However because these light-maps can often be quite small, doing this right can be a frustrating process which requires artists to layout their light-map UVs with the correct padding to avoid splitting pixels, creating shadow artifacts on objects in the scene. To make matters worse, the size of the object in the scene is the ultimate determining factor on what size light-map the object should use. This leaves the artists working in Maya without the proper context in which to make that decision, which leads to a back and forth between Maya and Unreal, or more commonly simply changing the light-map size in Unreal and hoping it doesn’t produce artifacts. 

To improve this workflow, we can provide tools to estimate the light-map size based on the surface area of the object, as well as a suite of tools to help simplify their creation. 

Overview

As this tool was requested by my Environment team, and was specifically requested to be a drop down menu, I worked within the constraints of Maya’s dropdown options. In future, I do think this UI would be better suited to it’s own dockable window. 

The UI


The Light-map Tools Dropdown Menu.

1. Estimate Light-Map Size will first take the surface area of the selected object, and then return a window with the estimated appropriate lightmap size for the object. This is also called behind the scenes to lay out the light-maps without the user needing to do anything. The main benefit here is the user can run this operation to get the appropriate light-map size, and can simply paste that value into the static mesh options in Unreal to set the correct light-map size when importing. 

2. In the Light-Map Creation section we handle the creation of the light-map UV channel, with various options to fit our workflow which utilized multiple UV channels. Additionally, a checkable box determines if we do the entire layout process all at once on creation, making this a truly one-click operation. 

3. In the Light-Map Layout Options section, we handle some options based on the needs for specific assets using radio-buttons. These options simply determine how we handle the layout of the UV shells. 

4. Layout Light-Map UVs will run the layout process with the appropriate padding for the scale of the object by first estimating the lightmap size, then using the selected options to determine the needs of the asset before laying out the UVs. 




Demonstrating the light-map estimation feature.

In the GIF above we see a quick example of three vastly different scale objects, and their estimated lightmap size. It is important to note that in order to estimate the light-map size, our transformations must be frozen.



Demonstrating the light-map layout feature.

In the GIF above we see a quick example of creating and laying out our light-map UVs from our existing UVs. In the first example, we see the process of duplicating UV 0 to UV 1, then laying them out. In the second example we select the “Layout on Creation” option, and we can also see that the UV’s retain their current scale ratios when laying out.

The Breakdown

The biggest benefit in this tool is the automatic estimation of the appropriate scale for the light-map. All of the other features are standard Maya options with fewer clicks, reducing the opportunity for human error by automatically applying the appropriate settings. 

Interpreting the Ask

Because the Environment Team was spending a lot of time trying to get their light-maps laid out appropriately at the right scale, the ask was a relatively open-ended request to improve that workflow by providing some tools to help the artists make those decisions. Light-maps were universally hated and as such often complained about. 

I interpreted this as an opportunity to eliminate as much of the human interaction as possible to reduce the frustration it was causing, so I set out with the following goals: 

  • Remove the guesswork from estimating the light-map size, and find a formula to produce consistent results. 
  • Remove the need to manually input values to lay out light-maps at the appropriate resolution. 
  • Reduce the amount of effort and input needed across the board. 

The Formula

This was by far and away the most tricky part of figuring out this tool, and it is unfortunately not a one-size-fits-all solution for every project, because texture budgets vary and what worked for us may not be the right solution for others. Because there is no hard and fast rule on texel density for light-maps (likely because the visualizer is customizable, due to the fact that the light-map budget is different for every project), I first had to confirm with our lighting team and Environment lead that we were happy with the current settings. Once we had decided on our visualizer scale, I was able to set out on trying to get everything into the “green.” 

In the end, using a series of 1m cubes scaled to different sizes and manually adjusting their lightmap sizes to figure out the appropriate surface-area-to-light-map-texture-size ratio. The formula that worked for us came out to: 

SizeEstimate = 12(√(SurfaceArea/6)/10)

Because artists often worked with groups for a more manageable scene, I iterated over a list of objects, using cmds.polyEvaluate to retrieve the surface area of each of the object, then added them all together. From there, we used our formula. 

def calculate_lightmap_size(obj):

    """Calculate total area of selected objects, then estimate light-map size."""

    surface_area = 0

    for index, o in enumerate(obj):

        sa = cmds.polyEvaluate(o, area=True)

        surface_area += sa


    size_est = 12  * (math.sqrt(surface_area // 6) // 10)

    lm_size = find_nearest_size(size_est)

    return lm_size

Making it Useable

The above formula would consistently hit exact texture sizes in the correct circumstances (ie: a perfect cube with even measurements), but with real assets we would get odd numbers. To round to the nearest acceptable power of two texture, I defined a range of accepted light-map sizes, and we take the result of our formula and compare it to find the nearest power of two option, seen below. 

LM_SIZES = [16, 23, 64, 128, 256, 512, 1024, 2048]


def find_nearest_size(input_key):

    """Compare estimated ideal light-map size to nearest standard texture size."""      return LM_SIZES[min(range(len(LM_SIZES), key = lambda i: abs(LM_SIZES[i] - input_key))]

Calculating Padding

Now that we know what texture size we are using the next step in automating this process is ensuring we applied the appropriate shell and tile padding values when laying out the UVs. To do this, we simply took our desired padding values, and divided them by the map size we found earlier.

def calculate_padding(map_size):

    """Calculate the padding value by dividing texel value by map size."""

    shell_pad = (4.0 / float(map_size))

    tile_pad = (1.0 / float(map_size))  

    return shell_pad, tile_pad 

Creating the UV Channel

All of the above is great, but without the proper UV channel we can’t actually do anything with this information. Because this tool was built in reaction to the process, I knew we already had some assets which would have light-map UV channels, and others would not. To accommodate the varying number of UV channels, and to ensure they were all in the appropriate channel, we check over our UV channels and make sure the correct channel is selected, before proceeding. 

def check_uv_sets(obj, copy_uvs=True, set_to_copy=0):

    """Check UV sets for lightmap UV set. Create or rename as needed."""

    for index, o in enumerate(obj):

    uv_sets = cmds.polyUVSet(o, query=True, allUVSets=True)

    # handle light-map naming

    if 'lightmap' not in uv_sets:

        if len(uv_sets) > 1:

            lm_set = uv_sets[1]

            cmds.polyUVSet(o, rename=True, newUVSet='lightmap', uvSet=lm_set)

        else:

            cmds.polyUVSet(o, create=True, uvSet='lightmap')

        # handle light-map copy

        if copy_uvs:

            copy_set = (uv_sets[set_to_copy])

            cmds.polyUVSet(o, copy=True, newUVSet='lightmap', uvSet=copy_set)

        cmds.polyUVSet(obj, currentUVSet=True, uvSet='lightmap')

Laying Out the UVs

At this point, we have calculated everything we need to layout our UVs at the correct texel density with the correct padding, and we have ensured we have the correct channel created, populated, and selected. From here the last thing left is to lay out the UVs in the light-map channel. To do this, we simply pass the object, map size, shell and tile padding, and our selection from the UI for our scale ratios to pymel.other.u3dLayout. This will fail if we have non-manifold geometry, so we wrap this in a try/except and return a dialogue if something goes wrong. 

def layout_uvs(obj, map_size, shell_pad, tile_pad, scale_ratio=0):

    """Lays out the current UV channel with appropriate settings. Scale ratio is an int between 1-3, which changes the scale mode     during layout. 0 = None, 1 = Preserve 3D ratio, 2 = Preserve UV ratio."""

    try:

        pymel.other.u3dLayout(obj, res=map_size, spc=shell_pad, mar=tile_pad, scl=scale_ratio)

    except:

        message = cmds.confirmDialog(title='Something went wrong!',

                                                          message='Something went wrong during the UV layout process. Most commonly this is due to non-manifold geometry, please clean up the geometry before continuing!',

                                                          button=['Ok'],

                                                          defaultButton='Ok')

In Action

The last step in the process here is simply setting the appropriate light-map scale on the static mesh in Unreal. To do so, we simply must open up the static mesh details, and paste the copied value from the Light-Map Estimator under General Settings > Light Map Resolution. 


The Light Map Resolution option under the Static Mesh Details panel.

Before and After comparison. 
Report