scrapbook
  • "Unorganized" Notes
  • The Best Public Datasets for Machine Learning and Data Science
  • Practice Coding
  • plaid-API project
  • Biotech
    • Machine Learning vs. Deep Learning
  • Machine Learning for Computer Graphics
  • Books (on GitHub)
  • Ideas/Thoughts
  • Ziva for feature animation: Stylized simulation and machine learning-ready workflows
  • Tools
  • 🪶math
    • Papers
    • Math for ML (coursera)
      • Linear Algebra
        • Wk1
        • Wk2
        • Wk3
        • Wk4
        • Wk5
      • Multivariate Calculus
    • Improving your Algorithms & Data Structure Skills
    • Algorithms
    • Algorithms (MIT)
      • Lecture 1: Algorithmic Thinking, Peak Finding
    • Algorithms (khan academy)
      • Binary Search
      • Asymptotic notation
      • Sorting
      • Insertion sort
      • Recursion
      • Solve Hanoi recursively
      • Merge Sort
      • Representing graphs
      • The breadth-first search algorithm
      • Breadth First Search in JavaScript
      • Breadth-first vs Depth-first Tree Traversal in Javascript
    • Algorithms (udacity)
      • Social Network
    • Udacity
      • Linear Algebra Refresher /w Python
    • math-notes
      • functions
      • differential calculus
      • derivative
      • extras
      • Exponentials & logarithms
      • Trigonometry
    • Probability (MIT)
      • Unit 1
        • Probability Models and Axioms
        • Mathematical background: Sets; sequences, limits, and series; (un)countable sets.
    • Statistics and probability (khan academy)
      • Analyzing categorical data
      • Describing and comparing distributions
      • Outliers Definition
      • Mean Absolute Deviation (MAD)
      • Modeling data distribution
      • Exploring bivariate numerical data
      • Study Design
      • Probability
      • Counting, permutations, and combinations
      • Binomial variables
        • Binomial Distribution
        • Binomial mean and standard deviation formulas
        • Geometric random variable
      • Central Limit Theorem
      • Significance Tests (hypothesis testing)
    • Statistics (hackerrank)
      • Mean, Medium, Mode
      • Weighted Mean
      • Quartiles
      • Standard Deviation
      • Basic Probability
      • Conditional Probability
      • Permutations & Combinations
      • Binomial Distribution
      • Negative Binomial
      • Poisson Distribution
      • Normal Distribution
      • Central Limit Theorem
      • Important Concepts in Bayesian Statistics
  • 📽️PRODUCT
    • Product Strategy
    • Product Design
    • Product Development
    • Product Launch
  • 👨‍💻coding
    • of any interest
    • Maya API
      • Python API
    • Python
      • Understanding Class Inheritance in Python 3
      • 100+ Python challenging programming exercises
      • coding
      • Iterables vs. Iterators vs. Generators
      • Generator Expression
      • Stacks (LIFO) / Queues (FIFO)
      • What does -1 mean in numpy reshape?
      • Fold Left and Right in Python
      • Flatten a nested list of lists
      • Flatten a nested dictionary
      • Traverse A Tree
      • How to Implement Breadth-First Search
      • Breadth First Search
        • Level Order Tree Traversal
        • Breadth First Search or BFS for a Graph
        • BFS for Disconnected Graph
      • Trees and Tree Algorithms
      • Graph and its representations
      • Graph Data Structure Interview Questions
      • Graphs in Python
      • GitHub Repo's
    • Python in CG Production
    • GLSL/HLSL Shading programming
    • Deep Learning Specialization
      • Neural Networks and Deep Learning
      • Untitled
      • Untitled
      • Untitled
    • TensorFlow for AI, ML, and DL
      • Google ML Crash Course
      • TensorFlow C++ API
      • TensorFlow - coursera
      • Notes
      • An Introduction to different Types of Convolutions in Deep Learning
      • One by One [ 1 x 1 ] Convolution - counter-intuitively useful
      • SqueezeNet
      • Deep Compression
      • An Overview of ResNet and its Variants
      • Introducing capsule networks
      • What is a CapsNet or Capsule Network?
      • Xception
      • TensorFlow Eager
    • GitHub
      • Project README
    • Agile - User Stories
    • The Open-Source Data Science Masters
    • Coding Challenge Websites
    • Coding Interview
      • leetcode python
      • Data Structures
        • Arrays
        • Linked List
        • Hash Tables
        • Trees: Basic
        • Heaps, Stacks, Queues
        • Graphs
          • Shortest Path
      • Sorting & Searching
        • Depth-First Search & Breadth-First Search
        • Backtracking
        • Sorting
      • Dynamic Programming
        • Dynamic Programming: Basic
        • Dynamic Programming: Advanced
    • spaCy
    • Pandas
    • Python Packages
    • Julia
      • jupyter
    • macos
    • CPP
      • Debugging
      • Overview of memory management problems
      • What are lvalues and rvalues?
      • The Rule of Five
      • Concurrency
      • Avoiding Data Races
      • Mutex
      • The Monitor Object Pattern
      • Lambdas
      • Maya C++ API Programming Tips
      • How can I read and parse CSV files in C++?
      • Cpp NumPy
    • Advanced Machine Learning
      • Wk 1
      • Untitled
      • Untitled
      • Untitled
      • Untitled
  • data science
    • Resources
    • Tensorflow C++
    • Computerphile
      • Big Data
    • Google ML Crash Course
    • Kaggle
      • Data Versioning
      • The Basics of Rest APIs
      • How to Make an API
      • How to deploying your API
    • Jupiter Notebook Tips & Tricks
      • Jupyter
    • Image Datasets Notes
    • DS Cheatsheets
      • Websites & Blogs
      • Q&A
      • Strata
      • Data Visualisation
      • Matplotlib etc
      • Keras
      • Spark
      • Probability
      • Machine Learning
        • Fast Computation of AUC-ROC score
    • Data Visualisation
    • fast.ai
      • deep learning
      • How to work with Jupyter Notebook on a remote machine (Linux)
      • Up and Running With Fast.ai and Docker
      • AWS
    • Data Scientist
    • ML for Beginners (Video)
    • ML Mastery
      • Machine Learning Algorithms
      • Deep Learning With Python
    • Linear algebra cheat sheet for deep learning
    • DL_ML_Resources
    • Awesome Machine Learning
    • web scraping
    • SQL Style Guide
    • SQL - Tips & Tricks
  • 💡Ideas & Thoughts
    • Outdoors
    • Blog
      • markdown
      • How to survive your first day as an On-set VFX Supervisor
    • Book Recommendations by Demi Lee
  • career
    • Skills
    • learn.co
      • SQL
      • Distribution
      • Hypothesis Testing Glossary
      • Hypothesis Tests
      • Hypothesis & AB Testing
      • Combinatorics Continued and Maximum Likelihood Estimation
      • Bayesian Classification
      • Resampling and Monte Carlo Simulation
      • Extensions To Linear Models
      • Time Series
      • Distance Metrics
      • Graph Theory
      • Logistic Regression
      • MLE (Maximum Likelihood Estimation)
      • Gradient Descent
      • Decision Trees
      • Ensemble Methods
      • Spark
      • Machine Learning
      • Deep Learning
        • Backpropagation - math notation
        • PRACTICE DATASETS
        • Big Data
      • Deep Learning Resources
      • DL Datasets
      • DL Tutorials
      • Keras
      • Word2Vec
        • Word2Vec Tutorial Part 1 - The Skip-Gram Model
        • Word2Vec Tutorial Part 2 - Negative Sampling
        • An Intuitive Explanation of Convolutional Neural Networks
      • Mod 4 Project
        • Presentation
      • Mod 5 Project
      • Capstone Project Notes
        • Streaming large training and test files into Tensorflow's DNNClassifier
    • Carrier Prep
      • The Job Search
        • Building a Strong Job Search Foundation
        • Key Traits of Successful Job Seekers
        • Your Job Search Mindset
        • Confidence
        • Job Search Action Plan
        • CSC Weekly Activity
        • Managing Your Job Search
      • Your Online Presence
        • GitHub
      • Building Your Resume
        • Writing Your Resume Summary
        • Technical Experience
      • Effective Networking
        • 30 Second Elevator Pitch
        • Leveraging Your Network
        • Building an Online Network
        • Linkedin For Research And Networking
        • Building An In-Person Network
        • Opening The Line Of Communication
      • Applying to Jobs
        • Applying To Jobs Online
        • Cover Letters
      • Interviewing
        • Networking Coffees vs Formal Interviews
        • The Coffee Meeting/ Informational Interview
        • Communicating With Recruiters And HR Professional
        • Research Before an Interview
        • Preparing Questions for Interviews
        • Phone And Video/Virtual Interviews
        • Cultural/HR Interview Questions
        • The Salary Question
        • Talking About Apps/Projects You Built
        • Sending Thank You's After an Interview
      • Technical Interviewing
        • Technical Interviewing Formats
        • Code Challenge Best Practices
        • Technical Interviewing Resources
      • Communication
        • Following Up
        • When You Haven't Heard From an Employer
      • Job Offers
        • Approaching Salary Negotiations
      • Staying Current in the Tech Industry
      • Module 6 Post Work
      • Interview Prep
  • projects
    • Text Classification
    • TERRA-REF
    • saildrone
  • Computer Graphics
  • AI/ML
  • 3deeplearning
    • Fast and Deep Deformation Approximations
    • Compress and Denoise MoCap with Autoencoders
    • ‘Fast and Deep Deformation Approximations’ Implementation
    • Running a NeuralNet live in Maya in a Python DG Node
    • Implement a Substance like Normal Map Generator with a Convolutional Network
    • Deploying Neural Nets to the Maya C++ API
  • Tools/Plugins
  • AR/VR
  • Game Engine
  • Rigging
    • Deformer Ideas
    • Research
    • brave rabbit
    • Useful Rigging Links
  • Maya
    • Optimizing Node Graph for Parallel Evaluation
  • Houdini
    • Stuff
    • Popular Built-in VEX Attributes (Global Variables)
Powered by GitBook
On this page
  • You’ll need these resources to follow this tutorial
  • Init function
  • Computation
  • Result
  • Running a Neural Network model inside the DG node
  • Input and output attributes
  • Model computation
  • Removing hardcoded file path
  • Adding class (plant) names
  • In conclusion
  1. 3deeplearning

Running a NeuralNet live in Maya in a Python DG Node

Previous‘Fast and Deep Deformation Approximations’ ImplementationNextImplement a Substance like Normal Map Generator with a Convolutional Network

Last updated 4 years ago

Procedural flower and plant classification displayed inside Autodesk Maya's UI

You’ll need these resources to follow this tutorial

DG (Dependency Graph) nodes are the atomic elements that make up a Maya scene. You can list all the DG nodes in your scene by unchecking the option to list only the acyclic ones (DAGs) in your outliner.

And you can inspect how DG nodes are connected using the Node Editor (Windows > Node Editor).

You can create DG nodes with custom, inputs, outputs, and computations in Python or C++. I’ll use Python for this tutorial.

import maya.api.OpenMaya as om # (1) Load OpenMaya
# (2) Inform Maya we are using OpenMaya2
+ def maya_useNewAPI():
# (3) Declare global node params and other global vars
nodeName = 'templateNode'
nodeTypeID = om.MTypeId(0x60011)
+ class templateNode(om.MPxNode): # (4) Here we declare the computation
+ def create():
+ def init(): # (5) Here we declare the
+ def _toplugin(mobject):
+ def initializePlugin(mobject):
+ def uninitializePlugin(mobject):

In the code above we load the OpenMaya API (1) and inform Maya we’ll be using OpenMaya2 (2) by declaring a maya_useNewAPI function. This is a convention. Then (3) we give the node a name and a unique hexadecimal ID (more conventions). If by any chance you have another node registered under the same ID Maya will not load your plug-in.

We create a class based on OpenMaya’s MPxNode, a class for custom Maya nodes. This is where we define the computation. We define functions to create and initialize the node. We define a function to declare the properties of the plugin. And finally, we define the functions that should be called for plugin initialization and uninitialization.

The things you got to look out for are the (4) definition of the class templateNode() and (5) the init() function.

Init function

In the init function we create all of the node’s the input (1) and output (2) attributes. Attributes are based on the appropriate OM classes, such as MFnNumericAttribute for numbers and MFnTypedAttribute for other types of data. Floats should be declared as OM’s kFloat type. Inputs should be writable, while ouputs should not, for they are the result of the node’s computation.

def init():
    # (1) Setup attributes    
    nAttr = om.MFnNumericAttribute() 
    # Maya's Numeric Attribute class    
    kFloat = om.MFnNumericData.kFloat 
    # Maya's float type    
    templateNode.a = nAttr.create('a','a', kFloat, 0.0)    
    nAttr.hidden = False    
    nAttr.keyable = True    
    templateNode.b = nAttr.create('b','b', kFloat, 0.0)    
    nAttr.hidden = False    
    nAttr.keyable = True    
    #(2) Setup the output attributes    
    templateNode.result = nAttr.create('result', 'r', kFloat)    
    nAttr.writable = False    
    nAttr.storable = False    
    nAttr.readable = True    
    #(3) Add the attributes to the node    
    templateNode.addAttribute(templateNode.a)    
    templateNode.addAttribute(templateNode.b)    
    templateNode.addAttribute(templateNode.result)    
    #(4) Set the attribute dependencies    
    templateNode.attributeAffects(templateNode.a, templateNode.result)    
    templateNode.attributeAffects(templateNode.b, templateNode.result)

After adding attributes to the node (3) you’ll need to specify which inputs trigger the computation of specific outputs (4). In our template example we’ll sum the values of ‘a’ and ‘b’, thus changes in both ‘a’ and ‘b’ affect the result.

Computation

The computation is defined in the compute method of our MPxNode based class.

class templateNode(om.MPxNode):
    '''A template Maya Python DG Node.'''
    def compute(self, plug, datablock): #(1)
        # (1) Get handles from MPxNode's data block
        aHandle = datablock.inputValue(templateNode.a)
        bHandle = datablock.inputValue(templateNode.b)
        resultHandle = datablock.outputValue(templateNode.result)

        # (2) Get data from handles
        a = aHandle.asFloat()
        b = bHandle.asFloat()

        # (3) Compute
        c = a+b

        # (4) Output
        resultHandle.setFloat(c)

The data comes from Maya’s dependency graph through the data block. From it, we retrieve the handles for inputs and outputs. We then retrieve values from the input’s handles, perform the computations, and set the values in the output handles.

Result

If your code is correct you can tell Maya about the location of your plug-in. Edit the Maya.env file (that lives in your Username\Documents\Maya\MayaVersion\ folder) and include the following line:

MAYA_PLUG_IN_PATH=”FolderWhereYourPluginLives”

Load your plug-in in the Maya plug-in manager, and search for the name of your node in Maya’s Node Editor. You can test if computations are being performed correctly by changing the inputs and hovering your mouse over the outputs to see the results. Now let’s implement a Neural Network inside this Python DG node.

Running a Neural Network model inside the DG node

Input and output attributes

We’ll change the input and output attributes of our node to match those of our neural network. The trained model has 4 inputs: sepal length, sepal width, petal length, petal width; and 3 outputs: probability of being type Setosa, probability of being type Virginica, probability of being type Versicolor. For simplicity we’ll not output each individual probability as a scalar value but one string with a list of all probabilities and another string with the name of the winner. This is how the init function should look:

def init():
    # (1) Setup input attributes
    nAttr = om.MFnNumericAttribute()
    tAttr = om.MFnTypedAttribute()
    kFloat = om.MFnNumericData.kFloat
    kString = om.MFnData.kString
    irisModel.sepalLen = nAttr.create('sepalLength','sl', kFloat, 0.0)
    nAttr.hidden = False
    nAttr.keyable = True
    irisModel.sepalWid = nAttr.create('sepalWidth','sw', kFloat, 0.0)
    nAttr.hidden = False
    nAttr.keyable = True
    irisModel.petalLen = nAttr.create('petalLength','pl', kFloat, 0.0)
    nAttr.hidden = False
    nAttr.keyable = True
    irisModel.petalWid = nAttr.create('petalWidth','pw', kFloat, 0.0)
    nAttr.hidden = False
    nAttr.keyable = True

    #(2) Setup the output attributes
    irisModel.win = tAttr.create('winner', 'w', kString)
    tAttr.writable = False
    tAttr.storable = False
    tAttr.readable = True
    irisModel.prob = tAttr.create('probabilities', 'p', kString)
    tAttr.writable = False
    tAttr.storable = False
    tAttr.readable = True

    #(3) Add the attributes to the node
    irisModel.addAttribute(irisModel.filePath)
    irisModel.addAttribute(irisModel.sepalLen)
    irisModel.addAttribute(irisModel.sepalWid)
    irisModel.addAttribute(irisModel.petalLen)
    irisModel.addAttribute(irisModel.petalWid)
    irisModel.addAttribute(irisModel.result)
    irisModel.addAttribute(irisModel.win)
    irisModel.addAttribute(irisModel.prob)

    #(4) Set the attribute dependencies
    irisModel.attributeAffects(irisModel.sepalLen, irisModel.win)
    irisModel.attributeAffects(irisModel.sepalWid, irisModel.win)
    irisModel.attributeAffects(irisModel.petalLen, irisModel.win)
    irisModel.attributeAffects(irisModel.petalWid, irisModel.win)
    irisModel.attributeAffects(irisModel.sepalLen, irisModel.prob)
    irisModel.attributeAffects(irisModel.sepalWid, irisModel.prob)
    irisModel.attributeAffects(irisModel.petalLen, irisModel.prob)
    irisModel.attributeAffects(irisModel.petalWid, irisModel.prob)

Model computation

To load the trained model and feed data to it you’ll need to load the Keras and Numpy libraries, so make sure you add the following code to the beginning of your Python plug-in:

import numpy as np
from keras.models import load_model

You can change the computation of the node to load the new input and output attributes that were created (1). Then we get the float values from the inputs and make Numpy floats out of them so Keras don’t throw warnings (2). Build an NP array to feed your model, just how it has been discussed in the previous tutorial, load the model from the ‘h5’ file and get the predictions (3). Once that is done you can set the ‘winner’ and ‘probabilites’ outputs (4).

class irisModel(om.MPxNode):
    '''A node computing the outputs of a Neural Network trained for the classification of plants.'''
    def compute(self, plug, data):
        # (1) Get data handles
        plHandle = data.inputValue(irisModel.sepalLen)
        pwHandle = data.inputValue(irisModel.sepalWid)
        slHandle = data.inputValue(irisModel.petalLen)
        swHandle = data.inputValue(irisModel.petalWid)
        winHandle = data.outputValue(irisModel.win)
        probHandle = data.outputValue(irisModel.prob)

        # (2) Get input data
        sepalLen = np.float32(plHandle.asFloat())
        sepalWid = np.float32(pwHandle.asFloat())
        petalLen = np.float32(slHandle.asFloat())
        petalWid = np.float32(swHandle.asFloat())

        # (3) Compute output
        plantData = np.array([sepalLen,sepalWid,petalLen,petalWid])
        plantData = plantData.reshape((1,4))
        model = load_model('C:\\Users\\gusta\\Downloads\\iris.h5')
        prediction = model.predict(plantData)

        # (4) Output value
        winHandle.setString(str(np.argmax(prediction)))
        probHandle.setString(str(prediction))

Performance wise there is a problem with this code. We are loading our model from disk at every evaluation. From a usability perspective, it would be better to load the model with a file browser and to have the winner’s name instead of its index. Let’s address these issues.

To load the model only once you can define it outside of the computation node. One easy way to do this would be to declare it alongside your global variables, as such:

# Declare global node params and other global vars
nodeName = 'irisModel'
nodeTypeID = om.MTypeId(0x60006)
model = load_model('C:\\Users\\gusta\\Downloads\\iris.h5')

Removing hardcoded file path

If you don’t want to hardcode the path to the model, and if you want to eventually make it a node attribute a better alternative might be to create a cache that only gets updated when the path to the model changes. In such a way:

# Implement a class for caching loaded models
class ModelCache:
    '''A interface for loading and caching Keras models'''
    filePath = ''
    model = None
    def getOrLoad(self,filePath):
        if filePath == self.filePath:
            return self.model
        self.filePath = filePath
        self.model = load_model(filePath)

If you choose to use this ModelCache you’ll need to create a global instance of it that can be called during computation time, like so:

# Declare global node params and other global vars
nodeName = 'irisModel'
nodeTypeID = om.MTypeId(0x60006)
modelCache = ModelCache() # an instance of our model caching and loading class
irisModel.filePath = tAttr.create('filePath', 'fp', kString)
tAttr.usedAsFilename = True

Adding class (plant) names

Finally, to output the name of the winning class instead of its index, all you need is a dictionary like this:

plantNames = {
    0:'Iris Setosa',    
    1:'Iris Virginica',    
    2:'Iris Versicolor'
}

In conclusion

When everything is ready and properly connected it should look like this:Iris model running live inside a custom Python DG node

In the first tutorial of this series, you learned how you can train a Neural Network to classify any group of scalar values in your Maya scene. You have seen how this works in the example of a procedural flower from which parameters can be used to classify its type. In that first tutorial, the classification was done from within Maya’s script editor. Now you have learned how to do the same computation directly inside Maya’s dependency graph. This means you can use your Neural Network’s output to drive any other node in Maya interactively! In the image above you can see these outputs influencing annotation objects, but these could be any other Maya nodes. I hope you can see the potential in this.

The outliner and a context menu in Autodesk Maya's user interface
Autodesk Maya´s node editor user interface

To declare your custom DG node, you’ll need to create a Python plug-in for Maya. This is just a .py file where you will declare the name of your node, its attributes and the computations it should perform. Here is the collapsed anatomy of a Python plug-in that implements a single DG node. .

Autodesk Maya's user interface displaying a custom DG node

In this example, we will use the Neural Network we have trained in the previous tutorial, that classifies types of plants based on the sizes of petals and sepals. If you haven’t followed that tutorial, I highly recommend you do so. . After changing the name and ID of our template node, you’ll need to change its input and output attributes.

Neural Net model implemented as a custom Python DG Node

You can load this input attribute as you would any other. .

Procedural flower and plant classification displayed inside Autodesk Maya's UI
You can download this template in the resources for this article
You can download the trained model and the Maya scene in the resources for this article
If you need a detailed implementation of this code please check the resources for this article
Anatomy of a Python DG Node
Running a Neural Network model inside the DG node
Optimizing performance and usability