-
Notifications
You must be signed in to change notification settings - Fork 3
How To
If you want to set different hotkeys or buttons to toggle blendshapes, animations, or anything else, you can do so with either the Hotkey
node or the WebSocket
node.
The Hotkey node can be set to trigger with specific keyboard inputs, but it will not work if the program is in GPU Priority mode or running with admin privileges.
The WebSocket node triggers with specified messages received via WebSocket from other programs, like the StreamDeck or Streamer.bot. To use this node, you will need to activate the WebSockets Receiver under Settings > Misc > WebSockets
.
To set up buttons on your StreamDeck that send websocket commands, you will need to install a WebSocket plugin from Elgato's Marketplace. Then make a new WebSocket Message button. Put the VNyan address in URL, and what you want your toggle message to be in Message. This message can be anything, it just needs to match what will be put in the VNyan node.
The VNyan address will be the following, with [PORT] being replaced by your port number:
ws://localhost:[PORT]/vnyan
Below is an image of what this will look like in StreamDeck:
In VNyan, make a WebSocket Command node and put the same text that you entered as the Message. Now, anytime that button is pressed, this node will trigger. You can now wire it to what you need. Below is an example of a WebSocket toggle to toggle the Fun blendshape clip:
Tutorial on using WebSockets to set up VNyan buttons on StreamDeck: https://youtu.be/EEgqxriPWMM?si=9_DEqX30DN6LoTvl
If you want your graph to run continuously, you can use a Timer node that triggers itself with a Start Timer to create a loop. This can be used if you want to continuously animate your models bones with Rotation nodes, or if you want to have something trigger when a blendshape gets past a certain value.
Adding an Application Start node will have this loop begin when VNyan starts up. Setting the Milliseconds to trigger to 0 ensures that this loop will run every frame.
If you want to create a toggle for your graph to turn parts or all of it on or off, you can use a toggle system like in the below image.
Set up a parameter to be a "Flag" which will control the graphs on and off state. By using a Parameter Filter node that reads this flag parameter (the green node on the bottom right), we can control whether anything connected to this runs or not. Using the Application Start node we can make sure that the initial state of our toggle is "on".
We can toggle this parameter to flip between on and off states using a setup like the four nodes in the middle of the image. Anytime the Toggle node is triggered, it will check the current state of your flag, and flip it.
With this set up, you can use a Call Trigger node (black node shown in the example) in other graphs or linked to a hotkey or websocket to toggle your graph on and off.
VNyan has a ChatBlob object as a droppable by default that you can use. These blobs have a text field above them that you can use to show usernames of chat members who trigger the droppable.
The object looks for text from a text parameter called <redeemer>. If this is not set, then you will see the text "<redeemer>" above the object. To change this, you will need to use a Set Text Parameter node beforehand to change the <redeemer> parameter to whatever you want.
If you want to show the username of someone who triggered a chat blob drop (like with a twitch redeem), you could use a setup like below. The Channel Points node sets a builtin parameter called <username> with the redeemer's name, so you can set that for <redeemer> to show usernames above the Chat Blob.
VNyan offers a Rest API that listens to POST/GET-messages sent to address:
http://localhost:8069
The body of the POST-message should be json in following format:
{
"action": "ActionName",
"payload":
{
"key1": "value",
"nyan": "nyaa"
}
}
The action can be named anything. In VNyan you'll then need to add a API Message-node and type the same action name to the Action-field. The node will then trigger whenever a message with the given action is sent. The payload-part of the json on the other hand contains a key/value pairs of strings that can be read into a VNyan dictionary. This allows passing custom values to the application.
After VNyan's release 1.4.1 it's also been possible to use GET-messages. For GET-messages everything should be passed on the query string. The action name must be set in action-parameter. Every other parameter will be added into the output dictionary as key/value pairs. GET-messages do not output the json-data. You should use the following format:
http://localhost:8069/?action=ActionName&value1=nyaa&nyan2=NyanNyan
You can an extension for SAMMI to connect to VNyan by Swolekat here: https://github.com/swolekat/vnyan-sammi-extension/tree/v1.0.0
To send commands to VNyan from Mix It Up, you can use this SendWebSocket plugin by LumKitty: https://github.com/LumKitty/SendWebSocket Place the exe file somewhere in your file directory. In Mix It Up, you can use this by creating an Action for an External Program (under Action Groups). Point the program path to the exe file, and send the following as your argument (replacing [WEBSOCKETMESSAGE] with the message you want to listen for in VNyan:
ws://localhost:8000/vnyan [WEBSOCKETMESSAGE]
The Twitch Integrated Throwing System by remasuri3 is a VTuber program for throwing 3d objects at your model. VNyan has it's own built-in throw system, but if you would like to use this as well you can use VNyan's VMC Layers.
T.I.T.S. comes with a VSeeFace connection mode that will work with VNyan as well.
- In T.I.T.S., start the VSeeFace Connection and copy (or change) the port number
- In VNyan, put the port number it into one of the VMC Receiver settings.
- Set the Track Spine slider to 0.5 and all other sliders to 0.
- In either Webcam Tracking (if you use webcam) or ARKit Tracking (if you use phone tracking), set the Apply Spine slider to 0.5.
That should be it and you will see your model react properly!
Notes
T.I.T.S. applies its effect on your model's Spine and Chest bones, which is why we only need the Spine Slider applied. Webcam and ARKit tracking overrides VMC tracking, which is why we need to turn the sliders down on those tracking systems as well.
Node Graphs can override VMC tracking too, such as if you have a pose, animation or MMD file playing. In those cases, you would have to turn down the Apply Spine sliders for those nodes as well.
Bone Rotation nodes will also override this tracking, but Additive Bone Rotation nodes will not.
Here are a bunch of example graphs for you to use as reference in your projects! Feel free to adapt or modify any of these
This is a sample graph that reads the amount of bits, splits them into 1 bit, 100 bit and 1000 bit throwables.
Replace the items in the 3 Throw Item-nodes with whatever you want to throw. Keep in mind that 99 bits will throw 99 items while 100 throws only 1 of the 100bit items so while the number might be correct the effect is much less. You could also set the min amount to trigger to 100.
This graph takes in a string as <input>, lets you set a word in <wordToFind>, and counts how many times the word appears in your input. Could be used to filter chat messages to see how many times certain key words appear.
This graph uses the motion of your model's right hand to make your connected Lovense device vibrate.
There are two components to this. The first (purple nodes) uses a loop to check the change in position (calculated as [DeltaPos]) of your right hand bone. It uses a buffer parameter to have a gradual increase and decrease depending on whether your hand is in motion or not. It has a polling rate of 100ms so to give time to detect the motion; if we used 0ms here, then the [DeltaPos] would only ever be very small and change very rapidly.
The second component (red nodes) is a separate loop that applies the calculated [MovementBuffer] to vibration strength.
VNyan - https://suvidriel.itch.io/vnyan