-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
have a gltf file and want to play back its animation on a video stream #63
Comments
I know Jellyfish / membrane has a demo videoroom for webrtc, but it's unclear the earlier steps. |
Hi @fire!
You have to fragment data you want to send. In case of jellyfish, we only allow for WebRTC and RTSP ingress so you would have to generate such a stream and send it to the Jellyfish. In case of Membrane, you could send data on your own using e.g. udp or tcp but in case of udp you have to deal with packet-loss and congestion control on your own so I don't think it's a good idea.
We don't support this in Jellyfish, at least at the moment. In Membrane, you can read some image for example png, decode it and treat as one raw video frame. Then, just emit it with the frame rate you want. The other way would be to do similar thing on the client side and send it via WebRTC to the Jellyfish. We do something similar in our dashboard. We have very simple static image, we create MediaStreamTrack using canvas with some framerate and send as WebRTC stream to the Jellyfish.
I'm not sure if I follow you. What would you like to render in blender or Godot? |
You mentioned the approach of read some image for example png, decode it and treat as one raw video frame. Then, just emit it with the frame rate you want. So one way is that for example blender would take the gltf, do processing and emit a 4k output png frames of for example 500 frames of pngs one by one. Edited: The link I attached is a 3d animation |
I'm interested in WebRTC ingress. Do you think REST ingress is a good idea? I think it's possible to treat the blender render frames as a membrane pipeline, but it needs the large gltf file, some parameters and blender. |
I'm interested in coding some of these features if you're able to draft a design. |
Could you please describe your use case more precisely? What do you want to achieve? Where is the client side and where the server side? |
Blender Render Frames as Membrane PipelinesequenceDiagram
participant C as Client
participant S as Server
C->>S: Uploads gltf file
S->>S: Loads gltf file
loop For each frame in model.frames
S->>S: Processes frame with parameters
S->>S: Saves processed frame
end
OSC Delivered Person Animation Frames from Motion Capture SolutionssequenceDiagram
participant GMP as Google Media Pipe
participant OSC as OSC
participant BR as Blender Render
participant T as Twitch
loop For each person in motion capture data
GMP->>OSC: Sends person animation frames in real-time
loop For each frame in animation frames
OSC->>BR: Translates frame into membrane pipeline format
BR->>BR: Processes frame as membrane pipeline in real-time
end
BR->>T: Streams processed frames as video in real-time
end
|
I'm trying to discover a design for this sample problem.
See also https://lox9973.com/ShaderMotion/player-gltf.html that takes a video capture and playbacks a 3d character motion video frame.
The text was updated successfully, but these errors were encountered: