Interactivity / live streaming for use in explonatory / educational apps #2775
Replies: 3 comments
-
Also, the above description could be consider as 0. level of interactivity. Then there could be 1. level, which would integrate some interactive features as panning and zooming from dash/plotly library. So that user can visualize some function, see how it reacts to parameters change and zoom in/out, move, etc. for better usability. No, how could this be achieved? |
Beta Was this translation helpful? Give feedback.
-
Hi @AdamJel. As you can see by searching the repository or the discord channel, we have discussed and considered interactivity a number of times. As you have already seen, people are currently working on a livestreaming feature. However, in the case of interactivity, the general sense is that this cannot be done with the current technology underlying manim. A major rewrite would be necessary to allow for that. @eulertour is working on something that will eventually lead to that. I will let him speak about that. If you are interested in implementing such features, any contributions are more than welcome! |
Beta Was this translation helpful? Give feedback.
-
I'm currently working on both a JS client for displaying graphics with webgl and the interface in manim to communicate with it. It will lay the groundwork for the type of interactivity you're describing by allowing animations to play asynchronously and in real time, but it will be a bit more work after that to get to the point where the graphics will be reflected from a live terminal. |
Beta Was this translation helpful? Give feedback.
-
Description of proposed feature
Hi, IMO it would be great, if the output of this library was not only video, but some sort of a permanent live stream, which could be use for interactivity. By interactivity I mean that the output would change based on change in code. That could be used in terminal (as in #441 ), or in jupyter or dash/plotly. That way, easy python-only apps could be build, so that learners don't have to only passively absorb the video, but they could interact and thus learn through their own exploration.
Could something like this be done? What are possible ways to implement this?
Thank you
Beta Was this translation helpful? Give feedback.
All reactions