Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

User-defined flow maps for CPU particle direction #14744

Open
PatrickRyanMS opened this issue Jan 25, 2024 · 1 comment
Open

User-defined flow maps for CPU particle direction #14744

PatrickRyanMS opened this issue Jan 25, 2024 · 1 comment
Assignees
Milestone

Comments

@PatrickRyanMS
Copy link
Member

PatrickRyanMS commented Jan 25, 2024

We currently have three ways to determine motion path for particles in our standard particle system without using a custom update function:

  • Direction1, Direction2 set on emitter
  • Normal for shape or mesh emitters
  • Perturbation of the particle path with a noise texture

With these options, we can make particles take many types of linear paths, especially when you add in the gravity parameter. However, if you want particles to flow around certain objects or in a particular path, like in a circle, we need a custom update function. One thing we can do to get a bit closer would be to enable a feature where users can create their own flow map to have particles take their directions from.

Here are some considerations:

  • The flow map could be used in screen space, so the particle can determine where they are in the scene and take their vector from the corresponding pixel in the texture
  • Using an updatable texture could enable things like particle attractors since we could use the values in the texture to either attract or repel based on the vector data stored in the texture. Think in terms of updating the texture with the mouse position so a gradient can be positioned in the texture where the mouse is on screen to affect the flow of the particles based on cursor position.
  • Maybe there is a world where node material can serve as a source for this. We have the ability to make a procedural texture in node material already, can we then use this to drive particles as the flow map?
  • There is another option where we may want to import a flow map baked from a fluid simulation or even a hand painted flow map, so the ability to either assign a loaded texture, or be able to load that texture into node material for further manipulation would be great.
  • I think screen space would be the most normal use case, but there is also the notion of UV space for mesh emitters. Let's say we have a particle system on a mesh in local space, can I use the flow map in UV space to make particles flow along a mesh in a particular way? Let's say you have a ground mesh and want fireflies in certain areas, you might have a flow pattern for flying in X and Y, and the Z component can give a direction to keep the particles moving up and down along the mesh.
  • A strength component would also be useful. All of our emitters have a emit direction anyway, having a flow strength per particle (either static, random, or gradient) would allow us to have particles be influenced by the flow map in a procedural way. Maybe we have particles flowing a certain direction and then a "wind" swirls them around suddenly before they relax back into their original path. This could be ramping up the strength of the flow map and then ramping down again.
  • Additionally with screen space for flow maps, maybe there is a concept of projection direction as well onto a particle volume. This would allow a flow map to affect a specific volume of particles as if the texture is projected from a specific direction regardless of the position of the camera. This would be useful for environmental particles where the camera may be overhead.
  • There is also a concept of a 2D experience using a flowmap for particles. Maybe the camera is fixed and we only care about (X,Y) vectors. This means that a user could channel pack two flowmaps into one texture load by placing two different 2D flow maps in R and G then B and A. A system like node material would give users the flexibility to do something like this and then unpack the textures in a node graph before generating the final texture to be used in scene.
  • Additionally, there is a notion of strength of a flowmap. The flowmap could be taken as the actual motion of the particles, but there is a use case where a user may want to vary the strength of the flowmap based on position. Maybe stronger in the center, for example. In this case, using the alpha channel as a 0-1 strength scalar would be useful. Again, if we have the ability to use our node tools, being able to update the strength by modifying the alpha channel would be possible and unlock a lot of potential.

Much of this was just off the top of the head design, so the bullets are a little out of order. But hopefully it helps form the basis of what the feature could be. Ping with questions!

@PatrickRyanMS PatrickRyanMS added this to the Future milestone Jan 25, 2024
@thomlucc thomlucc modified the milestones: Future, 8.0 Mar 12, 2024
Copy link

This issue has been automatically staled because it has been inactive for more than 14 days. Please update to "unstale".

@github-actions github-actions bot added the stale label Mar 27, 2024
@deltakosh deltakosh removed the stale label Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants