Isotropix Forums

Scattering and Baking Fundamentals: Filesize and RAM

General Discussion about Isotropix and CG related topics

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Wed Dec 26, 2018 11:23 pm

Hi,
Another question related to the scattering.

I'm not sure how to use multiple decimation textures to control a point cloud.
How do I set up the nodes properly to utilize a scope texture as well as a gradient slop at the same time? It only allows me to plug in one at a time.

I tried using a texture blend node, but it doesn't seem integrate both decimations.

Thanks,

Ken
Ken_M
 
Posts: 59
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby pulverfass » Thu Dec 27, 2018 1:12 am

change the blend mode to multiply or add, that should work. By default the blend mode ist set to normal, so depending on the mix value it is either the one input or the other that's used, or a mix of both, determined by the percentage value.
pulverfass
 
Posts: 29
Joined: Mon Mar 05, 2018 1:10 pm

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Thu Dec 27, 2018 11:41 pm

Thanks very much, Pulverfass! :)

'Multiply' works great.
Ken_M
 
Posts: 59
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Thu Mar 14, 2019 2:40 am

Hi,
This is a question that's been bothering me for a while now...
point_cloud_move_v001.JPG



If I have a sphere and I use it as geometry support to scatter a point cloud the point cloud will scatter on sphere, however if I want to move or resize the sphere the point cloud just stays put in the original size/location.

Do I absolutely need to remember to parent the point cloud to the sphere before I start moving things or is there another way?

I'm not understanding the proper workflows for this situation.
On simpler tests I've made a 'combiner' of the sphere when I'm happy with the sphere's position and then I set the 'geometry support' of the point cloud to be the combiner instead of the original sphere.
Trying to move the combiner later results in me often forgetting to parent the point cloud to the combiner and I wind up with points that are offset from the move.

Is there a check box or something where I can just make sure the points stick to the sphere no matter how I moved/resized?

Sometimes on larger projects I don't notice that I've accidentally nudged a combiner when I should have moved the original shape until after I've waited a while for it to calculate a significant scatter.

I've gone over the documentation and a couple of the scatter tutorials again, but I don't see the answer.

Thanks for the help!

Ken
Ken_M
 
Posts: 59
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby dboude » Thu Mar 14, 2019 9:42 am

Hi,

A simple workaround consists to combine the object you want to generate points on and use this combiner as support for the point cloud. The combiner has to stay in the center of the world but you can freely move the source object(s) and the points will follow. But in this way, the points are re-generated each time you move the object. That's why it's better to parent your point cloud.

Personally, I use a script to create points on geometries that parent the point cloud wherever the geometries are.

python code

# Create a point cloud for each selected objects

ix.enable_command_history()
sel = ix.selection
count = sel.get_count()

if count == 0:
ix.cmds.CreateObject("PTC_", "GeometryPointCloud")
else:
for i in range(count):
target = sel[i]
namet = target.get_name()

if target.is_kindof("Geometry") or target.is_kindof("SceneObjectCombiner") or target.is_kindof("SceneObjectScatterer"):

ptc = ix.cmds.CreateObject("PTC_" + namet, "GeometryPointCloud")
pip = ix.application.get_prefs().get_bool_value("layout", "parent_in_place")
if pip == True:
ix.cmds.SetParentInPlace(False)
ix.cmds.SetParent([str(ptc) + ".parent"], [target], [0])
ix.cmds.SetParentInPlace(True)
else:
ix.cmds.SetParent([str(ptc) + ".parent"], [target], [0])

ix.cmds.SetValues([str(ptc) + ".geometry"], [target])
ix.cmds.SetValues([str(ptc) + ".point_count"], ["100"])
else:
ix.log_error("You didn't select a geometry !!!")
ix.disable_command_history()


Select the geometry you want points on and execute.

Cheers ;)
Démian
Isotropix
Technical Artist - Clarisse Specialist
User avatar
dboude
 
Posts: 881
Joined: Mon Jul 03, 2017 10:51 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Sun Mar 17, 2019 8:53 am

Thanks very much for the help and the script, Démian! I'll give it a try :)
Ken_M
 
Posts: 59
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Kramon » Sun Mar 17, 2019 4:11 pm

From my exp... i write all my point clouds to disc.. to alembic... that way i still get super quick snappy load times.. and save times.. and pointcloud is also read instantly fast... even if it's big.
Kramon
 
Posts: 82
Joined: Sat Nov 15, 2014 8:15 pm

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Sun Mar 17, 2019 10:10 pm

Thanks Kramon! Does this bake the point cloud into a specific shape? - like if I had a point cloud scattered to a sphere with a slope decimating the points does saving it to alembic lock it to that half sphere shape? or does this keep it procedural?
Is it similar to saving a baked scatter as an archive?

Thank you.
Ken_M
 
Posts: 59
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby dboude » Wed Mar 20, 2019 9:42 am

Hi,

When you bake a point cloud (which is procedural), Clarisse creates a point container with the points inside. It's the same container you will use when you paint particles. The points are statics.

In the same way, when you export a point cloud as Alembic points positions are store ---> static.

What I like to do is to bake my dynamic point clouds and use the baked points. Then I deactivate the dynamic point clouds(Ctrl - D). In this way, I keep my procedural setup if I have to tweak it but it is not evaluated when I open the scene.

Cheers ;)
Démian
Isotropix
Technical Artist - Clarisse Specialist
User avatar
dboude
 
Posts: 881
Joined: Mon Jul 03, 2017 10:51 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Sun Mar 24, 2019 4:19 am

Thank you, Démian!

point_cloud_ram_usage_v001.JPG


I've been playing with baking particle containers (hopefully I'm following your suggestion correctly) . As you can see I've been File -- Reference --- exporting context.

I'm up to 210 million points (all exported contexts) and ram usage is steady at around 16 - 18gb. That's great :D . A file like this would have continuously crashed if I was trying to keep point clouds procedural. It would have eaten my Ram up to the ceiling and then crashed.

Is it more efficient file-performance-wise when dealing with large point clouds to:
A) File -- Reference --> Export Context
B) Save Context containing particle container as Alembic, and then re-import Alembic point cloud

It seems easier for my brain to understand A... but I know Kramon mentioned exporting to Alembic. Are there advantages to that over just exporting the context.

I like how for 'exporting context' I can make it local again if I want to modify it particle paint. That's a good feature.

Thanks again for the help!

Ken
Ken_M
 
Posts: 59
Joined: Sun Jun 03, 2018 12:57 am

PreviousNext

Return to General Discussion
cron