Isotropix Forums

Scattering and Baking Fundamentals: Filesize and RAM

General Discussion about Isotropix and CG related topics

Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Wed Dec 05, 2018 12:30 pm

Before I knew what baking was I often left all my scattering unbaked. I had a point cloud with a decimation texture and then a 'scatterer' attached to that to scatter vegetation, etc.

I found this worked fine in tests when I only had a few scatters going on. Back in October I was trying out some tests with a lot of scattering. Grass, stones, plants etc scattered a hundred-million times each over a kilometer of groundplane.
Clarisse can handle a lot, but sometimes I found opening up the file again to work on it resulted in a bunch of scatters trying to calculate simultaneously and crashing my file.

I returned to the tutorials and found that you could bake scatters down which tended to lessen the load when the file is first opened, however I also noticed that for every 10 million - 100 million scatter I did my file size would increase almost a gigabyte for each one.

Do the more experienced users have thoughts on when it's better to bake a scatter vs leaving it to calculate? Is baking designed for smaller numbers? Are there hazards to having giant multi-gig clarisse project files?
I'm working on a 10GB photoshop file right now so I'm used to large file sizes.... but I wanted to make sure I wasn't breaking some fundamental and ruining my project.

After testing out/learning Clarisse for some months now I'm also wondering about how scattering effects RAM. Are the crashes I sometimes get during the calculations of large scatters related to RAM?

I have 32 gigabytes of RAM. I've noticed Clarisse will sometimes creep towards the maximum, but it doesn't ever hit those RAM errors I get in other 3D software. It seems when I go to render some of my heaviest scenes the RAM usage will drop to 22/32 gigs or less.

I also encountered a little bit of an issue with baking large scatters... I'm not sure if I'm doing something wrong. I have my D drive set up as my cache for Clarisse in preferences, but when I press bake there's suddenly a huge influx of data to my C drive. My C drive is a tiny SSD with the operating system on it so sometimes I've hit crashes from running out of disk space there during a bake. I try to keep it as clear as possible and the data seems to go away when I exit Clarisse or move beyond the baking.

Are there ways of handling particularly heavy files with lots of scattering going on in them? Maybe there's some tricks beyond the tutorials that people have picked up for handling very heavy scenes?

Is there an ideal amount of RAM for Clarisse or is it 'the skies the limit'?

Thank you,

Ken
Ken_M
 
Posts: 49
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby dboude » Wed Dec 05, 2018 2:46 pm

Hi Ken,

First of all, yes, the sky is the limit. The more RAM you have, the more complexity you can handle. However, we don't do magic. When a point cloud is generated each point has to be stored in RAM. So, if you generate 10 million points and then decimate 8 million of them, the evaluation has to store first 10M points and them decimate 8M. Finally, in your RAM, 2M points are stored. But in the meanwhile, Clarisse stored 10M points...
In that case, you should use the "point UV sampler" geometry node which generates points according to a texture. Basically, you should use the invert of your current decimate texture. You will save time on points generation.With this you won't creep toward the maximum.

Another thing. If you use a 1 million point cloud to generate 1 million blades of grass, Clarisse stores 1 million points. But if you make first a clump of 10 blades with a point cloud with 10 points and then scatter this clump 100 000 times, you will have 1 million blades of grass. But in the RAM 10 + 100 000 points will be stored. You saved 10x RAM.

Cheers ;)
Démian
Isotropix
Technical Artist - Clarisse Specialist
User avatar
dboude
 
Posts: 750
Joined: Mon Jul 03, 2017 10:51 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Thu Dec 06, 2018 5:08 am

Thanks Démian! I appreciate the tips.
I'll test out the 10 blade scatter - to 100 0000 to get a million. That sounds a lot more efficient on my system than what I was trying to do.

If I'm trying to cover a bunch of hills in dense grass this is the best way of doing it? Or should I start looking into the fur or other systems?

Thanks for the all help you've given me.

Ken
Ken_M
 
Posts: 49
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby dboude » Thu Dec 06, 2018 10:31 am

I would take that way, yes. Once the blades geometry loaded you will be able to shape your scatterer with a good feedback. If you use fur, you trigger the generation of strands each time you make a change.

For BG hills, you can use big clumps to cover bigger areas in order to optimize the overall points count.

Cheers ;)
Démian
Isotropix
Technical Artist - Clarisse Specialist
User avatar
dboude
 
Posts: 750
Joined: Mon Jul 03, 2017 10:51 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Mon Dec 10, 2018 3:03 am

Thank you, Démian.
That clears up a question I've had for a while.

Are there drawbacks to having a Clarisse file with lots of baked scatters that is several gigabytes in size? After following your advice -- the scatters that I'm actually able to bake are making my file slightly more stable.

What sorts of file sizes are you hitting in your work? -- or that you you've seen other people hit for large environments? I know for the tutorial files that I was following along with I don't think I ever went over a few megabytes in size.

Sometimes when my work crashes I find Clarisse will immediately create a crash auto-save file. Is this more difficult for Clarisse to do this with file sizes over a few gigabytes?
I have experienced some auto save files (sometimes occurring during a crash) where I open them and they completely empty.

I'd had photoshop and Zbrush files get up into the 15GB range. I'm just making sure that doing the same for Clarisse won't result in an unstable project file?

Thanks for all the tips and suggestions so far,

Ken
Ken_M
 
Posts: 49
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby dboude » Mon Dec 10, 2018 10:53 am

Hi,

I never dealt with 15GB files in Clarisse. However, I always try to keep my point clouds procedural (lightweight scene) and use the scatterer baking only when I need to make precise tweaks on it.

A good thing to do is to create references with your baked scatterers. So the data weight of your main scene stays lightweight. To do this, put one or more baked scatterers in a context and go to File > Reference > Export Context. Clarisse will create a file with your baked scatterer inside it and make a reference of it in your main scene. So the data of the embedded points of your scatterer stay external to your project file. Now, the auto-save can do it job much faster.

Then if you want to edit the referenced baked scatterer you can :
- make an override on it
- or make the reference local > make your tweak > re-export as reference

Cheers ;)

PS : for the empty autosave files, which version of Clarisse do you use ?
Démian
Isotropix
Technical Artist - Clarisse Specialist
User avatar
dboude
 
Posts: 750
Joined: Mon Jul 03, 2017 10:51 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Tue Dec 11, 2018 2:36 pm

Hi Démian,
Thanks again for the awesome information. Sending baked scatters to 'references' is a great solution. Thank you for that.
I think that's one of my new favorite features! That's great. I'm going to have to try that out more. My current file is around 1.5GB with a bunch of baked scatters... so sending those to references should clean it up nicely. Thankyou.

The blank auto-save files were occurring in 3.6...... I think and on a different computer. I was borrowing a friend's computer back then as well. Now Clarisse just disappears and blinks to desktop without an error window if it happens to crash during a bake.
I think I can assess it's a 'not enough ram' situation because the crash occurs instantly as soon as the RAM graph in my task manager bumps the ceiling.

I guess the issue I'm having now is trying to cover a large portion of an aerial matte painting shot with grass/vegetation. If I leave the scatter procedural it sometimes takes a while to generate each scatter. I'm cautious about having too many procedural scatters going on at once because that has tended to crash other test files I've done.

I used your idea of creating large groupings of grass prior to scattering so I'm scattering swaths of combined grass/foliage that's several meters wide, but I'm still running into ram crashes.

Is there any way to cover an area with multiple smaller baked scatters? - I wanted to use your tip about sending baked portions to references to ease up on the system resources... but I can't bake large-scale scatters because the ram spikes up and then crashes Clarisse.

If I turn down the scatter count my hills start looking really sparse....

My original goal was to scatter the vegetation evenly, bake it, and then use the particle paint to erase paths and roadways in the grass. (I'm building a medieval town type of scene) I'm not sure if there's a better workflow for this.

I have an i7 intel quad core with 32gb of ram. I would love to fit more ram into this machine, but i can't. It seems like it's the maximum. I bought the new computer prior to learning of Clarisse's existence so initially I was trying to find a machine more in my price range that had a decent amount of vram for Octane. Now I kinda wish I went with the one that had more ram slots as I'm not using Octane anymore.

Thanks for all your help so far,

Ken
Ken_M
 
Posts: 49
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby dboude » Wed Dec 12, 2018 10:47 am

Ken_M wrote:Is there any way to cover an area with multiple smaller baked scatters? - I wanted to use your tip about sending baked portions to references to ease up on the system resources... but I can't bake large-scale scatters because the ram spikes up and then crashes Clarisse.


Hi Ken,

What you can do is to isolate parts of your scatterer with scopes to bake it in multiples smaller part. Say 3 - 4 parts. And then scatter those parts again to repopulate your layout.

Here is an example scene in which I control the decimation of scatterer instances with scope texture/object.
Scope_Scaterrer_Extract.project
(42.32 KiB) Downloaded 22 times


If it's hard to find the connection between objects uses those button to navigate between the connected objects :
Input_Output_Connections.PNG
Input_Output_Connections.PNG (8.84 KiB) Viewed 613 times




The idea is to place the scope on an area you want to bake, then bake the scatterer. Replace the scope on another area, re-bake, etc etc ...

Cheers
Démian
Isotropix
Technical Artist - Clarisse Specialist
User avatar
dboude
 
Posts: 750
Joined: Mon Jul 03, 2017 10:51 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby Ken_M » Thu Dec 13, 2018 3:22 pm

Thank you very much for the file! Your solutions work great.

It also taught me how scopes work. Bonus! :)

Thank you! I appreciate you taking the time to set up a file for me demonstrating everything.
Ken_M
 
Posts: 49
Joined: Sun Jun 03, 2018 12:57 am

Re: Scattering and Baking Fundamentals: Filesize and RAM

Unread postby dboude » Thu Dec 13, 2018 4:19 pm

You're welcome ;)
Démian
Isotropix
Technical Artist - Clarisse Specialist
User avatar
dboude
 
Posts: 750
Joined: Mon Jul 03, 2017 10:51 am

Next

Return to General Discussion