appleseed Users Forum

Appleseed archviz test


First of all I congratulate you because of this webpage. Love it. It is beautiful, smart and very professional.
Secondly: this software rocks!! I suspected that when I played around with it few years ago, that after a few releases it will tur out great And here we are :smiley: This years GSoC really speeded up things.
And big thanks for Joel for the exporter!!
One thing is disturbing (for me at least) that appleseed doesn’t support camera shift, which is very much needed in many cases. If there’s a wishlist, I’d make a wish for that!
Other than that appleseed is fast, and easy to use through the exporter, great work.
Here’s a test render. The scene was originally set up for Vray, I transformed it to fit appleseed (oh, yes Joel, is it posible to have group instances supported in the addon? I had to particle-system the bushes, because instances didn’t work) and rendered it. I faked camera shift with a black plane, which is sampled for the least time to speed up rendering then cropped the image in compositor. It’s made with adaptive sampling min:32 max:256.
Hope you like it C&C are welcome!!!
Regards, Zsolt

Hi Zsolt,

Very nice work! The grass is a bit too straight and dark (no transparency?) but the building looks very good.

Thanks for the kind words about appleseed. I’ll open an issue for camera shift (edit: here we go).

Keep up the good work, and please keep us posted about your work!


Thanks! I agree on the grass straightness, the darkness is the result of compositing, I don’t say it was done on purpose, but I kind of liked it so I decided to stay with it. If I have some spare time I’ll add wind to grass, trees are already armatured for animation, and make a small animation test. Rendertime in this open scene can be kept really down with Appleseed, actually exporting the geometry is a bit slow, but that is understandable, quite lot of polys, that Blender have to fight with.
And thanks for the camera shift in advance ! :smiley:


Very cool! :slight_smile:

The color gradiant is maybe a bit strong and the grass is too vertical.

I also think the sky has a weird “blue” (result of the strong color gradiant?).

Good work! :slight_smile:

Hey! I’m playing around now with an interior scene, which needs more optimisation than the before one. Maybe my questions will be stupid, but please help me in it. The scene is not lit very well, so not many photons reaching the area, because of this if I do not sample the scene enough there will be black dots (I guess pixels without any photons or rays or whatever…) so I increased the samples like for example min:64 max: 2048, but this slows the render to a crawl. Now, my question would be that do antialiasing passes work like in yafaray: so I can save some time if I set samples to a lower value and make more passes, so the adaptive sampler may leave out the already proper pixels from the second pass and works only on the problematic ones to reduce noise? Is it correct? So lets say instead of one pass of min64 max 2084, I can set 4passes of min16 max 512 or something like that, would it be faster?? Thanks in advance!!


Which lighting engine are you using in appleseed, Unidirectional Path Tracing or SPPM?

Can you show the renders you are currently getting on this interior scene?

Using multiple passes and fewer samples per pass will not speed up rendering.

Hey! Pathtracing. Well the noise is more in the patio where there is glass, water, and bumby-glossy corten steel… Hmm… SPPM would be better? This image is not that big and it took almost 2 hours to render. After all the interior not that noisy, but it took time. The main problem is patio as I’ve mentioned. By the way thanks for the fast reply! :smiley:

Where are the light sources?

Maybe you can post that scene so I can have a look?

Sure I can post it when I get tó my machine, but it’s nőt needed I guess at áll, cause it’s dead simple: the whole scene is lit b the enviroment exr image. The lámpa hardly cast any light through their diffuse transmitter covers. So it’s IBL only. Is it not enough?

Sure, a sky is enough to illuminate a scene. So do I understand correctly that light from the sky has to go through the glass windows in order to illuminate the interior? If so, that would be a difficult situation for most renderers since the interior is lit entirely by caustics.

You should try disabling the other light sources to make sure they aren’t the ones causing the noise outside. Alternatively you can assign the environment EDF, EDFs and lights to distinct render layers and check which contributes the most noise.

I’m also curious to know how you achieved the diffuse transmitters on top of the light?

Hey! The lighting is not through the glass, cause the windows you can see reflecting on the patio’s window are simple holes without glass planes. The patios’s glass is a glass with ior 1.0 so there’s no refraction blended with a specularity layer. So a specular BTDF mixed with specular BRDF. Anyway caustics are switched off. The noise in the patio is caused by the small lamps… It’s a pity. You were right. The translucent lamp shader is a diffuse BTDF with transmittance=1 + a layer of kelemen BRDF. Ok fireflies are off with the light. :smiley: Any advice to speed up things? I’ve tried to use sppm, but I totally don’t know how to set it up properely. :smile:

Still very noisy path, adaptive min:250 max:2500 about 100 minutes of rendering. Now I used physical sky suspecting that will give smoother result. the water and the doorframe is still noisy though. Any idea for getting better result? Shall I increase the samples more?

Ok. I’ve started to render it out in progressive mode with uniform samlpe. I’ve let it render for 3 hours. Still not happy with the quality an 3 hours is a lot of time for this small image. Anyway here’s the render.

I wonder if it’s not glasses that break everything. Have you any render without glasses? :slight_smile:

Nice work @jarmaizsolt! I’m still interested in having your scene to figure out why performances aren’t so good.