appleseed Users Forum

Battle against Сycles

figured out.
after mirroring the objects turned out to be a negative scale. LEDs are shining inside

The cycles in the glass, like plastic.
in the glass Apple Seed magic and beauty

1 Like

if I turn on the Bump, an artifact appears. how to win?

test scene:!FxImnQKC!969AVsbTh8zQFAiewHUu6Vbq5bVgftyeY-CekOTK8sE

Great to see you making progress on this project and getting pleasing results out of appleseed.

Let’s try to figure out the cause of these artifacts. I’m trying to repro with the scene you sent on Discord.

I express my deep gratitude to you in finding a solution.
Everything worked out.
Indeed, the problem was in the reflections.
For complete certainty, I re-modeled the problem object. It seems that now everything is all right.


Hi. Apologies for the late reply.

You have full control over the encoding/transfer function, the RGB primaries and white point of the textures, and of the RGB primaries and white point of your working/rendering space with chromatic adaptation when changes in white point require so (Bradford CAT).

The default working space is scene-linear wtih sRGB/Rec.709 RGB primaries and D65 white point.
Presently we only support these primaries and white point, but there is work in progress to support ITU-R BT.2020, and ACEScg/AP1, or even user set primaries and white point.

Leaving that aside for now, assuming your texture was also encoded with these primaries and white point, you just choose the transfer function to apply in the “input transfer function” section, and this will apply the correct transform.
See also

Have in mind that textures encoded with the sRGB transfer function are not encoded with a gamma of 2.2. The sRGB transfer function is a piecewise function with a linear part in the shadow areas, and a gamma curve with an exponent of 2.4 in the rest of the function. The average slope of this piecewise function is a gamma of approximately 2.2, but what you want to do, is to use the correct transfer function for your sources.

There’s plenty of information in this great article by Thomas Mansecal,

The reason the CMS is not enabled by default is that ideally you would want to preprocess your textures with something like OpenImageIO’s maketx, in order to build mipmaps, optimize opaque, monochrome textures, and while doing so you could (and should) convert the textures from the color space they are encoded in, into your rendering/working space.

By doing this you move (expensive) color transformations out of the shading stage. You’ll notice considerable performance improvements when for instance, rendering UDIMs, where you can have a large number of UDIM textures of very high dimensions.

If we consider the assumptions above, sRGB/Rec.709 RGB primaries and D65 whitepoint, then maketx can provide you with a transformation from sRGB encoded textures into scene-linear encoded textures via its built-in “srgb” and “linear” color space transformations.
You would just invoke it with something like

maketx .... <my options here> ... --colorconvert "sRGB" "linear" <filename>.tif

Remember that if you are converting a sRGB encoded texture to scene-linear encoding, you have to take into account the destination bit depth in order to avoid loosing color precision. Converting a sRGB encoded 8bpc TIFF into a scene-linear 8bpc TIFF will result in color precision loss unless you increase the bit depth of the destination TX file.

maketx .... --colorconvert "sRGB" "linear" -d uint16 <filename>.tif

This would get you a properly “linearized” (16bpc) texture, and you wouldn’t need to perform the color transformations at the shader level - it was all done offline.
If you want more information, this is also a great read:

I hope this helps.


Also note that blenderseed includes a texture conversion tool:

1 Like

I am very glad that so many smart people have appeared in my topic.

1 Like

Texturing is not the strongest side of Apple Seed. Textures are not visible in the viewer. I have no idea how in such conditions to assemble a scene with a large use of textures, that they would need to be converted.
Materials and textures are not visible. very hard to work with. here Cycles win mercilessly

but I see how strong your team is. You will definitely do this. I’m sure.
And I will try to fill out the Apple Seed portfolio with excellent work. Those with whom I do not have a non-disclosure agreement (which is often the case in my practice)

1 Like

Thanks for the kind words and honest feedback, very appreciated. Contributions to our portfolio are also most welcome!

I thank everyone for participating in this glorious battle!
I think we won!

This is not a synthetic comparison.
This is the result of one project in the same hands of one user.
in Cycles I maximally tried to achieve a result. In the Apple Seed exactly the same tried.

Cycles are slightly processed in a gimp.
Apple seed - pure rendeor.

Maximum size:!VxI1QYpI!eaBsXO4XQzPEeEvJvwG4ZLkkI39IlukA40BKBYM4e8I

I found the Apple Seed a strong render. Perhaps the best.
Yes, the developers have yet to do something. But what already exists allows us to make the result of a higher class.
Huge respect for the development team.


Thanks for the kind words, and thanks for sharing your work!

I want to make my observations. From the perspective of a novice.

  • Display of textures and materials on objects in the preview window.
  • Material ID and Object ID. Without the ability to have masks for objects, compositing becomes impossible. Without compositing there is no power. Luxrender recently implemented the selection of objects through colors. It is very convenient. It does not need to be customized in advance. This is better than implemented in Cycles.

These two things, in my opinion, are the most critical for artists. Of extreme importance.

-Node to save color. When setting up the material is very necessary. This is the best implementation in Luxrender.

  • There is no possibility to copy the material. The lack of this basic function is surprising.

  • Ray Bias. Absolutely not clear parameter.
    Not for artists.

  • I really liked the ease of rendering settings. this is for artists. it’s great.

  • Adaptive sampling. Very good with the task. Eliminates the need to separate rendering layers for different parameters of the sampler.

  • I didn’t like the denoiser. But in truth, I have met only 1 good denoiser - in Octane. The Luxrender is good, but all-consuming system resources.

  • Liked the environment types. A wide range for all occasions. Especially liked the type of “gradient”

1 Like

The answer came from the customer.
I note this is the 25th version!

hard to give any comments) Everything looks pretty cool)
The only embarrassment is that the button is dark.

Well, that was the last shot in this battle.

Now I have to make 11 more such models, and 5 renders with each.
Where is my coffee with honey?


I’m not sure what you mean by this. Textures and materials are fully usable in the preview window and interactive rendering.

in my understanding, this cube should look pink. but not black.
Any material is displayed in black.
The same with the textures in the view mode, all objects are black.

Ah, you’re talking about the 3D viewport.

Coloring objects according to the material’s base or diffuse color is something we’re doing in the 3ds Max plugin, maybe we could do the same in Blender? Going beyond that by showing textures would be nice but I don’t know how much work this is.

Increase the lamp intensity to 75 or 100. The default is very low

edit: or you mean the viewport ‘material’ mode? That’s a trickier proposition. That involves transferring appleseed shaders into the GLSL shading system that the material view uses. That’s way above my coding skill level at this point.

Oh oh oh…
oh oh…
I can not find the words.
which at least are in the English language

1 Like

Yes, for that mode they wrote custom GLSL shaders that mimic the appearance of Cycles materials.

And as far as I know that system is not exposed to Python so we wouldn’t be able to access it anyway.

Unfortunately, render engine addons face some serious limitations when integrating into Blender. Thanks to Simon Wendsche we managed to cheat around the slow mesh export by using a backdoor into the raw data structures, but other systems don’t have the same tricks.