Following concerns raised in the commit that changed the name initially,
rB2e19509e60b39837, it makes more sense to keep the "Surface" name for
this node because it has a specific meaning that should not be confused
with other types of subdivision.
This commit includes nodes to build the following primitives:
- Cone
- Cylinder
- Circle
- Cube
- UV Sphere
- Ico Sphere
- Line
- Plane/Grid
In general the inputs are the same as the corresponding operators
in the 3D view.
**Line Primitive**
The line primitive has two modes-- adding vertices between two end
points, or adding vertices each at an offset from the start point.
For the former mode, there is a choice between a vertex count
and a distance between each point.
**Plane Primitive**
This commit includes the "Plane" and "Grid" primitives as one node.
Generally primitives are named after the simpler form of the shape they
create (i.e. "Cone" can make some more complex shapes). Also, generally
you want to tweak the number of subdivisions anyway, so defaulting to
plane is not an inconvenience. And generally having fewer redundant
base primitives is better.
**Future Improvements**
A following patch proposes to improve the speed of the cylinder, cone,
and sphere primitives: D10730. Additional possible future improvements
would be adding subdivisions to the cube node and rings to the cone
and cylinder nodes.
Differential Revision: https://developer.blender.org/D10715
In the current implementation, cryptomatte passes are connected to the node
and elements are picked by using the eyedropper tool on a special pick channel.
This design has two disadvantages - both connecting all passes individually
and always having to switch to the picker channel are tedious.
With the new design, the user selects the RenderLayer or Image from which the
Cryptomatte layers are directly loaded (the type of pass is determined by an
enum). This allows the node to automatically detect all relevant passes.
Then, when using the eyedropper tool, the operator looks up the selected
coordinates from the picked Image, Node backdrop or Clip and reads the picked
object directly from the Renderlayer/Image, therefore allowing to pick in any
context (e.g. by clicking on the Combined pass in the Image Viewer). The
sampled color is looked up in the metadata and the actual name is stored
in the cryptomatte node. This also allows to remove a hash by just removing
the name from the matte id.
Technically there is some loss of flexibility because the Cryptomatte pass
inputs can no longer be connected to other nodes, but since any compositing
done on them is likely to break the Cryptomatte system anyways, this isn't
really a concern in practise.
In the future, this would also allow to automatically translate values to names
by looking up the value in the associated metadata of the input, or to get a
better visualization of overlapping areas in the Pick output since we could
blend colors now that the output doesn't have to contain the exact value.
Idea + Original patch: Lucas Stockner
Reviewed By: Brecht van Lommel
Differential Revision: https://developer.blender.org/D3959
Currently (in geometry nodes) you can delete the group input or group
output nodes with no way to get them back without copy and paste. This
adds them to the "Group" submenu of the add menu so at least there is
a way to add them back.
Additionally, these nodes are moved to the "Group" submenu for all node
editors. This makes sense since they are not like the other input or
output nodes, they really just relate to how groups are organized.
Differential Revision: https://developer.blender.org/D10241
The Attribute Convert node provides functionality to change attributes
between different domains and data types. Before it was impossible to
write to a UV Map attribute with the attribute math nodes since they
did not output a 2D vector type. This makes it possible to
"convert into" a UV map attribute.
The data type conversion uses the implicit conversions provided by
`\nodes\intern\node_tree_multi_function.cc`.
The `Auto` domain mode chooses the domain based on the following rules:
1. If the result attribute already exists, use that domain.
2. If the result attribute doesn't exist, use the source attribute domain.
3. Otherwise use the default domain (points).
See {T85700}
Differential Revision: https://developer.blender.org/D10624
This patch adds a node, that removes an attribute if possible,
otherwise it adds an error message.
Differential Revision: https://developer.blender.org/D10697
This makes the following changes to the name of the two
geometry nodes subvision nodes:
- `Subdivision Surface` -> `Subdivide Smooth`
- `Subdivision Surface Simple` -> `Subdivide`
Most of the benefit is that the names are shorter, but it also better
mirrors the naming of operations in edit mode, and phrases the names
more like actions. This was discussed with the geometry nodes team.
This commit adds a simple string input node, intended for use in the
attribute workflow to make using the same attribute name in multiple
places easier. The node is function node similar to the existing vector
input node.
Ref T84971
Differential Revision: https://developer.blender.org/D10316
Add the Simple subdivision option to Geometry nodes, as a new node
instead of part of the existing subdivision node because of future
backend changes to the Simple option. (See T85584)
https://developer.blender.org/D10409
These are similar to the regular "Combine XYZ" and "Separate XYZ" nodes,
but they work on attributes. They will make it easier to switch between
vector attributes and float attributes.
Differential Revision: https://developer.blender.org/D10308
This node takes a volume and generates a mesh on it's "surface".
The surface is defined by a threshold value.
Currently, the node only works on volumes generated by the
Points to Volume node. This limitation will be resolved soonish.
Ref T84605.
Differential Revision: https://developer.blender.org/D10243
This node calculates a distance from each point to the closest position
on a target geometry, similar to the vertex weight proximity modifier.
Mapping the output distance to a different range can be done with an
attribute math node after this node.
A drop-down changes whether to calculate distances from points,
edges, or faces. In points mode, the node also calculates distances
from point cloud points.
Design task and use cases: T84842
Differential Revision: https://developer.blender.org/D10154
This node outputs true when geometry nodes is currently evaluated
for the viewport and false for final renders.
Ref T85277.
Differential Revision: https://developer.blender.org/D10302
Implements a node to get collection objects.
These objects are then passed along as instances in the node tree.
Follow up tasks:
Multiple nodes does not support instancing yet: T85159
Changing collection offset does not trigger a refresh: T85274
Reviewed By: Jacques, Dalai, Hans
Differential Revision: http://developer.blender.org/D10151
This implements a new geometry node based on T84606.
It is the first node that generates a `VolumeComponent`.
Differential Revision: https://developer.blender.org/D10169
This node allows sampling a texture for every vertex based on some
mapping attribute. Typical attribute names are the name of a uv map
(e.g. "UVMap") and "position". However, every attribute that can be
converted to a vector implicitly is supported.
It should be noted that as of right now, uv map attributes can only be
accessed after a Point Distribute node.
Ref T82584.
Differential Revision: https://developer.blender.org/D10121
The translate node moves every point in the geometry, and the scale
node multiplies the "scale" attribute of the input geometry by its input.
While these operations are already possible with the "Attribute" nodes,
these new nodes fit nicely with the nodes specifically for changing the
"rotation" attribute that already exist, and they provide a simpler way
to do the same thing.
Differential Revision: https://developer.blender.org/D10100
This adds a new Align Rotation to Vector node based on the mockup
in T83669.
Reviewers: HooglyBoogly, simonthommes
Differential Revision: https://developer.blender.org/D10081
This patch implements the same operations and interface as the regular
vector math node, but it runs for every element of the attribute. This
should expand what's possible with geometry nodes quite a bit.
Differential Revision: https://developer.blender.org/D9914
This node updates the "rotation" attribute on points.
Multiple ways to specify the rotation are supported.
Differential Revision: https://developer.blender.org/D9883
Ref T83668.
This new node increases the radiance of an image by a scalar value.
Previously, the only way to adjust the the exposure of an image was with
math node or using the scene's color management.
Reviewed By: jbakker
Differential Revision: https://developer.blender.org/D9677
Ref: T82651
Normally people use "Combine XYZ" to input a vector, but it is more
interesting to have an explicit vector input.
So this is basically "Combine XYZ" without any input sockets, the values
are stored in the node itself.
Differential Revision: https://developer.blender.org/D9885
This patch adds two related nodes, a node for separating points
and mesh vertices based on a boolean attribute input, and a node
for creating boolean attributes with comparisons.
See the differential for an example file and video.
Point Separate (T83059)
The output in both geometries is just point data, contained in the mesh
and point cloud components, depending which components had data in the
input geometry. Any points with the mask attribute set to true will be
moved from the first geometry output to the second. This means that
for meshes, all edge and face data will be removed. Any point domain
attributes are moved to the correct output geometry as well.
Attribute Compare (T83057)
The attribute compare does the "Equal" and "Not Equal" operations by
comparing vectors and colors based on their distance from each other.
For other operations, the comparison is between the lengths of the
vector inputs. In general, the highest complexity data type is used
for the operation, and a new function to determine that is added.
Differential Revision: https://developer.blender.org/D9876
This node can be used to mix two attributes in various ways.
The blend modes are the same as in the MixRGB shader node.
Differential Revision: https://developer.blender.org/D9737
Ref T82374.
This commit adds a node that fills every element of an attribute
with the same value. Currently it supports float, vector, and color
attributes. An immediate use case is for "billboard" scattering.
Currently people are using the same input to a Random Attribute node's
min and max input to fill every element of a vector with the same value,
which is an unintuitive way to accomplish the same thing.
Differential Revision: https://developer.blender.org/D9790
This patch adds support for AOVs in EEVEE. AOV Outputs can be defined in the
render pass tab and used in shader materials. Both Object and World based
shaders are supported. The AOV can be previewed in the viewport using the
renderpass selector in the shading popover.
AOV names that conflict with other AOVs are automatically corrected. AOV
conflicts with render passes get a warning icon. The reason behind this is that
changing render engines/passes can change the conflict, but you might not notice
it. Changing this automatically would also make the materials incorrect, so best
to leave this to the user.
**Implementation**
The patch adds a copies the AOV structures of Cycles into Blender. The goal is
that the Cycles will use Blenders AOV defintions. In the Blender kernel
(`layer.c`) the logic of these structures are implemented.
The GLSL shader of any GPUMaterial can hold multiple outputs (the main output
and the AOV outputs) based on the renderPassUBO the right output is selected.
This selection uses an hash that encodes the AOV structure. The full AOV needed
to be encoded when actually drawing the material pass as the AOV type changes
the behavior of the AOV. This isn't known yet when the GLSL is compiled.
**Future Developments**
* The AOV definitions in the render layer panel isn't shared with Cycles.
Cycles should be migrated to use the same viewlayer aovs. During a previous
attempt this failed as the AOV validation in cycles and in Blender have
implementation differences what made it crash when an aov name was invalid.
This could be fixed by extending the external render engine API.
* Add support to Cycles to render AOVs in the 3d viewport.
* Use a drop down list for selecting AOVs in the AOV Output node.
* Give user feedback when multiple AOV output nodes with the same AOV name
exists in the same shader.
* Fix viewing single channel images in the image editor [T83314]
* Reduce viewport render time by only render needed draw passes. [T83316]
Reviewed By: Brecht van Lommel, Clément Foucault
Differential Revision: https://developer.blender.org/D7010
This is the initial merge from the geometry-nodes branch.
Nodes:
* Attribute Math
* Boolean
* Edge Split
* Float Compare
* Object Info
* Point Distribute
* Point Instance
* Random Attribute
* Random Float
* Subdivision Surface
* Transform
* Triangulate
It includes the initial evaluation of geometry node groups in the Geometry Nodes modifier.
Notes on the Generic attribute access API
The API adds an indirection for attribute access. That has the following benefits:
* Most code does not have to care about how an attribute is stored internally.
This is mainly necessary, because we have to deal with "legacy" attributes
such as vertex weights and attributes that are embedded into other structs
such as vertex positions.
* When reading from an attribute, we generally don't care what domain the
attribute is stored on. So we want to abstract away the interpolation that
that adapts attributes from one domain to another domain (this is not
actually implemented yet).
Other possible improvements for later iterations include:
* Actually implement interpolation between domains.
* Don't use inheritance for the different attribute types. A single class for read
access and one for write access might be enough, because we know all the ways
in which attributes are stored internally. We don't want more different internal
structures in the future. On the contrary, ideally we can consolidate the different
storage formats in the future to reduce the need for this indirection.
* Remove the need for heap allocations when creating attribute accessors.
It includes commits from:
* Dalai Felinto
* Hans Goudey
* Jacques Lucke
* Léo Depoix
The design for how we approach the "Everything Nodes" project
has changed. We will focus on a different part of the project initially.
While future me will likely refer back to some of the code I remove here,
there is no point in keeping this code around in master currently.
It would just confuse other developers working on the project.
This does not remove the simulation modifier and data block. Those are
just cleaned up, so that the boilerplate code can be reused in the future.
The hardcoded age limit is now gone. The behavior can be implemented
with an Age Reached Event and Kill Particle node. Other utility nodes
to handle age limits of particles can be added later. Adding an
Age Limit attribute to particles on birth will be useful for some effects,
e.g. when you want to control the color or size of a particle over its
life time.
The Random Float node takes a seed currently. Different nodes will
produce different values even with the same seed. However, the same
node will generate the same random number for the same seed every
time. The "Hash" of a particle can be used as seed. Later, we'd want
to have more modes in the node to make it more user friendly.
Modes could be: Per Particle, Per Time, Per Particle Per Time,
Per Node Instance, ...
Also a Random Vector node will be useful, as it currently has to be
build using three Random Float nodes.
The following nodes work now (although things can still be improved of course):
Particle Birth Event, Praticle Time Step Event, Set Particle Attribute and Execute Condition.
Multiple Set Particle Attribute nodes can be chained using the "Execute" sockets.
They will be executed from left to right.
Object sockets work now, but only the new Object Transforms and the
Particle Mesh Emitter node use it. The emitter does not actually
use the mesh surface yet. Instead, new particles are just emitted around
the origin of the object.
Internally, handles to object data blocks are passed around in the network,
instead of raw object pointers. Using handles has a couple of benefits:
* The caller of the function has control over which handles can be resolved
and therefore limit access to specific data. The set of data blocks that
is accessed by a node tree should be known statically. This is necessary
for a proper integration with the dependency graph.
* When the pointer to an object changes (e.g. after restarting Blender),
all handles are still valid.
* When an object is deleted, the handle is invalidated without causing crashes.
* The handle is just an integer that can be stored per particle and can be cached easily.
The mapping between handles and their corresponding data blocks is
stored in the Simulation data block.
This commit adds the initial set of particles nodes. These are fairly
low level and are expected to be put into groups that we ship with Blender.
See D7384 for a description of the individual nodes.
Reviewers: brecht
Differential Revision: https://developer.blender.org/D7384
This adds an embedded node tree to the simulation data block dna.
The UI in the `Simulation Editor` has been updated to show a list
of simulation data blocks, instead of individual node trees.
The new `SpaceNodeEditor.simulation` property wraps the existing
`SpaceNodeEditor.id` property. It allows scripts to get and set
the simulation data block that is being edited.
Reviewers: brecht, mont29
Differential Revision: https://developer.blender.org/D7301