processing-docs
processing-docs copied to clipboard
Documentation for new attrib() function
float angle;
void setup() {
size(400, 400, P3D);
noStroke();
}
void draw() {
background(0);
pointLight(200, 200, 200, width/2, height/2, -200);
translate(width/2, height/2);
rotateY(angle);
beginShape(QUADS);
normal(0, 0, 1);
fill(50, 50, 200);
// A scalar attribute named brightness, affecting the next two vertices
attrib("brightness", 0.1);
// A vector attribute named tangent, affecting the first vertex
attrib("tangent", 0.1, 0.8, 0.1);
vertex(-100, +100);
// Another tangent vector, affecting the second vertex
attrib("tangent", -0.3, 1, 0);
vertex(+100, +100);
fill(200, 50, 50);
// A new brightness value, affecting the last two vertices
attrib("brightness", 0.5);
attrib("tangent", 0.5, 0.5, 0.1);
vertex(+100, -100);
attrib("tangent", 0.1, -0.9, 0);
vertex(-100, -100);
endShape();
angle += 0.01;
}
Hey, @codeanticode, where would you put the entry for attrib() Reference entry: https://processing.org/reference/
Inside Rendering or Rendering/Shaders or other?
Also, how would you describe attrib()? Any text or ideas are helpful.
In principle, I would put it under Rendering/Shaders, because custom vertex attributes won't work without the corresponding shader code.
I will write a paragraph describing attrib() shortly.
@REAS What about this:
"The attrib*() functions allow attaching custom values to each vertex in the scene. Processing by default handles several per-vertex attributes: position, color, normal, texture coordinates, etc. These attributes are used by the renderer to determine how the geometry will look on the screen as result of applying the built-in shaders that compute texturing, lighting, etc. However, if the user sets a custom shader that does some additional rendering calculations, then she might need to pass additional information to the her shader, in the form of custom attributes. These attributes can be of three types: position, normal, color, and other. The first three are meant to specify xyz coordinates, normal coordinates, and color components, respectively. The third type can be use to pass any kind of attribute value."
I added a simple example demonstrating the use of custom attributes: https://github.com/processing/processing-docs/tree/master/content/examples/Demos/Graphics/MeshTweening
Hi, I ended up here because I didn't manage to make the attrib() method work with POINTS shapes. Looking at the PShapeOpenGL code for the initPolyBuffers() and initPointBuffers() methods, it seems that the custom attributes are not used in the later. Is there a reason for that?
My particular problem is that I would like to add the normal information to the POINTS shape to use it in the vertex shader, to do something similar to your example above, but with points instead of QUADS.
I found someone esle describing the same problem at the forum: https://forum.processing.org/two/discussion/18688/how-does-the-point-shader-work-with-custom-attributes
Hi, sorry for the slow reply. You cannot set custom attributes for points or line shapes. The main reason to not to include this functionality was that point and shapes are handled in separate rendering paths from regular polys (point and lines are always screen-facing and are rendered by specialized shaders), and thus they would have required additional complexity in the GL the renderer, and did seem to me at the point that the main advantage of custom attributes would have been for regular polygonal shapes.
There also is the problem of how to differentiate between stroke and poly attributes when rendering a shape w/out making the API too cumbersome. Consider something like this:
stroke(0);
beginShape(QUADS);
fill(255);
normal(1,0, 0);
attribNormal("tang", 0, 1, 0);
attrib("thing", 3.0)
vertex(0, 0, 0);
...
endShape();
We don't know if these two attributes, tang and thing, are being created on the quads, or on their strokes. Happy to hear your thoughts on this.
not sure if this is still an active issue, but I was hoping to use attrib() with POINTS earlier today... had a point cloud with temporal information and I wanted to activate visibility (alpha) on a vertex by vertex basis depending on the current time of the system... ended up passing the time variable into the shader as alpha, but it's a hack / not entirely obvious for someone trying to follow my code... https://www.instagram.com/p/BilVIwfAESK/?taken-by=johnbcarpenter. i get what you're saying above though. that said, PShape + PShader is really nice work... thanks andres
ch_data.beginShape(POINTS);
ch_data.strokeWeight(3.5);
for (int i = 0; i < ch_locs.length; i++) {
PVector loc = ch_locs[i].copy();
ch_data.stroke(node_Hs[i], node_Ss[i], node_Bs[i], ch_time[i]); //passing time in as alpha
ch_data.vertex(loc.x, loc.y, loc.z);
}
ch_data.endShape(CLOSE);
I spent time on this and I'm stuck. Should only attrib()
be documented, or should attribPosition()
, attribNormal()
, and attribColor()
also be documented? I'm leaning in favor of keeping all of this in the JavaDoc right now, but I'm not sure how general or specific this syntax is. If documented in the main web reference, they will go into the Rendering category and the description @codeanticode wrote above will be used.
@REAS Do you still need some more info about this API to update the docs?
@codeanticode I'm wondering if they would be in the Web reference at all and if they are, if it should stop at attrib()
or include the variants for Normal, Color, etc.
The attribute functions are pretty advanced and require knowledge of shaders, so makes sense to document them only in the JavaDoc for the time being. Right now, the only place they are mentioned at all is in a couple of examples, I think.
The family of Bezier curve functions looks similar since you have bezier()
, bezierDetail()
, bezierPoint()
, and bezierTangent()
, and each is documented separately.
I'm in favor of documenting them only in the JavaDoc because of their specificity.
Sounds good to me!