colorAt does not return the proper color
Hi All, I am using a white rectangle (10 by 10 pixels) and I am trying (unsuccessfully) to read the color and bring it back as a value
so, this is my code:
local img = hs.image.imageFromPath("test-white.png") local col = img:colorAt({2,2}) print(hs.inspect(col))
Since the entire image is a solid white color #FFFFFF I am expecting to see is expected result{ alpha = 1.0, blue = 1.0, green = 1.0, red = 1.0 }
Instead, I have this observed result{ __luaSkinType = "NSColor", alpha = 1.0, blue = 0.99984186887741, green = 1.0, red = 0.99989891052246 }
What am I doing wrong?

Ummmm, good question. I can only assume it's either a maths/rounding issue (maybe you simply need to round up the value to one decimal place?), or something to do with ColorSync. @asmagill might have some ideas?
hs.image:colorAt() is fairly basic:
/// hs.image:colorAt(point) -> table
/// Method
/// Reads the color of the pixel at the specified location.
///
/// Parameters:
/// * `point` - a `hs.geometry.point`
///
/// Returns:
/// * A `hs.drawing.color` object
static int colorAt(lua_State* L) {
LuaSkin *skin = [LuaSkin shared] ;
[skin checkArgs:LS_TUSERDATA, USERDATA_TAG, LS_TTABLE, LS_TBREAK] ;
// Source: https://stackoverflow.com/a/33485218/6925202
@autoreleasepool {
NSImage *theImage = [skin luaObjectAtIndex:1 toClass:"NSImage"] ;
NSPoint point = [skin tableToPointAtIndex:2] ;
// Source: https://stackoverflow.com/a/48400410
[theImage lockFocus];
NSColor *pixelColor = NSReadPixel(point);
[theImage unlockFocus];
[skin pushNSObject:pixelColor];
}
return 1;
}
...however, looking at it now, it is using a deprecated method, so it might be worth looking into something like this.
Has a solution aver been found for this? Can the StackOverflow answer be merged? I'm happy to provide a PR if that's the way forward.