infidel-sensor
infidel-sensor copied to clipboard
[Bug]: Unexpected behaviour of the sensor ADC to Diameter
What happened?
Sensor is flashed with infidel_release_ee.ino, board with host_ee_prog.ino. This is calibration over I2C. I have drill bits of different sizes to calibrate with: 1.4, 1.5, 1.6, 1.7, 1.8, 1.9 mm [for my sensor, 2mm doesn't fit at all, 1.9mm doesn't even go all the way in, and 1mm doesn't move the lever].
I expect the normal behaviour of the sensor to be as coded in infidel_release_ee.ino, and indicated in the docs, as shown by orange curve on the graph -> ADC increases with decreasing diameter.
What actually happens with my readings (Command 6) is the opposite (Blue curve). My ADC readings increase with diameter.
Of course this means bad behaviour of the interpolation function convert2dia
in infidel_release_ee.ino, so I get incorrect readings when you import the calibrated values in EEPROM starting from the biggest diameter to the smallest, because it includes if (dia_table[i][0] > in) ... return (out); break
.
Another issue is that the ADC range is less wide than expected. The range of ADC values given in firmware.rst is [617,999] for drill bits from 1 to 2mm, for me they are spread out from 26 to 282 for drill bits 1.4 to 1.9mm. The resolution is about 2 (=500/256) values of ADC per micron, it's still ok but I'm wondering whether this shift towards lower ADC values, and slight reduction in range is normal. Probably not...
Any solutions?
To solve the incorrect readings, I simply flipped the calibration table so that ADC are sorted in ascending order with the line index of the table.
This solved the problem and my readings seem accurate. However I am wondering if I am the only one getting this deviation from the docs or if this is something that should be corrected in the documentation.
Additional info
I got the unpopulated PCB from Aisler, not altering anything from the SMT files there. Also got the BOM from Kitspace without deviations.
I have not been able to reproduce this error. I'm wondering if this might have something to do with the polarity of the magnet or hall effect sensor? Long shot, but I can't find anything in the software which would explain this strange phenomenon. Either that, or I don't understand the issue.
I'm not actually sure if it even matters what side of the magnet is facing the sensor, but the polarity of the sensor itself definitely matters. That said, I think your hypothesis that the calibration ordering was wrong seems like a much more plausible one than mine now that I think about it.
Having the ADC values in ascending order definitely was the right thing to do, before that convert2dia
shouldn't really have worked (only output the highest value). According to the default table in infidel_release_ee.ino
, the diameter should get smaller with bigger readings, so I don't really know what's the case here.
@CCMCAGP do you have calipers to measure the filament yourself?
Today I'm going to try and remove the lever to put another one with an opposite magnet side. I think that it's being inserted wrongly is the most plausible option. Other thing is, I have a N45 magnet instead of a N35. I would assume that would make the sensor react "better" than worse but then again, I don't know.
I interpreted the instructions in the video as "it doesn't matter which of the 3 pins of the sensor goes where in the PCB, as long as you install the magnet such that the magnet polarity doesn't make the LED light up..." (with calibration.ino flashed on)
As for the diameters, yes I have a caliper, even a micrometer actually, so I'm able to measure the drill bits size accurately. But as I said, I couldn't follow the calibration procedure per say, because the 2mm and 1mm drill bits were either not fitting in or not triggering anything.
I don't know about the strength of the magnets but if you get a usable signal it should be fine. The orientation of the sensor does actually matter (+
, -
, O
, in that order, wouldn't want to swap them). Again, if you get a signal, this should be fine. The missing calibration points for 1mm and 2mm don't matter as long as there are enough points to interpolate from. What I suspect is happening with the current calibration table, is that your readings are 'mirrored'. So for example for the reading of 800
it will show the correct diameter of 1659
. From there if the reading decreases, the diameter should increase, since the magnet is further away from the sensor. But according to your table they actually increases. Can you try whether this mirrored behaviour is actually the case? If so, you might be able to just turn the 2nd column in the table upside down (the diameter mappings).