IronOS icon indicating copy to clipboard operation
IronOS copied to clipboard

Simple high precision temperature calibration

Open ruslan-khudyakov opened this issue 6 years ago • 77 comments

  • I'm submitting a ...

    • [ ] Bug report
    • [x] Feature request
    • [ ] Translation
  • Setting up of tip reference temperature as 100°C (temperature of boiling water)

The simplest way to calibrate is dip the tip into the boiling water. (water boils at 100 °C (212 °F)

The currently used method (ambient temp) expects ≈30°С and is not accurate.

Procedure:

  1. Heat the water to a boil;
  2. Dip tip in the water;
  3. Press "Calibrate"
  4. Done! )))

ruslan-khudyakov avatar May 14 '18 10:05 ruslan-khudyakov

Hi, This has been suggested before, and I might implement this as a second stage calibration.

However, "Water" does not boil at 100C. Pure H20 does, when its at 1 atmosphere (and thus standard pressure). Any deviation from this can cause errors in the measurement.

At the moment, the temperature sensor in the handle is used (Thus the warning to ensure the handle is the same temp as the tip). This provides a temperature in 0.1C increments that is used for the calibration.

Ralim avatar May 14 '18 10:05 Ralim

Hi, thanks for answer!

Right now in my room is 23°C degrees and after calibration soldering iron show me tip how 33°C. (Was in standby mode etc.)

However, this is so far from 0.1°C…

Yes, idial conditions for 100°C it sea level and 1 atmosphere but even at 1000 meters water does boil at 97°C. How many people living on 0.5 miles above sea level? )))

P.S. as a second way it would be cool!

ruslan-khudyakov avatar May 14 '18 10:05 ruslan-khudyakov

What about calibrating at 37 degrees Celsius? More practical than boiling water and more accurate than "room temperature". Also Come to think of it the iron can double as a thermometer. ;)

yschaeff avatar May 14 '18 11:05 yschaeff

@yschaeff, yeah, "rectal calibration mode", LOL!!!

ruslan-khudyakov avatar May 14 '18 11:05 ruslan-khudyakov

~~The temp calibration should simply just be user-settable offset like the voltage, so it doesn't matter if you do it at 15C, 37C or 100C. Everyone owns a thermometer of some sort.~~

Scratch that, didn't read Ralims post with any actual intelligence. (need more coffee).

JohnEdwa avatar May 14 '18 12:05 JohnEdwa

Anyway we need a better way how to calibrate properly - we need at least two points calibration - two temperatures - 30 and 100 C? What do you think

Eldenroot avatar May 14 '18 19:05 Eldenroot

The best way in my opinion is 100°C plus Manual User-setting like a voltage.

ruslan-khudyakov avatar May 14 '18 19:05 ruslan-khudyakov

37 and 100 for two points calibration would be the best. One point calibration is not precise.

Eldenroot avatar May 14 '18 20:05 Eldenroot

@Eldenroot I'm not against a multiple point calibration option. But generally it will just provide a means for the user to mess up the calibration of the iron.

As to go further would involve then calibrating the thermocouple and op-amp, and to calibrate out the tolerance of the op-amp circuitry etc would require two stages of calibration, and two different ambient temperatures as well to allow for a change of the cold junction.

At the moment the firmware is assuming an average value for the temperature gain for the system. And the calibration is there to calibrate the offset in the system that is unaccounted for after cold junction correction.

The actual correction required for cold junction can vary tip to tip, along with both the tolerance of the op-amp and the temperature curve for the tip.

So the calibrations required for this would ultimately inflate to a 4 step process, done for two different tips to profile the unit fully. Which still will only improve the tip actual temperature slightly.

Note that there is currently an issue with the screen showing a higher temperature than is actually at the tip due to the filtering done for the UI. This is most pronounced at the low end of the scale, and dimishes when the iron is operating. This offset is a bug that will be handled in future.

Ralim avatar May 15 '18 11:05 Ralim

Ok, you are right. I am looking forward to see this bug fixed, keep good work, perfect fw.

Eldenroot avatar May 15 '18 11:05 Eldenroot

We have two available constants — 1. Boiling water: 100°C; 2. Melting Ice: 0°C.

The best for home use.

ruslan-khudyakov avatar May 15 '18 12:05 ruslan-khudyakov

Using melting ice & boiling water seems a bit weird to me. The more obvious and relevant calibration point would be the melting point of some eutectic solder alloy like normal 63/37. Besides, a multi meter with a thermocouple is pretty common equipment and cheap Hakko tip thermometer clones are available for under 10 bucks.

I noticed on my TS100 that the temperature is completely off:

http://www.minidso.com/forum.php?mod=viewthread&tid=3168&extra=page%3D1

This is also not a constant offset and increases with the set temperature, so a single bias value would not be enough. Calibration with the second temperature sensor at room temperature does not seem useful as the error increases with temperature, so at 20C or so it is negligible.

I would really appreciate a feature in the custom firmware that allows this to be mitigated as with the current official & custom firmware the difference between actual vs displayed temperature >30C.

blitzcode avatar May 16 '18 19:05 blitzcode

Melting & boiling points weird? Sorry, but the Celsius scale is based on them. )))

ruslan-khudyakov avatar May 16 '18 20:05 ruslan-khudyakov

Yes, I think so. It's going to be tricky / messy / unreliable / impossible to heat the iron to exactly 100C with boiling water or cool it to exactly 0C with a bunch of ice cubes. Slowly increasing the set temperature till you reach the well-defined melting point of 63/37 solder (which you likely have) seems much simpler and more reliable. The melting point of solder is also closer to the temperatures which you'll actually be using the iron at, making the calibration more likely to be useful in the presence of any non-linearities like the ones I observed when measuring my TS100. Besides, if you have an external reference like a DMM or tip thermometer you can also save yourself from messing around with kitchen supplies to calibrate your iron ;-)

But I'm not an expert on measuring & calibrating soldering irons, I'm happy with any solution that would allow me to get my TS100 to heat up with <10C error from the set temperature.

blitzcode avatar May 16 '18 20:05 blitzcode

@blitzcode, the iron in the boiling water will have exactly 100°C and we will get accuracy ±1°C (or better)

ruslan-khudyakov avatar May 16 '18 20:05 ruslan-khudyakov

If you really want to know and calibrate the tip temperature, get a Hakko FG-100 clone, you can find ones for under $15. Or if you really want to cheap out, the sensors are just bog-standard K-type thermocouples almost any multimeter can read and can be bought $4 for 10pcs.

Talking of the FG-100:

@Ralim, I tested my three tips (B2, BC2 and D24), and only the original B2 was actually anywhere near accurate, the other two are showing notably low values. Before each test, I let the iron cool down and did the tip temp calibration. Or did I just get two dodgy tips.

tiptemp

JohnEdwa avatar May 16 '18 22:05 JohnEdwa

Hi, I'll try and get something a bit better in a coming firmware.

Can I ask what tips everyone is using? I only have offset issues on the non production tip I was sent and hakko tips.

It is looking like some models of tips have a much higher error compared to some.

I own two of the clone hakko fg-100 units, and neither gives the same result (difference of about 10C).

When testing my older BC2 tips and C1 tips they come in real close to the set point, however the newer BC2 tip is about 15C.

I have noticed significant offset can occur if the offset error cancellation is done with a warm handle, Or warm tip.

At the least I will try and get an updated firmware that lists the temp offset out when I'm next working on this.

Also, none of the tips are rated above 400C, and prolonged use above 400 can degrade the temperature accuracy.

Ralim avatar May 16 '18 22:05 Ralim

@JohnEdwa Clone does not guarantee accuracy. (Original HAKKO FG-100 tolerance of ±3°C)

ruslan-khudyakov avatar May 16 '18 22:05 ruslan-khudyakov

@JohnEdwa

Exactly, this stuff is not expensive. Nice graphs! ;-)

Here are my measurements from the other thread:

Set  TipTherm Fluke
300  267      269
350  308      314
400  355      358

TipTherm = FG-100 clone, Fluke is a Fluke DMM with a K-type thermocouple. I used a D24 tip. I also have a BC2 tip, also very inaccurate.

It seems the FG-100 clone and my DMM agree reasonably close. If I place the DMM thermocouple on the D24 tip the measurements are a few C below the FG-100, if I place it a bit higher on the tip they're a few degrees above.

I also found this video:

https://www.youtube.com/watch?v=DEEaLMv6dog&feature=youtu.be&t=12m49s

Some users claim they get very accurate temperatures with their TS100, I certainly have an error > 30C. Official firmware is no better. Both my tips have this huge error.

blitzcode avatar May 16 '18 22:05 blitzcode

@Ralim After replacement tip we need recalibration. All tips a little different. (thermocouple position, weight, volume, etc)

ruslan-khudyakov avatar May 16 '18 22:05 ruslan-khudyakov

@ruslan-khudyakov

Clone does not guarantee accuracy. (Original HAKKO FG-100 tolerance of ±3°C)

Of course not, you pay $250 for that promise of accoracy, but the device itself is just a K-type thermocouple reader, and the sensor is a K-type thermocouple with a piece of metal crimped on top - both super simple things for a clone to do for a fraction of the price of the 'name brand'.

And seeing that my $15 clone, the device itself is within 1.7C from my Brymen BM869 (0.3% +-1.5C) and the testing the same temp with a regular K-type the difference is just 0.8C, I'd say it's accurate enough, seeing it's just 16% the cost, wouldn't you say?

device sensor

JohnEdwa avatar May 16 '18 23:05 JohnEdwa

@JohnEdwa 1.7°C between Clone and BM869 doesn't mean anything. Accuracy - it's tolerance from Real Correct Temperature, not comparative cheap gadgets measurements.

Here we don't know temperature you really have here. We just know that THIS measurements are close. For assess of the accuracy we need a temperature standard (etalon).

ruslan-khudyakov avatar May 16 '18 23:05 ruslan-khudyakov

@ruslan-khudyakov

Sure, we have no idea what the actual spec for the FG-100 clone is, or how accurate and precise it actually is, or if the measurements are linear at all. But we do know it for the BM869 - it's has a 0.3% +- 1.5C spec on temperature. The 294.5C I measured, the real temperature should be between 292C and 297C. And when that same temperature was tested with a different thermocouple, the BM869 got 293.7C, meaning it should be between 291C to 296C.

And as in this instance, the FG-100 measured 292C which fits. If I wanted to do it, I could see if my FG-100 clone also fit the 0.3% +-1.5C spec - all I would have to do is do a bucketload of measurements with both, and check that they all fit within the range of the BM869 spec.

JohnEdwa avatar May 17 '18 00:05 JohnEdwa

I got a KU tip today, quick test at 320C set shows ~290C with the tip thermometer, seems like the temperature is completely wrong with this one as well. I don't think there's much need to discuss the finer details of measuring temperature and calibrating soldering irons, the problem appears to be in a different order of magnitude ;-)

blitzcode avatar May 17 '18 11:05 blitzcode

Can I ask what tips everyone is using?

BC2

LarsSimonsen avatar Jun 22 '18 19:06 LarsSimonsen

Just received my C1 tip, it is way off as well. Redid the measurements and graphs to suit, though I went from just 200 to 400 instead of 150-450 this time.

iohhhu

JohnEdwa avatar Jul 21 '18 15:07 JohnEdwa

FYI: I'm working on this at the moment slowly as I have time / patience. @JohnEdwa Since you have a few tips at your disposal, could you give me a hand by getting a spread of 2-4 temps over the temp range per tip and recording the "Rtip" measurement that is now shown in the debug menu (long hold the rear button on idle screen). Doesn't need to be perfect temps, but even just some rough values to get an idea of the different tip curves.

Ralim avatar Aug 02 '18 03:08 Ralim

@Ralim Sure, though these tips cool off really fast and even more so if I actually try to measure the temperature at the same time, so a way to see Rtip while the iron is on would be immensely helpful.

JohnEdwa avatar Aug 03 '18 12:08 JohnEdwa

Hi All,

I bought two TS100 and one with the full kit of tips. I have now made some tests and this is what I have come up with. I have a Fluke 87 mkV with a thermocouple. I have used all tips un/calibrated and the latest Ralim release (Maj 7:th 2018).

  1. It is very important to wet the tip with solder when measuring the tip temperature and give it time. Otherwise smaller tips give much lower values than larger tips and the time is needed to heat the external Fluke measurement thermocouple. Theses measurements gave that the tips where in the range 10-20 degC below the set temperature, at approx. 200 degC.

  2. When using a 63/37 Tin/Lead Solder with a melting point if about 183 degC, all my tips melt the Solder in the range 190 - 200 degC. The important thing here as well is to overheat the tip and wet it with solder before going down in temperature to test the melting point. Otherwise the same thing here, smaller tips give higher temperatures.

I have a Weller TS80 and that one is off by about 30 degC (just an analog turning knob though).

The problem I see when going up i temperature is that the tips start to deviate more and more the higher they get, this is also confirmed by JohnEdwa's measurements.

Thinking out loudly, I am wondering if this is partly due to the design of the tips where the thermocouple and the heater is integrated. If you have a fixed temp you will have a more or less fixed PWM dutycycle for that temp. The higher you go the higher will the dutycycle be. At higer temps, if the wire (heater + thermo) within the tip does not reach equilibrium you measure a higher temp than what the outer part of the tip actually has. The temperature gradient, cooling of the tip also gets steeper the higher you go in temp.

Does this make any sense?

)) P

P.S. I would try to push the ADC measurement as far away from the PWM-pulse as possible. (I have not looked at the exact implementation so this is perhaps already the case?)

Repled avatar Aug 05 '18 22:08 Repled

@johnedwa Good point, I'll try and get you a testing build soon.

@repled

I agree with your two points completely, and when measuring I usually try to ensure the top of the thermocouple is inside of the solder glob.

The deviation is a combination of non-linearity as well as when above 400C the tips starts to run out of headroom of the ADC/op-amp.

The design comes into this as Heating and measuring the tip temperature is mutually exclusive, and there is a recovery time after the end of the heating pulse train for the sensor to stabilise and the op amp to desaturate. The tip appears to recover faster than the op-amp.

In the system at the moment the hardware is setup to trigger the ADC automatically after a delay at the end of the pwm period. This delay is not perfect,and on some devices the first ADC sample will catch the tail end of this recovery,which will lead to a slight overreading, however this only occurs during full pwm duty (usually only the main heat up), which is mostly hidden by the pid and thermal reaction time of the tip

https://ralimtek.com/stm32_double_pwm/

That is the small article on how I setup the timers to gain nicely scheduled ADC readings.

Ralim avatar Aug 05 '18 22:08 Ralim