It's been a while since I shared here, mostly because BC continues to run great, and my setup has been stable.
Problem
That said, I kept having issues with Tilt accuracy, as changing the battery throws off its original calibration, which is why the Tilt/Tilt2 app has its own calibration means. I was getting frustrated with changing my offset each time I measured my wort with my EasyDens to maintain some level of accuracy since 1) that wastes wort and 2) Supabase database carries inaccurate numbers until I adjust it (and I want some accuracy for a personal phone app and data analysis project I am doing).
Solution approach
Calibrating the Tilt for BC has been covered a bit in this thread, but I wanted to offer my approach to get it accurate enough for government work without having to calibrate with a custom wort/sugar solution each time the battery is changed and without using a CSV in a lookup table. It requires a script, and the Tilt must be properly calibrated in pure water (with another gravity tool, like an EasyDens, agreeing the water is 1.000).
Assumptions
I work from the assumption that Tilt calibrates it at the factory with the OEM battery, hardcoding both the highest calibration (e.g., 1.120) and lowest (1.000). From there, the assumption is that Tilt uses a linear scale factor to convert X-degree of angle to Y gravity. Since the Tilt can't have the upper calibration changed, we have to work with whatever it thinks the scale is between the highest calibration and the new 1.000. If the battery weight makes that scale factor smaller or larger, we have to deal with it. But it is still linear in some way.
With that, since gravity drop is also linear, we can use the OG of the wort to set our own scale factor, where the Tilt is the independent variable, and the actual SG (as referenced by an EasyDens or standard hydrometer, for example) is the dependent variable. By working from the OG of the given beer, you don't have to do a full-scale calibration when you change the battery (e.g., making a solution with a known gravity and getting what the Tilt thinks it is). It gets enough precision to make the Tilt data meaningful for whatever need. You also don't need to use a look-up table or spreadsheet (though, I use an Excel sheet with similar math below to see what I
should expect so I can check my script works well).
I hope this helps someone else who battles Tilt accuracy issues after changing the battery.
BC setup for the code
To start, you need your Tilt device setup without any calibration properties. You also need three globals: one for your Tilt's OG measurement, one for the actual OG (as measured by other means), and one for your adjusted SG value. Mine is automatically set based on Brewfather data, but you can just set the global in BC when you are transferred to the fermenter.
The code
You can use and adjust this script accordingly for your layout/format. For me, "BB1" is used at the beginning of all my devices and globals for my BrewBuilt fermenter (1 being used to distinguish a potential future 2nd BrewBuilt fermenter; I do the same for my Spike and Fermzilla fermenters).
Code:
//Initialize the local script variables
new value oldTilt //Gets what the Tilt device shows so the script will trigger when the value changes
new value ogEasyDens //My measured OG, which I get from my EasyDens
new value ogEasyDens2 //Used for some math
new value ogTilt //What the Tilt showed at the start of fermentation
new value ogTilt2 //Used for some math
new value ratio //My scale factor
new value sgTilt //What the Tilt currently shows
new value sgAdj //The new SG value when set back to an actual SG format
oldTilt = "BB1 Tilt" SG //Set what the starting Tilt value is at when the script starts
[Loop]
wait "BB1 Tilt" SG != oldTilt //Wait until the Tilt's shown value doesn't match the old one
oldTilt = "BB1 Tilt" SG //Reset the oldTilt value so this can trigger again on the next change
ogTilt = "BB1 Tilt OG" value //Get data from the global used to store the Tilt's OG value
ogTilt2 = ogTilt - 1 //Subtract 1 so you are left only with the decimal value (or "gravity points")
ogEasyDens = "BB1 OG" value //Get data from the global used to store the actual OG value
ogEasyDens2 = ogEasyDens - 1 //Subtract 1 so you are left only with the decimal value (or "gravity points")
ratio = ogEasyDens2 / ogTilt2 //Calculate the scale ratio (the slope... dependent variable over the independent variable)
ratio precision = 6 //Get more precision out of the ratio for more accuracy in the new value (i.e.., minimize inaccurate rounding)
sgTilt = "BB1 Tilt" SG - 1 //Get the current Tilt value and subtract 1, leaving only the decimal value (or "gravity points")
sgAdj = sgTilt * ratio //Multiply the gravity points by the scale value
sgAdj precision = 6 //Adjust to 6 decimals of precision to limit rounding
sgAdj += 1 //Add the 1 back to show an actual gravity value
"BB1 Gravity" value = sgAdj //Set the adjusted gravity global to the calculated actual SG value
goto "Loop" //Return to loop and wait for the Tilt to have a change in its value
Now, you should see what the gravity is with better accuracy and use that adjusted SG global for whatever you need (e.g., show on a custom site/app, push to Brewfather, trigger fermenter actions, etc.).
Limitations
As the calibration is based on OG readings of the Tilt and other means, the scale value is less precise than if you made a high-gravity solution and calculated a static scale value until the battery is changed again. But I don't want to make unnecessary high-gravity wort for this, and this has so far been accurate enough for (as a DC resident) government work.
My random gravity readings during fermentation using my EasyDens has returned the expected gravity readings based on this script within one gravity point, and that's just due to rounding, which we can't really control.
Even cooler would be if we could set the Tilt device's calibration functions via script to not have to use a global for the adjusted value, e.g.:
"BB1 Tilt" offset = -1
"BB1 Tilt" linearmultiplier = ration,
"BB1 Tilt" offset = +1)
But I get how tricky it would be when the calibration values in the properties have to be set in order and stored for the device.