Notes from Sand Hill


Logo

XTension Tech Notes

Sandhill Crane


ID: TN.analog
Subject: General discussion of analog data collection.
Date:10/6/97
Applicability:All versions of XTension
Contributor:Michael

Just what is involved in reading such things as temperature and humidity, light levels, water pressure or the strain of weight ?

Our human senses have provided us with the ability to tell the difference in even slight variations of temperature and humidity, light and dark. We are very adept at it, but 'IT' is very social and what is 'cold' or 'warm' or 'hot' to a human can easily be communicated to another human by all sorts of signals and words.

Computers however must be told what 'HOT' and 'COLD' are, and with respect to what.

This would be simple if 'HOT' meant 'ON', and 'COLD' meant 'OFF'. But what about 'WARM' or 'COOL' , 'BRIGHT' and 'DIM' or 72 degrees versus 85 ?

There are electronic devices which can sense changes in temperature or pressure, sound, light and even chemical concentrations.

These "sensors" all translate the current 'level' of what it detects (like temperature), into a voltage which is proportional to the input 'level'.

This voltage is pretty useless directly to the Macintosh. It must be translated further into a value which can be stored in the computer and manipulated.

So there needs to be a small hardware 'interface' which samples the input voltage and translates it into a digital value and then sends this value to the Macintosh in a format which makes it easy to identify and use.

At the Macintosh, and using XTension, these values need to be displayed to a human and perhaps used to calculate some response to the ( 'high' temperature ).


An example to make it a little easier:

This technote uses an example, a simple temperature sensor which is monitored by the ADB I/O from the Beehive, and sampled by scripts which are scheduled to run by XTension.

The graphic below will help in the discussion following...
Analog ?




The whole thing appears very complex at first, but once you have the idea, you can see that you won't have to deal with the numbers or the scripts very often.

In this example, we're going to use a common temperature sensor which is cheap ( $3.00 ) and is compatible with the analog inputs of the ADB I/O.

This sensor is the LM34 by National Semiconductor, and is probably one of the most commonly used temp sensors. It has a 'Range' of -40F to 230F, which means that it is only sensitive to that range. Temperatures above 230 or below -40F, would be meaningless or distorted by the sensor.

The LM34 needs only 3 wires connected between its terminals and the input screw posts of one of the ADB I/O analog channels. ( yes you can have up to 4 temp sensors on a single ADB I/O)

The LM34 'translates' the temperature into a voltage which is relative to the 5 volt reference (Vref) that is provided by the ADB I/O. Thus, the output voltage of the sensor varies between zero and 5 volts, representing a temperature variation between -40 and 230 degF.

If we divide 5 volts by the range of temperature ( 270 degrees total ), we get a value of 0.0185 volts. This value, about 20 millivolts, represents ONE degree Fahrenheit, so says the sensor.

So, if the temperature is 70 degrees F, the sensor will output a voltage which is proportional by that amount per degree ABOVE the low limit of the sensor.

This means the difference between -40 degF and +70 degF or 110 total degrees,

SO: 110 * 0.0185volts = 2.035 volts


Now, when this voltage is 'sampled' by the interface (ADB I/O), it still is just a 'voltage', and not in a form that can be used by a computer program or human.

The voltage needs to be translated into a 'digital' form, and then sent to the Macintosh and XTension for processing and display.


The ADB I/O samples the voltage using a '8-bit analog to digital converter'. This is a semiconductor chip which specifically translates a 0 to 5 volt input into a 8-bit 'raw count' which is then sent to the Macintosh via the AppleDesktopBus, and then to XTension.

This 'raw count' can represent 256 different values, and the lowest value is zero, so we'll let zero raw counts equal the lowest sensor temperature of -40 degF, and of course, the highest raw count of 255 will represent 230 degF, but remember the total range is 270 degF.

It can be easily seen then that each raw count is equivalent to

( 40 + 230 ) / 255 or about 1.05 degrees F per count


Therefore, for the example of 70 degF, we would see a raw count of :

( -40 +70 )degrees * 1.05 degrees per count = 105 raw counts


This raw value is passed from the ADB I/O whenever a script in XTension requests it. But it must be further processed in order to make it reasonable to humans and scripts that you write.


Don't forget that the ADB I/O must be configured once (maybe in the Startup Script). The following line will configure all 4 of the analog ports as analog in :

--Set up the ADB I/O port B as 4 analog inputs.
configure ADBIO unit 1 port B as {analog in, analog in, analog in, analog in}

Now we create a script called "Take the Temp" which samples the temperature on channel B of the ADB I/O :

set value of "Temperature" to (get ADBIO unit 1 port B channel 1) as integer

This script should be scheduled to run every 3 minutes by the scheduled events dialog, but we still need to do some more processing, because we've only read the 'raw counts' from the ADB I/O.

In the same way that the sensor and the ADB I/O encoded
the temperature, we need to un-encode it.

By remembering that each 'raw count' is equal to 1.05 degF, then we need to multiply the "RAW" data by 1.05 to get degrees Fahrenheit above -40 degF. If we also subtract out the 40 degrees, we get a normalized temperature in degrees above and below Zero F:

set DegPerCount to 1.05
set Off_set to 40
set RAW to (get ADBIO unit 1 port B channel 1) as integer
set value of "Temperature" to ((round (RAW * DegPerCount)) - ((Off_set as integer))

(The database unit "Temperature" is set up as a 'pseudo unit' which has no X-10 address, and is defined as 'dimmable'.)

The value in "Temperature" will now be sampled and updated every 3 minutes by the scheduled script "Take the Temp".

What you do from there is up to you, but you might add something like :

set DegPerCount to 1.05
set Off_set to 40
set RAW to (get ADBIO unit 1 port B channel 1) as integer
set value of "Temperature" to ((round (RAW * DegPerCount)) - ((Off_set as integer))
--
IF value of "Temperature" > 78 then
speak "It's getting warm, please turn on the A/C"
else
IF value of "Temperature" < -10 then
speak "It's too cold for me !"
end if
(yes, you can have negative values in "Temperature")


Calibrating your new 'Thermometer'

Now that we have all of the parts and scripts, we need to verify that the readings that we display in "Temperature" are at least CLOSE to the real temperature.

By a simple procedure we can test how accurate we are at 32 degF or the freezing point of water:

Put the temperature probe in a plastic (zip-close) bag.
Allow the wires to the probe to extend out.
Put the bag and probe into a bowl of ice cubes and water.
Allow this to settle, making sure that water doesn't get in the bag.
After 5-10 minutes, the reading for "Temperature"
will probably be wrong !

No worries, we need only adjust things a bit.

What is the difference between the value displayed for "Temperature" and 32 degrees?

If for example the displayed value is 43, then we are off by 11 degrees.

We need only adjust the "Offset" value in the script from -40 to -51, and the displayed value should then equal 32.

Of course we will want to test the cold water with a reasonable thermometer, and then we have one end of our 'comfort zone' tested. We still need to check out the higher temperatures.

Just let the probe sit close to a standard wall thermostat for a few minutes, making sure that it hangs in the air and is not touching anything.

If we are lucky, the database "Temperature" will be very close to the temp displayed by the wall thermostat.

If this is NOT very far off, we can adjust the "Offset" value to more closely approximate the temperature range of the standard human habitat.

If there is too much difference, then adusting the "Offset" value only will make temperatures at the lower end inaccurate. If you have to adjust more than 5 or so, then we may need to adjust the value of each 'raw count' or "DegPerCount" in the script.

If we subtract the 'raw counts' we get at 32 degF from
the raw counts that we get when the probe measures 72 degF.
The difference divided by 40 ( 72 - 32 ) = NEW value for "DegPerCount".

Substituting the new "DegPerCount" into the equation, the processed value for "Temperature" should be accurate to within +/- 1 degree at least within the normal human range.

Should you want to extend that range, you need only calibrate the equation again using two or more "Known" temperatures as above.


WHOA ! That's too simple !

True, we cheated by not describing all sorts of sensors, and their characteristics. And we didn't talk about linearization and 'dead spots', etc. All of these are part of the analog world, however the description above is complete enough to understand the basics of range and offset and calibration for many different sensor types.


This has been a hasty technote, and I appologize for its lack of 'complete-ness'.
But many have asked for this, so you will probably see this change often.

Also, please check out the other technotes and links on our website: Ideas Page.

Additional help is available on our website: www.shed.com
Or directly to Michael and Paul Ferguson at: 407-349-5960
Or mail to Sand Hill Engineering Inc. Box 517 Geneva FL 32732
All icons, graphics and text copyright ¨1997 Sand Hill Engineering Inc.