dataTaker - Data Loggers, Powerful and Flexible Data Acquisition & Data Logging Systems

The conversion rate or sampling speed for the dataTaker analog to digital converter is dependent on a number of factors. These include

the time required to determine which Schedules and channels are to be scanned

the time required to test whether an autocalibration is required now, and if so the time required to perform an autocalibration

the time required to select each channel for conversion

the settling time after selecting a new channel, during which the sensor signal is allowed to stabilize

the time required for the actual analog to digital conversion of the input signal

the time required to change gain (range) if necessary, and to repeat the analog to digital conversion

the time required to convert the primary data from the analog to digital converter into electrical units

the time required to linearise data in electrical units to data in physical units using internal functions, or user defined polynomials or spans

The timing diagram for analog to digital conversion by the dataTaker is illustrated overleaf. The time components of the analog to digital conversion process are described below.

Note : The ADC timing given below assumes that return of data to the host in real time is disabled (/r). Any RS232 COMMS or NETWORK communications will increase these times, depending on baud rate.

Schedule Execution Setup Time

The schedule execution setup time is the period required to determine which schedules if any are due for execution, and to carry out autocalibration of the analog to digital converter if necessary.

If there are no schedules due for execution, then other functions such as communications, digital inputs, counters, etc. are checked for need for attention before the schedules are again checked. The process of checking schedules for eligibility for execution takes 5 mS.

When a schedule becomes due for execution, and auto-calibration is enabled (/K), then the zero voltage reference is checked for the potential need for an auto-calibration. This takes 35 mS.

If autocalibration is required, then this is carried out before the schedule executes and any input channels are sampled. Autocalibration takes a further 550 mS to complete.

Therefore the schedule execution setup time can vary from 5 mS if auto-calibration is disabled (/k), to 40 mS if autocalibration is enabled but not required, to 590 mS if an autocalibration is carried out.

Following the schedule execution setup time, the sampling of input channels in the schedule list begins. During sampling of the input channels, a number of time consuming processes occur.

Channel Selection Time

This is the time taken for the analog input multiplexer to select an analog input channel for sampling. The channel selection time is approximately 10 mS, and is fixed.

Channel Settling Time

The channel settling time is the time allowed between selecting a particular analog channel on the input multiplexer, and commencing analog to digital conversion. This period allows the output of the sensor to settle after beginning to output bias current to the logger following selection.

The channel settling time is 10 mS by default. However the channel settling time can be defined within the range of 0 to 30000 mS using the Parameter10 command, to match the output characteristics of the sensors being monitored. The conversion rate or sampling speed of the dataTaker can be modified by altering the channel settling time (see below).

Analog to Digital Conversion Time

During the analog to digital conversion time, the input signal on the selected channel is converted to its basic digital equivalent. The conversion time is fixed at one mains or line cycle, which is 20 mS for a 50 Hz line frequency and 16.67 mS for a 60 Hz line frequency. The analog to digital conversion process is linked to the mains or line period to maximize line hum rejection.

The local mains or line frequency is primarily defined by the setting of bit 1 of the DIP switch, which allows selection for 50 or 60 Hz. Alternatively the line frequency can be set within the range of 48 to 1000 Hz using the Parameter11 command. The conversion rate or sampling speed of the dataTaker can be modified by altering the line frequency setting (see below).

Calculation and Linearization Time

Following analog to digital conversion of an input signal, the raw data is firstly calculated into the fundamental electrical units of mV, mA, Ohms or Hz, then linearized to physical units of temperature, strain, loop percentage or user defined polynomial or span as required.

The calculation and linearization of raw data for each channel is performed while the next channel in the schedule list is being converted (see ADC Timing Diagram). The calculation and linearization time for each channel does not influence the overall sampling speed, unless the conversion time is reduced to less than the calculation and linearization time (see below).

 


Signal or Sensor Type

Calculation and Linearization Time (mS)

Voltage

4.0

Current

5.0

Current Loop

6.0

Resistance ñ 2 & 3 wire

6.0

Resistance ñ 4 wire

7.0

Frequency

6.0

Period

5.0

LM35

4.5

LM335/AD590

5.0

LM34

6.0

RTD - 4 wire

9.0

Thermocouple ñ zero voltage reference

3.0

Thermocouple ñ junction temperature

4.5

Thermocouple ñ first in schedule

13.0

Thermocouple ñ subsequent in schedule

9.0

Thermistor

9.0

Digital state and counters

2.0

Calculation

2 - 10

Polynomial

Add 2 ñ 8

Span

Add 2 ñ 3

Averaging

Add 1 ñ 2

Minimum and Maximum

Add 1 ñ 2

Integration

Add 1 ñ 2

The calculation and linearization time varies considerably, from 4 mS for a voltage signal to 25 mS for a thermocouple with a user defined polynomial attached. Approximate calculation and linearization times for the different analog input channel types are listed in the table below.

Calculation and linearization times for calculations, polynomials and spans depends on the complexity of the expression.

Default ADC Sampling Speed

The analog to digital converter sampling speed is determined by the combination of channel selection, settling, conversion, calculation and linearization times.

At 50 Hz line frequency, the default channel selection, settling and conversion times total 45 mS. This combined time translates to a sampling speed of

23 samples/sec for 20 channels in a Schedule

20 samples/sec for 1 channel in a Schedule

At 60 Hz line frequency, the default channel selection, settling and conversion times total 37 mS. This combined time translates to a sampling speed of

25 samples/sec for 20 channels in a Schedule

22 samples/sec for 1 channel in a Schedule

Increasing ADC Sampling Speed

The conversion rate or sampling speed of the dataTaker can be increased to approximately 3 fold the default sampling speed.

Disable Autocalibration

Disabling the Autocalibration Switch (/k) will reduce the schedule execution setup time, because the logger no longer checks if a calibration is required and does not autocalibrate. Setting the calibration interval Parameter0 to a large value will prevent autocalibration from occurring, however the checks will still be done.

This time saving is minor, however if an actual autocalibration occurs during rapid data collection, then there will be a 500-600mS gap in the data set when the autocalibration occurred.

Increasing Mains or Line Frequency Setting

The dataTaker does not actually take its timing from the local mains or line frequency. The logger actually simulates the local mains or line frequency, which it then uses to time analog to digital conversion such that it is synchrony with the local frequency. This reduces mains or line hum induced errors.

Increasing the mains or line frequency setting of the logger decreases the simulated mains or line period. Since analog to digital conversion occurs over one mains or line period, this then decreases the analog to digital conversion time or increases the conversion rate. Note however that such actions will reduce hum rejection by the analog to digital converter, and will introduce some errors if the area is electrically noisy.

Using DeTransfer, the line frequency can be increased to 100 Hz (decreases ADC period to 10mS) by the command

P11=100

The total channel selection-settling-conversion time is reduced to 10 + 10 + 10 = 30 mS. This combined time translates to an increased sampling speed of a little more than 33 samples/sec (other factors not being limiting).

The mains or line frequency can also be set by assigning the period to System Variable 8 as follows

8SV=100

If the analog to digital conversion time is decreased to less than the linearization time, then the sampling speed will be determined by the settling and linearization times rather than by the settling and analog to digital conversion times. Under these circumstances the selection, settling and conversion of the next channel will not commence until the linearization of the previous channel has finished.

Decreasing the Channel Settling Time

The channel settling time is the time allowed between selecting a particular analog channel on the input multiplexer, and commencing analog to digital conversion. This period allows the output of the sensor to settle after beginning to output bias current to the logger following selection.

The channel settling time is 10 mS by default. However the channel settling time can be defined within the range of 0 to 30000 mS using the Parameter10 command, to match the output characteristics of the sensors being monitored.

Using DeTransfer, the channel settling time can be decreased to 1mS by the command

P10=1

The total channel selection-settling-conversion time is reduced to 10 + 1 + 20 = 31 mS. This combined time translates to an sampling speed of a little more than 32 samples/sec at normal mains frequency (other factors not being limiting).

However, combine this with increased mains or line frequency (see above), the total channel selection-settling-conversion time is reduced to 10 + 1 + 10 = 21 mS which translates to an sampling speed of a little more than 47 samples/sec.

The channel settling time can also be set by assigning the period to System Variable 7 as follows

7SV=1

Again, if the linearization time is the dominant factor, then reducing channel settling time will have no real effect.

Autoranging

When Schedule are entered into the dataTaker, all analog channels are set to an initial gain or range which is optimal for the particular signal type.

When each channel is subsequently sampled, it is autoranged to obtain a reading at the highest possible resolution. This range is then retained and becomes the initial range when the channel is next sampled.

This autoranging however reduces the apparent sampling rate. Input signals that are out of the currently selected range for that channel have to be re-converted at different ranges to obtain an acceptable reading. This may involve two or three successive conversions of the channel before suitable data is obtained, or the channel is deemed to be 'out of range'.

For example the second analog channel in the ADC Timing Diagram below is out of range on the first conversion, requiring a second conversion after a gain change. In this case the analog to digital converter took 70 mS to read this channel, rather than 40 mS if the first reading had been in range.

Therefore the sampling rate of the analog to digital converter will be slower when signals are continuously varying between ranges, than if signals stay in the same range.

Digital Channels

Digital input and counter channels in schedule lists are skipped during the channel selection phase. These channels are directly read and linearized following calculation and linearization of the previous analog channel, while the next analog channel is converting (see fourth and fifth channels in the ADC Timing Diagram below).

Digital and counter channels are read in less than 2 mS.

Data Destination

Following execution of a Schedule, resulting data is then logged into the data storage memory if data logging is enabled (LOGON), and returned to the host computer if data return is enabled (/R).

The time taken to log the data into the data storage memory is 2 mS plus 0.5 mS per data point.

The time taken to return the data to the host computer depends on the number of characters to be transmitted for each data item, and the setting for the RS232 COMMS baud rate.

However, transmitting of data to the host computer will generally slow the sampling rate to less than the default sampling rate. The increased sampling rates discussed above will only be achieved if returns to the host computer in real time are disabled (/r) and all data is logged during the data collection session.

Maximum Sampling Rate

The maximum sampling rate possible with the dataTaker 50, 500,600 series data loggers is a little over 70 samples per second.

This is achieved by setting the mains or line frequency to the upper limit of P11=1000, setting channel settling time to the lower limit of P10=1, logging data in real time, disabling return of data to the host computer, and reading only simple voltage signals which have minimal calculation requirements.

Using DeTransfer, the maximum sampling rate can be increased by the commands

/r
/k
P0=100
P10=1
P11=1000
LOGON

Using DeLogger, these commands can be entered in the Program Builder Settings tab. In the Settings tab, right click Special Commands and select PropertiesÖ Enter the commands into the Pre or Post Schedule Initialization Commands.

 


ADC Timing Diagram

Page Content


Home

Title and Waranty

Go to: Section 2 | Section 3

Section 1


Construction of the dataTaker 50

Construction of the dataTaker 500 600

Construction of the CEM

Getting Started

 

Section 2


Interfacing

Powering the dataTaker

Powering Sensors from the dataTaker

The Serial Interfaces

The RS232 COMMS Serial Interface

The NETWORK Interface

Analog Process

Connect Analog

Analog Chns

Measuring Low Level Voltages

Measuring High Level Voltages

Measuring Currents

Measuring 4-20mA Current Loops

Measuring Resistance

Measuring Frequency and Period

Measuring Analog Logic State

Measuring Temperature

Measuring Temperature with Thermocouples

Measuring Temperature with RTDs

Measuring Temperature with IC Temperature Sensors

Measuring Temperature with Thermistors

Measuring Bridges and Strain Gauges

Measuring Vibrating Wire Strain Gauges

The Digital Input Channels

Monitoring Digital State

The Low Speed Counters

The Phase Encoder Counter

The High Speed Counters

The Digital Output Channels

The Channel Expansion Module

Installing The Panel Mount Display

 

Section 3


Programming the dataTaker

Communication Protocols and Commands

Entering Commands and Programs

Format of Returned Data

Specifying Channels

The Analog Input Channels

The Digital Input Channels

The Counter Channels

The Digital Output Channels

The Real Time Clock

The Internal Channels

Channel Options

Schedules

Alarms

Scaling Data - Polynomials, Spans and Functions

CVs Calcs and Histogram

Logging Data to Memory

Programming from Memory Cards

STATUS RESET TEST

Switches and Parameters

Networking

Writing Programs

Keypad and Display

Error Mess Text

Appendix A - ASCII

Appendix B - ADC Timing