The Accuracy can be defined as the amount

The
input impedance of a network is the measure of the opposition to current flow
(impedance), both static (resistance) and dynamic (reactance), into the load
network being connected that is external to the electrical source. If current
input is needed, it is ideal that the input impedance is low, such as a TIA which
is a converter for current to voltage. If we want a high voltage across the
input impedance, we increase the impedance, as this will in turn increase the voltage.

 

Input
Impedance

2.   
Broadband – transmits many signals through the whole band, allows you to
send and receive at the same time

1.   
Baseband – uses the whole band to submit one signal of information, this
allows you send or receive data, but not at the same time

Bandwidth
can be used in 2 different types of ways –

Bandwidth is defined as a range within a band of
frequencies or wavelengths. Bandwidth is also defined as the amount of data
that can be transmitted in a fixed amount of time. For digital devices, the
bandwidth is usually expressed in bits per second (bps) or bytes per second.
For analog devices, the bandwidth is expressed in cycles per second, or Hertz
(Hz).

 

Bandwidth

 

Sensitivity is defined as the ratio of
the output signal to the corresponding input signal for a specified set of
operating conditions. Similarly, gain is the ratio of the amplifier output
signal voltage to the input signal voltage. If the amplification ratio is less
than unity, then it is called the attenuation. The sensitivity of a measuring device or instrument
depends on the principle of operation and the design. Many devices or
instruments are designed to have a linear relationship between input and output
signals and thus provide a constant sensitivity over the operating range.

 

Sensitivity

 

Accuracy
can be defined as the amount of uncertainty in a measurement with respect to an
absolute standard. Accuracy specifications usually contain the effect of errors
due to gain and offset parameters. Offset errors can be given as a unit of
measurement such as volts or ohms and are independent of the magnitude of the
input signal being measured. An example might be given as ±1.0 millivolt (mV) offset
error, regardless of the range or gain settings. In contrast, gain errors do
depend on the magnitude of the input signal and are expressed as a percentage
of the reading, such as ±0.1%.

 

Accuracy

 

Resolution can be in a specification labelled as maximum resolution,
this is the smallest value found by the meter on the lowest range.  In many specifications for DMM (Digital
Multimeter), range and resolution are related. Some offer an auto range
function, it selects the best range for the measurement, and with this you get
the relevant reading and the best resolution.

Resolution refers to how fine a measurement a meter can make. By knowing
the resolution of a meter, you can determine if it is possible to see a small
change in the measured signal.

Resolution

Task 4

 

Test
specifications will ensure the validity and consistency of measurements when
testing electronic components because the test specification is a list of rules
that need to be adhered to so that the component is what is says to be without
any defects.

It
is very important that the components are functioning correctly and when a
consumer buys the product they hope that it does what it is supposed to do but
this is not always the case. It is the engineer’s job, to ensure that each and
every component is tested and deemed safe to use.

The implications of a poor test
specification for any component are enormous in terms of Cost, Safety and
Corporate Reputation.

·        
Temperature

·        
The component tolerance

·        
Waveform Type

·        
Frequency

·        
Current

·        
Voltage

·        
The way that the equipment is connected

·        
The type of test and measurement equipment used

Below I have devised a list of test
parameters, this will include:

An electronic test specifications
are a set of rules which must be applied when testing an electronic component
or system. This means that whenever the component or system is tested it is
tested in the same way. Therefore we should expect the same or very similar results
each time a test is carried out. When tested in this way, components or systems
whose behavior does not fall within the expected parameters are deemed to be
defective.

A Test Specification is a detailed
summary of what scenarios will be tested, how they will be tested and how often
they will be tested for a given feature. Test Specifications can be used to
verify the compliance to internal, national or international standards. These
standards ensure the validity and consistency of the measurement.

In
your report explain the importance of test specifications as an aid to ensuring
the validity and consistency of measurements taken during testing of electronic
equipment.

Task 3

 

 

Reference
standards
are used with known values for selected points covering the range of interest
are measured with the instrument in question. Then a functional relationship is
established between the values of the standards and the corresponding
measurements.

Measuring
instruments have a degree of accuracy that degrades over time. This is
typically caused by normal use however, a degrade in accuracy can also be
caused by electric or mechanical shock. In addition to this, hazardous
manufacturing environment can also effect the accuracy on the instrument.
Calibration improves the accuracy of the measuring device. Accurate measuring
improves product quality.

Calibrating
an instrument may vary from product to product but the process itself generally
involves using the piece of equipment to test samples of one or more values
also known as calibrators. When the results are found they are used to
establish a relationship between the measurement technique used by the
instrument and the known values.


Manufacturer’s specified tolerance


Consistency with similar instruments at your facility


Capability of available test equipment


Requirements of the process

Calibration
tolerances should be determined from a combination of factors. These factors
are:

Tolerance is the permissible deviation from a specified value; may
be expressed in measurement units, percent of span, or percent of reading.

Accuracy
is the degree of closeness of measurements of a quantity to its actual true
value.

Precision is the degree to which repeated measurements under
unchanged conditions show the same result

There are two main characteristics
that will be needed when calibrating an instrument. These are Precision and
Accuracy.

Instrument calibration is intended
to eliminate or reduce bias in an instrument’s readings over a range for
values.

Calibration refers to the act of
evaluating and adjusting the precision and accuracy of measurement equipment.

The
company also require a report in which you describe the principles and need for
the calibration of an item of electronic test equipment.

Task 2

 

·        
Setting Resolution – 0.1dB
(or 0.01?V to 1mV) by direct keyboard entry, or in user-set increments of 0.1dB
to 100dB (or 0.01?V to 100mV)

·        
Accuracy – Better than ± 2dBm

·        
Frequency Range – 10MHz
to 6000MHz

·        
+/- 1 ppm over 15°C to
30°C; +/- 2 ppm over 5°C to 40°C

·        
Level Range –  -110dBm
to +7dBm amplitude, 0.1dB steps

Characteristics

·        
Full remote control through RS232, USB, LAN and GPIB

·        
Storage for 12 generator set-ups and 16 sweep lists

·        
User compensation tables for specific test set-ups

·        
Fast stepping sweep with dwell times down to 10ms

·        
Locking to external frequency standard

·        
0MHz to 6000MHz frequency range

Features

 

·        
Generates different electrical waveforms over wide range of frequencies

Function

 

An oscilloscope’s primary function is to provide a graph of
a signal’s voltage over time. This is useful for measuring such things as clock
frequencies, duty cycles of pulse-width-modulated signals, propagation delay,
or signal rise and fall times. It can also alert you to the presence of
glitches in your logic or bouncing switches.

Oscilloscope

 

·        
Frequency range – 50 to 400Hz (Phase rotation test frequency, 45 to
65Hz)

·        
Measuring range – 0 to 400K?

·        
Voltage range – 100 to 690V AC

·        
LCD resolution – 1V ±3%, 8 digits

·        
LED resolution – 12, 24, 50, 120, 230, 400, 690V

Characteristics

 

·        
Auto power on – less than 12V AD/DC

·        
Optical and Acoustic indication

·        
Voltage display – 12 to 690V AC/DC

·        
Fully operational voltage indication even when batteries and discharge

·        
Fully compliant with GS38

·        
Phase rotation test system

Features

 

·        
Test continuity – Current

·        
Test AC/DC voltage

Function

A logic probe is a hand-held test probe used for analysing
and troubleshooting the logical states of a digital circuit.

Logic Probe

 

 

 

 

Reverse power protection prevents signals traveling in the wrong
direction from damaging the source.

An on-board reference is a source of information, usually
referring to the clock, which supplies timing information.

Operation features to consider include on-board reference,
on-board oscillator, reverse power protection, and battery power.

The frequency range specifies the range of output frequencies
the generator can produce. The frequency resolution is the smallest frequency
increment the generator can produce. The generator’s internal clock determines
the frequency accuracy. It is a measure of how accurately the source frequency
can be set.

The maximum input channels refer to the maximum number of all
analogue input channels, general and specific types.

 

·        
Description Function Generator, 4 MHz,
Digital

·        
DC Offset Voltage± 10 V

·        
Square Wave Symmetry? 2% (1 Hz to 100 kHz)

·        
Impedance± 2% (50 ?)

·        
Attenuator± 2% – 20 dB

·        
Amplitude0.1 to 10 Vp-p (50 ? load)

·        
Sine Wave Distortion< 2% (1 Hz to 100 kHz) ·         Max Frequency Range (MHz)4 ·         Min Frequency Range (Hz)0.5 Characteristics   ·         Operating Modes Normal - Sweep, VCF ·         various waveforms the most common of which are triangle, sine, and rectangular pulse ·         frequency fine ·         frequency range ·         setting amplitude, Functions ·         Switching Speed ·         Frequency Accuracy ·         Frequency resolution ·         Frequency Range ·         Maximum Input Channels Features to consider when looking at signal generators are; A signal generator is an electronic device that generates repeating or non-repeating electronic signals in either the analogue or the digital domain. It is generally used in designing, testing, troubleshooting, and repairing electronic or electroacoustic devices, though it often has artistic uses as well. Signal generator ·         And trigger sources. ·         Sample rate ·         Output level ·         Input signal range ·         Bandwidth ·         Distortion ·         Accuracy ·         Resolution ·         Output impedance ·         Input impedance In addition to this, there are many characteristics of this piece of equipment. The multimeters characteristics are:   ·         Attenuators ·         Manual and Automatic Range selection ·         In-built calibration facilities ·         External Bus Interfaces ·         Portable ·         Attenuators ·         Input and Output connectors ·         Separate battery and fuses access without breaking the calibration seal ·         Display ·         Min/Max/Average recording ·         Frequency measurement up to 100kHz ·         Capacitance measurement up to 10000?F ·         Resistance, continuity and diode measurement ·         EN61010-1 safety standard compliance to Cat IV 600V & Cat III 1000V ·         True rms (AC) measurement ·         6000 count LCD with 33 segment bargraph With using the Fluke 179 Digital Multimeter, there are many benefits and features. These consist of: ·         And Logic Level. ·         Impedance ·         Accurate measurement of waveform parameters ·         Distortion Measurement ·         Waveform ·         Resistance ·         Voltage and Current ·         Alternating Current (AC) and Direct Current (DC) The function of a Fluke 179 Digital Multimeter is to accurately measure values like: A device consisting of one or more meters, as an ammeter and voltmeter, used to measure two or more electrical quantities in an electric circuit, as voltage, resistance, and current.                                     Amprobe 15XP-A Handheld Digital Multimeter Assignment 1

Author: