Theinput impedance of a network is the measure of the opposition to current flow(impedance), both static (resistance) and dynamic (reactance), into the loadnetwork being connected that is external to the electrical source. If currentinput is needed, it is ideal that the input impedance is low, such as a TIA whichis a converter for current to voltage. If we want a high voltage across theinput impedance, we increase the impedance, as this will in turn increase the voltage. InputImpedance2. Broadband – transmits many signals through the whole band, allows you tosend and receive at the same time1. Baseband – uses the whole band to submit one signal of information, thisallows you send or receive data, but not at the same timeBandwidthcan be used in 2 different types of ways – Bandwidth is defined as a range within a band offrequencies or wavelengths.
Bandwidth is also defined as the amount of datathat can be transmitted in a fixed amount of time. For digital devices, thebandwidth is usually expressed in bits per second (bps) or bytes per second.For analog devices, the bandwidth is expressed in cycles per second, or Hertz(Hz).
Bandwidth Sensitivity is defined as the ratio ofthe output signal to the corresponding input signal for a specified set ofoperating conditions. Similarly, gain is the ratio of the amplifier outputsignal voltage to the input signal voltage. If the amplification ratio is lessthan unity, then it is called the attenuation. The sensitivity of a measuring device or instrumentdepends on the principle of operation and the design. Many devices orinstruments are designed to have a linear relationship between input and outputsignals and thus provide a constant sensitivity over the operating range.
Sensitivity Accuracycan be defined as the amount of uncertainty in a measurement with respect to anabsolute standard. Accuracy specifications usually contain the effect of errorsdue to gain and offset parameters. Offset errors can be given as a unit ofmeasurement such as volts or ohms and are independent of the magnitude of theinput signal being measured. An example might be given as ±1.
0 millivolt (mV) offseterror, regardless of the range or gain settings. In contrast, gain errors dodepend on the magnitude of the input signal and are expressed as a percentageof the reading, such as ±0.1%. Accuracy Resolution can be in a specification labelled as maximum resolution,this is the smallest value found by the meter on the lowest range. In many specifications for DMM (DigitalMultimeter), range and resolution are related.
Some offer an auto rangefunction, it selects the best range for the measurement, and with this you getthe relevant reading and the best resolution. Resolution refers to how fine a measurement a meter can make. By knowingthe resolution of a meter, you can determine if it is possible to see a smallchange in the measured signal.ResolutionTask 4 Testspecifications will ensure the validity and consistency of measurements whentesting electronic components because the test specification is a list of rulesthat need to be adhered to so that the component is what is says to be withoutany defects. Itis very important that the components are functioning correctly and when aconsumer buys the product they hope that it does what it is supposed to do butthis is not always the case.
It is the engineer’s job, to ensure that each andevery component is tested and deemed safe to use. The implications of a poor testspecification for any component are enormous in terms of Cost, Safety andCorporate Reputation. · Temperature· The component tolerance· Waveform Type· Frequency· Current· Voltage· The way that the equipment is connected· The type of test and measurement equipment usedBelow I have devised a list of testparameters, this will include:An electronic test specificationsare a set of rules which must be applied when testing an electronic componentor system.
This means that whenever the component or system is tested it istested in the same way. Therefore we should expect the same or very similar resultseach time a test is carried out. When tested in this way, components or systemswhose behavior does not fall within the expected parameters are deemed to bedefective. A Test Specification is a detailedsummary of what scenarios will be tested, how they will be tested and how oftenthey will be tested for a given feature. Test Specifications can be used toverify the compliance to internal, national or international standards.
Thesestandards ensure the validity and consistency of the measurement. Inyour report explain the importance of test specifications as an aid to ensuringthe validity and consistency of measurements taken during testing of electronicequipment.Task 3 Referencestandardsare used with known values for selected points covering the range of interestare measured with the instrument in question. Then a functional relationship isestablished between the values of the standards and the correspondingmeasurements. Measuringinstruments have a degree of accuracy that degrades over time.
This istypically caused by normal use however, a degrade in accuracy can also becaused by electric or mechanical shock. In addition to this, hazardousmanufacturing environment can also effect the accuracy on the instrument.Calibration improves the accuracy of the measuring device. Accurate measuringimproves product quality.
Calibratingan instrument may vary from product to product but the process itself generallyinvolves using the piece of equipment to test samples of one or more valuesalso known as calibrators. When the results are found they are used toestablish a relationship between the measurement technique used by theinstrument and the known values. •Manufacturer’s specified tolerance•Consistency with similar instruments at your facility•Capability of available test equipment•Requirements of the processCalibrationtolerances should be determined from a combination of factors.
These factorsare:Tolerance is the permissible deviation from a specified value; maybe expressed in measurement units, percent of span, or percent of reading.Accuracyis the degree of closeness of measurements of a quantity to its actual truevalue.Precision is the degree to which repeated measurements underunchanged conditions show the same resultThere are two main characteristicsthat will be needed when calibrating an instrument. These are Precision andAccuracy.Instrument calibration is intendedto eliminate or reduce bias in an instrument’s readings over a range forvalues.Calibration refers to the act ofevaluating and adjusting the precision and accuracy of measurement equipment. Thecompany also require a report in which you describe the principles and need forthe calibration of an item of electronic test equipment.
Task 2 · Setting Resolution – 0.1dB(or 0.01?V to 1mV) by direct keyboard entry, or in user-set increments of 0.1dBto 100dB (or 0.01?V to 100mV)· Accuracy – Better than ± 2dBm· Frequency Range – 10MHzto 6000MHz· +/- 1 ppm over 15°C to30°C; +/- 2 ppm over 5°C to 40°C· Level Range – -110dBmto +7dBm amplitude, 0.1dB stepsCharacteristics· Full remote control through RS232, USB, LAN and GPIB· Storage for 12 generator set-ups and 16 sweep lists· User compensation tables for specific test set-ups· Fast stepping sweep with dwell times down to 10ms· Locking to external frequency standard· 0MHz to 6000MHz frequency rangeFeatures · Generates different electrical waveforms over wide range of frequenciesFunction An oscilloscope’s primary function is to provide a graph ofa signal’s voltage over time. This is useful for measuring such things as clockfrequencies, duty cycles of pulse-width-modulated signals, propagation delay,or signal rise and fall times.
It can also alert you to the presence ofglitches in your logic or bouncing switches. Oscilloscope · Frequency range – 50 to 400Hz (Phase rotation test frequency, 45 to65Hz)· Measuring range – 0 to 400K?· Voltage range – 100 to 690V AC· LCD resolution – 1V ±3%, 8 digits· LED resolution – 12, 24, 50, 120, 230, 400, 690VCharacteristics · Auto power on – less than 12V AD/DC· Optical and Acoustic indication · Voltage display – 12 to 690V AC/DC· Fully operational voltage indication even when batteries and discharge· Fully compliant with GS38· Phase rotation test systemFeatures · Test continuity – Current· Test AC/DC voltageFunctionA logic probe is a hand-held test probe used for analysingand troubleshooting the logical states of a digital circuit. Logic Probe Reverse power protection prevents signals traveling in the wrongdirection from damaging the source.An on-board reference is a source of information, usuallyreferring to the clock, which supplies timing information.
Operation features to consider include on-board reference,on-board oscillator, reverse power protection, and battery power. The frequency range specifies the range of output frequenciesthe generator can produce. The frequency resolution is the smallest frequencyincrement the generator can produce. The generator’s internal clock determinesthe frequency accuracy. It is a measure of how accurately the source frequencycan be set. The maximum input channels refer to the maximum number of allanalogue input channels, general and specific types. · Description Function Generator, 4 MHz,Digital · DC Offset Voltage± 10 V · Square Wave Symmetry? 2% (1 Hz to 100 kHz)· Impedance± 2% (50 ?)· Attenuator± 2% – 20 dB · Amplitude0.1 to 10 Vp-p (50 ? load)· Sine Wave Distortion< 2% (1 Hz to 100kHz)· Max Frequency Range (MHz)4· Min Frequency Range (Hz)0.
5Characteristics · Operating Modes Normal – Sweep, VCF· variouswaveforms the most common of which are triangle, sine, and rectangular pulse· frequencyfine · frequencyrange · settingamplitude,Functions· SwitchingSpeed· FrequencyAccuracy· Frequencyresolution· FrequencyRange· MaximumInput ChannelsFeatures to consider when looking at signal generatorsare;A signal generator is an electronic device that generatesrepeating or non-repeating electronic signals in either the analogue or thedigital domain. It is generally used in designing, testing, troubleshooting,and repairing electronic or electroacoustic devices, though it often hasartistic uses as well. Signal generator· And trigger sources. · Sample rate · Output level· Input signal range· Bandwidth· Distortion· Accuracy· Resolution· Output impedance· Input impedanceInaddition to this, there are many characteristics of this piece of equipment.
The multimeters characteristics are: · Attenuators· Manual and AutomaticRange selection · In-built calibrationfacilities· External Bus Interfaces· Portable· Attenuators· Input and Outputconnectors· Separate battery andfuses access without breaking the calibration seal· Display · Min/Max/Averagerecording· Frequency measurementup to 100kHz· Capacitance measurementup to 10000?F· Resistance, continuityand diode measurement· EN61010-1 safetystandard compliance to Cat IV 600V & Cat III 1000V· True rms (AC)measurement· 6000 count LCD with 33segment bargraphWithusing the Fluke 179 Digital Multimeter, there are many benefits and features.These consist of:· And Logic Level. · Impedance · Accurate measurement ofwaveform parameters· Distortion Measurement· Waveform· Resistance· Voltage and Current· Alternating Current(AC) and Direct Current (DC)Thefunction of a Fluke 179 Digital Multimeter is to accurately measure valueslike:Adevice consisting of one or more meters, as an ammeter and voltmeter, used tomeasure two or more electrical quantities in an electric circuit, as voltage,resistance, and current. Amprobe 15XP-A Handheld Digital MultimeterAssignment 1