This website uses cookies. By using this site, you consent to the use of cookies. For more information, please take a look at our Privacy Policy.
Home > Wiki encyclopedia > Logic Analyzer

Logic Analyzer

A logic analyzer is an instrument that analyzes the logical relationship of a digital system. Logic analyzer is a kind of bus analyzer that belongs to the data domain test instrument, that is, an instrument that observes and tests the data flow on multiple data lines at the same time based on the concept of bus (multi-line). This instrument is complicated The testing and analysis of the digital system is very effective. A logic analyzer is an instrument that uses a clock to collect and display digital signals from test equipment. The most important role is to determine timing. Since the logic analyzer does not have many voltage levels like the oscilloscope, it usually displays only two voltages (logic 1 and 0). Therefore, after setting the reference voltage, the logic analyzer judges the signal under test by the comparator, which is higher than the reference The voltage is High, and the one below the reference voltage is Low. A digital waveform is formed between High and Low.

Logic Analyzer

Instrument definition

For example: a logic analyzer that uses a 200Hz sampling rate for the signal under test. When the reference voltage is set to 1.5V, the logic analyzer will take a point every 5ms during measurement. If it exceeds 1.5V, it is High (logic 1) The lower than 1.5V is Low (logic 0), and the logic 1 and 0 can be connected into a simple waveform, and the engineer can find the abnormal error (bug) in this continuous waveform. On the whole, when the logic analyzer measures the measured signal, it does not display the voltage value, but the difference between High and Low; if you want to measure the voltage, you must use an oscilloscope. In addition to the different display of voltage values, another difference between a logic analyzer and an oscilloscope is the number of channels. The general oscilloscope has only 2 channels or 4 channels, and the logic analyzer can have from 16 channels, 32 channels, 64 channels and hundreds of channels, so the logic analyzer has the ability to perform multi-channel tests at the same time. Advantage.

According to the difference in hardware equipment design, the logic analyzers on the market can be roughly divided into independent (or single model) logic analyzers and PC-based card-type virtual logic analyzers that need to be combined with computers. The stand-alone logic analyzer integrates all test software, operation management components, and one instrument; the card-type virtual logic analyzer needs to be used with a computer, and the display screen is also separated from the host. In terms of overall specifications, the stand-alone logic analyzer has developed to a very high standard product. For example, the sampling rate can reach 8GHz, the number of channels can be expanded to more than 300 channels, and the storage depth is relatively high. The previous price of the stand-alone logic analyzer It is expensive, ranging from tens of thousands to hundreds of thousands of RMB, which is rarely affordable for general users. The card-type virtual logic analyzer based on the computer interface provides corresponding performance at a lower cost, but the card-type virtual logic analyzer requires a computer to be used. The development of technology has also gradually integrated the functions of oscilloscope and logic analyzer into a hybrid instrument (MSO), also known as mixed signal test instrument.


Protocol analysis

The logic analyzer is the same as the oscilloscope. It collects the specified signal and displays it to the developer in a graphical manner. The developer analyzes whether the error is based on these graphical signals according to the protocol. Although the graphical display has brought a lot of convenience to developers, manually analyzing a series of signals is not only troublesome but also extremely error-prone.

In this society of rapid technological development, everything is pursuing high efficiency. Automation and intelligence have become the development direction of protocol analysis. Under the guidance of this idea, the protocol analysis functions of various test instruments appeared and developed. Most developers can easily find errors, debug hardware, speed up development progress through the protocol analysis function of test tools such as logic analyzers, and provide guarantee for high-speed and high-quality completion of projects.

Regarding this question, the developers of Guangzhou Zhiyuan Electronics Co., Ltd. put forward a brand-new answer: Protocol analysis is a unity that makes full use of logic analyzer resources in a certain application field. Regardless of the sampling frequency, storage space, trigger depth and other resources, the logic analyzer is limited. Only by fully combining the relevant components of the protocol can it maximize its effectiveness.

Protocol decoding is the basis of protocol analysis. Only decoding the correct protocol analysis can be accepted by others, and only correct decoding can provide more error information.

Protocol triggering can make full use of the limited trigger depth and storage space, while providing more and more reliable triggers, providing an efficient tool for quickly finding and locating errors.

Error recognition is the main role of the logic analyzer. It is based on protocol decoding and protocol triggering. Only the powerful protocol triggering function can collect errors, and only the correct decoding of the protocol can find errors.

Information prompts can make full use of resources such as colors and views, and effectively express the results of protocol decoding, so that users can quickly find the information they need. Of course, the information prompt can also reasonably adjust the processing resources, saving user time.

Main feature

The role of the logic analyzer is to display the operation of the digital system in a form that is easy to observe, to analyze and troubleshoot the digital system. Its main features are as follows:

There are enough input channels

It has a variety of flexible triggering methods to ensure accurate positioning of the observed data stream (for software, it can track any program segment in system operation, and for hardware, it can detect and display glitch interference in the system).

With memory function, it can observe single and non-periodic data information and diagnose random faults.

With the ability to delay, to analyze the cause of the failure.

With a limited function, it can select the data to be obtained and delete irrelevant data.

It has a variety of display methods. It can display programs in characters, mnemonics, assembly language, display data in binary, octal, decimal, hexadecimal, etc., and use timing charts to display the timing relationship between information.

It has the ability to drive time-domain instruments in order to reproduce the true waveform of the signal under test and facilitate fault location.

Has reliable glitch detection capability.

Instrument classification

Logic analyzers are divided into two categories: Logic State Analyzer (LSA) and Logic Timing Analyzer. The basic structure of these two types of analyzers is similar, the main difference between the display mode and the timing mode.

The logic state analyzer uses characters 0, 1, or mnemonics to display the detected logic state. The display is intuitive, and error codes can be quickly found from a large number of digits to facilitate functional analysis. The logic state analyzer is used to perform real-time state analysis on the system and check the information state on the bus under the action of the system clock. It does not have a clock generator inside, it uses the system clock under test to control the recording, and works in synchronization with the system under test. It is mainly used to analyze the software of the digital system and is a powerful tool for tracking, debugging programs, and analyzing software failures.

The logic timing analyzer is used to investigate the transmission of digital signals and the time relationship between the two system clocks. It is equipped with a clock generator. Record data under the control of internal clock and work asynchronously with the system under test, mainly used for analysis, debugging and maintenance of digital equipment hardware.

Working principle

The working process of the logic analyzer is the process of data collection, storage, triggering, and display. Because it uses digital storage technology, the data collection work and the display work can be carried out separately or simultaneously. If necessary, the stored data can be repeated Display to facilitate the analysis and research of the problem.

Connect the system under test to the logic analyzer, and use the probe of the logic analyzer (the probe of the logic analyzer is a collection of several probes with small contact pins to facilitate detection of high-density integrated circuits) to monitor the data of the system under test Flow to form parallel data and send it to the comparator, the input signal is compared with the externally set threshold level in the comparator, and the signal greater than the threshold level value outputs a high level on the corresponding line, otherwise the output is low Normally shape the input waveform. The compared and shaped signal is sent to the sampler and is sampled under the control of a clock pulse. The sampled signals are stored in memory in order. The sampling information is organized in the memory according to the principle of "first in first out". After the display command is obtained, the information is read out one by one according to the order, and the measured display is performed according to the set display mode.

Development History

The first-generation product of the logic analyzer, which came out in 1973, has a slow test speed and simple functions. It only has basic triggering capabilities and display methods. Timing analysis and status analysis belong to two types of instruments. The second-generation product is marked by microcomputerization. The combination of analysis and status analysis is convenient for software and hardware analysis of microcomputers; the main features of the third generation products are high speed, multi-channel, large storage capacity and the analysis capabilities represented by system performance analysis; the fourth generation products It is marked by a single-chip logic analyzer, and its performance is more perfect.

Display form

After the logic analyzer writes the measured data signal into the memory in digital form, all or part of the data in the memory can be stably displayed on the screen through the control circuit as needed. There are usually the following display methods.

Timed display

The timing display is to display the contents of the memory on the screen in the form of a waveform diagram expressed by logic levels. It shows a series of square wave-shaped waveforms after being shaped. The high level represents "1" and the low level represents " 0". Since the displayed waveform is not the actual waveform, it is also called "pseudo waveform".

Status table display

The status table shows that the contents of the memory are displayed on the screen in the form of various values such as binary, octal, decimal, and hexadecimal.

Graphical display

Graphical display is a way to display the screen with the X direction as the time axis and the Y direction as the data axis. The digital quantity to be displayed is converted into an analog quantity through a D/A converter, and the analog quantity is displayed on the screen in the order of the digital quantity retrieved from the memory to form a dot matrix of an image.

Image display

The image display is to display all the contents of the memory in the form of a dot diagram at a time. It divides each memory word into two parts, high-order and low-order, which are converted into analog by X/Y direction D/A converters and sent to the X and Y channels of the display, then each memory word lights up one on the screen. point.

Instrument function

As mentioned earlier, most logic analyzers are a combination of two instruments. The first part is a timing analyzer and the second part is a state analyzer.

Timing analysis

A timing analyzer is a part of a logic analyzer similar to an oscilloscope. It displays information in the same way as an oscilloscope. The horizontal axis represents time and the vertical axis represents voltage amplitude. The timing analyzer first samples the input waveform and then uses the user-defined voltage threshold to determine the high and low levels of the signal. The timing analyzer can only determine whether the waveform is high or low, and there is no intermediate level. So the timing analyzer is like a digital oscilloscope with only 1 bit vertical resolution. However, a timing analyzer cannot be used to test parameters. If you use timing analysis to measure the rise time of a signal, you are using the wrong instrument. If you want to check the timing relationship of signals on several lines, a timing analyzer is a reasonable choice. If the signal previously sampled by the timing analyzer is one state and the signal sampled this time is another state, then it knows that the input signal has jumped at some point between the two samples, but the timing analysis The instrument does not know the precise moment. In the worst case, the uncertainty is one sampling period.

Transition timing

If we want to sample and save data for a long time without change, the transition timing can effectively use the memory. Using transition timing, the timing analysis only saves the samples collected after the signal transition and the time from the last transition.

Glitch capture

Glitches in digital systems are a headache, and some timing analyzers have glitch capture and triggering capabilities that can easily track unpredictable glitches. Timing analysis can effectively sample the input data and track any transitions between samples, so that glitches can be easily identified. In timing analysis, the definition of glitch is: any transition that crosses the logical threshold multiple times between samples. Displaying glitches is a very useful function, which helps to trigger glitches and display the data before the glitches are generated, thus helping us determine the cause of the glitches.

State analysis

The state of the logic circuit is: the sample of the bus or signal line when the data is valid. The main difference between timing analysis and status analysis is: timing analysis is controlled by internal clock sampling, the sampling and the system under test are asynchronous; status analysis is controlled by the clock of the system under test, sampling is synchronized with the system under test. Use the timing analyzer to view the "when" event occurred, and use the status analyzer to check the "what" event occurred. Timing analyzers usually display data in waveforms, and status analyzers usually display data in lists.

Technical index

Number of channels

Where a logic analyzer is required, a comprehensive analysis of a system should introduce all signals that should be observed into the logic analyzer, so that the number of channels of the logic analyzer should be at least: the word length of the system under test ( Number of data buses) + number of control buses of the system under test + number of clock lines. In this way, for an 8-bit computer system, at least 34 channels are required. The number of channels of mainstream products of several manufacturers is as high as 340 channels, such as Tektronix, etc. The mainstream products on the market are 16-34 channel logic analyzers.

Sufficient timing resolution

Timed sampling rate

When timing sampling and analysis, to have sufficient timing resolution, there should be a sufficiently high timing analysis sampling rate, but not only high-speed systems need high sampling rate, the sampling rate of mainstream products is as high as 2GS/s. At the rate, we can see the details in 0.5ns time.

State analysis rate

In the state analysis, the logic analyzer's sampling reference clock uses the working clock of the test object (external clock of the logic analyzer). The highest rate of this clock is the logic analyzer's high state analysis rate. In other words, the fastest operating frequency of the system that this logic analyzer can analyze. The timing analysis rate of mainstream products is 300MHz, and the highest can be up to 500MHz or even higher.

Record length per channel

The memory of the logic analyzer is used to store the data it samples for comparison, analysis, and conversion (such as converting the signal it captures into a non-binary signal [assembly language, C language, C++, etc.], etc. The benchmark when choosing the memory length is "larger than the length of the largest block that we can observe after the system can perform the largest split.

Test Fixture

The logic analyzer is connected to the device under test through a probe. The test fixture plays a very important role. There are many types of test clips, such as flying heads and fly heads.


The logic analyzer is connected to the device under test through a probe. The probe functions as a signal interface and occupies an important position in maintaining signal integrity. A logic analyzer is different from a digital oscilloscope. Although the amplitude change relative to the upper and lower limits is not important, the amplitude distortion must be converted into a timing error. The logic analyzer has probes with dozens to hundreds of channels, and its frequency response is from dozens to hundreds of MHz, ensuring that the relative delay of each probe is minimum and the distortion of the amplitude is kept low. This is a key parameter characterizing the performance of the logic analyzer probe. Passive probes from Agilent and active probes from Tektronix are the most representative, and are high-end probes for logic analyzers.

The strength of the logic analyzer lies in its ability to gain insight into the timing relationships of signals in many channels. Unfortunately, if there is a slight difference between the channels, the timing deviation of the channel will occur. In some types of logic analyzers, this deviation can be reduced to a minimum, but there are still residual values. General-purpose logic analyzers, such as TLA600 of Tektronix or HP16600 of Agilent, have a time deviation of approximately 1 ns in all channels. Therefore, the probe is very important. For details, please refer to "Test Accessories and Connected Probe" on this site.

a. The resistive load of the probe, that is, the size of the shunt effect on the system current after the probe is connected to the system. In a digital system, the current load capacity of the system is generally more than a few KΩ, and the effect of the shunt effect on the system is generally It can be ignored that the impedance of several popular long logic analyzer probes is generally between 20 and 200KΩ.

b. The capacitive load of the probe: the capacitive load is the equivalent capacitance of the probe when the probe is connected to the system. This value is generally between 1 and 30PF. In high-speed systems, the influence of the capacitive load on the circuit is far greater than the resistance Sexual load, if this value is too large, it will directly affect the shape of the signal "along" in the entire system, change the nature of the entire circuit, and change the real-time observation of the system by the logic analyzer, resulting in what we see is not the original system Characteristics.

c. The ease of use of the probe: refers to the degree of difficulty when the probe is connected to the system. As the density of the chip package is getting higher and higher, various packaging forms such as BGA, QFP, TQFP, PLCC, SOP, etc. have appeared. The minimum pin pitch of the IC has reached below 0.3mm, and it is necessary to lead the signal well, especially the BGA package, it is indeed difficult, and the size of the discrete device is getting smaller and smaller, the typical has reached 0.5mm × 0.8mm .

d. Compatibility with the debugging part on the existing circuit board.

6. The openness of the system: As the voice of data sharing becomes higher and higher, the openness of the system we use becomes more and more important, and the operating system of the logic analyzer has also developed from a dedicated system in the past to use the Windows interface. This way we are very convenient in use.


If you have digital logic signals in your work, you have the opportunity to use a logic analyzer. Therefore, a logic analyzer should be chosen, which not only meets the functions used, but also does not exceed the required functions. Most users will find an easy-to-operate instrument, which has fewer operation steps on function control, fewer menu types, and is not too complicated.

On the other hand, if you need to use the fastest and largest type of logic analyzer with strong analysis capabilities, there are ready-made solutions. This novel instrument has almost no channel-to-channel delay and probe load effects. If you have a slight omission, you may have to spend tens of thousands of dollars in tuition to gain experience.

It is really the first important thing to be able to capture the signal. When you know that the data being captured is useful data, it depends on the ability of the logic analyzer.


The logic analyzer is mainly used to locate the specific waveform data when the system is in error, and infer the cause of the system error by observing the waveform data, so as to find out the solution to the error in a targeted manner.

There are two main ways to use the logic analyzer to locate the erroneous waveform data. One is to grab a lot of data during the operation, and then use these methods to find the location of the error point in this data. This method takes time and effort. Moreover, subject to the storage capacity of the logic analyzer, it is not always possible to capture the target waveform data; another is to start capturing data when specific waveform data arrives by triggering, so as to accurately locate the target waveform data.

The concept of triggering initially appeared on an analog oscilloscope. The oscilloscope stopped acquiring when a signal of a specific waveform set arrived, and drawn the waveform on the screen. The logic analyzer uses this concept when analyzing digital systems.

During the operation of the digital system, in most cases, the data is continuous. The logic analyzer must store the observed data. The storage depth of the logic analyzer is limited. This is equivalent to extracting a certain amount on the transmission belt. The amount of data extracted depends on the memory depth of the logic analyzer. By triggering, under the condition that a specific waveform data signal is generated, observe the state of the signal related to it before or (and) after the condition is generated. The intuitive performance is the setting of the trigger position. If the trigger position is set to start the tracking trigger, the memory starts to store the collected data when the trigger event occurs until the memory is full; if the tracking trigger is selected to end, the memory stores the continuous data collected until the trigger event occurs until the trigger time Stop storage, when the memory is full and the trigger event has not yet occurred, the new data will automatically overwrite the oldest stored data


The structure of the logic analyzer includes a storage control unit, and the size of the memory indicates the storage depth of the logic analyzer. The bandwidth of modern logic analyzers for storing data is mostly huge. For example, the storage bandwidth of the LAB6052 logic analyzer of Guangzhou Zhiyuan Electronics Co., Ltd. is 500MSps×32bit or 16Gbps, regardless of whether it is data transmission (USB2.0 data rate is 480Mbps) or data The analysis (PC software) process cannot be completed in real time. Therefore, the logic analyzer can only temporarily store the data in the memory and then give it to the analyzer for analysis.

If you need to continuously capture the data stream, the logic analyzer must have enough memory to record the entire event. The storage depth is closely related to the sampling speed. The storage depth you need depends on the total time span to be measured and the required time resolution. The longer the single measurement time and the higher the sampling frequency, the greater the storage depth required. .

In traditional mode, storage depth × sampling resolution = sampling time, which means that under the premise of ensuring sampling resolution, a large storage depth directly increases the single sampling time, that is, more waveform data can be observed and analyzed; and Under the condition that the sampling time is guaranteed, the sampling frequency can be increased, and a more real signal can be observed.

Choose to use


Since the microprocessor was developed in the early 1970s, 4-bit and 8-bit buses appeared, and the dual-channel input of traditional oscilloscopes could not meet the observation of 8-bit bytes. Testing of microprocessors and memories requires different instruments than time and frequency domains. Digital field test equipment came into being. Soon after HP introduced the status analyzer and Biomation launched the timing analyzer (the two were very different at first), users began to accept this number-domain test instrument as a means to finally solve the digital circuit test. Soon the status analyzer and timing analyzer Merged into a logic analyzer.

In the late 1980s, logic analyzers became more complex, and of course it was more difficult to use. For example, the introduction of multi-level tree trigger to deal with complex events such as IF, THEN, ELSE and so on. This type of combination triggering is bound to be more flexible, and at the same time it is not so easy for most users to master.

The probes of logic analyzers are increasingly important. Probe problems occur when the 16 pins on the through-hole component and the pins with only 0.1" gap on the in-line component are clamped. The logic analyzer provides hundreds of devices that operate at 200MHz. Channel signal connection is a real problem. Adapters, clips and auxiliary claw hooks are various, but the best way is to design an inexpensive test fixture, the logic analyzer is directly connected to the fixture, forming a reliable and compact contact .

Development trend

The basic orientation of the logic analyzer has found a solution in the continuous fusion of computers and instruments. Tektronix's TLA600 series logic analyzer focuses on solving the orientation and development capabilities, that is, how the instrument acts and how to construct a distinctive structure. The guide uses Microsoft's Windows interface, which is very easy to drive. Improving the ability to discover signals necessarily involves changes in the structure of the instrument. In all the data to be processed, the data related to time is emphasized, and different types of information are displayed in multiple windows. For example, for a microprocessor, it is best to observe the timing and status and disassemble the source code at the same time, and the cursors on each window are tracked and connected to each other.

Regarding triggering, it is always a problem in traditional logic analyzers. The TLA600 series logic analyzer provides users with a trigger library, which simplifies the setting of complex trigger events and ensures that you concentrate on solving test problems without having to spend time adjusting the trigger settings of the logic analyzer. The library contains many easy-to-understand trigger settings that can be used as the starting point for triggers that usually need to be modified. The need for special triggering capabilities is only part of the problem. In addition to being directly triggered by an error event, users also want to observe the signal from the past time period to find out the source of the error and its relationship. Fine triggering and deep memory can improve the advanced triggering capability.

Using Windows on the PC platform, in addition to providing many well-known benefits for the majority of users, as long as the correct software and related tools are given, you can remotely control through the Internet, extract source code and symbols from the target file format, support Microsoft's CMO/DCOM standard, and the processor can run various control operations.

How to choose?

If a digital circuit fails, we generally consider using a logic analyzer to check the integrity of the digital circuit. It is not difficult to find the existing fault; but in other cases, do you consider using a logic analyzer? For example: the first point is how to observe whether the test system is actually executing according to the program we designed when it executes the program we have prepared in advance? If we write (MOV A, B) to the system and the system executes (ADD A, B), what are the consequences? The second point: how to really monitor the actual working status of the software system, instead of setting breakpoints with DEBUG and other methods, and viewing certain preset variables or data in memory is the value we want in advance. Here we have third, fourth, etc. many problems to be solved.

Usually we divide the digital system into a hardware part and a software part. When developing and designing these systems, we have many things to do, such as the preliminary design of hardware circuits, the formulation and preliminary preparation of software programs, the debugging of hardware circuits, the debugging of software, And finalizing the system, etc. In these tasks, almost every step of the work requires the help of a logic analyzer, but in view of the different economic strength and personnel status of each unit, and in many systems, it is not necessary to put Each part of the above is repeated, so that we can divide the use of logic analyzer into the following levels:

The first level: Just look at some common faults in the hardware system, such as the waveforms of clock signals and other signals, and whether there are glitches in the signal that seriously affect the system.

The second level: The timing of each signal of the hardware system should be well analyzed in order to make the best use of system resources and eliminate some of the failures that can be analyzed by timing analysis;

The third level: analysis of the hardware implementation of the software to ensure that the written program is completely executed by the hardware system;

The fourth level: It is necessary to monitor the execution of the software in real time and debug the software in real time.

The fifth level: the systematic anatomical analysis of the software and hardware of the existing customer system is required to achieve the function that we fully understand and master the software and hardware system of the existing customer system.

For the requirements of the above several levels, we can see that they do not all need very high-end logic analyzers. For users of the first level, they can even use a oscilloscope with better functions to solve the problem. The above several levels of use, you can choose the corresponding instrument when choosing the instrument. There are actually several levels of logic analyzers. They have:

1. Ordinary 2 to 4 channels of digital memory, such as TDS3000 series (plus TDS3TRG advanced trigger module), use some of its advanced trigger functions (such as pulse width trigger, runt pulse trigger, certain AND between each channel, Or, and or, or XOR trigger), we can find the signal we want to see, find and eliminate some faults, and the function of the oscilloscope can also be used for other purposes. Here we only use the additional functions of an oscilloscope It can be said that this way is the most economical way.

2. When the number of channels of the oscilloscope is not enough, you can also choose some multi-channel timing analysis instruments with simple timing analysis functions, such as early logic analyzers and mixed-signal oscilloscopes on the market today, such as Agilent’s 546× ×D oscilloscope.

3. Some functions are relatively simple, and the speed is not a very fast computer plug-in type. Virtual instruments based on Windows and most functions are completed by software. Such products are produced in many domestic manufacturers.

4. The sampling rate, trigger function and analysis function are all powerful and non-expandable fixed machine. Example TLA600 series.

5. Modular plug-in card type machine with stronger functions and better expansion; for different users, different grades of instruments can be selected according to the needs.

When to use?

Logic analyzer is recognized as the best tool in the process of digital design verification and debugging. It can verify whether the digital circuit is working properly and help users find and eliminate faults. It can capture and display multiple signals at a time, and analyze the time relationship and logic relationship of these signals. For debugging and intermittent faults that are difficult to capture, some logic analyzers can detect low-frequency transient interference and whether they violate the establishment and hold time. . In the integration of software and hardware systems, the logic analyzer can track the execution of the embedded software and analyze the efficiency of program execution to facilitate the final optimization of the system. In addition, some logic analyzers can correlate source code with specific hardware activities in the design. A logic analyzer can correlate source code with specific hardware activities in the design.

When you need to complete the following tasks, please use a logic analyzer:

· Debug and verify the operation of the digital system;

· Simultaneous tracking and correlation of multiple digital signals;

·Check and analyze the operation and transient state in the bus that violate the time limit;

· Track the implementation of embedded software.

Application range

With the development of large-scale integrated circuits and microcomputers, modern digital systems have been computerized. The introduction of microcomputer, on the one hand, greatly improved the system's ability to complete many complex tasks; on the other hand, traditional detection equipment has been unable to effectively detect and analyze digital systems, especially microcomputer systems. This is because the data transmission of the digital system is carried out in a manner of spatially distributing multiple code points, which constitute data in a certain format. The transmitted data stream is a data word whose discrete time is its independent variable, rather than a waveform whose continuous time is its independent variable. Therefore, important parameters such as signal amplitude in analog signal analysis are not so important in digital signal analysis. The latter focuses on investigating whether the signal is above or below a certain threshold level, and the relative relationship between these digital signals and the system time.


  • XC5VFX100T-2FFG1738I


    FPGA Virtex-5 FXT Family 65nm Technology 1V 1738-Pin FCBGA

  • XC5VFX130T-1FFG1738C


    FPGA Virtex-5 FXT Family 65nm Technology 1V 1738-Pin FCBGA

  • XC4028XL-1HQ304C


    FPGA XC4000X Family 28K Gates 2432 Cells 0.35um Technology 3.3V 304-Pin HSPQFP EP

  • XCS20XL-5PQ208C


    FPGA Spartan-XL Family 20K Gates 950 Cells 250MHz 3.3V 208-Pin HSPQFP EP

  • XCS20XL-5VQ100C


    FPGA Spartan-XL Family 20K Gates 950 Cells 250MHz 3.3V 100-Pin VTQFP

FPGA Tutorial Lattice FPGA
Need Help?


If you have any questions about the product and related issues, Please contact us.