Definition of: MHz (MegaHertz)

MHz or MegaHertz means One Million clock cycles per second. The unit is used to measure the maximum frequency of circuit switch, or “state change,” for electronics. Since data change or computations involve a state change, the frequency is an upper-limit to performance. GigaHertz (GHz) is One Billion cycles, or 1000 MegaHzerts (MHz).

Because Hertz is to measure frequencies, it is also used for radio-frequency, but that has nothing to do with computing performance.

Frequency in MHz if often used as a metric for chips (CPU, GPUs) and memory (RAM, VRAM), but also for data buses (data transport lanes). In all cases, the frequency defines the rhythms at which things do happen. The higher the rhythm (the clock), the more potential for changes and computations.

Relation to performance

Frequency is one of many factors of performance. It is also possible to scale performance by treating more data. For example, for each clock tick, computer makers have made it possible to treat larger chunks of data (8,16,32,64,128,256 bits or more). This is called the data “width.”

It is also possible to process more data streams by adding more sub-processors or cores. CPUs have two to 16 cores, and GPUs have thousands of computing cores in them, all running at the same clock (frequency).

All this data must transit from memory to cores, and that’s why a fast data bus is required. A balanced system will be designed so that none of the performance-defining elements ends up being a bottleneck for the others.

Filed in Computers..