If I need to pick two things that are fascinating about nature, I would pick symmetry and synchronization. Mathematician Steven Strogatz in his famous TED talk explored how flocks of creatures (like birds, fireflies and fish) manage to synchronize and act as a unit — when no one’s giving orders. Similarly, The book “The Equation That Couldn’t Be Solved” explores symmetry in everything from biology and physics to music and the visual arts.
A lot of scientific and mathematical discoveries are based on concepts associated with symmetry and synchronization . Albert Einstein used symmetry as a guiding principle when he devised his General Theory of Relativity. James Clerk Maxwell, mathematical physicist, demonstrated symmetry between electric and magnetic fields.
Initial synchronization is a very important process in communication systems. It comes into picture when a user equipment (UE) tries to connect to network very first time or it tries to handover to new cell. These procedures allow UE to synchronize with the network both in terms of time and frequency, identify itself with the network and acquire knowledge about the network to which it is trying to connect. To aid UE in this process, Synchronization signals are defined. These synchronization signals are transmitted by network and UE detects them to synchronize itself with the network. Keeping in mind importance of these signals, good amount of effort is given to design these signals. One of the design requirement for these signal is that synchronization signals should be based on mathematical sequences that have sharp (time/frequency offset detection) ambiguity function so that mis-detection at UE end does not happen due to time and frequency impairments.
This blog post tries to explore how these synchronization signals are changed with each generation of cellular communication standard starting from 2G-GSM to 5G-NR.
GSM uses TDMA technique for transmitting information and it uses
Frequency Correction Burst (FCB) signal as the frame synchronization information. The FCB signal consists of a single Continuous-Wave (CW) tone (Pure Sine Wave) transmitted at a frequency 67.708 Hz above the nominal carrier frequency of the downlink signal. The tone is transmitted for 148 symbol intervals in the first time slot of every tenth frame, equivalent to 148/270833=546.4 microseconds. So, the task of the receiver is to detect this pure sine-wave for detecting the frame structure of down-link frame structure. This may look like a trivial signal processing task. However, detecting a sine wave that is deep buried under the noise and impaired by frequency offset and other impairments can be a daunting task.
In 4G-LTE, There are two types of synchronization signals: Primary Synchronization Signal (PSS) and Secondary Synchronization Signals (SSS). PSS is based on class of complex exponential sequences known as Zad-off Chu (ZC) sequence. Its waveform is defined by class of waveforms known as Constant Amplitude Zero-Autocorrelation (CAZAC). SSS is based on maximum length sequences (m-sequences). The sequence used for the SSS is an interleaved concatenation of two length-31 m- sequences s0(m0) (n) and s1(m1) (n) (also known as “short codes”), where m0 and m1 are the indices for the short codes/sequences s0 and s1, respectively. An m-sequence is a pseudo random binary sequence which can be created by cycling through every possible state of a shift register of length resulting in a sequence of length.
In 5G-NR also there are same two synchronization signals as 4G-LTE. However, type of the signals used is different. PSS is based on m-sequence and SSS is based on Gold sequence. NR-PSS sequence is constructed based on a length-127 BPSK modulated m-sequence. In freq. domain 3 cyclic shifts (0, 43, 86) to get the 3 PSS signals carrying the index of cell ID in a cell ID group. NR-SSS sequence is generated by the multiplication of two BPSK modulated m-sequences with cyclic shifts determined by the cell ID.
Why does 5G-NR use m-sequence for PSS instead of ZC-sequence based as in 4G-LTE ? Given below is great article that explains the reason. In nutshell, The LTE PSS based on Zadoff-Chu (ZC) sequences has a time-frequency ambiguity problem, i.e., the time-frequency auto-correlation of each LTE PSS contains several signiﬁcant side lobes besides the main lobe at zero time delay and zero carrier frequency offset (CFO). This may lead to false PSS/SSS detection. http://www.dpi-proceedings.com/index.php/dtcse/article/viewFile/26293/25707
Why does 5G-NR use gold-sequence for SSS instead of interleaved concatenation of two m-sequences with different cyclic shifts as in 4G-LTE ?
LTE cell searches can result in false alarms (FAs), where non-existent cells (also known as false cells or ghost cells) are detected. One cause of this is when two nearby cells have SSSs which share a short code. e.g., the cell ID NID(1)=1 has an m0=0, but so does NID(1)=30, NID(1)=59, NID(1)=87, NID(1)=114, NID(1)=160, and NID(1)=165. Accordingly, if a cell has NID(1)=30, and a nearby cell has NID(1)=114, they would have the same s0(m0) (n). This overlap in constituent short codes results in what is known as a “short code collision”, which, in turn, can result in a ghost cell being detected. Other causes of ghost cells include imperfect PSS/SSS cross-correlations between two cell IDs and random noise/interference. [Ref: System and method for cell search enhancement in an LTE system United States Patent 9432922]. In LTE, this issue can to a certain extent be alleviated by performing cell ID veriﬁcation via the always-on common reference signal (CRS), which is scrambled by the cell ID. For NR, there is no always-on reference signal and the problem is solved inherently by proper SSS sequence selection.