Channel raster is the distance between neighboring channels in a frequency band.
During Cell search procedure and Initial synchronization process user equipment (UE) tries to synchronize with the network both in terms of time and frequency, identify itself with the network and acquire knowledge about the network to which it is trying to connect.
A naïve cell search procedure will look like as follows:
1) Do a frequency scan of all the configured frequency bands using the range defined as channel raster. For example if channel raster is 200 KHz, UE will perform search in every 200 KHz in all the bands UE supports. This actually means UE calculates DL received signal power and detects a set of frequency candidates.
2) Search for synchronization signals over all the candidates it has detected in step-1.
In practical systems above naïve procedure is not followed. UE implements more efficient search algorithm. For example, performing spectrum analysis on the frequencies before engaging search operation. Also, it is to be noted that this kind of sophisticated frequency search procedure only needs to done for the very first time. Once the UE successfully acquires the synchronization signal, the later initial access procedure can be based on apriori information from the previous search cell results.
In any case, Channel raster defines step-size for this search procedure and consequently it influences time UE will take to complete initial search procedure. The Channel raster for GSM is 200 kHz for all bands. WCDMA and TD-SCDMA also use a channel raster of 200 kHz for all bands. The selected center frequency must therefore be a whole number multiple of 200 kHz. However, there are exceptions.
The center frequency of an LTE carrier is assumed to be on a 100 kHz channel raster (known also as carrier grid) in order to allow the LTE UEs to search a limited number of carrier frequencies in the frequency bands they support and synchronize within a reasonable amount of time after being activated. Synchronization signals are centered around DC subcarrier. So, there is no decoupling between channel raster and synchronization raster.
5G-NR is expected to be deployed in the wide frequency bands, frequency scanning using the channel raster is not efficient. Further, There may be confusion between NR/LTE netowrks, as both NR and LTE employ
OFDM based waveforms in the downlink. furthermore “alien” waveform confusion may occur when some massive MTC waveforms or some on-demand signals exist in the frequency bands. [Ref: US patent application 2018/0131487]
5G-NR decouples channel raster and synchronization raster. Therefore, [Ref: R1-1608967]
- Frequency raster (used for synchronization and NR cell search) is sparser than carrier/channel raster.
- Frequency raster (used for synchronization and NR cell search) depends on frequency bands.
- NR UE does not assume a fixed frequency location of the synchronization signal(s) relative to the center of the carrier.
- Sparser frequency raster (than channel raster) means that the center frequency locations of the synchronization signal and physical carrier are different.
To come-up to this there were extensive discussions in 3GPP about following questions.
Question 1: Is there any benefit for UE power consumption in allowing that the candidate frequency locations of synchronization signal(s) for NR be sparser than for LTE ?
Question 2: If there is benefit, what are possible synchronization signal raster per frequency ranges and/or bands?
Question 3: what should be relation between the candidate frequency locations of synchronization signal(s) and the frequency locations of the center of NR carrier bandwidth.
Question 4: Are there drawbacks in allowing that the candidate frequency locations of synchronization signal(s) are sparser than the possible frequency locations of the center of NR carrier bandwidth?
Question 5: What about the scenarios where synchronization signal(s) of neighbor cells may not be on the same center frequency, and the impact on intra/inter frequency RRM measurements.