Gradient-based optimization method for interference suppression of linear arrays by the amplitude-only and phase-only control
Published online by Cambridge University Press: 28 September 2021
Abstract
This paper presents a gradient-based optimization method for interference suppression of linear arrays by controlling the electrical parameters of each array element, including the amplitude-only and phase-only. Gradient-based optimization algorithm (GOA), as an efficient optimization algorithm, is applied to the optimization problem of the anti-interference arrays that is generally solved by the evolutionary algorithms. The goal of this method is to maximize the main beam gain while minimizing the peak sidelobe level (PSLL) together with the null constraint. To control the nulls precisely and synthesize the radiation pattern accurately, the full-wave method of moments is used to consider the mutual coupling among the array elements rigorously. The searching efficiency is improved greatly because the gradient (sensitivity) information is used in the algorithm for solving the optimization problem. The sensitivities of the design objective and the constraint function with respect to the design variables are analytically derived and the optimization problems are solved by using GOA. The results of the GOA can produce the desired null at the specific positions, minimize the PSLL, and greatly shorten the computation time compared with the often-used non-gradient method such as genetic algorithm and cuckoo search algorithm.
- Type
- Antenna Design, Modelling and Measurements
- Information
- International Journal of Microwave and Wireless Technologies , Volume 14 , Issue 8 , October 2022 , pp. 1002 - 1008
- Copyright
- Copyright © The Author(s), 2021. Published by Cambridge University Press in association with the European Microwave Association
References
- 1
- Cited by