The limitations to ultra-low sidelobe performance are explored using a 32-element linear array, operating at L-band, contianing transmit/receive (T/R) modules with 12-bit phase shifters. With conventional far-field calibrations, the average sidelobe level of the array was about-40dB. In theory, considerably lower sidelobe performance is expected from such an array. Initially, sidelobe performance was thought to be limited by inadequate calibrations. An examination of individual array element patterns showed a mirror-symmetric ripple which could be attributed to edge effects in a small array. Simulations indicated that more precise calibrations would not compensate for these element-pattern differences. An adaptive calibration technique was developed which iteratively adjusted the attenuator and phaser commands to create nulls in the antenna pattern in the direction of the nulls of a theoretical antenna pattern. With adaptive calibrations, the average sidelobe level can be lower to 60dB. The technique can be used for interference suppression by implementing antenna patterns with deep nulls in specified directions.