Interview with Dan the validation man

Interview between Dan and Robert

Interview between Dan and Robert

Of course the antenna design algorithms have to be validated! That’s what you’d expect from any good software?

Isn’t antenna design routine validation and model validation the same thing??

After hearing these comments I decided to move the spotlight onto Dan your validation man to see what validation is all about. He is the person charged with the job of ensuring that the algorithms used for antenna designs and antenna models that are exported from Antenna Magus are validated and can be trusted.

My chat with Dan went something like this:

Me: Hello Dan, I heard a lot of positive feedback from our users that they love the fact that the exported simulation models are ready to run. Can you please explain how you do it?

Dan: It is always great when we get positive feedback, but I definitely can’t take all the credit. It is a team effort between the engineers and myself.

Me: Some people think model validation and design routine validation is the same thing. Can you explain the differences?

Dan: These are two completely different concepts which are validated simultaneously. Almost like simultaneously solving two unknowns using mathematical equations.

Me: Oh? OK?that is interesting. Can you explain the process in a bit more detail?

Dan: Well the engineers give me the algorithms which are used to design the antennas in Antenna Magus. There are often a few objective groups that have to be tested so I pick a wide range of combinations and do multiple designs to see if the results match the given design criteria. I effectively treat the algorithms as a black-box and test if they work as specified without trying to understand what is going on inside the algorithm. For instance if a patch is designed for 7 dBi gain using a 1mm Rogers substrate with 100 Ohm impedance, I set up a simulation model using the designed values and then look at the results to confirm if I achieve the specified performance.

Me: What do you do if you don’t get what you expect?

Dan: Well I don’t try to fix it myself! I inform the engineer about the problem and he has to fix it and then resubmit his design for validation. Sometimes this process has to go through a couple of reworks or iterations before I am happy that the design is working as expected.

Me: So you say that you are using CEM analysis to test the design but how do you know if the problem isn’t maybe with the simulation model that you are using?!

Dan: This is what makes it interesting. The engineers do compare their simulation models against literature and measurements for specific cases, but that doesn’t come close to covering the whole Antenna Magus design range! I analyse hundreds of designs with multiple CEM tools using different CEM methods over a wide range of design objectives and compare the results for the different tools and methods. If all simulated results match then I am confident that the simulation models are good and that unexpected results are due to problems with the design algorithm. If there isn’t good correlation between different simulation tools and methods, then I first ask the engineer to see if there are not problems with the models before I carry on with validation.

Me: Oh so that is why you said the design and model validation are interdependent.

Dan: Exactly. An interesting case is with the E/H plane horn. Initially the design did not pass due to greater-than-expected variation in the beamwidth (up to +- 4dB at the design frequency!). The simulated results were identical for different CEM methods in both FEKO and CST MICROWAVE STUDIO, so I knew the problem had to be with the design approach. After the engineers spent a lot of time searching for the mistake they realized that diffractions inside the waveguide could indeed cause such a variation and that it was not easily possible to compensate for this in the design. They made a couple of changes, and we were able to get the performance of the algorithm to be acceptable for 1st order design.

Me: Wow! that is interesting.

Dan: Yes it was very interesting. For some cases this effect was insignificant but the validation routine showed that there are cases where the diffractions cannot be ignored in the design approach.

Me: How many simulations have you done to validate the designs and models of the 101 antennas currently in Antenna Magus?

Dan: Hahahaha!! I’ve stopped counting a long time ago. For some antennas I have run over 2000 simulations. Other antennas validate quicker but I would say an average of 1300 runs per antenna. Remember some antennas have multiple models per export tool.

Me: That works out to a total of more than 130 000 simulations!

Dan: Well I actually think that is quite a conservative estimation as this doesn’t include all the simulations that have to be repeated when errors are found with models and design algorithms.

Me: I never realised that so much effort goes into the validation process but it does put me at ease to know that everything is so thoroughly tested.
Thank you for your time Dan and for helping ensure that Antenna Magus gives working antenna designs and that simulation models that are really ready to run!
Dan: It is a pleasure!

Author: Robert Kellerman

Leave a Reply

Your email address will not be published. Required fields are marked *