I have a strategy in which I have a few filters I am using, namely OsMA, Stochastic, RSI and CCI. Apart from the parameters of each indicator, I have on and off inputs for each one of them:
input bool use_osma= true; input bool use_sto = true; input bool use_rsi = true; input bool use_cci = true; input int fOs=120; input int sOs=70; input int sigOs=45; input int K=19; input int D=7; input int S=11; input int RSIp=2; input int CCIp=4;
Now when I’m trying to optimise the strategy, I want to also check which combination of filters works best. But the problem is that when one of the filters is turned off( let’s say for example it’s the OsMA that is off), the optimisation still does tests with different settings for the OsMA, wasting unnecessary testing time.
So is there a way to skip these different tests of an indicator’s settings, when that indicator is turned off?
I can’t think of a way to do it with INIT_PARAMETERS_INCORRECT.