You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If we use interpreter.SetDefaultNumberType(DefaultNumberType.Decimal) , then we can't access any default library like string related or Math related functions etc.
#311
Closed
ATEEKGIT opened this issue
Jul 19, 2024
· 3 comments
· Fixed by #313
If we use interpreter.SetDefaultNumberType(DefaultNumberType.Decimal); as in code below.
interpreter = new Interpreter().SetDefaultNumberType(DefaultNumberType.Decimal);
After this if we try to evaluate below string for a loaded variable strVariable , Below expression will not evaluate.
strVariable.Substring(0, 40)
if we create the interpreter with this interpreter = new Interpreter() then above expression will evaluate.
I have tried to add reference for the string and Math lib as below , but it still does not work , it only works if we remove the SetDefaultNumberType option.
Thank you @ATEEKGIT for the bug report. Do you think it is possible for you to create a PR with an unit test that demonstrate the bug? Otherwise I will try to work on it in the future.
thanks bro for attention , unfortunatly i am not familiar much with github and jsut reach here to report bug as it was a major issue for us in production. The problem can be reproduced with just just 2 lines of code. ,
a. create instance of interpreter as below.
interpreter = new Interpreter().SetDefaultNumberType(DefaultNumberType.Decimal);
b. and try to evaluate andy of the function of Math or String or DateTime , it will not work.
and then you may pass to interprerter any reference to any library and it will not work.
var interpreter2 = new Interpreter().SetDefaultNumberType(DefaultNumberType.Decimal);
interpreter2.SetVariable("a", a);
// expected to throw because Substring is not defined for decimal
Assert.Throws<NoApplicableMethodException>(() => interpreter2.Eval("a.Substring(0, 2)"));
// It works if we cast to int
Assert.AreEqual("AA", interpreter2.Eval("a.Substring((int)0, (int)2)"));
Why do you use SetDefaultNumberType in your scenario?
If we use interpreter.SetDefaultNumberType(DefaultNumberType.Decimal); as in code below.
interpreter = new Interpreter().SetDefaultNumberType(DefaultNumberType.Decimal);
After this if we try to evaluate below string for a loaded variable strVariable , Below expression will not evaluate.
strVariable.Substring(0, 40)
if we create the interpreter with this interpreter = new Interpreter() then above expression will evaluate.
I have tried to add reference for the string and Math lib as below , but it still does not work , it only works if we remove the SetDefaultNumberType option.
can someone help on this issue. as we need to set Default Number Type and also have some access to libraries for extension methods etc.
The text was updated successfully, but these errors were encountered: