I know you’ve addressed aspects of this topic before but despite my searching of the forums I’m not clear on why the explicit test macro variables such as %::TESTID::Reply% etc cannot be used in another test’s parameters. There may be a good reason but would you mind reviewing the following scenario and revisiting this again?
For several years we’ve been using a two test structure that we built using recommendations from this forum. E.g. in http://www.ks-soft.net/cgi-bin/phpBB/vi ... php?t=4265. They work together to allow a Text Log test to scan a log file that comes from a program that generates random names.
Our implementation REVIEW:
First we use Test 1 (shellscript) TESTID 478:
It scans a folder for the name of the most recently modified file. The filename is returned in the %reply% value of this test #478. We then run an HMS action profile that updates a %udv_filename% variable with this test’s %Reply%.
Now our Text Log Test and uses d:\logs\%udv_filename% with Macro’s enabled and the most recent log file is scanned. It works perfectly. Except we have cluttered up our UDV list with stuff that we don’t really need to see.
PROPOSED REVISED SOLUTION
Recently I had to do this again and thought, why am I going through the extra effort and saving the value in a UDV when the TestID 478’s reply is already saved in %::478::reply% . So I created my new Text Log test and used this test parameter: d:\logs\%::478::reply% in the file_to_test
As I see now, by reviewing the forum again, it’s not suppose to work and I get an “RMS: Test Error� response.
I’m unclear why this isn’t allowed? If it were I could eliminate a HMS and Action profile as well as save clutter in our UDV list. AND actually since I don’t let all staff create Scripts or Action profiles but they can create tests, they could be more self-sufficient. Lastly, I don’t need these results to be persistently stored. I think it would add flexibility to the test creation and simplify configurations.
Thanks so much for your consideration.
FYI: I’m running a beta release 10.51f
Using Explicit Test Macro Variables in Test Parameters
-
- Posts: 2832
- Joined: Tue May 16, 2006 4:41 am
- Contact:
Maybe it's better to use %NewestFile% variable instead?
E.g.
File to test: D:\Logs\DB\%NewestFile%
"Translate macros" option should be marked
Please check for more details about %NewestFile% variable at:
http://www.ks-soft.net/hostmon.eng/mfra ... htm#macros
E.g.
File to test: D:\Logs\DB\%NewestFile%
"Translate macros" option should be marked
Please check for more details about %NewestFile% variable at:
http://www.ks-soft.net/hostmon.eng/mfra ... htm#macros
One of the reasons - its bad for performance.So I created my new Text Log test and used this test parameter: d:\logs\%::478::reply% in the file_to_test
As I see now, by reviewing the forum again, it’s not suppose to work and I get an “RMS: Test Error� response.
I’m unclear why this isn’t allowed?
When test name, comment and other fields depend on static parameters (e.g. target hostname or target file), then HostMonitor does not need to scan test list and change test names after each test probe.
Regards
Alex
Okay, performance impact is definitely a good reason to avoid this. So I think I understand that the performance issue is due to the fact that test parameter variables can be (and often are) used to automate the creation of the TestName and Test comment lines. And if that is done then the dynamic nature of the test names and comments is what causes the performance degrades. Right?! Makes a lot of sense and I suppose if someone did something crazy with tests referencing each other they could create a loop with the variables. That would be bad!
So I'm back to using dynamic UDV's? I'm not using more than a hand full right now but I can see where I may want to use more to store data that I'd like to pass into test parameters. If I created 100 or 1000 UDVs and had tests running an HMS every 3 minutes that updated the UDV, would that cause a performance issue? I haven't tried this yet and so far my HM seems to be performing well, but should I watch out? Also, are UDV stored in the HML?
I have another question about the performance impact of using Test Macro Variables. I've started to build some complex logic within the dependency expressions of what I call "Intelligent Tests". One of these Intelligent Tests had a dependency expression that references six other tests so that we can send a very precise alarm for a specific condition. It's really hard to explain in an email but, suffice it to say, I'm starting to build more of these "Intelligent Alerts". They are really nothing more than a PING ::1 test where I've reversed the alert so it's always BAD. However since I have an expression dependency the test is waiting on master until the other six tests align into a predictable yet complex failure pattern. This keeps things clean and it's really cool to be able to report a human readable status rather than the individual sensors values. However, if I build a few hundred of these, will I create a performance issue? Or does reading these Test Macro Variables in a dependency expression not cause a problem?
Thanks again for bearing with my questions.
So I'm back to using dynamic UDV's? I'm not using more than a hand full right now but I can see where I may want to use more to store data that I'd like to pass into test parameters. If I created 100 or 1000 UDVs and had tests running an HMS every 3 minutes that updated the UDV, would that cause a performance issue? I haven't tried this yet and so far my HM seems to be performing well, but should I watch out? Also, are UDV stored in the HML?
I have another question about the performance impact of using Test Macro Variables. I've started to build some complex logic within the dependency expressions of what I call "Intelligent Tests". One of these Intelligent Tests had a dependency expression that references six other tests so that we can send a very precise alarm for a specific condition. It's really hard to explain in an email but, suffice it to say, I'm starting to build more of these "Intelligent Alerts". They are really nothing more than a PING ::1 test where I've reversed the alert so it's always BAD. However since I have an expression dependency the test is waiting on master until the other six tests align into a predictable yet complex failure pattern. This keeps things clean and it's really cool to be able to report a human readable status rather than the individual sensors values. However, if I build a few hundred of these, will I create a performance issue? Or does reading these Test Macro Variables in a dependency expression not cause a problem?
Thanks again for bearing with my questions.
As I understand some application (applications) generate a lot of files using different name patterns and all these files should be stored in the same folder?So I'm back to using dynamic UDV's?
you cannot change name patterns?
you cannot use separate folders for different kind of files?
you cannot move old (checked/processed) files to another folder?
100 should not be a problem.seelye wrote:If I created 100 or 1000 UDVs and had tests running an HMS every 3 minutes that updated the UDV, would that cause a performance issue? I haven't tried this yet and so far my HM seems to be performing well, but should I watch out? Also, are UDV stored in the HML?
1000 probably is Ok as well but need to test...
Version 10 processes such expressions faster so few hundred should be Ok.I've started to build some complex logic within the dependency expressions of what I call "Intelligent Tests". One of these Intelligent Tests had a dependency expression that references six other tests so that we can send a very precise alarm for a specific condition. if I build a few hundred of these, will I create a performance issue? Or does reading these Test Macro Variables in a dependency expression not cause a problem?
However it depends on many factors.
How many tests do you have? Folders?
How many different expressions used? Each test uses unique "master" expression so there are few hundred different expressions?
Regards
Alex