My ODBC Logging Setup
-
- Posts: 76
- Joined: Wed Mar 21, 2007 5:51 am
- Location: Klarup (AAlborg), Demark
My ODBC Logging Setup
Just wanted to share my ODBC logging setup - Please notice this is Work In Progress and data are from a development/test system:
Part 1:
The table in SQL 2005 - the Database is called HostMonitor:
USE [HostMonitor]
GO
/****** Object: Table [dbo].[HMLOG] Script Date: 04/24/2007 14:39:42 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[HMLOG](
[EVENTTIME] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTNAME] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[STATUS] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[REPLY] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTMETHOD] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[EVENTTIMESTAMP] [datetime] NOT NULL,
[TESTID] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[INTERVAL] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ALERTTHRESHOLD] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[SIMPLESTATUS] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HOSTADDR] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FOLDER] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FULLPATH] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKNOWLEDGEDAT] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[ACKNOWLEDGEDBY] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[ACKCOMMENT] [varchar](500) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
Notice the eventtimestamp column set actual time in, it is used for sorting correctly, do make indexes!! Varchar sizes can also be improved a bit.
I have used MS SQL timestamps on an SQL server running on top of VMWare ESX - this did not work for some reason, so I use GetDate() - this is also (human)readable and easy to handle. Check out http://support.microsoft.com/kb/931279
In HostMonitor Options -> Backup Log -> Database I have this SQL Query (remember to setup ODBC also):
Insert into hmlog (eventtime, testname, status, reply, testmethod,testid,interval,alertthreshold,simplestatus,hostaddr,folder,fullpath,eventtimestamp,acknowledgedat,acknowledgedby,ackcomment) VALUES ('%DateTime%', '%TestName%', '%Status%', LEFT('%Reply%',199), '%TestMethod%','%TestID%','%Interval%','%AlertThreshold%','%SimpleStatus%','%HostAddr%','%Folder%','%FullPath%',GetDate(),'%AcknowledgedAt% ','%AcknowledgedBy% ',LEFT('%AckComment%',499))
Notice the LEFT function is inplemented, as I have seen that stuff from the "Text Log" test can be very long, and if it is longer than the column can handle, an ODBC error is raised ( an that is bad if syslog.htm is monitored, because the ODBC error text is quite long.... think it over)
I log all events in the database, logging only status changes makes it impossible to track trends/values (via the reply column)
Well, that takes care of logging to the ODBC Source.
Part 2:
To make sure that continues logging is happening, a ODBC test tells how long time from the last entry in the table:
select top 1 DATEDIFF(s, eventtimestamp,getdate()) from sla Order by eventtimestamp desc
This gives the difference in seconds from now to the last entry, must be max 30 secs in my case (full logging), if the test fails - check logging to the ODBC source. Also use IP instead of servername in the ODBC setup, to prevent logging fails if DNS does..
Also, I have various checks over the MS SQL Server, as service runnning, agent running, disks not filling up and watch out for filling the transaction log - it is a common MS SQL mistake
Part 3:
Over the table HMLOG I have some practical Views, both for the LogAnalyzer and various Web-Applications:
USE [HostMonitor]
GO
/****** Object: View [dbo].[LogAnalyzer1Day] Script Date: 04/24/2007 14:50:37 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE VIEW [dbo].[LogAnalyzer1Day]
AS
SELECT TESTNAME, STATUS, REPLY, TESTMETHOD, EVENTTIMESTAMP, EVENTTIME, ACKNOWLEDGEDAT, ACKNOWLEDGEDBY
FROM dbo.HMLOG
WHERE (EVENTTIMESTAMP > GETDATE() - 1)
GO
I have 1Day, 2Days, 3Days, 1Week, 2Weeks etc etc - all that is needed to change is the GETDATE()-Z, where Z=Number of days.
I also have some very important tests, that are used for calculating SLA (service Level Agreements) for customers, these are all in a folder called "0SLA", and similar to the above, I have some views like this:
USE [HostMonitor]
GO
/****** Object: View [dbo].[SLA] Script Date: 04/24/2007 14:52:11 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE VIEW [dbo].[SLA]
AS
SELECT EVENTTIME, TESTNAME, STATUS, REPLY, TESTMETHOD, EVENTTIMESTAMP, TESTID, INTERVAL, ALERTTHRESHOLD, SIMPLESTATUS, HOSTADDR,
FOLDER, FULLPATH
FROM dbo.HMLOG
WHERE (FULLPATH = 'Root\0SLA\')
Part 4:
A few WebApps inspirations (with a little pseudo code) in have made over the HMLOG table, the moment the test data is in a SQL database, the possibilities are endless :
Monitoring actual status for each test in the "0SLA" folder:
Get all tests:
sql="SELECT DISTINCT TestName FROM SLA ORDER BY TestName"
Run a loop with this:
sql="SELECT TOP 1 Status FROM LogAnalyzer1Day WHERE TestName='" & TestName & "' ORDER BY EVENTTIMESTAMP DESC"
This will give the current status for variable TestName
The pda version for out QTEK devices:
A detailed view over a test:
Part 5:
For most of the views, I have made an ODBC Log like this:
I prefer to have filter and stuff in the SQL database, but that of course is an option of opinions.
Part 5:
May be continued here....posting of actual asp and/or .net files etc.
Part 1:
The table in SQL 2005 - the Database is called HostMonitor:
USE [HostMonitor]
GO
/****** Object: Table [dbo].[HMLOG] Script Date: 04/24/2007 14:39:42 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[HMLOG](
[EVENTTIME] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTNAME] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[STATUS] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[REPLY] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTMETHOD] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[EVENTTIMESTAMP] [datetime] NOT NULL,
[TESTID] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[INTERVAL] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ALERTTHRESHOLD] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[SIMPLESTATUS] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HOSTADDR] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FOLDER] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FULLPATH] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKNOWLEDGEDAT] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[ACKNOWLEDGEDBY] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[ACKCOMMENT] [varchar](500) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
Notice the eventtimestamp column set actual time in, it is used for sorting correctly, do make indexes!! Varchar sizes can also be improved a bit.
I have used MS SQL timestamps on an SQL server running on top of VMWare ESX - this did not work for some reason, so I use GetDate() - this is also (human)readable and easy to handle. Check out http://support.microsoft.com/kb/931279
In HostMonitor Options -> Backup Log -> Database I have this SQL Query (remember to setup ODBC also):
Insert into hmlog (eventtime, testname, status, reply, testmethod,testid,interval,alertthreshold,simplestatus,hostaddr,folder,fullpath,eventtimestamp,acknowledgedat,acknowledgedby,ackcomment) VALUES ('%DateTime%', '%TestName%', '%Status%', LEFT('%Reply%',199), '%TestMethod%','%TestID%','%Interval%','%AlertThreshold%','%SimpleStatus%','%HostAddr%','%Folder%','%FullPath%',GetDate(),'%AcknowledgedAt% ','%AcknowledgedBy% ',LEFT('%AckComment%',499))
Notice the LEFT function is inplemented, as I have seen that stuff from the "Text Log" test can be very long, and if it is longer than the column can handle, an ODBC error is raised ( an that is bad if syslog.htm is monitored, because the ODBC error text is quite long.... think it over)
I log all events in the database, logging only status changes makes it impossible to track trends/values (via the reply column)
Well, that takes care of logging to the ODBC Source.
Part 2:
To make sure that continues logging is happening, a ODBC test tells how long time from the last entry in the table:
select top 1 DATEDIFF(s, eventtimestamp,getdate()) from sla Order by eventtimestamp desc
This gives the difference in seconds from now to the last entry, must be max 30 secs in my case (full logging), if the test fails - check logging to the ODBC source. Also use IP instead of servername in the ODBC setup, to prevent logging fails if DNS does..
Also, I have various checks over the MS SQL Server, as service runnning, agent running, disks not filling up and watch out for filling the transaction log - it is a common MS SQL mistake
Part 3:
Over the table HMLOG I have some practical Views, both for the LogAnalyzer and various Web-Applications:
USE [HostMonitor]
GO
/****** Object: View [dbo].[LogAnalyzer1Day] Script Date: 04/24/2007 14:50:37 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE VIEW [dbo].[LogAnalyzer1Day]
AS
SELECT TESTNAME, STATUS, REPLY, TESTMETHOD, EVENTTIMESTAMP, EVENTTIME, ACKNOWLEDGEDAT, ACKNOWLEDGEDBY
FROM dbo.HMLOG
WHERE (EVENTTIMESTAMP > GETDATE() - 1)
GO
I have 1Day, 2Days, 3Days, 1Week, 2Weeks etc etc - all that is needed to change is the GETDATE()-Z, where Z=Number of days.
I also have some very important tests, that are used for calculating SLA (service Level Agreements) for customers, these are all in a folder called "0SLA", and similar to the above, I have some views like this:
USE [HostMonitor]
GO
/****** Object: View [dbo].[SLA] Script Date: 04/24/2007 14:52:11 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE VIEW [dbo].[SLA]
AS
SELECT EVENTTIME, TESTNAME, STATUS, REPLY, TESTMETHOD, EVENTTIMESTAMP, TESTID, INTERVAL, ALERTTHRESHOLD, SIMPLESTATUS, HOSTADDR,
FOLDER, FULLPATH
FROM dbo.HMLOG
WHERE (FULLPATH = 'Root\0SLA\')
Part 4:
A few WebApps inspirations (with a little pseudo code) in have made over the HMLOG table, the moment the test data is in a SQL database, the possibilities are endless :
Monitoring actual status for each test in the "0SLA" folder:
Get all tests:
sql="SELECT DISTINCT TestName FROM SLA ORDER BY TestName"
Run a loop with this:
sql="SELECT TOP 1 Status FROM LogAnalyzer1Day WHERE TestName='" & TestName & "' ORDER BY EVENTTIMESTAMP DESC"
This will give the current status for variable TestName
The pda version for out QTEK devices:
A detailed view over a test:
Part 5:
For most of the views, I have made an ODBC Log like this:
I prefer to have filter and stuff in the SQL database, but that of course is an option of opinions.
Part 5:
May be continued here....posting of actual asp and/or .net files etc.
-
- Posts: 76
- Joined: Wed Mar 21, 2007 5:51 am
- Location: Klarup (AAlborg), Demark
Well, the Story continues, since my upgrade to 6.80 yesterday, my database looks like this:
USE [HostMonitor]
GO
/****** Object: Table [dbo].[HMLOG] Script Date: 05/01/2007 10:16:18 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[HMLOG](
[EVENTTIME] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTNAME] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[STATUS] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[REPLY] [varchar](8000) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTMETHOD] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[EVENTTIMESTAMP] [datetime] NOT NULL,
[TESTID] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[INTERVAL] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ALERTTHRESHOLD] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[SIMPLESTATUS] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HOSTADDR] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FOLDER] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FULLPATH] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKNOWLEDGEDAT] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKNOWLEDGEDBY] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKCOMMENT] [varchar](500) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HTTPPAGE] [varchar](8000) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HTTPCODE] [varchar](20) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HTTPHEADER] [varchar](300) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ROWID] [uniqueidentifier] NOT NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
Most URL tests are changed to HTTP tests, as I want to capture the HttpHeader, HttpCode and HttpPage into the table. Also an RowID column is created, so each test has an unique identifier. This make the SQL Query in Hostmonitor look like this:
Insert into hmlog (rowid,eventtime, testname, status, reply, testmethod,testid,interval,alertthreshold,simplestatus,hostaddr,folder,fullpath,eventtimestamp,acknowledgedat,acknowledgedby,ackcomment,httppage,httpcode,httpheader) VALUES (NEWID(),'%DateTime%', '%TestName%', '%Status%', LEFT('%Reply%',7999), '%TestMethod%','%TestID%','%Interval%','%AlertThreshold%','%SimpleStatus%','%HostAddr%','%Folder%','%FullPath%',GetDate(),'%AcknowledgedAt% ','%AcknowledgedBy% ',LEFT('%AckComment%',499),LEFT('%HttpPage%',7999),LEFT('%HttpCode%',19),LEFT('%HttpHeader%',299))
To utilize this and find problems, I have created a drill-down system as a web app:
Step 1 - Select a Folder
Step 2 - Select a Test
Step 3 - Select a specific test-event in the test
Step 4 - Display all gathered info on the test:
A nice feature is, that be rightclicking any test - via an "Custom Menu profile", a web page appears with the last 1000 test-event for the test:
USE [HostMonitor]
GO
/****** Object: Table [dbo].[HMLOG] Script Date: 05/01/2007 10:16:18 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[HMLOG](
[EVENTTIME] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTNAME] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[STATUS] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[REPLY] [varchar](8000) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[TESTMETHOD] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[EVENTTIMESTAMP] [datetime] NOT NULL,
[TESTID] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[INTERVAL] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ALERTTHRESHOLD] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[SIMPLESTATUS] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HOSTADDR] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FOLDER] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[FULLPATH] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKNOWLEDGEDAT] [varchar](200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKNOWLEDGEDBY] [varchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ACKCOMMENT] [varchar](500) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HTTPPAGE] [varchar](8000) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HTTPCODE] [varchar](20) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[HTTPHEADER] [varchar](300) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[ROWID] [uniqueidentifier] NOT NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
Most URL tests are changed to HTTP tests, as I want to capture the HttpHeader, HttpCode and HttpPage into the table. Also an RowID column is created, so each test has an unique identifier. This make the SQL Query in Hostmonitor look like this:
Insert into hmlog (rowid,eventtime, testname, status, reply, testmethod,testid,interval,alertthreshold,simplestatus,hostaddr,folder,fullpath,eventtimestamp,acknowledgedat,acknowledgedby,ackcomment,httppage,httpcode,httpheader) VALUES (NEWID(),'%DateTime%', '%TestName%', '%Status%', LEFT('%Reply%',7999), '%TestMethod%','%TestID%','%Interval%','%AlertThreshold%','%SimpleStatus%','%HostAddr%','%Folder%','%FullPath%',GetDate(),'%AcknowledgedAt% ','%AcknowledgedBy% ',LEFT('%AckComment%',499),LEFT('%HttpPage%',7999),LEFT('%HttpCode%',19),LEFT('%HttpHeader%',299))
To utilize this and find problems, I have created a drill-down system as a web app:
Step 1 - Select a Folder
Step 2 - Select a Test
Step 3 - Select a specific test-event in the test
Step 4 - Display all gathered info on the test:
A nice feature is, that be rightclicking any test - via an "Custom Menu profile", a web page appears with the last 1000 test-event for the test:
-
- Posts: 2832
- Joined: Tue May 16, 2006 4:41 am
- Contact:
-
- Posts: 76
- Joined: Wed Mar 21, 2007 5:51 am
- Location: Klarup (AAlborg), Demark
Thanks
Just wanted to feedback that this really helped me get started with logging to SQL. The biggest challenge I was facing was data types for the Macros but I can see that you just dumped it all as a text and then di any numerical manipulation within SQL which has worked out all the issues I had been facing.
Thanks again.
Paul
Thanks again.
Paul
-
- Posts: 76
- Joined: Wed Mar 21, 2007 5:51 am
- Location: Klarup (AAlborg), Demark
Glad to hear that, at the moment I am actually running on a MySQL database, due to different circumstances But there is not much difference.
I had yo adjust logging levels though, my initial ambition was full logging on all tests, but that produced 500.000 rows every day, wich killed my databases after a week, where there could be +4 millions rows....
So my default logging level is now "write a record when test status changes + midnight logging" - and for some tests I run the "reply" logging level.
Please also notice, that using the HTTP test instead of the URL test gives more feedback in the macro's, lige header and httpbody.
I had yo adjust logging levels though, my initial ambition was full logging on all tests, but that produced 500.000 rows every day, wich killed my databases after a week, where there could be +4 millions rows....
So my default logging level is now "write a record when test status changes + midnight logging" - and for some tests I run the "reply" logging level.
Please also notice, that using the HTTP test instead of the URL test gives more feedback in the macro's, lige header and httpbody.
-
- Posts: 76
- Joined: Wed Mar 21, 2007 5:51 am
- Location: Klarup (AAlborg), Demark
Well, You wont belive it - now we are on Oracle
So from MS SQL -> MySQL -> Oracle - been there, done that
A few weeks ago, we switched to Oracle, mainly to the fact that we are using oracle in the production, and I dont have to maintain the database ( as I would with MS SQL and MySQL)
So now we are on a Virual Oracle (VMWare ESX underneath, Red Hat Enterprise 4 and Oracle 10G) - and the performance is pretty good.
The HM setup now covers 4 physical locations and aprox 1500. Tests
So I just wanted to show my updated Webinterface, using data from HM stored in the Oracle Database:
So from MS SQL -> MySQL -> Oracle - been there, done that
A few weeks ago, we switched to Oracle, mainly to the fact that we are using oracle in the production, and I dont have to maintain the database ( as I would with MS SQL and MySQL)
So now we are on a Virual Oracle (VMWare ESX underneath, Red Hat Enterprise 4 and Oracle 10G) - and the performance is pretty good.
The HM setup now covers 4 physical locations and aprox 1500. Tests
So I just wanted to show my updated Webinterface, using data from HM stored in the Oracle Database:
-
- Posts: 2832
- Joined: Tue May 16, 2006 4:41 am
- Contact:
Thank you for feedback! This information is important for us!mos-eisley wrote:So now we are on a Virual Oracle (VMWare ESX underneath, Red Hat Enterprise 4 and Oracle 10G) - and the performance is pretty good.
The HM setup now covers 4 physical locations and aprox 1500. Tests
Wow! Looks awesome! Thank you, your work is a good advertising of HostMonitor's power.mos-eisley wrote:So I just wanted to show my updated Webinterface, using data from HM stored in the Oracle Database
BTW. We are going to release new command line utility: LogVisualizer. LogVisualizer may use HTML, text, dbf or ODBC log to build chart and save this chart into specified file (.gif). Chart is built for for the certain period of time for the particular test. So, you will be able to integrate LogVisualizer into your system to show the graphical representation of logged data.
Regards,
Max
Please help to write SQL query to create report like you have. Raw SLA Measument. I want to get in result table this row:
1)start datetime
2)end datetime
3)difference
4)from status
5)to status
6)translated (UP or DOWN)
No metter which timeperiod and testname. Thats I create myself. In DB I write standart row:
1)testname
2)status
3)eventtime
4)reply
5)eventid
What you doing with status 'Checking' and 'Disable'/'Enable'?
1)start datetime
2)end datetime
3)difference
4)from status
5)to status
6)translated (UP or DOWN)
No metter which timeperiod and testname. Thats I create myself. In DB I write standart row:
1)testname
2)status
3)eventtime
4)reply
5)eventid
What you doing with status 'Checking' and 'Disable'/'Enable'?
MSSQL 2005| Working variant
Drop Table #101;
Select EventTime,Status,Simple_Status,Reply,TestId
Into #101
From HMLog
--Where Convert(Char(10),EventTime,104)='17.11.2010' And TestId=2113;
Where TestId=2313;
--Where Convert(Char(10),EventTime,104)='17.11.2010';
--Select * From #101 Order by 1;
Drop Table #102;
Select EventTimeBeg,StatusBeg,EventTimeEnd,
(Select Status From #101 Where TestId=sel.TestId And EventTime=sel.EventTimeEnd And Status<>sel.StatusBeg) StatusEnd,
Simple_Status,TestId
Into #102
From (
Select a.EventTime EventTimeBeg,a.Status StatusBeg,Min(b.EventTime) EventTimeEnd,a.Simple_Status,a.TestId
From #101 a,#101 b
Where Substring(a.Reply,1,6)<>'ACK BY'
And Substring(b.Reply,1,6)<>'ACK BY'
And a.TestId=b.TestId
And a.Status<>b.Status
And a.EventTime<=b.EventTime
Group by a.EventTime,a.Status,a.Simple_Status,a.TestId
) Sel;
--Select * From #102 Order by 1;
Drop Table #103;
Select v1.EventTimeBeg,v1.EventTimeEnd,DATEDIFF(s,v1.EventTimeBeg,v1.EventTimeEnd) Duration,
v1.TestId,v1.StatusBeg,v1.StatusEnd,v1.Simple_Status,sel.EventTimeReply,sel.Reply,
DATEDIFF(s,v1.EventTimeBeg,sel.EventTimeReply) DurationACK
Into #103
From #102 v1 Left join (
--Select Max(v.EventTimeEnd) EventTimeEnd,r.EventTimeReply,r.Reply,r.TestId
Select Min(v.EventTimeEnd) EventTimeEnd,r.EventTimeReply,r.Reply,r.TestId
From (
Select EventTime EventTimeReply,LTrim(Substring(Reply,7,Len(Reply)-6)) Reply,Status,TestId From #101 Where Substring(Reply,1,6)='ACK BY'
) r,#102 v
--Where r.EventTimeReply>=v.EventTimeEnd
--And r.Status=v.StatusEnd
Where r.EventTimeReply<=v.EventTimeEnd
And r.Status=v.StatusBeg
Group by r.EventTimeReply,r.Reply,r.TestId
) sel On v1.TestId=sel.TestId And v1.EventTimeEnd=sel.EventTimeEnd;
Select EventTimeBeg,EventTimeEnd,Duration "Duration,s",TestId,StatusBeg,StatusEnd,Simple_Status,EventTimeReply "EventTimeACK",Reply "ACK by",DurationACK From #103
--where Simple_Status = 'DOWN'
Order by 1;
Drop Table #101;
Select EventTime,Status,Simple_Status,Reply,TestId
Into #101
From HMLog
--Where Convert(Char(10),EventTime,104)='17.11.2010' And TestId=2113;
Where TestId=2313;
--Where Convert(Char(10),EventTime,104)='17.11.2010';
--Select * From #101 Order by 1;
Drop Table #102;
Select EventTimeBeg,StatusBeg,EventTimeEnd,
(Select Status From #101 Where TestId=sel.TestId And EventTime=sel.EventTimeEnd And Status<>sel.StatusBeg) StatusEnd,
Simple_Status,TestId
Into #102
From (
Select a.EventTime EventTimeBeg,a.Status StatusBeg,Min(b.EventTime) EventTimeEnd,a.Simple_Status,a.TestId
From #101 a,#101 b
Where Substring(a.Reply,1,6)<>'ACK BY'
And Substring(b.Reply,1,6)<>'ACK BY'
And a.TestId=b.TestId
And a.Status<>b.Status
And a.EventTime<=b.EventTime
Group by a.EventTime,a.Status,a.Simple_Status,a.TestId
) Sel;
--Select * From #102 Order by 1;
Drop Table #103;
Select v1.EventTimeBeg,v1.EventTimeEnd,DATEDIFF(s,v1.EventTimeBeg,v1.EventTimeEnd) Duration,
v1.TestId,v1.StatusBeg,v1.StatusEnd,v1.Simple_Status,sel.EventTimeReply,sel.Reply,
DATEDIFF(s,v1.EventTimeBeg,sel.EventTimeReply) DurationACK
Into #103
From #102 v1 Left join (
--Select Max(v.EventTimeEnd) EventTimeEnd,r.EventTimeReply,r.Reply,r.TestId
Select Min(v.EventTimeEnd) EventTimeEnd,r.EventTimeReply,r.Reply,r.TestId
From (
Select EventTime EventTimeReply,LTrim(Substring(Reply,7,Len(Reply)-6)) Reply,Status,TestId From #101 Where Substring(Reply,1,6)='ACK BY'
) r,#102 v
--Where r.EventTimeReply>=v.EventTimeEnd
--And r.Status=v.StatusEnd
Where r.EventTimeReply<=v.EventTimeEnd
And r.Status=v.StatusBeg
Group by r.EventTimeReply,r.Reply,r.TestId
) sel On v1.TestId=sel.TestId And v1.EventTimeEnd=sel.EventTimeEnd;
Select EventTimeBeg,EventTimeEnd,Duration "Duration,s",TestId,StatusBeg,StatusEnd,Simple_Status,EventTimeReply "EventTimeACK",Reply "ACK by",DurationACK From #103
--where Simple_Status = 'DOWN'
Order by 1;