prewikka.diff
NEWS | ||
---|---|---|
7 | 7 |
as well as message summary (require twisted.names and twisted.internet), |
8 | 8 |
see the additional dns_max_delay settings parameters in prewikka.conf. |
9 | 9 | |
10 |
- In the alert summary view, handle portlist and ip_version service fields,
|
|
10 |
- In the alert summary view, handle portlist and ip_version service fields, |
|
11 | 11 |
and show alert messageid. |
12 | 12 | |
13 | 13 |
- Fix exception when rendering ToolAlert. |
... | ... | |
32 | 32 | |
33 | 33 |
- Only perform additional database request when using Sensor localtime: |
34 | 34 |
this bring a performance improvement of about 36% on aggregated query, |
35 |
when using either frontend localtime (the default), or UTC time.
|
|
35 |
when using either frontend localtime (the default), or UTC time. |
|
36 | 36 | |
37 | 37 |
- JQuery support: Port most of the javascript code to make use of JQuery. |
38 | 38 |
Add show/hide effect to CSS popup. More filtering functionality in the |
39 |
SensorListing view.
|
|
39 |
SensorListing view. |
|
40 | 40 | |
41 | 41 |
- Cleanup the Authentication class, so that uper Prewikka layer can act |
42 | 42 |
depending whether the backend support user creation / deletion. Anonymous |
... | ... | |
44 | 44 | |
45 | 45 |
- Better integration of CGI authentication allowing user listing and deletion. |
46 | 46 | |
47 |
- Report template exception directly to the user.
|
|
47 |
- Report template exception directly to the user. |
|
48 | 48 | |
49 | 49 |
- Fix exception if an alert analyzer name is empty. |
50 | 50 | |
... | ... | |
56 | 56 |
(which is a minor issue since the user is already authenticated). Thanks |
57 | 57 |
to Helmut Azbest <helmut.azbest@gmail.com> for the fix. |
58 | 58 | |
59 |
- Fix a typo making mod_python use the parent method (patch from
|
|
59 |
- Fix a typo making mod_python use the parent method (patch from |
|
60 | 60 |
Helmut Azbest <helmut.azbest@gmail.com>). |
61 | 61 | |
62 |
- In the configuration file, recognize section even if there are whitespace
|
|
63 |
at the beginning of the line.
|
|
62 |
- In the configuration file, recognize section even if there are whitespace |
|
63 |
at the beginning of the line. |
|
64 | 64 | |
65 | 65 |
- Localization fixes, by Sebastien Tricaud <toady@gscore.org>, and |
66 | 66 |
Bjoern Weiland. |
... | ... | |
110 | 110 | |
111 | 111 |
* 2007-05-26, prewikka-0.9.11.2: |
112 | 112 | |
113 |
- In case a database schema upgrade is required, or the Prewikka
|
|
113 |
- In case a database schema upgrade is required, or the Prewikka |
|
114 | 114 |
database does not exist, make the error available from the Prewikka |
115 |
console, rather than exiting badly (which previously required the
|
|
115 |
console, rather than exiting badly (which previously required the |
|
116 | 116 |
user to parse its web server log in order to find out the problem). |
117 |
|
|
117 | ||
118 | 118 | |
119 | 119 |
* 2007-05-25, prewikka-0.9.11.1: |
120 | 120 | |
... | ... | |
122 | 122 | |
123 | 123 |
- Fix incorrect locale switch when accessing certain pages. |
124 | 124 | |
125 |
|
|
125 | ||
126 | 126 |
* 2007-05-21, prewikka-0.9.11: |
127 | 127 | |
128 | 128 |
- Prewikka has been internationalized: user might choose the language |
129 | 129 |
used in their settings tabs. Additionally, you might specify |
130 | 130 |
a default locale using the "default_locale" configuration keyword. |
131 | 131 | |
132 |
- Brazilian Portuguese translation, by Edelberto Franco Silva<edeunix@edeunix.com>.
|
|
132 |
- Brazilian Portuguese translation, by Edelberto Franco Silva<edeunix@edeunix.com>. |
|
133 | 133 |
- French translation, by Sebastien Tricaud <sebastien@gscore.org>. |
134 | 134 |
- German translation, by Bjoern Weiland <mail@bjou.de>. |
135 |
- Russian translation, by Valentin Bogdanov <bogdanov.valentin@gmail.com>.
|
|
135 |
- Russian translation, by Valentin Bogdanov <bogdanov.valentin@gmail.com>. |
|
136 | 136 |
- Spanish translation, by Carlo G. AƱez M. <carlo.anez@gmail.com>. |
137 | 137 | |
138 |
- New powerfull and scalable agent view, grouping agent together by
|
|
139 |
Location and Node.
|
|
140 |
|
|
138 |
- New powerfull and scalable agent view, grouping agent together by |
|
139 |
Location and Node. |
|
140 | ||
141 | 141 |
- In the Alert/Heartbeat summary view, number analyzers backward so that |
142 | 142 |
it reflect the ordering in the analyzer list. |
143 |
|
|
144 |
- Improved support for resizing menu.
|
|
145 |
|
|
146 |
- Fix a konqueror rendering bug with the inline filter.
|
|
143 | ||
144 |
- Improved support for resizing menu. |
|
145 | ||
146 |
- Fix a konqueror rendering bug with the inline filter. |
|
147 | 147 | |
148 | 148 |
- Various bug fixes. |
149 |
|
|
150 |
|
|
149 | ||
150 | ||
151 | 151 |
* 2007-04-05, prewikka-0.9.10: |
152 | 152 | |
153 |
- Don't show all source and target when they reach a predefined limit, instead
|
|
153 |
- Don't show all source and target when they reach a predefined limit, instead |
|
154 | 154 |
provide an expansion link. |
155 |
|
|
155 | ||
156 | 156 |
- Add two new view in the Events section: CorrelationAlert and ToolAlert. |
157 |
|
|
158 |
- Ability to filter and aggregate on all IDMEF path. If the filtered path is
|
|
159 |
an enumeration, automatically provide the list of possible value.
|
|
160 |
|
|
161 |
- Add a combo box for the user to choose which criteria operator to use.
|
|
162 |
|
|
163 |
- Provide an enumeration filter for the type of alert (Alert, CorrelationAlert,
|
|
157 | ||
158 |
- Ability to filter and aggregate on all IDMEF path. If the filtered path is |
|
159 |
an enumeration, automatically provide the list of possible value. |
|
160 | ||
161 |
- Add a combo box for the user to choose which criteria operator to use. |
|
162 | ||
163 |
- Provide an enumeration filter for the type of alert (Alert, CorrelationAlert, |
|
164 | 164 |
ToolAlert, OverflowAlert). |
165 |
|
|
165 | ||
166 | 166 |
- Prewikka can now aggregate by analyzer. |
167 |
|
|
168 |
- When a session expire and the user login, the user is redirected to the page
|
|
167 | ||
168 |
- When a session expire and the user login, the user is redirected to the page |
|
169 | 169 |
he attempted to access when the session expired. |
170 |
|
|
170 | ||
171 | 171 |
- When an error occur, the default Prewikka layout is now preserved. |
172 |
|
|
173 |
- Correct handling of empty value for hash key generation. Fix #204.
|
|
174 |
|
|
175 |
- Use new libpreludedb function that return the results as well as the number
|
|
176 |
of results. This avoid using COUNT() in some places (namely, this speedup
|
|
172 | ||
173 |
- Correct handling of empty value for hash key generation. Fix #204. |
|
174 | ||
175 |
- Use new libpreludedb function that return the results as well as the number |
|
176 |
of results. This avoid using COUNT() in some places (namely, this speedup |
|
177 | 177 |
non aggregated view by ~50%). |
178 |
|
|
178 | ||
179 | 179 |
- Avoid iterating the list of database result more than needed. |
180 |
|
|
180 | ||
181 | 181 |
- Support IDMEF Action, SNMPService, and WebService class. |
182 |
|
|
182 | ||
183 | 183 |
- Improved support for small screen resolution. |
184 |
|
|
185 |
|
|
184 | ||
185 | ||
186 | 186 | |
187 | 187 |
* 2007-02-06, prewikka-0.9.9: |
188 | 188 | |
189 | 189 |
- Improve database performance by reducing the number of query. (Paul Robert Marino) |
190 |
|
|
190 | ||
191 | 191 |
- Activate CleanOutput filtering (lot of escaping fixes). |
192 |
|
|
192 | ||
193 | 193 |
- More action logging. |
194 |
|
|
194 | ||
195 | 195 |
- Bug fixes with the error pages Back/Retry buttons. |
196 |
|
|
196 | ||
197 | 197 |
- Fix error on group by user (#191). |
198 |
|
|
198 | ||
199 | 199 |
- Fix template compilation error with Cheetah version 2 (#184). |
200 |
|
|
200 | ||
201 | 201 | |
202 | 202 |
* 2006-11-23, prewikka-0.9.8: |
203 |
|
|
203 | ||
204 | 204 |
- Save/load user configuration when using CGI authentication mode (#181). |
205 | 205 | |
206 | 206 |
- Show Prewikka version in the About page (#177). |
207 | 207 | |
208 |
- Use Python logging facility (available backend: stderr, file, smtp, syslog),
|
|
208 |
- Use Python logging facility (available backend: stderr, file, smtp, syslog), |
|
209 | 209 |
multiple simultaneous handler supported (#113). |
210 | 210 | |
211 | 211 |
- Fix anonymous authentication. |
... | ... | |
222 | 222 | |
223 | 223 | |
224 | 224 |
* 2006-08-18, prewikka-0.9.7.1: |
225 |
|
|
225 | ||
226 | 226 |
- Fix filter interface bug introduced in 0.9.7. |
227 | 227 | |
228 | 228 |
- Improved error reporting on filter creation. |
... | ... | |
232 | 232 | |
233 | 233 |
* 2006-08-16, prewikka-0.9.7: |
234 | 234 | |
235 |
- Use preludedb_delete_(alert|heartbeat)_from_list(). Require
|
|
236 |
libpreludedb 0.9.9. Provide a deletion performance improvement
|
|
235 |
- Use preludedb_delete_(alert|heartbeat)_from_list(). Require |
|
236 |
libpreludedb 0.9.9. Provide a deletion performance improvement |
|
237 | 237 |
of around 3000%. |
238 | 238 | |
239 |
- Handle multiple listed source/target properly. Separate
|
|
239 |
- Handle multiple listed source/target properly. Separate |
|
240 | 240 |
source/target in the message listing. |
241 | 241 | |
242 | 242 |
- Make host command/Information link available from the Sensor |
243 | 243 |
listing. |
244 | 244 | |
245 |
- Always take care of the "external_link_new_window" configuration
|
|
246 |
parameter.
|
|
247 |
|
|
248 |
- Make external command handling more generic. Allow to specify
|
|
245 |
- Always take care of the "external_link_new_window" configuration |
|
246 |
parameter. |
|
247 | ||
248 |
- Make external command handling more generic. Allow to specify |
|
249 | 249 |
command line arguments. |
250 | 250 | |
251 |
- Allow to define unlimited number of external commands rather than
|
|
252 |
only a defined subset (fix #134).
|
|
251 |
- Allow to define unlimited number of external commands rather than |
|
252 |
only a defined subset (fix #134). |
|
253 | 253 | |
254 | 254 |
- Avoid toggling several popup at once in the HeartbeatListing. |
255 | 255 | |
... | ... | |
257 | 257 | |
258 | 258 |
- New address and node name lookup provided through prelude-ids.com service. |
259 | 259 | |
260 |
- Link to new prelude-ids.com port lookup instead of broken portsdb
|
|
260 |
- Link to new prelude-ids.com port lookup instead of broken portsdb |
|
261 | 261 |
database (fix #162). |
262 | 262 | |
263 | 263 |
- Various bug fixes. |
... | ... | |
271 | 271 | |
272 | 272 |
- Show multiple source/target in message listing/summary. |
273 | 273 | |
274 |
- Fix invalid use of socket.inet_ntoa() to read ICMP Gateway Address,
|
|
274 |
- Fix invalid use of socket.inet_ntoa() to read ICMP Gateway Address, |
|
275 | 275 |
which is stored as string (#156). |
276 | 276 | |
277 | 277 |
- Fix aggregation on IDMEF-Path that are not string. |
... | ... | |
299 | 299 |
- Intelligent display for CorrelationAlert. Include correlated |
300 | 300 |
alert information in the alert listing. |
301 | 301 | |
302 |
- Intelligent printing of Network centric information.
|
|
302 |
- Intelligent printing of Network centric information. |
|
303 | 303 | |
304 | 304 |
- Fix Cheetah compilation for the heartbeat page. |
305 | 305 | |
... | ... | |
319 | 319 |
- Distribute SQLite schema. |
320 | 320 | |
321 | 321 |
- Fix exception in the heartbeat analysis view when the heartbeat_count |
322 |
or heartbeat_error_margin settings are explicitly set (#124).
|
|
322 |
or heartbeat_error_margin settings are explicitly set (#124). |
|
323 | 323 | |
324 |
- Fix Cheetah 1.0 heartbeat listing exception (#119).
|
|
324 |
- Fix Cheetah 1.0 heartbeat listing exception (#119). |
|
325 | 325 | |
326 | 326 |
- Open external link in new windows by default. Add a configuration option |
327 | 327 |
to disable opening external link in new window (#61). |
328 | 328 | |
329 |
- Provide the ability to specify the configuration file that Prewikka
|
|
329 |
- Provide the ability to specify the configuration file that Prewikka |
|
330 | 330 |
use (#117). |
331 | 331 | |
332 | 332 |
- Sanitize the limit parameter in case the input value is not correct |
333 |
instead of triggering an exception (#118).
|
|
333 |
instead of triggering an exception (#118). |
|
334 | 334 | |
335 |
- Handle the preludeDB "file" setting (for use with SQLite like database).
|
|
335 |
- Handle the preludeDB "file" setting (for use with SQLite like database). |
|
336 | 336 | |
337 | 337 |
- Fix filter saving issue in the heartbeat listing. |
338 | 338 | |
... | ... | |
348 | 348 | |
349 | 349 |
- Add an "Unlimited" timeline option. |
350 | 350 | |
351 |
- Fix classification escaping problem that could lead to empty
|
|
351 |
- Fix classification escaping problem that could lead to empty |
|
352 | 352 |
listing when unwiding alert with classification text containing backslash. |
353 | 353 | |
354 |
- Don't print un-necessary separator when the protocol field is
|
|
354 |
- Don't print un-necessary separator when the protocol field is |
|
355 | 355 |
empty in the alert listing. |
356 | 356 | |
357 |
- Improve Correlation Alert display. Allow focus both on the Correlation Alert
|
|
357 |
- Improve Correlation Alert display. Allow focus both on the Correlation Alert |
|
358 | 358 |
summary and on the correlated alert listing. |
359 | 359 | |
360 |
- Don't propagate the "save" parameter, so that the user don't end up saving
|
|
361 |
settings without knowing about it.
|
|
360 |
- Don't propagate the "save" parameter, so that the user don't end up saving |
|
361 |
settings without knowing about it. |
|
362 | 362 | |
363 | 363 | |
364 | 364 |
* 2005-11-30, prewikka-0.9.1: |
365 | 365 | |
366 |
- Resolve the protocol number from the message summary view.
|
|
366 |
- Resolve the protocol number from the message summary view. |
|
367 | 367 | |
368 |
- Separate port and protocol value, so that we don't end up
|
|
368 |
- Separate port and protocol value, so that we don't end up |
|
369 | 369 |
linking the protocol to portdb if there is no port. |
370 | 370 | |
371 | 371 |
- Ability to setup IDMEF filter using iana_protocol_name and iana_protocol_number. |
... | ... | |
373 | 373 |
- Sanitize timeline years value on system which does not support time |
374 | 374 |
exceeding 2^31-1. Fix #104. |
375 | 375 | |
376 |
- Mark CorrelationAlert explicitly in the AlertListing.
|
|
376 |
- Mark CorrelationAlert explicitly in the AlertListing. |
|
377 | 377 | |
378 |
- Make inline filter mark more visible.
|
|
378 |
- Make inline filter mark more visible. |
|
379 | 379 | |
380 | 380 |
- Ability for the user to save settings for the current view. |
381 | 381 | |
... | ... | |
384 | 384 |
- Fix a bug where clicking the IP address popup would cause |
385 | 385 |
Firefox to go back to the top of the page. Fix #112. |
386 | 386 | |
387 |
- Don't hardcode path to /usr/bin/python, but resort to
|
|
387 |
- Don't hardcode path to /usr/bin/python, but resort to |
|
388 | 388 |
/usr/bin/env to find it. |
389 | 389 | |
390 | 390 | |
... | ... | |
394 | 394 | |
395 | 395 |
- Minor rendering fix. |
396 | 396 | |
397 |
- Handle service.iana_protocol_name / service.iana_protocol_number
|
|
397 |
- Handle service.iana_protocol_name / service.iana_protocol_number |
|
398 | 398 |
as well as service.protocol. |
399 | 399 | |
400 | 400 | |
... | ... | |
436 | 436 | |
437 | 437 |
- Show target file in the message listing. |
438 | 438 | |
439 |
- Much more information in the alert summary view.
|
|
439 |
- Much more information in the alert summary view. |
|
440 | 440 |
Especially useful for users of integrity checker. |
441 | 441 | |
442 | 442 | |
... | ... | |
488 | 488 |
- XHTML conformance in most of the code. |
489 | 489 | |
490 | 490 |
- Fix possible exception with filtered classification text. |
491 |
|
|
491 | ||
492 | 492 |
- Allow filtering on heartbeat.analyzer.name. |
493 | 493 | |
494 | 494 | |
... | ... | |
503 | 503 |
Fix Javascript warnings. Correct URL escaping. Make it work |
504 | 504 |
better in Apple's Safari browser. |
505 | 505 | |
506 |
- More error checking when saving custom filter. Error out in case a
|
|
506 |
- More error checking when saving custom filter. Error out in case a |
|
507 | 507 |
filter reference non existing criteria. Add the substr operator. |
508 | 508 | |
509 |
- Fix bug in the whole alert/heartbeat navigation system, simplify
|
|
510 |
and cleanup the code, always report the current filtered field 'action' to
|
|
509 |
- Fix bug in the whole alert/heartbeat navigation system, simplify |
|
510 |
and cleanup the code, always report the current filtered field 'action' to |
|
511 | 511 |
the user. |
512 | 512 | |
513 |
- Make the mouse pointer behave like it does for javascript links on Alert
|
|
513 |
- Make the mouse pointer behave like it does for javascript links on Alert |
|
514 | 514 |
listing table head. |
515 | 515 | |
516 |
- Fix alert mixup when expanding an aggregated classification with different
|
|
516 |
- Fix alert mixup when expanding an aggregated classification with different |
|
517 | 517 |
severity. |
518 |
|
|
518 | ||
519 | 519 |
- Fix low/mid/high/none severity filtering. |
520 | 520 | |
521 | 521 |
- Fix a bug where agents with multiple address would disappear. |
... | ... | |
530 | 530 |
- Add an heartbeat_error_margin configuration keyword. |
531 | 531 | |
532 | 532 |
- Saving modification to an existing filter now work. |
533 |
|
|
534 |
- Make prewikka.cgi catch exceptions that are raised during the prewikka
|
|
533 | ||
534 |
- Make prewikka.cgi catch exceptions that are raised during the prewikka |
|
535 | 535 |
initialization step and display an error screen to the user instead of |
536 | 536 |
a server internal error. |
537 | 537 | |
... | ... | |
565 | 565 |
- Update Auth cookie expiration time. |
566 | 566 | |
567 | 567 |
- Fix escaping issue. |
568 |
|
|
568 | ||
569 | 569 | |
570 | 570 |
* 2005-04-05, prewikka-0.9.0-rc4: |
571 | 571 | |
... | ... | |
581 | 581 |
login for no appearent reasons. |
582 | 582 | |
583 | 583 |
- Set default session expiration time to 60 minutes. |
584 |
|
|
585 |
|
|
584 | ||
585 | ||
586 | 586 |
* 2005-03-31, prewikka-0.9.0-rc3: |
587 | 587 | |
588 | 588 |
- Installation cleanup / bugfix. |
prewikka/IDMEFDatabase.py | ||
---|---|---|
36 | 36 |
class IDMEFTime(object): |
37 | 37 |
def __init__(self, res): |
38 | 38 |
self._res = res |
39 |
|
|
39 | ||
40 | 40 |
def __del__(self): |
41 | 41 |
idmef_time_destroy(self._res) |
42 | 42 | |
... | ... | |
45 | 45 | |
46 | 46 |
def __int__(self): |
47 | 47 |
return idmef_time_get_sec(self._res) |
48 |
|
|
48 | ||
49 | 49 |
def __float__(self): |
50 | 50 |
return float(idmef_time_get_sec(self._res)) + float(idmef_time_get_usec(self._res)) / 10 ** 6 |
51 |
|
|
51 | ||
52 | 52 |
def toYMDHMS(self): |
53 | 53 |
return time_to_ymdhms(time.localtime(idmef_time_get_sec(self._res))) |
54 |
|
|
54 | ||
55 | 55 |
def __getattribute__(self, name): |
56 | 56 |
if name is "sec": |
57 | 57 |
return idmef_time_get_sec(self._res) |
... | ... | |
98 | 98 |
except KeyError: |
99 | 99 |
return None |
100 | 100 | |
101 |
|
|
102 |
|
|
101 | ||
102 | ||
103 | 103 |
class Message: |
104 | 104 |
def __init__(self, res, htmlsafe): |
105 | 105 |
self._res = res |
106 | 106 |
self._value_list = None |
107 | 107 |
self._htmlsafe = htmlsafe |
108 |
|
|
108 | ||
109 | 109 |
def __del__(self): |
110 | 110 |
idmef_message_destroy(self._res) |
111 | 111 | |
... | ... | |
115 | 115 |
def __iter__(self): |
116 | 116 |
if not self._value_list: |
117 | 117 |
raise TypeError, "iteration over a non-sequence" |
118 |
|
|
118 | ||
119 | 119 |
self._list_iterator = 0 |
120 | 120 |
return self |
121 | 121 | |
... | ... | |
124 | 124 |
return idmef_value_get_count(self._value_list) |
125 | 125 | |
126 | 126 |
return 1 |
127 |
|
|
127 | ||
128 | 128 |
def next(self): |
129 | 129 |
next = idmef_value_get_nth(self._value_list, self._list_iterate) |
130 | 130 |
if not next: |
... | ... | |
132 | 132 | |
133 | 133 |
value = self._convert_value(next, self._root + "(%d)" % self._list_iterate) |
134 | 134 |
self._list_iterate += 1 |
135 |
|
|
135 | ||
136 | 136 |
return value |
137 | 137 | |
138 | 138 |
def _convert_value(self, idmef_value, key): |
... | ... | |
143 | 143 |
value._value_list = idmef_value |
144 | 144 |
if self._value_list: |
145 | 145 |
idmef_value_ref(idmef_value) |
146 |
|
|
146 | ||
147 | 147 |
elif idmef_value_get_type(idmef_value) != IDMEF_VALUE_TYPE_CLASS: |
148 | 148 |
value = convert_idmef_value(idmef_value) |
149 | 149 |
if not self._value_list: |
150 | 150 |
idmef_value_destroy(idmef_value) |
151 |
|
|
151 | ||
152 | 152 |
else: |
153 | 153 |
if not self._value_list: |
154 | 154 |
idmef_value_destroy(idmef_value) |
155 |
|
|
155 | ||
156 | 156 |
value = Message(idmef_message_ref(self._res), self._htmlsafe) |
157 | 157 |
value._root = key |
158 | 158 | |
159 | 159 |
return value |
160 |
|
|
160 | ||
161 | 161 |
def _get_raw_value(self, key): |
162 | 162 |
path = idmef_path_new_fast(key) |
163 | 163 |
idmef_value = idmef_path_get(path, self._res) |
164 |
|
|
164 | ||
165 | 165 |
if idmef_value: |
166 | 166 |
ret = self._convert_value(idmef_value, key) |
167 | 167 |
else: |
168 |
if idmef_path_is_list(path, -1):
|
|
168 |
if idmef_path_is_ambiguous(path):
|
|
169 | 169 |
ret = [] |
170 | 170 |
else: |
171 | 171 |
ret = None |
172 |
|
|
172 | ||
173 | 173 |
idmef_path_destroy(path) |
174 | 174 |
return ret |
175 | 175 | |
... | ... | |
181 | 181 |
return escape_value(self._get_raw_value(key)) |
182 | 182 |
else: |
183 | 183 |
return self._get_raw_value(key) |
184 |
|
|
184 | ||
185 | 185 |
def match(self, criteria): |
186 | 186 |
if type(criteria) is list: |
187 | 187 |
criteria = " && ".join(criteria) |
... | ... | |
192 | 192 | |
193 | 193 |
return ret |
194 | 194 | |
195 |
def get(self, key, default=None, htmlsafe=None):
|
|
195 |
def get(self, key, default=None, htmlsafe=None): |
|
196 | 196 |
if htmlsafe != None: |
197 | 197 |
htmlsafe_bkp = self._htmlsafe |
198 | 198 |
self._htmlsafe = htmlsafe |
199 |
|
|
200 |
val = self[key]
|
|
199 | ||
200 |
val = self[key] |
|
201 | 201 |
if val == None: |
202 | 202 |
val = default |
203 |
|
|
203 | ||
204 | 204 |
if htmlsafe != None: |
205 | 205 |
self._htmlsafe = htmlsafe_bkp |
206 |
|
|
206 | ||
207 | 207 |
return val |
208 | 208 | |
209 | 209 |
def getAdditionalData(self, searched, many_values=False): |
... | ... | |
213 | 213 |
meaning = self["%s.additional_data(%d).meaning" % (self._root, i)] |
214 | 214 |
if meaning is None: |
215 | 215 |
break |
216 |
|
|
216 | ||
217 | 217 |
if meaning == searched: |
218 | 218 |
value = self["%s.additional_data(%d).data" % (self._root, i)] |
219 |
|
|
219 | ||
220 | 220 |
if not many_values: |
221 | 221 |
return value |
222 |
|
|
222 | ||
223 | 223 |
values.append(value) |
224 | 224 | |
225 | 225 |
i += 1 |
... | ... | |
252 | 252 |
self._rows = [ ] |
253 | 253 |
self._has_cache = False |
254 | 254 |
self._res, self._len = results |
255 |
|
|
255 | ||
256 | 256 |
def __iter__(self): |
257 | 257 |
if self._has_cache: |
258 | 258 |
return iter(self._rows) |
259 | 259 |
else: |
260 | 260 |
return self |
261 |
|
|
261 | ||
262 | 262 |
def __len__(self): |
263 | 263 |
return self._len |
264 |
|
|
264 | ||
265 | 265 |
def __del__(self): |
266 | 266 |
if self._res: |
267 | 267 |
self._db_delete(self._res) |
268 |
|
|
268 | ||
269 | 269 |
def __getitem__(self, key): |
270 | 270 |
if isinstance(key, types.SliceType): |
271 | 271 |
start, stop, step = key.start, key.stop, key.step |
272 | 272 |
index = start + stop |
273 | 273 |
else: |
274 | 274 |
index = key |
275 |
|
|
275 | ||
276 | 276 |
if not self._has_cache: |
277 | 277 |
for r in self: |
278 | 278 |
if len(self._rows) >= index: |
279 | 279 |
break |
280 |
|
|
280 | ||
281 | 281 |
return self._rows[key] |
282 |
|
|
283 |
def next(self):
|
|
282 | ||
283 |
def next(self): |
|
284 | 284 |
if self._res == None: |
285 | 285 |
raise StopIteration |
286 | 286 | |
... | ... | |
292 | 292 |
raise StopIteration |
293 | 293 | |
294 | 294 |
row = self._db_convert_row(values) |
295 |
|
|
295 | ||
296 | 296 |
self._rows.append(row) |
297 | 297 |
return row |
298 |
|
|
298 | ||
299 | 299 | |
300 | 300 |
class DbResultValues(DbResult): |
301 | 301 |
def __init__(self, selection, results): |
302 | 302 |
self._selection = selection |
303 | 303 |
DbResult.__init__(self, results) |
304 |
|
|
304 | ||
305 | 305 |
def _db_get_next(self): |
306 | 306 |
return preludedb_result_values_get_next(self._res) |
307 |
|
|
307 | ||
308 | 308 |
def _db_delete(self, result): |
309 | 309 |
if self._selection: |
310 | 310 |
preludedb_path_selection_destroy(self._selection) |
311 |
|
|
311 | ||
312 | 312 |
if result: |
313 | 313 |
preludedb_result_values_destroy(result) |
314 |
|
|
314 | ||
315 | 315 |
def _db_convert_row(self, values): |
316 | 316 |
row = [] |
317 | 317 |
for value in values: |
... | ... | |
319 | 319 |
row.append(None) |
320 | 320 |
else: |
321 | 321 |
row.append(convert_idmef_value(value)) |
322 |
idmef_value_destroy(value)
|
|
323 |
|
|
322 |
idmef_value_destroy(value) |
|
323 | ||
324 | 324 |
return row |
325 |
|
|
325 | ||
326 | 326 |
class DbResultIdents(DbResult): |
327 | 327 |
def _db_get_next(self): |
328 | 328 |
return preludedb_result_idents_get_next(self._res) |
329 |
|
|
329 | ||
330 | 330 |
def _db_delete(self, result): |
331 | 331 |
if result: |
332 | 332 |
preludedb_result_idents_destroy(result) |
333 |
|
|
333 | ||
334 | 334 |
def _db_convert_row(self, value): |
335 | 335 |
return value |
336 | 336 | |
337 | 337 |
class IDMEFDatabase: |
338 | 338 |
_db_destroy = preludedb_destroy |
339 | 339 |
_db = None |
340 |
|
|
340 | ||
341 | 341 |
def __init__(self, config): |
342 | 342 |
settings = preludedb_sql_settings_new() |
343 | 343 |
for param in "file", "host", "port", "name", "user", "pass": |
... | ... | |
360 | 360 |
raise "libpreludedb %s or higher is required (%s found)." % (wanted_version, cur) |
361 | 361 |
else: |
362 | 362 |
raise "libpreludedb %s or higher is required." % wanted_version |
363 |
|
|
363 | ||
364 | 364 |
self._db = preludedb_new(sql, None) |
365 | 365 | |
366 | 366 |
def __del__(self): |
367 | 367 |
if self._db: |
368 | 368 |
self._db_destroy(self._db) |
369 |
|
|
369 | ||
370 | 370 |
def _getMessageIdents(self, get_message_idents, criteria, limit, offset, order_by): |
371 | 371 |
if type(criteria) is list: |
372 | 372 |
if len(criteria) == 0: |
373 | 373 |
criteria = None |
374 | 374 |
else: |
375 | 375 |
criteria = " && ".join(criteria) |
376 |
|
|
376 | ||
377 | 377 |
if criteria: |
378 | 378 |
criteria = idmef_criteria_new_from_string(criteria) |
379 | 379 | |
380 | 380 |
idents = [ ] |
381 |
|
|
381 | ||
382 | 382 |
if order_by == "time_asc": |
383 | 383 |
order_by = PRELUDEDB_RESULT_IDENTS_ORDER_BY_CREATE_TIME_ASC |
384 | 384 |
else: |
385 | 385 |
order_by = PRELUDEDB_RESULT_IDENTS_ORDER_BY_CREATE_TIME_DESC |
386 |
|
|
387 |
try:
|
|
386 | ||
387 |
try: |
|
388 | 388 |
result = get_message_idents(self._db, criteria, limit, offset, order_by) |
389 | 389 |
except: |
390 | 390 |
self._freeDbParams(criteria=criteria) |
391 | 391 |
raise |
392 |
|
|
392 | ||
393 | 393 |
if criteria: |
394 | 394 |
idmef_criteria_destroy(criteria) |
395 |
|
|
395 | ||
396 | 396 |
if not result: |
397 |
return [ ]
|
|
398 |
|
|
397 |
return [ ] |
|
398 | ||
399 | 399 |
return DbResultIdents(result) |
400 |
|
|
400 | ||
401 | 401 |
def getAlertIdents(self, criteria=None, limit=-1, offset=-1, order_by="time_desc"): |
402 | 402 |
return self._getMessageIdents(preludedb_get_alert_idents2, criteria, limit, offset, order_by) |
403 | 403 | |
... | ... | |
406 | 406 | |
407 | 407 |
def _getLastMessageIdent(self, type, get_message_idents, analyzerid): |
408 | 408 |
criteria = None |
409 |
if analyzerid != None: |
|
410 |
criteria = "%s.analyzer(-1).analyzerid == '%s'" % (type, str(analyzerid)) |
|
409 |
if analyzerid is not False: |
|
410 |
if analyzerid is None: |
|
411 |
criteria = "! %s.analyzer(-1).analyzerid" % (type) |
|
412 |
else: |
|
413 |
criteria = "%s.analyzer(-1).analyzerid == '%s'" % (type, str(analyzerid)) |
|
411 | 414 | |
412 | 415 |
idents = get_message_idents(criteria, limit=1) |
413 | 416 | |
414 | 417 |
return idents[0] |
415 | 418 | |
416 |
def getLastAlertIdent(self, analyzer=None):
|
|
419 |
def getLastAlertIdent(self, analyzer=False):
|
|
417 | 420 |
return self._getLastMessageIdent("alert", self.getAlertIdents, analyzer) |
418 | 421 | |
419 |
def getLastHeartbeatIdent(self, analyzer=None):
|
|
422 |
def getLastHeartbeatIdent(self, analyzer=False):
|
|
420 | 423 |
return self._getLastMessageIdent("heartbeat", self.getHeartbeatIdents, analyzer) |
421 | 424 | |
422 | 425 |
def getAlert(self, ident, htmlsafe=False): |
... | ... | |
442 | 445 |
def _freeDbParams(self, selection=None, criteria=None): |
443 | 446 |
if selection: |
444 | 447 |
preludedb_path_selection_destroy(selection) |
445 |
|
|
448 | ||
446 | 449 |
if criteria: |
447 | 450 |
idmef_criteria_destroy(criteria) |
448 |
|
|
451 | ||
449 | 452 |
def getValues(self, selection, criteria=None, distinct=0, limit=-1, offset=-1): |
450 | 453 |
if type(criteria) is list: |
451 | 454 |
if len(criteria) == 0: |
452 | 455 |
criteria = None |
453 | 456 |
else: |
454 | 457 |
criteria = " && ".join([ "(" + c + ")" for c in criteria ]) |
455 |
|
|
458 | ||
456 | 459 |
if criteria: |
457 | 460 |
criteria = idmef_criteria_new_from_string(criteria) |
458 |
|
|
461 | ||
459 | 462 |
my_selection = preludedb_path_selection_new() |
460 | 463 |
for selected in selection: |
461 | 464 |
my_selected = preludedb_selected_path_new_string(selected) |
... | ... | |
466 | 469 |
except: |
467 | 470 |
self._freeDbParams(my_selection, criteria) |
468 | 471 |
raise |
469 |
|
|
472 | ||
470 | 473 |
if criteria: |
471 |
idmef_criteria_destroy(criteria)
|
|
472 |
|
|
474 |
idmef_criteria_destroy(criteria) |
|
475 | ||
473 | 476 |
if not result: |
474 | 477 |
preludedb_path_selection_destroy(my_selection) |
475 | 478 |
return [ ] |
476 |
|
|
479 | ||
477 | 480 |
return DbResultValues(my_selection, result) |
478 |
|
|
481 | ||
479 | 482 |
def _countMessages(self, root, criteria): |
480 | 483 |
return self.getValues(["count(%s.create_time)" % root], criteria)[0][0] |
481 |
|
|
484 | ||
482 | 485 |
def countAlerts(self, criteria=None): |
483 | 486 |
return self._countMessages("alert", criteria) |
484 | 487 | |
... | ... | |
511 | 514 |
index += 1 |
512 | 515 |
analyzer_paths.append(path) |
513 | 516 | |
514 |
return analyzer_paths
|
|
517 |
return analyzer_paths |
|
515 | 518 | |
516 | 519 |
def getAnalyzer(self, analyzerid): |
517 | 520 |
ident = self.getLastHeartbeatIdent(analyzerid) |
518 | 521 |
heartbeat = self.getHeartbeat(ident) |
519 | 522 | |
520 |
index = 0 |
|
521 |
while True: |
|
522 |
if not heartbeat["heartbeat.analyzer(%d).name" % (index + 1)]: |
|
523 |
break |
|
524 |
index += 1 |
|
525 | ||
526 |
analyzer = { } |
|
527 |
analyzer["analyzerid"] = analyzerid |
|
528 |
analyzer["name"] = heartbeat.get("heartbeat.analyzer(%d).name" % index) |
|
529 |
analyzer["model"] = heartbeat.get("heartbeat.analyzer(%d).model" % index) |
|
530 |
analyzer["version"] = heartbeat.get("heartbeat.analyzer(%d).version" % index) |
|
531 |
analyzer["class"] = heartbeat.get("heartbeat.analyzer(%d).class" % index) |
|
532 |
analyzer["ostype"] = heartbeat.get("heartbeat.analyzer(%d).ostype" % index) |
|
533 |
analyzer["osversion"] = heartbeat.get("heartbeat.analyzer(%d).osversion" % index) |
|
534 |
analyzer["node_name"] = heartbeat.get("heartbeat.analyzer(%d).node.name" % index) |
|
535 |
analyzer["node_location"] = heartbeat.get("heartbeat.analyzer(%d).node.location" % index) |
|
536 |
|
|
537 |
i = 0 |
|
538 |
analyzer["node_addresses"] = [ ] |
|
539 |
while True: |
|
540 |
address = heartbeat.get("heartbeat.analyzer(%d).node.address(%d).address" % (index, i)) |
|
541 |
if not address: |
|
542 |
break |
|
543 |
analyzer["node_addresses"].append(address) |
|
544 |
i += 1 |
|
545 |
|
|
546 |
analyzer["last_heartbeat_time"] = heartbeat.get("heartbeat.create_time") |
|
547 |
analyzer["last_heartbeat_interval"] = heartbeat["heartbeat.heartbeat_interval"] |
|
548 |
analyzer["last_heartbeat_status"] = heartbeat.getAdditionalData("Analyzer status") |
|
549 |
|
|
550 |
return analyzer |
|
523 |
path = [] |
|
524 |
analyzer = {} |
|
525 |
analyzerd = { "path": path, "node_addresses": [], "node_name": None, "node_location": None } |
|
526 | ||
527 |
for a in heartbeat["analyzer"]: |
|
528 |
path.append(a["analyzerid"]) |
|
529 |
analyzer = a |
|
530 | ||
531 |
for column in "analyzerid", "name", "model", "version", "class", "ostype", "osversion": |
|
532 |
analyzerd[column] = analyzer.get(column, None) |
|
533 | ||
534 |
analyzerd["node_name"] = analyzer.get("node.name") |
|
535 |
analyzerd["node_location"] = analyzer.get("node.location") |
|
536 | ||
537 |
for addr in analyzer.get("node.address.address", []): |
|
538 |
analyzerd["node_addresses"].append(addr) |
|
539 | ||
540 |
analyzerd["last_heartbeat_time"] = heartbeat.get("heartbeat.create_time") |
|
541 |
analyzerd["last_heartbeat_interval"] = heartbeat.get("heartbeat.heartbeat_interval") |
|
542 |
analyzerd["last_heartbeat_status"] = heartbeat.getAdditionalData("Analyzer status") |
|
543 | ||
544 |
return analyzerd |
prewikka/templates/SensorListing.tmpl | ||
---|---|---|
24 | 24 |
}); |
25 | 25 | |
26 | 26 |
\$(".fieldset_toggle2").click(function(){ |
27 |
\$(this).prev().find(".fieldset_toggle").click();
|
|
28 |
return false;
|
|
27 |
\$(this).prev().find(".fieldset_toggle").click();
|
|
28 |
return false;
|
|
29 | 29 |
}); |
30 | 30 | |
31 | 31 |
\$("td.offline, td.online, td.missing, td.unknown").click(function(){ |
prewikka/templates/utils.tmpl | ||
---|---|---|
13 | 13 |
<tbody> |
14 | 14 |
<tr class="table_row_even"> |
15 | 15 |
<td>#echo $analyzer.name or "n/a" #</td> |
16 |
<td>
|
|
16 |
<td>
|
|
17 | 17 |
#if $analyzer.model |
18 | 18 |
$analyzer.model |
19 | 19 |
#if $analyzer.version |
... | ... | |
23 | 23 |
n/a |
24 | 24 |
#end if |
25 | 25 |
</td> |
26 |
<td>
|
|
26 |
<td>
|
|
27 | 27 |
#if $analyzer.ostype |
28 | 28 |
$analyzer.ostype |
29 | 29 |
#if $analyzer.osversion |
... | ... | |
33 | 33 |
n/a |
34 | 34 |
#end if |
35 | 35 |
</td> |
36 |
<td>#echo $analyzer.node_name or "n/a" #</td>
|
|
36 |
<td>#echo $analyzer.node_name or "n/a" #</td>
|
|
37 | 37 |
<td>#echo $analyzer.node_location or "n/a" #</td> |
38 |
<td>
|
|
38 |
<td>
|
|
39 | 39 |
#if len($analyzer.node_addresses) > 0 |
40 | 40 |
#for $address in $analyzer.node_addresses |
41 | 41 |
$address<br/> |
... | ... | |
43 | 43 |
#else |
44 | 44 |
n/a |
45 | 45 |
#end if |
46 |
</td>
|
|
46 |
</td>
|
|
47 | 47 |
</tr> |
48 | 48 |
</tbody> |
49 | 49 |
</table> |
prewikka/views/sensor.py | ||
---|---|---|
55 | 55 | |
56 | 56 |
if time.time() - int(heartbeat_time) > int(heartbeat_interval) + error_margin: |
57 | 57 |
return "missing", _("Missing") |
58 |
|
|
58 | ||
59 | 59 |
return "online", _("Online") |
60 | 60 | |
61 | 61 | |
62 | 62 |
def analyzer_cmp(x, y): |
63 | 63 |
xmiss = x["status"] == "missing" |
64 | 64 |
ymiss = y["status"] == "missing" |
65 |
|
|
65 | ||
66 | 66 |
if xmiss and ymiss: |
67 | 67 |
return cmp(x["name"], y["name"]) |
68 |
|
|
68 | ||
69 | 69 |
elif xmiss or ymiss: |
70 | 70 |
return ymiss - xmiss |
71 |
|
|
71 | ||
72 | 72 |
else: |
73 | 73 |
return cmp(x["name"], y["name"]) |
74 |
|
|
74 | ||
75 | 75 |
def node_cmp(x, y): |
76 | 76 |
xmiss = x["missing"] |
77 | 77 |
ymiss = y["missing"] |
78 |
|
|
78 | ||
79 | 79 |
if xmiss or ymiss: |
80 |
return ymiss - xmiss
|
|
80 |
return ymiss - xmiss |
|
81 | 81 |
else: |
82 | 82 |
return cmp(x["node_name"], y["node_name"]) |
83 | 83 | |
... | ... | |
98 | 98 |
def init(self, env): |
99 | 99 |
self._heartbeat_count = int(env.config.general.getOptionValue("heartbeat_count", 30)) |
100 | 100 |
self._heartbeat_error_margin = int(env.config.general.getOptionValue("heartbeat_error_margin", 3)) |
101 |
|
|
102 |
|
|
101 | ||
103 | 102 |
def render(self): |
104 | 103 |
analyzers = { } |
105 | 104 | |
... | ... | |
110 | 109 | |
111 | 110 |
locations = { } |
112 | 111 |
nodes = { } |
113 |
|
|
114 |
for analyzer_path in self.env.idmef_db.getAnalyzerPaths(): |
|
115 |
analyzerid = analyzer_path[-1] |
|
112 | ||
113 |
for analyzerid in self.env.idmef_db.getAnalyzerids(): |
|
116 | 114 |
analyzer = self.env.idmef_db.getAnalyzer(analyzerid) |
117 |
|
|
115 | ||
118 | 116 |
parameters = { "analyzerid": analyzer["analyzerid"] } |
119 | 117 |
analyzer["alert_listing"] = utils.create_link("sensor_alert_listing", parameters) |
120 | 118 |
analyzer["heartbeat_listing"] = utils.create_link("sensor_heartbeat_listing", parameters) |
... | ... | |
124 | 122 |
analyzer["node_name_link"] = utils.create_link(self.view_name, |
125 | 123 |
{ "filter_path": "heartbeat.analyzer(-1).node.name", |
126 | 124 |
"filter_value": analyzer["node_name"] }) |
127 |
|
|
125 | ||
128 | 126 |
if analyzer["node_location"]: |
129 | 127 |
analyzer["node_location_link"] = utils.create_link(self.view_name, |
130 | 128 |
{ "filter_path": "heartbeat.analyzer(-1).node.location", |
131 | 129 |
"filter_value": analyzer["node_location"] }) |
132 |
|
|
130 | ||
133 | 131 |
node_key = "" |
134 | 132 |
for i in range(len(analyzer["node_addresses"])): |
135 | 133 |
addr = analyzer["node_addresses"][i] |
136 | 134 |
node_key += addr |
137 |
|
|
135 | ||
138 | 136 |
analyzer["node_addresses"][i] = {} |
139 | 137 |
analyzer["node_addresses"][i]["value"] = addr |
140 | 138 |
analyzer["node_addresses"][i]["inline_filter"] = utils.create_link(self.view_name, |
... | ... | |
147 | 145 |
utils.create_link("Command", |
148 | 146 |
{ "origin": self.view_name, |
149 | 147 |
"command": command, "host": addr }))) |
150 |
|
|
148 | ||
151 | 149 |
analyzer["status"], analyzer["status_meaning"] = \ |
152 | 150 |
get_analyzer_status_from_latest_heartbeat(analyzer["last_heartbeat_status"], |
153 | 151 |
analyzer["last_heartbeat_time"], |
... | ... | |
156 | 154 | |
157 | 155 |
analyzer["last_heartbeat_time"] = utils.time_to_ymdhms(time.localtime(int(analyzer["last_heartbeat_time"]))) + \ |
158 | 156 |
" %+.2d:%.2d" % utils.get_gmt_offset() |
159 |
|
|
157 | ||
160 | 158 |
node_location = analyzer["node_location"] or _("Node location n/a") |
161 | 159 |
node_name = analyzer.get("node_name") or _("Node name n/a") |
162 | 160 |
osversion = analyzer["osversion"] or _("OS version n/a") |
163 | 161 |
ostype = analyzer["ostype"] or _("OS type n/a") |
164 | 162 |
addresses = analyzer["node_addresses"] |
165 |
|
|
163 | ||
166 | 164 |
node_key = node_name + osversion + ostype |
167 |
|
|
165 | ||
168 | 166 |
if not locations.has_key(node_location): |
169 | 167 |
locations[node_location] = { "total": 1, "missing": 0, "unknown": 0, "offline": 0, "online": 0, "nodes": { } } |
170 | 168 |
else: |
... | ... | |
174 | 172 |
locations[node_location]["nodes"][node_key] = { "total": 1, "missing": 0, "unknown": 0, "offline": 0, "online": 0, |
175 | 173 |
"analyzers": [ ], |
176 | 174 |
"node_name": node_name, "node_location": node_location, |
177 |
"ostype": ostype, "osversion": osversion,
|
|
175 |
"ostype": ostype, "osversion": osversion, |
|
178 | 176 |
"node_addresses": addresses, "node_key": node_key } |
179 | 177 |
else: |
180 | 178 |
locations[node_location]["nodes"][node_key]["total"] += 1 |
181 |
|
|
179 | ||
182 | 180 |
status = analyzer["status"] |
183 | 181 |
locations[node_location][status] += 1 |
184 | 182 |
locations[node_location]["nodes"][node_key][status] += 1 |
... | ... | |
187 | 185 |
locations[node_location]["nodes"][node_key]["analyzers"].insert(0, analyzer) |
188 | 186 |
else: |
189 | 187 |
locations[node_location]["nodes"][node_key]["analyzers"].append(analyzer) |
190 |
|
|
188 | ||
191 | 189 |
self.dataset["locations"] = locations |
192 |
|
|
190 | ||
193 | 191 | |
194 | 192 |
class SensorMessagesDelete(SensorListing): |
195 | 193 |
view_name = "sensor_messages_delete" |
... | ... | |
205 | 203 |
if self.parameters.has_key("heartbeats"): |
206 | 204 |
criteria = "heartbeat.analyzer(-1).analyzerid == %d" % long(analyzerid) |
207 | 205 |
self.env.idmef_db.deleteHeartbeat(self.env.idmef_db.getHeartbeatIdents(criteria)) |
208 |
|
|
206 | ||
209 | 207 |
SensorListing.render(self) |
210 | 208 | |
211 | 209 | |
... | ... | |
219 | 217 |
def init(self, env): |
220 | 218 |
self._heartbeat_count = int(env.config.general.getOptionValue("heartbeat_count", 30)) |
221 | 219 |
self._heartbeat_error_margin = int(env.config.general.getOptionValue("heartbeat_error_margin", 3)) |
222 |
|
|
220 | ||
223 | 221 |
def render(self): |
224 | 222 |
analyzerid = self.parameters["analyzerid"] |
225 |
|
|
223 | ||
226 | 224 |
analyzer = self.env.idmef_db.getAnalyzer(analyzerid) |
227 | 225 |
analyzer["last_heartbeat_time"] = str(analyzer["last_heartbeat_time"]) |
228 | 226 |
analyzer["events"] = [ ] |
229 | 227 |
analyzer["status"] = "abnormal_offline" |
230 | 228 |
analyzer["status_meaning"] = "abnormal offline" |
231 |
|
|
229 | ||
232 | 230 |
start = time.time() |
233 | 231 |
idents = self.env.idmef_db.getHeartbeatIdents(criteria="heartbeat.analyzer(-1).analyzerid == %d" % analyzerid, |
234 | 232 |
limit=self._heartbeat_count) |
... | ... | |
254 | 252 |
analyzer["events"].append({ "value": "sensor is down since %s" % older_time, "type": "down"}) |
255 | 253 |
if newer: |
256 | 254 |
event = None |
257 |
|
|
255 | ||
258 | 256 |
if newer_status == "starting": |
259 | 257 |
if older_status == "exiting": |
260 | 258 |
event = { "value": "normal sensor start at %s" % str(newer_time), |
... | ... | |
267 | 265 |
if abs(int(newer_time) - int(older_time) - int(older_interval)) > self._heartbeat_error_margin: |
268 | 266 |
event = { "value": "abnormal heartbeat interval between %s and %s" % (str(older_time), str(newer_time)), |
269 | 267 |
"type": "abnormal_heartbeat_interval" } |
270 |
|
|
268 | ||
271 | 269 | |
272 | 270 |
if newer_status == "exiting": |
273 | 271 |
event = { "value": "normal sensor stop at %s" % str(newer_time), |