Fixed the urlfilter bug #0000552

The config recently contained all filter destinations thats why the squidGuard
exhausts the memory for each filter process even when the filter destination was
not used by any rule.

The destinations were added because time constraints were not able to add needed
destinations, so all destination were added by the config parser.

Now the config parser checks if the destination is enabled by generel, if not it
checks if the destination is needed by a time constraint.
This commit is contained in:
root
2010-02-05 13:27:11 +01:00
parent 3675d563b8
commit 8f451574dd

View File

@@ -2,7 +2,7 @@
###############################################################################
# #
# IPFire.org - A linux based firewall #
# Copyright (C) 2007 Michael Tremer & Christian Schmidt #
# Copyright (C) 2010 Michael Tremer & Christian Schmidt #
# #
# This program is free software: you can redistribute it and/or modify #
# it under the terms of the GNU General Public License as published by #
@@ -2973,7 +2973,24 @@ sub writeconfigfile
foreach $category (@categories) {
$blacklist = $category;
$category =~ s/\//_/g;
#if ( $filtersettings{"FILTER_".uc($category)} ne "on" ){next;}
if ( $filtersettings{"FILTER_".uc($category)} ne "on" ){
my $constraintrule = "false";
foreach (@tclist){
chomp;
@tc = split(/\,/);
$tc[13] =~ s/\//_/g;
if ($tc[15] eq 'on' && $tc[13] =~ $category){
$constraintrule = "true";
}
}
if ( $constraintrule eq "false"){
next;
}
}
print FILE "dest $category {\n";
if (-e "$dbdir/$blacklist/domains") {
print FILE " domainlist $blacklist\/domains\n";