auto-learn and HTTP

Need help? Ask here.

Moderator: Moderators

Post Reply
wkuypers
Posts: 79
Joined: Mon Nov 28, 2005 4:34 am

auto-learn and HTTP

Post by wkuypers »

I have made a php page answering in the body with reject or accept (and nothing else, no title, nothing); but the log files continues to tell me it is not able to validate. Is it ok to use a php page ?
the query seems to be correct of the type validate.php?domaine=xxx&user=zzz

What's wrong ?

Thanks,

Willem
wkuypers
Posts: 79
Joined: Mon Nov 28, 2005 4:34 am

solution

Post by wkuypers »

I think I found the solution. The body and head were empty, but the tags themselves were still there. I elimintated them too. And it's ok now.
wkuypers
Posts: 79
Joined: Mon Nov 28, 2005 4:34 am

another question about autolearn

Post by wkuypers »

Wouldn't it be better just caching the the good ones, and not the bad email addresses ? Two arguments :
If a client sends an email to an address, by mistake, and the email adresses is created afterwards, than the client has to wait the end of the caching period (min. 1 month) to come through.
Obviously the autolearn feature is there so I don't get bothered by a client phoning the email isn't working (and I have to delete the item manually)

Another thing, one of the domains is subject to mail-bombing on false adresses (100 in 5 minutes, 24h/24h). If I activate the option for the domain, in 1 month time you will have thousands of mistaken email adresses, that will never be used again.

Thanks,
RollerNetSupport
Site Admin
Posts: 598
Joined: Wed Nov 17, 2004 10:05 pm
Location: Nevada
Contact:

Post by RollerNetSupport »

You're probably right about not caching negative responses. It is part of the design, so we can easily stop caching them. I'd wanted to let it go for a while to see if there was any point to caching the negative responses, but it seems there isn't for several reasons:

* Dictionary attacks rarely repeat the same addresses.

* Although auto-learn without negative caching will propagate the dictionary attacks to the end side, this will happen anyway. Blacklisting should be used to stop an attack in progress.

* Entries not in the table will still not be accepted when the final destination (or verification URL) is not available, only the cached positive responses actually matter in this case.

* Potential for large database tables as you mentioned; although this isn't a concern in the short term, since the database regularly handles millions of log entries.

So, we'll probably look at only caching positive responses.
Technical Support support@rollernet.us
Roller Network LLC
Post Reply