I can not say how much data is read everytime someone opens a profile site, but as all information that is read from the harddrive is then sent to the client asking for that data, the bandwidth that is created is a pretty good indicator about what serverload has been produced by this request.
Here is something I tried to inform you about: Amount of data you get is not proportional to CPU/HDD time, take "search" function as example. But you will keep saying that amount of data is good meter, no it's not.
How do you know that I was not there?
This surely means you weren't there otherwise you would say so.
searching for "forum lag" in this forum does not return any results
There are lags now, yet no topic (nor your search would find it). Are you done with your nonsense?
Is it my fault if you don't know the circumstances?
Is it my fault?
Or maybe I said it's your fault?
but I think that loading one thread creates a much higher server load because much more content has to be read out of the database
This content can and will be cached.
So when 260 people are online and at least half of them use the forum while being online this should result in a much bigger read activity on the servers hard drives.
Take your time understanding how this number was calculated. Most of those people could be refered and only view main page. Also this was in 2007, hint: other forum version and server configuration?
Yes, it doesn't rule it out, but it is pretty different from what you would expect from a constant high server load, which a DOS attack typically produces. A more typical sign would be high response time or no response at all, but continuously at all the time and not just from time to time for a few minutes.
It does not rule out DOS attack, it just shows that it isn't that effective. Also nobody accused you of DoS attack, it was unintended so symptoms will vary.
I was NOT loading big chunks out of the database, why do you think that? Have you ever seen a user profile page? For most of the profiles no more than 1000 chars = 1Kb has to be read. For every thread you load more than this amount * 10 of data has to be read in order to send all posts to the client.
80% of traffic is same thing over and over, data that is very rarely used is not cached. Also it's funny how you think databases work.
Most of the indicators you thought you had against me are only uncertain or even wrong things and from these uncertain indicators you drew a wrong conclusion and even officially said that the forum lags were my fault!
Quit whining.
you didn't even apologize.
I don't like people that whine all the time.
But I would definitely not have wrongly accused someone in an official forum while not having more than indicators.
You accuse me of being wrong, while you only have indicators that say you weren't DoSing it. You don't have any proof it wasn't you, you don't have any logs. I feel so wrongly accused, I should whine just like you.
I don't want this to become a discussion about personal things and I won't answer to any replies on this, but please just think about what you post and what effect this might have on the user you are addressing in your post.
Go kill yourself, I said something bad about you.
Seriously, quit whining, it's just boring me and I don't want to come to common conclusion with person that can only whine about how he was treated.
After a long while I came to the conclusion that pretty much every captcha is crackable but that ReCaptcha was not
Everything is crackable, it's question how much time/money you invest into cracking it.
While reading site jitspoe mentioned, something called proof of work came into my mind. It's basically just hashing some string until X first bits are not set. It will make bots have to wait in order to register. Making this correctly long will pretty much ruin sense of trying to register for bots, yet humans will be able to generate this while they are filling data.